Estimating Asset Allocation Inputs
A number of different techniques are also used to improve the estimates of future asset class returns, standard deviations, correlations, and other inputs that are used by various asset allocation methodologies. Of these variables, future returns are the hardest to predict. One approach to improving return forecasts is to use a model containing a small number of common factors to estimate future returns on a larger number of asset classes. In some models, these factors are economic and financial variables, such as the market/book ratio, industrial production, or the difference between long- and short-term interest rates. Perhaps the best known factor model is the CAPM (capital asset pricing model). This is based on the assumption that, in equilibrium, the return on an asset will be equal to the risk-free rate of interest, plus a risk premium that is proportional to the asset’s riskiness relative to the overall market portfolio. Although they simplify the estimation of asset returns, factor models also have some limitations, including the need to forecast the variables they use accurately and their assumption that markets are usually in a state of equilibrium.
The latter assumption lies at the heart of another approach to return estimation, known as the Black–Litterman (BL) model. Assuming that markets are in equilibrium enables one to use current asset class market capitalizations to infer expectations of future returns. BL then combines these with an investor’s own subjective views (in a consistent manner) to arrive at a final return estimate. More broadly, BL is an example of a so-called shrinkage estimation technique, whereby more extreme estimates (for example, the highest and lowest expected returns) are shrunk toward a more central value (for example, the average return forecast across all asset classes, or BL’s equilibrium market implied returns). At a still higher level, shrinkage is but one version of model averaging, which has been shown to increase forecast accuracy in multiple domains. An example of this could be return estimates that are based on the combination of historical data and the outputs from a forecasting model.
When it comes to improving estimates of standard deviation (volatility) and correlations, one finds similar techniques employed, including factor and shrinkage models. In addition, a number of traditional (for example, moving averages and exponential smoothing) and advanced (for example, GARCH and neural network models) time-series forecasting techniques have been used as investors search for better ways to forecast volatility, correlations, and more complicated relationships between the returns on different assets. Finally, copula functions have been employed with varying degrees of success to model nonlinear dependencies between different return series.
In summary, although they are improving and becoming more robust to uncertainty than in the past, almost all quantitative approaches to asset allocation still suffer from various limitations. In a complex adaptive system this seems unavoidable, since their evolutionary processes make accurate forecasting extremely difficult using existing techniques. This argues strongly for averaging the outputs of different methodologies as the best way to make asset allocation decisions in the face of uncertainty. Moreover, these same evolutionary processes can sometimes give rise to substantial asset class over- or undervaluation that is outside the input assumptions used in the asset allocation process. Given this, relatively passive risk management approaches such as diversification and rebalancing occasionally need to be complemented with active hedging measures such as going to cash or buying options. The effective implementation of this process will require not only paying ongoing attention to asset class valuations, but also a shift in focus from external performance metrics to achieving the long-term portfolio return required to reach one’s goals. When your objective is to outperform your peers or an external benchmark, it is tempting to stay too long in overvalued asset classes, as many investors painfully learned in 2001 and again in 2008.
Making It Happen
Using broadly defined asset classes minimizes correlations and creates more robust solutions by reducing the sensitivity of results to deviations from assumptions about future asset class returns, which are the most difficult to forecast.
Equal dollar weighting should be the default asset allocation, as it assumes that all prediction is impossible.
However, there is considerable evidence that the relative riskiness of different asset classes is reasonably stable over time and therefore predictable. This makes it possible to move beyond equal weighting and to use risk budgeting. There is also evidence that different asset classes perform better under different economic conditions, such as high inflation or high uncertainty. This makes it possible to use scenario-based weighting.
Techniques such as mean–variance optimization and stochastic search are more problematic, because they depend on the accurate prediction of future returns. Although new approaches can help to minimize estimation errors, they cannot eliminate them or change the human behavior that gives rise to bubbles and crashes. For that reason, all asset allocation approaches require not only good quantitative analysis, but also good judgment and continued risk monitoring, even after the initial asset allocation plan is implemented.
- Page 4 of 4
- Previous section Asset Allocation: Advanced Techniques