Study Identifies Enormous Risk in Public Pension Investments

Commentary

Study Identifies Enormous Risk in Public Pension Investments

While the accounting practices adopted by most public pension plans have been extensively condemned in recent research, few people pay attention to the risks inherent in investments taken by those plans. For example, a popular criticism of public pension plan discount rates is that they undervalue pension liabilities by not reflecting the risk of those liabilities, and that the expected returns assumed by those rates are too high and unrealistic. However, the asset allocations designed to achieve those returns can yield a wide range of outcomes, and the distribution of these outcomes can have significant effects on pension costs.

A recent paper by James Farrell at Florida Southern College and Daniel Shoag at Harvard’s Kennedy School explores this issue. The paper is based on a stochastic model that simulates investment returns, as well as the resulting funded ratios and amortization costs, in present-discounted terms. (In a stochastic model the final output is not a single, pre-determined result but a range of outcomes influenced by random elements to capture the risk nature of investment returns.) This simulation approach is in contrast with the current GASB accounting guidelines that focus exclusively on assumed deterministic outcomes, ignoring investment risk. The Farrell/Shoag model also differs from others in the existing literature that often assume a normal distribution of returns or assume that the plan’s discount rate is also the long-term expected rate of return by using simulation processes that build on a multi-asset class portfolio and its historical across-class correlations and non-normal distribution of returns,

There are some helpful and important insights to be gleaned from the Farrell/Shoag simulation model. First, the current state of a typical public pension fund is meaningfully, substantially risky. In their baseline simulation, the model produces a median annualized return of 10% (see more on this figure in the discussion below) with a standard deviation of 1.8%[1]. Over a twenty-five year period this is significant because of the compounding effect. Consider that the difference in asset growth between returns with a 25% probability and 75% probability is more than ten times the size of the initial asset value in the simulation. That is a lot of volatility.

How does this return volatility affect funding costs? The authors find that under the same baseline simulation, the sum of the discounted amortization payments and the final unfunded liability has a standard deviation of roughly 7.25% of the total present discounted payroll over that period.

Moreover, the model shows that while shifting the asset allocation towards riskier assets can increase the median outcomes, it also significantly increases the left-tail risk of the distribution. In other words, riskier investments improve the expected results while magnifying the potential losses. However, this standard risk-reward tradeoff is mostly ignored in the existing public accounting guidelines regarding the setting of the discount rate/assumed rate of return.

One important caveat is that because the simulation model is based on historical returns between 1986 and 2013, even the baseline scenario is optimistic. That time frame was generally favorable for investment performance thanks to exceptional economic conditions that are unlikely to come back over the next few decades. This means that the median annualized return and the accompanying return distribution produced by the model are probably higher than what some key financial predictors suggest for the future.

As might be expected, the model shows that having a lower discount rate and paying the required contributions in full generally lead to better funding status. This holds true even after isolating the discount rate’s planning effects that lower the required contributions from its accounting effects that reduce the estimated unfunded liability.

Ultimately, the Farrell/Shoag paper demonstrates the imperative of reporting the current risks faced by public pension plans as well as having the right risk measures in place to report. One key lesson is that risk measures like funded ratios and amortization payments alone cannot be used to assess how actuarial assumptions (such as the discount rate and amortization period) affect the distribution of outcomes, or to compare riskiness among plans with different assumptions when these risk measures themselves are also sensitive to these very assumptions. It is therefore important to calibrate these risk measures using a constant set of assumptions and having a clear baseline when doing such analyses.

[1] Standard deviation is a statistical measure of volatility, indicating how “spread out” the data are from the mean. In finance, the standard deviation is a measure of risk.

Truong Bui is a policy analyst at Reason Foundation, where he works on the Pension Integrity Project.