Commentary

Comparing the Performance of Private and Public Prisons

If you can't win, change the rules

The Arizona Department of Corrections (ADC) recently released its latest cost comparison study between the ADC operated facilities and private facilities operated for the state. State law requires these occasional reviews to be conducted and the previous two studies found that the private facilities operated with significantly fewer tax dollars than their government-run counterparts, achieving cost savings of 17, 13.6 and 10.8 percent in 1997, 1998, and 1999 respectively. Despite this history and a clear track record of success throughout the country, the latest ADC review suggests a curious shift – that private prison costs were 8.5 and 13.5 percent higher than state costs.

Even a cursory review of the study finds a logical explanation for the quick turnaround: If you change the rules, you can change the results.

After several years of study and a widely accepted methodology that consistently produced results that demonstrated the success of private prisons, the ADC chose to change the rules rather than compete. Put simply the new cost analysis ultimately removes or deducts costs traditionally attributed to the ADC from their bottom line, while adding additional costs to the private facilities. This type of accounting can only be described as “Enron-like”-where the accounting rules are determined by the desired outcomes.

To be fair, not all of the changes are necessarily wrong or unfair. For example, contract administration and oversight costs are added to the average daily cost of private facilities-adding $2 to the daily bottom line cost. This is a widely common and acceptable practice. Indeed, it was undertaken in the previous two reviews as well.

However, this raises a question. Who is more accountable-public or private facilities?

In this review, no additional oversight costs are charged to ADC facilities. It seems as if public and private facilities are held to different standards-and the private facilities achieve a higher level of oversight given the extra cost to administrate and monitor those facilities. Given that actual contract administration represents a tiny fraction of the cost added to private facilities, ADC facilities must not have the same level of oversight or else they would have been applied the same costs. Thus, the extra cost is an extra benefit for private prisons.

In another example of Enron accounting, the review also eliminates some costs for the ADC facilities. It pretends that money was not spent by the state. For example, it removes $1.09 from the daily cost of ADC beds for the work incentive plan because this cost is “entirely borne by the state.” Well of course it is; it’s a state program isn’t it? For comparisons’ sake you can’t just ignore some costs. If the ADC were operating the private facilities this program would remain in place-and those costs would continue to be borne by the state. Thus, they should be included in any consideration and review of the true costs of operated an ADC facility.

The study authors suggest that these changes should be made because “historical” data can only tell you so much. And that ADC may develop a better design and generate savings in future projects. While this may be true, historical data is terribly important. It provides a baseline and an expectation. For example, since the first introduction of private prisons in Arizona the real cost has declined from $47.27 a day in 1995 to $44.77 in 2004. During the same time, public facilities have seen a double digit percentage increase ($43.79 to $53.63).

If the data was allowed to speak for itself, before adjustments (additions and subtractions) the private facilities would continue to fair very well. Consider this, the average cost of ADC facilities in 2003 – $46.90, and 2004 – $47.30, are both higher than the average private cost ($43 and $46.57). That’s even after adding additional oversight to the bottom line.

Another key consideration is that private facilities can only control their costs. They have no impact over state administrative or oversight costs. The data suggests that they have superior ability to control their costs and prevent escalation. The average daily rates over the course of the three studies is evidence enough-where they’ve had direct control over their costs, they’ve gone down. ADC’s went up in every category.

Beyond this, the new study format also fails to consider the relative quality between public and private prisons, a true disservice. While costs are important, quality and performance are just as if not more important factors that should be included in any consideration.

That’s where a more detailed analysis identifies some fundamental flaws in the analysis.

First the formula used to generate the average daily cost is inconsistent and the authors fail to give a justification. The state’s own Per Capita Cost Documentation reports that private prisons managed 1,678 inmates in 2003 and 1,685 in 2004-this includes temporary and emergency beds that the private facilities operated inside existing facilities. However, the analysis uses only 1,250 as the bed count for private facilities. This inflates the true per bed/inmate cost by over several dollars a day!

Second, the analysis leaves out any construction and start up costs. Private facilities factor these costs into their contract price upfront and recover those costs over the life of the contract. However, the state separates capital and operational expenditures and does not account for these costs in their per bed/per day cost. These costs, conservatively add between $3 and $5 per bed. This cost should be added to the in-house cost.

Third, while the analysis attempts to capture only those costs borne by both facilities it fails to subtract out special treatment costs that each private facility provides under contract, not provided by the state. While there is no way to account for these costs given the available data, it certainly does skew the results. Private facilities price would drop by at least a $1 if these services were removed from the evaluation.

At the end of the day, citizens don’t care who is providing a service as long as it is being provided effectively. Results and performance are what ultimately matter to taxpayers, not whether a private or public employee does the work. In order to generate real discussions about improving performance in our correctional systems, we need to get away from ideology and partisanship. Policy debates need to focus on results, performance and achieving the best outcome with the limited resources available. Failing programs, whether public or private, should be halted in favor or better-performing programs. The debate should move away from public vs. private and toward performing vs. non-performing. It shouldn’t matter what type of organization produces the best outcomes, so long as outcomes are achieved.

Just as in the cost category the private prisons faired exceptionally well in the previous years’ studies. In fact the 1997 review determined that the private facility (there was only one in the state at the time) was superior in public safety issues, protecting staff and inmates, and compliance with professional standards.

While not as one sided, the 2000 findings were significant. Government and private prisons were compared on 10 individual dimensions including security, food service, facility safety and sanitation, and inmate health services. In the first year, the private prisons outperformed ADC prisons in seven of 10 dimensions; in the second year, private facilities and ADC split the dimensions five to five.

Perhaps the Maximus analysis did not set out to review the relative quality because they couldn’t “justify” changing those rules?