Last week, the Oakland-based company SustainLane released the results of their research attempting to rank 25 U.S. cities with regard to their "sustainability." According to their website, "the peer-reviewed study is the first and most comprehensive US city sustainability performance benchmark." Not surprisingly, we see Smart Growth vanguards like Portland, Berkeley, Santa Monica, and Austin among those at the top of the list, while the much-maligned, "sprawl" poster children Detroit and Houston reside at the bottom.
But this sustainability ranking unfortunately seems to be based more on intent and commitment to the progressive agenda than any objective basis of measurement or coherent definition of sustainability.
This leads to a natural starting point: how do they define sustainability?
- The SustainLane US City Rankings focus on healthy regional economic development, vibrant communities and quality of life measurements. Our viewpoint of sustainable practices is weighted toward ideas borrowed from our natural systems and implemented in our cities, particularly those geared toward the revitalization of our economy and public health.
That sounds nice, but what does it mean? Sustainability discussions usually revolve around balancing the 3E's (environment, economy, and social equity). But SustainLane's definition -- which one would assume would be clearly stated given the attempt to construct a meaningful ranking system from it -- is full of style and devoid of substance.
For example, it's reasonable to assume that an index based on "healthy regional economic development" might include some measure of the sectoral diversity of the economy (e.g., is the economy broad-based or overly reliant on one sector?). But a glance at the methodology reveals the complete omission of any local or regional economic indicators. Similarly, social equity appears to be absent in the index. You'd think that a sustainability index that truly considered the equity component would include such basic factors as income, educational attainment, housing affordability, rents, and community health, for example.
The narrow range of data inputs they do include are puzzling indeed. One-third of the index is devoted solely to qualitative survey results that basically measure a city's intent to plan for a number of sustainability-related factors. This leads to the natural question, does the mere existence of a sustainability, bike, or mixed use development plan have any real bearing on the actual performance of a city in these areas? We all know that planning and implementation are two entirely different things.
There also seem to be some basic omissions. For example, they include solid waste diversion rates in their index, but ignore the total solid waste generation (either in total or per-capita). Similarly, they include a measure of tap water quality, but exclude the total volume of water use.
And the number of farmer's markets and community gardens as an indicator of Food & Agriculture? Personally, I love browsing farmer's markets and think they add a lot of community value, but is it conceivable that any significant portion of a city's food would ever be purchased there? The commute-to-work numbers are similarly insubstantial (probably < 10% for most cities), as well as the percentage of alternative-fuel vehicles in the city fleet. Even if every city vehicle was switched to non-polluting, alternative fuels, would it really have any remotely identifiable impact on pollution or energy usage relative to the total number of vehicles on the road?
And commitment to the Kyoto Protocol? Frankly, that's just laughable as an evaluation criteria. The European Union - great champion of the Kyoto Protocol - also enthusiastically committed to Kyoto's greenhouse gas reductions, but most of member countries have had to concede that they are nowhere close to meeting their emissions reduction targets. The lesson there is that good intentions don't necessarily translate into desired, real-world outcomes.
As for objectivity, the non-governmental data sources used in the rankings include Smart Growth America, the Natural Resources Defense Council, the Trust for Public Land, and the Environmental Working Group. With all due respect to these organizations, is it not apparent that these might not be the most objective data sources?
What I really get a kick out of is that Houston -- the bottom of the list (imagine that) -- actually ranked higher (#10) in ZONING than the top four cities on their list! Hmmmm...a city with no zoning rates higher than Portland or Berkeley in zoning (which they proxy using scores from Smart Growth America's 2002 Mixed Use Development index).
The obvious implication to me is that they've inadvertently validated the idea that the market can indeed provide mixed use development without planning and micromanagement from above. That's an argument for market-oriented planning if I've ever heard one. Funny that SustainLane conveniently left a discussion of that metric out in their slam on Houston and in their fawning praise of Portland in their individual city descriptions.
Back to the bigger picture, it seems to me that SustainLane does a disservice to the concept of sustainability and to the cities evaluated by cherrypicking a very limited set of indicators that tell us next to nothing about how these cities balance the 3E's in any tangible, real world sense. This shortchanges the debate and misleads the public as to what "sustainability" really implies, particularly as it relates to the important role of market forces in coordinating human activities.