In today’s New York Times, Lauren Morello of ClimateWire asks, “Is 350 [parts per million] the New 450 [ppm] When It Comes to Capping Carbon Emissions?”
The answer is yes, suggests Morello, a reporter with a keen eye for the shifting fashions of climate chic.
The older viewpoint was that if the world cuts back its CO2 emissions at least 50% by 2050, with industrial countries cutting their emissions by 80% or more, we could stabilize CO2 concentrations at 450 ppm, and that, in turn, would limit global warming to 2 degrees Celsius above pre-industrial levels.
But a 45o ppm stabilization target is increasingly regarded as too weak and unacceptably risky. Twenty scientists, in an open letter to the President and Congress, contend that the Waxman-Markey legislation, with its emission reduction target of 83% by 2050, should be considered “only a first step.”
Then there’s the 350 or Bust campaign led by the Center for Biological Diversity. CBD and its comrades demand that U.S. environmental statutes be “fully implemented” to lower CO2 concentrations to 350 ppm. In June, CBD issued a report advising EPA to establish National Ambient Air Quality Standards (NAAQS) for CO2 set at 350 ppm.
Morello quotes Sanford University scientist Stephen Schneider on why 350 ppm is better than 450 ppm: “We’re betting the planet. There’s no such thing as a safe level [of CO2 concentrations]. There’s a level of very risky, versus mildly risky.”
This is the familiar rhetoric that we’re ”gambling with the only planet we have.” As should be obvious by now (alas, it isn’t), Schneider and other cap-and-traders propose to gamble with the only economy we have. They talk as if there are no risks of climate policy, only risks of climate change. I would paraphrase Schneider as follows: There’s economically hazardous (stabilization at 450 ppm by 2050) and there’s economically ruinous (stabilization at 350 ppm).
In “We Can’t Get There From Here” (Mar. 14, 2009), Newsweekcolumnist Sharon Begley describes what it would take to stabilize CO2 concentrations at 450 ppm by 2050:
[Cal Tech chemist Nate] Lewis’s numbers show the enormous challenge we face. The world used 14 trillion watts (14 terawatts) of power in 2006. Assuming minimal population growth (to 9 billion people), slow economic growth (1.6 percent a year, practically recession level) and—this is key—unprecedented energy efficiency (improvements of 500 percent relative to current U.S. levels, worldwide), it will use 28 terawatts in 2050. (In a business-as-usual scenario, we would need 45 terawatts.) Simple physics shows that in order to keep CO2 to 450 ppm, 26.5 of those terawatts must be zero-carbon. That’s a lot of solar, wind, hydro, biofuels and nuclear, especially since renewables kicked in a measly 0.2 terawatts in 2006 and nuclear provided 0.9 terawatts. Are you a fan of nuclear? To get 10 terawatts, less than half of what we’ll need in 2050, Lewis calculates, we’d have to build 10,000 reactors, or one every other day starting now. Do you like wind? If you use every single breeze that blows on land, you’ll get 10 or 15 terawatts. Since it’s impossible to capture all the wind, a more realistic number is 3 terawatts, or 1 million state-of-the art turbines, and even that requires storing the energy—something we don’t know how to do—for when the wind doesn’t blow. Solar? To get 10 terawatts by 2050, Lewis calculates, we’d need to cover 1 million roofs with panels every day from now until then. “It would take an army,” he says. Obama promised green jobs, but still.
The sacrifices required of developing countries would be immense, because 90% of the growth in global CO2 emissions is expected to occur in developing countries. Here’s a graph former CEQ Chairman Jim Connaughton prepared for the December 2007 major emitters conference:
Stephen Eule of the U.S. Chamber of Commerce shows that to lower global emissions 50% below today’s levels by 2050 (the minimum reduction required to stabilize CO2 at 450 ppm), developing countries would have to reduce their emissions 62% below the baseline projection even if developed countries magically reduce their emissions to zero. They’d have cut emissions 71% below baseline if developed countries cut their emissions “only” 84% below current levels (essentially the Waxman-Markey reduction target).
Absent technological miracles (which in their nature can’t be planned or predicted), lowering CO2 to 350 ppm by 2050 would probably require a global depression sustained over several decades.
Along with the push to make 350 the new 450, I detect a shift in climate alarmist rhetoric.
If I’m not mistaken, there is a new and greater emphasis on the so-called precautionary principle. We don’t really know that limiting CO2 concentrations to 450 ppm would keep a safe lid on global warming, so we should err on the side of caution; 350 ppm is a more protective goal, argue NASA’s James Hansen and Gavin Schmidt. Again, this completely ignores the perils of the political interventions and fossil-energy restrictions required to achieve either of those targets.
Another rhetorical shift is a subtle revision in the concept of climate sensitivity. Climate sensitivity used to mean how much global warming you get from a given increase in CO2 concentrations. However, since 2001, although CO2 concentrations have increased at an accelerating rate, global temperatures have been stagnant or even declined slightly. To my knowledge, no scientist in the late 1990s predicted a roughly 10-year period of no warming at the start of the 21st Century. This suggests that the climate is less sensitive (less reactive to CO2 emissions) than the alleged “scientific consensus” has been telling us.
That’s inconvenient if the only way to sell energy rationing to a reluctant populace is to claim, over and over again, that climate change is “even worse than scientists previously predicted.”
So the new rhetoric emphasizes the alleged damages of global warming — melting Arctic sea ice, drought in Australia, species migration. And we’re told that these impacts are occurring faster than climate models have predicted. Dr. Brenda Ekwurzel of the Union of Concerned Scientists argued along those lines at a Ways and Means Committee hearing earlier this year on “Scientific Objectives in Climate Change Legislation.”
Climate sensitivity is thus redefined to mean climate impacts per a given increment of warming rather temperature change per a given increment of CO2. In short, we’re supposed to believe that less warming than the IPCC predicts leads to worse impacts than the IPCC predicts. Hence the need to make 350 ppm the new 450 ppm.
All of which is obviously question-begging, because if the world isn’t warming, how do we know that, say, drought in Southern California is due to CO2 emissions rather than to ocean cycles or some other factor not related to the greenhouse effect? Indeed, if a change in weather or climatic conditions occurs faster than greenhouse climate models project, that is prima facie evidence that the change is not due to greenhouse gas emissions.
The older view of climate sensitivity – that X amount of CO2 produces Y amount of warming — is the correct one, because it alone allows scientists to frame testable hypotheses. Scientists can measure CO2 concentrations, and they can measure global temperatures, and they can test whether a given increment in CO2 concentrations does or does not yield a hypothetical increase in global temperature.
As discussed in a previous post, a recent observational study by Richard Lindzen and Yong-Sang Choi of MIT indicates that the actual climate is about six times less sensitive to CO2 emissions than the IPCC’s “best estimate.”