Despite lack of warming, alarmists predictably predict warming worse than predicted

by Marlo Lewis on February 24, 2009

As you may have heard, there has been no net warming of the planet since 2001, and no subsequent year was a warm as 1998 (admittedly a year with a major El Nino). A recent study by Keenlyside et al. (2008) concludes that “global surface temperature may not increase over the next decade” due to natural oscillations in the Atlantic and Pacific Oceans.

As Patrick Michaels of the Cato Institute explained at a recent congressional hearing, the suite of 21 climate models used in the IPCC’s mid-range emissions scenario (A1B) are on the verge of failing to reproduce actual climate data.

During the past 5 to 20 years, the observed trend in the average global temperature has been so low that it is starting to push the lower bounds of the climate models’ range of temperature predictions for that period. If 2009 is as cool as 2008 (with a La Nina brewing in the Pacific Ocean, that is not unlikely), then even the least sensitive of these models will be overestimating the actual amount of warming. And if Keenlyside is correct, and another decade elapses without significant warming, the models will have clearly failed.

The most important point for policymakers and citizens, as Michaels notes, is that if the models predict too much warming, then all model-based assessments of global warming impacts on agriculture, human health, extreme weather, etc. will be similarly overestimated.

So what do you do if you’re a climate alarmist and the world isn’t warming up as much as you said it would? Why, you redefine “climate sensitivity.” You claim that agriculture, health, weather, etc. are more “sensitive” to increases in global temperature than scientists once believed. You say that less warming than the IPCC warned us about will lead to worse impacts than the IPCC warned us about. That’s the gist of a recent IPCC-sponsored study, as summarized here by AP/MSNBC.com.

Well, I’m skeptical! First, the IPCC study claims that a 21st century temperature increase of only 1.8 degrees Fahrenheit and 3.6 degrees could significantly increase the severity of extreme weather. But Knutson et al. (2008), a leading modeling study of global warming and Atlantic tropical cyclones, projects that the same amount of warming will increase the average intensity of Atlantic hurricanes by only 1.7% but decrease hurricane frequency by 18%, producing a cumulative 25% decrease in Atlantic hurricane power. That’s a significant net climate benefit!

The IPCC study reportedly warns that even a wee bit of warming will produce deadly heat waves, suggesting (but not bluntly saying) that the 2003 killer heat wave in Europe was due to the atmospheric buildup of greenhouse gases. Yet, as Pat Michaels and Robert Balling document in their new book, Climate of Extremes, the European heat wave of 2003 was an atmospheric circulation anomaly–a bubble of hot air trapped in Europe during a summer that was slightly cooler than average worldwide.

The IPCC study, at least as summarized, also ignores research showing that as hot weather becomes more frequent, heat-related mortality declines. Cities with the hottest summer temperatures–Phoenix, AZ and Tampa, FL, for example, which have substantial elderly populations–have almost no heat-related mortality.

AP/MSNBC.Com quotes the researchers as saying: “For example, events such as Hurricane Katrina and the 2003 European heat wave have shown that the capacity to adapt to climate-related extreme events is lower than expected and, as a result, their consequences and associated vulnerabilities are higher than previously thought.”

This is complete rubbish. In 2006, Europe experienced a heat wave of comparable magnitude to the 2003 heat wave, yet you probably read nothing about it, because this time far fewer people died. What’s more, fewer people died than heat-related mortality models predicted (see here and here). Contrary to the IPCC bunch, the capacity to adapt to climate-related extreme events is higher than expected.

As for Katrina, this bespeaks a fundamental confusion on the part of the IPCC researchers. They confuse climate-related risk with climate change risk. Climate-related risk is chiefly determined by where you live and the existing social infrastructure, not by gradual changes in the atmosphere’s CO2 content.

New Orleans has always been in a hurricane corridor, and has always been below sea level. That’s the reason people in New Orleans are at risk, not because of global warming.

For example, a recent study by J.C. Mock finds no trend since 1800 in the frequency of major hurricanes striking Louisiana. The most active hurricane years in the record–1812, 1830, 1860–long predate the modern era of anthropogenic global warming.

Katrina was the worst natural disaster in U.S. history not because of any extra oomph it allegedly got from global warming (it was a category 1 storm by the time it hit New Orleans), but because government officials at all levels failed to provide adequate flood defenses for city that was well known to be vulnerable to hurricane-driven storm surges.

nicole February 25, 2009 at 7:49 am

wow this is amazing how you know all this stuff.

im in 8th grade looking for info. on global warming yeah

we are doing newsletters and i would like to do mine

on Global Warming.

if you have any advice please e-mail me

at nicolesanchez_3@msn.com

thanks[:

nicole sanchez

las cruces, new mexico

Comments on this entry are closed.

{ 2 trackbacks }

Previous post:

Next post: