Chip Knappenberger

Post image for Hurricane Sandy and Global Warming

Both the blogosphere and the mainstream media have been abuzz with commentary blaming global warming for Hurricane Sandy and the associated deaths and devastation. Bloomberg BusinessWeek epitomizes this brand of journalism. Its magazine cover proclaims the culpability of global warming as an obvious fact:

Part of the thinking here is simply that certain aspects of the storm (lowest barometric pressure for a winter cyclone in the Northeast) and its consequences (worst flooding of the New York City subway system) are “unprecedented,” so what more proof do we need that our fuelish ways have dangerously loaded the climate dice to produce ever more terrible extremes?

After all, argues Climate Progress blogger Brad Johnston, quoting hockey stick inventor Michael Mann, “climate change is present in every single meteorological event.” Here’s Mann’s explanation:

The fact remains that there is 4 percent more water vapor – and associated additional moist energy – available both to power individual storms and to produce intense rainfall from them. Climate change is present in every single meteorological event, in that these events are occurring within a baseline atmospheric environment that has shifted in favor of more intense weather events.

Well sure, climate is average weather over a period of time, so as climate changes, so does the weather. But that tautology tells us nothing about how much — or even how — global warming influences any particular event. Moreover, if “climate change is present in every single meteorological event,” then it is also present in “good” weather (however defined) as well as “bad.”

Anthony Watts makes this criticism on his indispensable blog, noting that as carbon dioxide (CO2) concentrations have risen, the frequency of hurricanes making landfall in the U.S. has declined.

The US Has Had 285 Hurricane Strikes Since 1850: ‘The U.S. has always been vulnerable to hurricanes. 86% of U.S. hurricane strikes occurred with CO2 below [NASA scientist James] Hansen’s safe level of 350 PPM.’

If there’s anything in this data at all, it looks like CO2 is preventing more US landfalling hurricanes.

Data Source: NOAA; Figure Source: Steve Goddard [click to continue…]

Post image for John Christy on Summer Heat and James Hansen’s PNAS Study

In a recent study published in Procedings of the National Academy of Sciences (PNAS), NASA scientist James Hansen and two colleagues find that whereas “extremely hot” summer weather “practically did not exist” during 1951-1980, such weather affected between 4% and 13% of the Northern Hemisphere land area during 2006-2011. The researchers infer that human-caused global warming is “loading” the “climate dice” towards extreme heat anomalies. They conclude with a “high degree of confidence” that the 2003 European heat wave, the 2010 Russian heat wave, and the 2011 Texas-Oklahoma drought were a “consequence of global warming” and have (as Hansen put it in a recent op-ed) “virtually no explanation other than climate change.”

In a recent post, I reviewed studies finding that the aforementioned anomalies were chiefly due to natural variability. In another post, I summarized an analysis by Patrick Michaels and Chip Knappenberger, who conclude that “the 2012 drought conditions, and every other [U.S.] drought that has come before, is the result of natural processes, not human greenhouse gas emissions.”

But what about the very hot weather afflicting much of the U.S. this summer? Greenhouse gas concentrations keep rising, heat spells are bound to become more frequent and severe as the world warms, and the National Oceanic and Atmospheric Administration (NOAA) reports that July 2012 was the hottest July ever in the U.S. instrumental record. Isn’t this summer what greenhouse warming “looks like“? What else could it be?

University of Alabama in Huntsville (UAH) climatologist John Christy addressed these questions last week in a two-part column. In Part 1, Christy argues that U.S. daily mean temperature (TMean) data, on which NOAA based its report, “do not represent the deep atmosphere where the enhanced greenhouse effect should be detected, so making claims about causes is unwise.” A better measure of the greenhouse effect is daily maximum temperature (TMax), and TMax records set in the 1930s remain unbroken. In Part 2, Christy argues that Hansen’s 10% estimate of the portion of land affected by extreme heat during 2006-2011 shrinks down to 2.9% when anomalies are measured against a longer, more representative climate baseline.  [click to continue…]

Post image for Hansen on Extreme Weather — Pat and Chip Respond

Last week, I posted a commentary on NASA scientist James Hansen’s study and op-ed, which attribute recent extreme weather to global climate change. In the op-ed, Hansen stated:

The deadly European heat wave of 2003, the fiery Russian heat wave of 2010 and catastrophic droughts in Texas and Oklahoma last year can each be attributed to climate change. And once the data are gathered in a few weeks’ time, it’s likely that the same will be true for the extremely hot summer the United States is suffering through right now.

My commentary concluded: “Hansen’s sweeping assertion that global warming is the principal cause of the European and Russian heat waves, and the Texas-Oklahoma drought, is not supported by event-specific analysis and is implausible in light of previous research.”

Although Hansen does not explicitly attribute the ongoing U.S. drought to global warming, he does blame global warming for both the 2011 Texas-Oklahoma drought and the current summer heat. And in his study, Hansen states: “With the temperature amplified by global warming and ubiquitous surface heating from elevated greenhouse gas amounts, extreme drought conditions can develop.”

This week on World Climate Report, Pat Michaels and Chip Knappenberger argue that the current U.S. drought “is driven by natural variability not global warming.” Their post (“Hansen Is Wrong“) is concise and layman-friendly. Here I offer an even briefer summary.

A standard measure of drought in the U.S. is the Palmer Drought Severity Index (PDSI), which measures the combined effects of temperature (hotter weather = more soil evaporation) and precipitation (more rainfall = more soil moisture). “The more positive the PDSI values, the wetter conditions are, the more negative the PDSI values, the drier things are.” The PDSI for the past 117 years (1895-2011) shows a small non-significant positive trend (i.e. towards wetter conditions). There is no greenhouse warming signal in this data.

[click to continue…]

Post image for The Greenland Ice Melt: Should We Be Alarmed?

If you follow global warming news at all, you’ve probably seen the NASA satellite images (above) many times. The images show the extent of Greenland surface ice melt on July 8 (left) and July 12 (right). In just a few days, the area of the ice sheet with surface melting increased from about 40% to 97%, including Summit Station, Greenland’s highest and coldest spot.

NASA took a drubbing from Patrick Michaels and Chip Knappenberger at World Climate Report (“Illiteracy at NASA“) for describing the ice melt as “unprecedented” in the title of the agency’s press release. The word literally means without precedent, and properly refers to events that are unique and never happened before. In reality, as one of NASA’s experts points out in the press release, over the past 10,000 years, such events have occurred about once every 150 years:

“Ice cores from Summit show that melting events of this type occur about once every 150 years on average. With the last one happening in 1889, this event is right on time,” says Lora Koenig, a Goddard glaciologist and a member of the research team analyzing the satellite data.

Equating ‘rare yet periodic’ with ‘unprecedented’ is incorrect and misleading. “But apparently,” comment Michaels and Knappenberger, “when it comes to hyping anthropogenic global warming (or at least the inference thereto), redefining English words in order to garner more attention is a perfectly acceptable practice.” New York Times blogger Andrew Revkin also chided NASA for an “inaccurate headline” and the associated “hyperventilating coverage,” but for a different reason: NASA provided “fodder for those whose passion or job is largely aimed at spreading doubt about science pointing to consequential greenhouse-driven warming.”

Enough on the spin. Let’s examine the real issues: (1) Did anthropogenic global warming cause the extraordinary increase in surface melting between July 8 and July 12? (2) How worried should we be about Greenland’s potential impact on sea-level rise? [click to continue…]

Post image for Historical Perspective on the Recent Heat Wave

Over at World Climate Report, the indefatigable Pat Michaels and Chip Knappenberger review a new study updating National Climate Data Center (NCDC) data on U.S. State climate extremes. I’ll cut right to the chase. The paper, “Evaluating Statewide Climate Extremes for the United States,” published in the Journal of Applied Meteorology and Climatology, finds that far more State-wide all-time-high temperature records were set in the 1930s than in recent decades.

From Pat and Chip’s review:

Despite the 24/7 caterwauling, only two new state records—South Carolina and Georgia—are currently under investigation. And, looking carefully at Shein et al. dataset, there appears to be a remarkable lack of all-time records in recent years. This is particularly striking given the increasing urbanization of the U.S. and the consequent “non climatic” warming that creeps into previously pristine records. . . .

Notice that the vast majority of the all-time records were set more than half a century ago and that there are exceedingly few records set within the past few decades. This is not the picture that you would expect if global warming from greenhouse gas emissions were the dominant forcing of the characteristics of our daily weather. Instead, natural variability is still holding a strong hand.

The chart below shows the number of State heat records and the year in which they were set. (When the same all-time high occurs in two or more years in the same State, each of those years gets a fraction of one point.)

 

Post image for Should We Fear the Methane Time Bomb?

A favorite doomsday scenario of the anti-carbon crusade hypothesizes that global warming, by melting frozen Arctic soils on land and the seafloor, will release billions of tons of carbon locked up for thousands of years in permafrost. Climate havoc ensues: The newly exposed carbon oxidizes and becomes carbon dioxide (CO2), further enhancing the greenhouse effect. Worse, some of the organic carbon decomposes into methane, which, molecule for molecule, packs 21 times the global warming punch of CO2 over a 100-year time span and more than 100 times the CO2-warming effect over a 20-year period.

The fear, in short, is that mankind is fast approaching a “tipping point” whereby outgassing CO2 and methane cause more warming, which melts more permafrost, which releases even more CO2 and methane, which pushes global temperatures up to catastrophic levels.

In a popular Youtube video, scientists flare outgassing methane from a frozen pond in Fairbanks, Alaska. A photo of the pond, with methane bubbling up through holes in the ice, appears in the marquee for this post. Are we approaching the End of Days?

New York Times science blogger Andrew Revkin ain’t buying it (“Methane Time Bomb in Arctic Seas – Apocalyplse Not,” 14 Dec. 2011), nor does his colleague, science reporter Justin Gillis (“Artic Methane: Is Catastrophe Imminent?” 20 Dec. 2011).

[click to continue…]

Post image for Arctic Ice Loss: Portent of Doom or Reason to Rethink IPCC Climate Sensitivity Assumptions?

“Two ice shelves that existed before Canada was settled by Europeans diminished significantly this summer, one nearly disappearing altogether, Canadian scientists say in new research,” reports an Associated Press (AP) article in the San Francisco Chronicle.

“The impact is significant and yet only a piece of the ongoing and accelerating response to warming of the Arctic,” Dr. Robert Bindschadler, emeritus scientist at NASA’s Goddard Space Center, told the AP.

The Canadian team’s research confirms MIT scientists‘ recent finding that the Arctic is shedding ice much faster than forecast by the UN Intergovernmental Panel on Climate Change’s (IPCC) Fourth Assessment Report (AR4), published just four years ago (2007). Which of course is taken to mean that global warming ‘is even worse than scientists previously believed.’

Not so fast, say climatologists Patrick Michaels and Chip Knappenberger, editors of World Climate Report (WCR). Paradoxically, more-rapid-than-projected Arctic ice loss is additional evidence that IPCC climate models are too “hot” — that is, overestimate climate sensitivity and forecast too much warming.

In IPCC climate models, decline of Arctic sea ice is treated as both a consequence of rising greenhouse gas (GHG) concentrations and as an important “positive feedback” that amplifies the direct GHG warming effect. Al Gore popularized this idea in An Inconvenient Truth, noting that as Arctic ice melts, less solar energy is reflected back to space and more absorbed by the oceans.

But, as WCR points out, if the IPCC models’ climate sensitivity estimates were correct, then the greater-than-expected positive feedback from greater-than-expected Arctic ice loss should be producing greater-than-expected global warming. Yet, despite the extra unanticipated warming influence from accelerating ice loss, the world is warming more slowly than IPCC models project.

Far from being a portent of doom, greater-than-projected ice loss, coinciding as it does with smaller-than-projected warming, indicates that actual climate sensitivity is less than model-estimated sensitivity.

Similarly, argues WCR in a related post, had IPCC models properly accounted for the planet’s recovery from the cooling effect of aerosols blown into the stratosphere by the Mount Pinatubo volcano eruption, they would be projecting even more warming than they do now. Yet model current projections already exceed the observed warming of the past 10-15 years.

The relevance to the survival of civilization and the habitability of the Earth? WCR explains:

The reason that all of this is important is that climate models which produce too much warming quite possibility are doing so because they are missing important processes which act to counteract the warming pressure exerted by increasing greenhouse gas concentrations—in other words, the climate sensitivity produced by the climate models is quite possibly too high.

If this proves to be the case, it means that there will be less future warming (and consequently less “climate disruption”) as greenhouse gas emissions continue to increase as a result of our use of fossil fuels.

Evidence continues to mount that this is indeed the case.

 

Post image for Endangered? “U.S. Death Rate Falls for 10th Straight Year” – CDC

“U.S. Death Rate Falls for 10th Straight Year,” the Centers for Disease Control (CDC) announced in a recent press release.  The release goes on to note that the “age-adjusted death rate for the U.S. population fell to an all-time low of 741 deaths per 100,000 people in 2009 — 2.3 percent lower than the 2008, according to preliminary 2009 death statistics released today [March 16, 2011] by the CDC’s National Center for Health Statistics.” This news is so good it bears repeating: The U.S. death rate fell for the “10th straight year” and is now at “an all-time low.” [click to continue…]

Deutsche Bank Climate Change Advisors (DBCCA) have just published Growth of U.S. Climate Change Litigation: Trends and Consequences.  My thanks to climate scientist Chip Knappenberger for spotlighting the DBCCA report in his column yesterday on MasterResource.Org.

DBCCA offer a bird’s eye view of the U.S. climate litigation landscape, provide data on the numbers and types of climate-related lawsuits, discuss their prospects for success and potential consequences, and emphasize that, absent congressional intervention, courts “will make the final decisions” about climate policy.

DBCCA summarize their findings as follows:

  • The number of climate change filings doubled between 2006 and 2007. They then reached a plateau for three years, but already in 2010 are on a path to triple over 2009 levels.
  • The largest increase in litigation has been in the area of challenges to federal action, specifically industry challenges to proposed EPA efforts to regulate greenhouse gas emissions.
  • From 2001 to date, 24% of total climate change-related cases were filed by environmental groups aiming to prevent or restrict the permitting of coal-fired power plants.
  • Approximately 37 states have joined, or have stated their intention to join, either side of the EPA litigation challenge.

 Especially useful are two charts on p. 5. The first chart breaks down by number and type climate cases filed through Oct. 8, 2010.

 types-of-climate-cases-filed

 Challenges to federal action (91 cases, 27%) make up the largest category of cases, followed by anti-coal litigation (74 cases, 22%).

The second chart shows the trend in climate-related filings since 1989:

 climate-litigation-filings-over-time

 The striking fact here is the upsurge in lawsuits filed by industry. During 2004-2008, industry filed between 1 and 4 climate-related lawsuits per year. In 2009, industry filed 9 such lawsuits, and in 2010, a whopping 82 lawsuits, about 76% of the total number.

DBCCA expect more industry litigation in the future: “EPA is now proceeding to issue technology standards on a sector-by-sector basis, and will continue unless Congress acts or the Court of Appeals issues a stay or annuls the tailoring rule. Every further move by EPA is likely to be challenged in court by industry.”