Search: feed

Revised 10/28/09

At the first Senate Environment and Public Works Committee hearing on S. 1733, the Kerry-Boxer “Clean Energy Jobs and American Power Act,” Department of Energy Secretary Steven Chu explained the economic rationale for adopting a Kyoto-style cap-and-trade program.

His argument, in a nutshell, goes like this:

  1. Reducing emissions globally will require a massive investment in “clean technologies” — an estimated $2.1 trillion in wind turbines and $1.5 trillion in solar voltaic panels by 2030. These investments will create many green jobs.
  2. “The only question is — which countries will invent, manufacture, and export these clean technologies and which will become dependent on foreign products.”
  3. The United States is falling behind. “The world’s largest turbine manufacturing company is headquartered in Denmark. 99 percent of the batteries that power America’s hybrid cars are made in Japan. We manufactured more than 40 percent of the world’s solar cells as recently as the mid-1990s; today we produce just 7 percent.”
  4. To seize the opportunity of clean tech and keep from falling farther behind, “we must enact comprehensive climate legislation,” the most important element of which is a “cap on carbon emissions that ratchets down over time. That critical step will drive investment decisions towards clean energy.”

There is so much silliness packed into Chu’s testimony that it’s hard to know where to begin.

Let’s start with Step 1: The world will need $3.6 trillion worth of clean tech by 2030. Suppose the world does decide to reduce emissions. There’s no good reason to suppose that wind turbines and solar panels will ever contribute more than a small fraction of the “solution,” because these technologies are not economically “sustainable” — they consume more wealth than they produce.

A recent report by the Rheinisch-Westfälisches Institut (RWI) finds that Germany’s Renewable Energy Sources Act (EEG) has utterly failed to make wind and solar power either commercially viable or cost-effective as an emission-reduction strategy. Herewith a few highlights.

First, renewable power is a net drain on Germany’s economy:

  • Germany subsidizes solar photovoltaics (PVs) at a rate of 59¢ per kWh. That is “more than eight times higher than the wholesale electricity price at the power exchange and more than four times the feed-in tariff [subsidy] paid for electricity produced by on-shore wind turbines.”
  • “Even on-shore wind, widely regarded as a mature technology, requires feed-in tariffs [subsidies] that exceed the per-kWh cost of conventional electricity by up to 300% to remain competitive.”
  • Germany has the second-largest installed wind capacity in the world, “behind the United States,” and the largest installed PV capacity in the world. However, installed capacity is not the same as production or contribution, and “by 2008 the estimated share of wind power in Germany’s electricity production was 6.3% . . . The amount produced by solar photovoltaics was a negligible 0.6% despite being the most subsidized renewable energy, with a net cost of about 8.4 Bn € (US 12.4 Bn) for 2008.”
  • “The total net cost of subsidizing electricity production by PV modules is estimated to reach 53.3 Bn € (US $73.2 Bn) for those modules installed between 2000 and 2010. . . .wind power subsidies may total 20.5 Bn € (US $28.1 Bn) for wind converters installed between 2000 and 2020.”

Even as a carbon-reduction strategy, wind and solar power are uneconomic:

  • “Given the net cost of 41.82 Cents/kWh for PV modules installed in 2008, and assuming that PV displaces conventional electricity generated from a mixture of gas and hard coal, abatement costs are as high as 716 € (US $1,050) per tonne [of carbon dioxide].”
  • “Using the same assumptions and a net cost for wind of 3.10 Cents/kWh, the abatement cost is approximately 54 € (US $80) [per tonne CO2]. While cheaper than PV, this cost is still nearly double the ceiling of the cost of a per-ton permit under Europe’s cap-and-trade scheme.”
  • Carbon permits are trading at 13.4 € per ton. “Hence, the cost from emission reductions as determined by the market is about 53 times cheaper than employing PV and 4 times cheaper than using wind power.”
  • Germany’s “increased use of renewable energy technologies generally attains no additional emission reductions beyond those achieved by ETS [European Trading System] alone. In fact, since establishment of the ETS in 2005, the EEG’s net climate effect has been equal to zero.”

Although the EEG creates some “green jobs,” the net impact on wealth and jobs is negative:

  • “While employment projections in the renewable sector convey seemingly impressive prospects for gross job growth, they typically obscure the broader implications for economic welfare by omitting any accounting of off-setting impacts. These impacts include, but are not limited to, job losses from crowding out of cheaper forms of conventional energy generation, indirect impacts on upstream industries, additional job losses from the drain on economic activity precipitated by higher electricity prices, and consumers’ overall loss of purchasing power due to higher electricity prices, and diverting funds from other, possibly more beneficial investment.”
  • “Proponents of renewable energies often regard the requirement for more workers to produce a given amount of energy as a benefit, failing to recognize that it lowers the output potential of the economy and is hence counterproductive to net job creation.”

As my colleague Don Hertzmark observes: “If you must continually pour external resources into an energy source, then it cannot be a net source of jobs in the economy, since those resources could have gone somewhere else to create real work.”

So, yes, via mandates and subsidies, governments around the world could pump $2.1 trillion into wind turbines and $1.5 trillion into PVs. But this is an unsustainable market that will make the world poorer, not wealthier, as Chu imagines.

Okay, now for Step 2: We must choose either to make clean tech or become dependent on foreign producers. This point is silly on many levels.

  • If we don’t enact cap-and-trade, then we won’t even have to consider buying or making trillions of dollars worth of “clean tech.”
  • Even if we choose to limit emissions, the German experience indicates that investing billions (let alone trillions) in clean tech is not cost-effective.
  • Even if we do enact a cap-and-trade program, and even if clean tech becomes cost-effective, why would we want to make our own wind turbines and PVs if imported products are cheaper?
  • Chu worries the United States could become “dependent on foreign products” — as if Denmark or Japan might refuse to sell us wind turbines or hybrid cars. Even oil is not the “energy weapon” it is sometimes cracked up to be, as Jerry Taylor and Peter Van Doren of the Cato Institute explain.
  • Besides, Toyota makes lots of cars — including hybrids — in the United States. Similarly, although Vestas, the world’s largest wind turbine manufacturer, is, as Chu says, ”headquartered” in Denmark, it is investing $1 billion in four Colorado plants. Chu’s fear of “dependence on foreign products” makes no sense in a globalized economy.

Step 3: The United States is falling behind in clean tech manufacture. If we’re “falling behind,” then why do Toyota and Vestas build factories here? Besides, “falling behind” is a problem only if the clean-tech industy is a net wealth-creator. As we have seen, this is not the case for wind turbines and PVs, which is why they require market-rigging subsidies, mandates, and penalties (caps or carbon taxes) levied against carbon-based energy.

If clean tech ever does become sustainable, the only legitimate role for policymakers would be to eliminate political impediments to market-driven investment. As MIT’s Thomas Lee, Ben Ball, Jr., and Richard Tabors wrote in the conclusion of Energy Aftermath, a retrospective on Carter-era energy policies:

The experience of the 1970s and 1980s taught us that if a technology is commercially viable, then government support is not needed and if a technology is not commercially viable, no amount of government support will make it so.

Step 4: To be leaders in clean tech manufacture, we must put a price on carbon — a cap that ratchets down every year.

This is convoluted. Chu began by arguing that we needed to invest in clean tech in order to reduce emissions. Now, he says we must reduce emissions to spur investment in clean tech! Apparently, if you can’t sell cap-and-trade on the basis of climate alarm, claim that it’s “about jobs.”

Another confusion — Chu suggests U.S. firms can’t or won’t develop clean-tech products for sale in the global marketplace unless the federal government boosts domestic market share by putting a price on carbon. Two problems here. First, a price on carbon does relatively little to increase the market share of wind and solar power, because even with a price on carbon to handicap fossil energy, renewable power is still uncompetitive. That’s why the Waxman-Markey bill includes a renewable portfolio standard in addition to a cap-and-trade program.

Second, a booming domestic market for a product is not a prerequisite to success in exporting that product. In the 1980s, the Asian Tigers produced enormous quantities of exports that were not widely purchased, and in some cases not even offered for sale, in domestic markets. If clean-tech products yield high returns in the global marketplace, enterprising U.S. firms will get into the game even if the products do not have a big market in the United States.

The irony is that a cap-and-trade program could actually be counter-productive to the development of an export-oriented clean-tech sector. Low-cost energy is a source of competitive advantage. By increasing energy costs, cap-and-trade would make all U.S.-based manufacture less competitive, including companies specializing in clean-tech products.

In the News

It’s Raining, You’re Snoring
Chris Horner, Washington Times, 16 October 2009

Can a Deal Be Reached at Copenhagen?
Myron Ebell, GlobalWarming.org, 16 October 2009

Big Chill on Global Warming
Washington Examiner
, 16 October 2009

Climate Change Dominos Fall
Lawrence Solomon, Financial Post, 16 October 2009

Obama Administration: Seals Can Adapt to Climate Change
Patrick Reis, Green Wire, 16 October 2009

Challenging Al Gore’s “Truth”
Phelim McAleer, Investor’s Business Daily, 15 October 2009

Kerry & Graham Get It Wrong
Marlo Lewis, OpenMarket.org, 15 October 2009

CBO: Cap-and-Trade Kills Jobs
Iain Talley, Wall Street Journal, 15 October 2009

Carbon Offsets Fail in First Trial
Juliet Eilperin, Washington Post, 15 October 2009

The Global Gas Shale Revolution
Donald Hertzmark, MasterResource.org, 14 October 2009

Soros Invests $1 Billion in Green Tech
Stanford Daily News, 12 October 2009

News You Can Use

Antarctic Ice Melt at Lowest Level in Satellite History

This week World Climate Report drew attention to a new study by Marco Tedesco and Andrew Monaghan in the journal Geophysical Research Letters showing that the ice melt across the Antarctic last summer (October-January) of 2008-2009 was the lowest recorded in the satellite history.

BBC Reporter Can Read a Thermometer

The most popular story on the BBC website this week is about the absence of global warming since 1998. According to the Daily Telegraph, “What Happened to Global Warming,” by BBC climate correspondent Paul Hudson, has an altogether different tone than the BBC’s previous climate reporting, which had been characterized by alarmism and advocacy.

Inside the Beltway

Myron Ebell

Senate Hearings Scheduled for Energy-Rationing Bill

The Chairman of the Senate Environment and Public Works Committee, Barbara Boxer (D-Calif.), announced this week that the committee will hold hearings on the Kerry-Boxer energy-rationing bill beginning on Tuesday, 27th October.  That day will be devoted to official witnesses from the Obama Administration.  Then on Wednesday and Thursday, the 28th and 29th, the committee will hear from a variety of supporters as well as a few witnesses opposed to the bill requested by Republicans.  Senator John Kerry (D-Mass.) has officially introduced the bill as S. 1733.  However, there is already a “chairman’s mark” that is not available for public inspection.  The chairman’s mark is no doubt being re-drafted as deals are made to win votes.  It is that version rather than S. 1733 that will be marked up in committee in November.

Graham Joins Kerry in Bi-partisan Hooey

The other big news on the Kerry-Boxer bill this week was an incoherent op-ed published in Sunday’s New York Times by Senators John Kerry (D-Mass.) and Lindsey Graham (R-SC) titled, “Yes We Can (Pass Climate Legislation).”  They announce that they have come together in the spirit of bi-partisanship to support an energy-rationing bill-a bill that has yet to be written and that bears only a family resemblance to the Kerry-Boxer bill.  Critical commentary on their op-ed can be found here, here, and here.  The op-ed was enthusiastically received by the mainstream media as evidence that the Senate logjam has broken and a bi-partisan coalition can now be created to reach the sixty votes necessary to pass energy-rationing legislation.

You Can Ask Gore, But He Doesn’t Have To Answer

Phelim McAleer, the producer of Not Evil Just Wrong, the documentary film premiering on Sunday, 18th October, mixed it up with former Vice President Al Gore at the Society of Environmental Journalists’ annual meeting in Madison (where it snowed) last Friday.  After Gore’s speech, McAleer had a chance to ask him about the British High Court’s verdict that there were nine substantial scientific errors in “An Inconvenient Truth.”  Why, he asked, hadn’t Gore done anything to correct those errors but instead continued to repeat them?  Gore changed the subject, and when McAleer persisted, the SEJ cut off his microphone.  McAleer’s op-ed in Investor’s Business Daily explains what happened and draws some conclusions about environmental reporting.  I hope lots of people have a chance to watch Not Evil Just Wrong.  The DVD can be purchased here.

Socialist International Unveils Climate Strategy Eerily Similar to Obama’s…

…Not coincidentally, Carol Browner, Obama’s “climate czar,” is a card-carrying member of the Socialist International. To read more about SI’s climate plan, as well as Carol Browner’s history with the group, click here.

Kerry-Boxer puts EPA in charge of building codes

Julie Walsh

The House-passed Waxman-Markey energy-rationing bill, H.R. 2454, sets specific federal housing standards that would increase the cost of a home from $4,000 to $10,000 and price more than 1,000,000 people out of the market, according to Bill Killmer, a vice president of the National Association of Home Builders. In 2014 for new residential buildings and 2015 for new commercial buildings, a 50 percent increase in energy efficiency is required (relative to the baseline code), increasing each year thereafter. Waxman-Markey also adopts California’s portable lighting fixture standard as the national standard. And it mandates efficiency improvements for many new appliances, including spas, water dispensers, and dishwashers.

But the Senate’s Kerry-Boxer energy-rationing bill, S. 1733, goes much further; it gives an unelected federal official a regulatory blank check:

“The (EPA) Administrator, or such other agency head or heads as may be designated by the President, in consultation with the Director of the National Institute of Standards and Technology, shall promulgate regulations establishing building code energy efficiency targets for the national average percentage improvement of buildings energy performance.” And, “The Administrator, or such other agency head or heads as may be designated by the President, shall promulgate regulations establishing national energy efficiency building codes for residential and commercial buildings.” Pp. 173-174

Federal building codes would be in the hands of the EPA.

Across the States

California

California Governor Arnold Schwarzenegger this week signed into law S.B. 32, which establishes a feed-in tariff that forces utilities to pay for surplus electricity generated by solar roof-top panels. Previously, California ratepayers subsidized the purchase of solar panels; now, they must pay above-market prices for power generated by those panels. The upshot is that the preponderance of ratepayers will pay more for electricity in order to subsidize the green-lifestyle of Californians wealthy enough to afford solar panels.

Mr Krugman in Sunday’s New York Times is worried.

In  his article “Cassandras of Science” he says, “What’s driving this new pessimism? Partly it’s the fact that some predicted changes, like a decline in Arctic Sea ice, are happening much faster than expected. Partly it’s growing evidence that feedback loops amplifying the effects of man-made greenhouse gas emissions are stronger than previously realized. For example, it has long been understood that global warming will cause the tundra to thaw, releasing carbon dioxide, which will cause even more warming, but new research shows far more carbon locked in the permafrost than previously thought, which means a much bigger feedback effect.”

He’s worried about the Arctic ice. Here’s the latest, though. Information from the National Snow and Ice Data Center shows that the Arctic has been rebounding for the past two years. (It hasn’t recovered yet, though.) The minimum sea ice extent in September of 2007 was 4.3 million square kilometers. In 2008, it was 4.7 mill sq km. And in 2009, it was 5.1 mill sq km. If the Arctic ice continues to rebound at this rate of 0.4 mill sq km per year, in two years it will be back to the level seen in 2006 of 5.9 mill sq km. And if it continues at this rate for three years? It will pass the Arctic sea ice minimum in 1995 of 6.1 mill sq km.

Krugman is also worried about the warming tundra releasing carbon dioxide and methane. But CO2Science .org says, “Another scare story came from a scientist who said the last IPCC report underestimated the vast amount of carbon contained in the world’s permafrost, which could be released to the air by rising temperatures. However, a detailed study of this phenomenon (Delisle, 2007) indicates that “permafrost will mostly prevail in this century in areas north of 70°N,” even for an unbelievable warming of 8°C, and that “permafrost will survive at depth in most areas between 60° to 70°N.” This scenario is also supported by the small amount of organic carbon released from permafrost during previous periods of warming, such as the Medieval Warm Period and Holocene Climatic Optimum, when no significant methane excursions were detected in ice core records of either Antarctica or Greenland.” If the Medieval Warm Period, which was warmer than today, didn’t have increased methane, then we won’t see it either.

If Mr Krugman is concerned about the sea bed deposits of methane called clathrates, he would be comforted reading about this six-year study by Petrenko at the University of Colorado, then. Petrenko says, “The results definitely help us to say that it doesn’t seem methane clathrates respond to warming by releasing lots of methane into the atmosphere, which is really good news for global warming.” Petrenko also said that temperatures in Greenland 12,000 years ago had increased about 10 degrees Celsius in 20 years. But it took 150 years for methane levels in the atmosphere to increase by 50 percent. Therefore, the methane did not contribute to that increase.

Arctic hockey stick graphs that claim that the Arctic is warmer now than in the past two thousand years such as this one, rely upon “previously published data from glacial ice and tree rings that were calibrated against the instrumental temperature record.” That tree ring data is now known to have been incorrect. When those graphs are corrected, they will  likely show that around 1000AD the Arctic was warmer but that runaway global warming obviously did not occur.

I can understand that Krugman hasn’t followed the science, but to make comments like this one, Krugman just looks very deceived: “And the industries of the past have armies of lobbyists in place right now; the industries of the future don’t.” The money behind “green” is actually enormous.

Yesterday, in State of Connecticut et al. v. American Electric Power et al., the 2nd U.S. Court of Appeals decided that states and other plaintiffs have the right to sue five electric utilities – American Electric Power, Cinergy, Southern Co., Excel Energy, and the Tennessey Valley Authority – for creating a ”public nuisance” by emitting CO2 and, thus, contributing to global warming.

With regard to American Electric Power (AEP) and Cinergy, I am tempted to say, it couldn’t happen to a nicer bunch of guys. These utilities for years have lobbied for carbon cap-and-trade schemes. Instead of opposing climate alarmism, they have helped promote it. Boys, you reap what you sow!  How are you going to deny plaintiffs’ allegations that your CO2 emissions are a public nuisance, when you have repeatedly stated on the record that man-made global warming is a big, big problem?

In the 139-page decision, Judges Joseph McClaughlin and Peter Hall (appointed by Presidents George H.W. Bush and George W. Bush respectively) rejected the lower court’s opinion (and the utilities’ argument) that the relief sought by plaintiffs — a gradually decreasing cap on the utilities’ CO2 emissions — raised “non-justificiable political questions.”

In a sane universe, the Appelate Court would have upheld the lower court’s decision. Energy policy is manifestly a political question — perhaps the most politicized issue to come down the pike in decades. If courts and litigators can dictate energy policy (actually, anti-energy policy) to the nation, then constitutional self-government is at an end.

The Court held that granting plaintiffs’ proposed remedy would not “decide overarching policy questions such as whether other industries or emission sources not before the court must also reduce emissions or determine how across-the board emissions reductions would affect the economy and national security.” Rather, the Court said, granting the remedy sought would only require the lower court to “resolve the particular nuisance issue before it” involving just the five utilities in the case (p. 30).

Who do Judges McClaughlin and Hall think they’re fooling? If plaintiffs sue the utilities and win, the precedent they establish would have enormous policy consequences. That’s the whole point. Setting the precedent for additional “public nuisance” litigation to restructure energy markets and the economy is what the case is all about.

Nobody seriously believes that capping the five utilites’ emissions would in itself provide any measurable relief from climate change, or any damages allegedly resulting from climate change. The litigation is either political grandstanding  and ambulance chasing, or it is designed to set the stage for a broader, policy-changing, litigation campaign. 

Once a court actually determines that CO2 emissions are a public nuisance, the same plaintiffs — or others — could argue that nothing less than eliminating AEP and Cinergy’s emissions is adequate to avoid dangerous “tipping points” and reduce “injuries” to the public (p. 8). Logically, if lower emissions is better, zero emissions is best.

Surely there is no shortage of eco-litigation groups willing to press the legal logic as far as it will go. The Center for Biological Diversity, for example, leads a coalition calling itself 350 or Bust. The idea is to use all available legal means to bring atmospheric CO2 concentrations down to 350 parts per million (today’s level is about 387 ppm). Accomplishing that goal would likely require a global depression over many decades. Pardon me if I view the alliance of climate alarmism and judicial activism as one of the biggest public nuisances we face.

It’s easy to suppose that public nuisance litigation will target only major emitters such as coal-burning utilities. But remember, utilities emit CO2 only in the process of serving customers who consume electricity. People powering their factories, lighting their homes, and running their laptops are ultimately to blame for destroying the planet, according to the “science” invoked by plaintiffs. In their worldview, everybody is injuring everybody else. So, shouldn’t everybody have the right sue everybody else?

I am reminded of the South Park Episode, Two Days Before the Day After Tomorrow – a parody of the preachy, global warming, Sci-Fi disaster film, The Day After Tomorrow

Stan and Cartman crash a speed boat into the world’s largest beaver dam, flooding the people of Beaverton out of their homes. Later that night, Stan, feeling guilty, asks his parents what’s being done to rescue the flood victims. Stan’s father says that’s not as important as finding out who deserves the blame. Some in South Park accuse George Bush; others accuse Al Qaeda. Stan’s father and other Colorado scientists announce they have found the real culprit: global warming.

Then comes the really bad news: Global warming will strike not the day after tomorrow, as scientists had previously thought, but two days before the day after tomorrow – today! There is panic in the streets, not just in South Park but all around the country. Fearing that global warming will shift the climate into an ice age, Stan’s father dons Arctic weather gear and nearly perishes in the summer heat. 

The Army rescues the Beaverton residents still stranded on their rooftops and ends the global warming panic — but only by blaming the flood on yet another bogeyman: Six-Legged, pincered, “Crab People.” Unable to live with the guilt any longer, Stan confesses to the people of South Park: ”I broke the dam.” One of the adults translates: “Don’t you see what this child is saying … we all broke the dam.” Another adult steps forward and says, “I broke the dam.” Then another and another.

We all emit CO2. We all consume electricity. Even if our utility generates juice from nukes or hydro, we drive CO2-emitting cars and consume goods and services made either directly or indirectly with CO2-emitting fossil energy. According to the “science” underpinning plaintiffs’ lawsuit, we’re all responsible for every damage and harm that anyone can plausibly (or implausibly) blame on global warming — every flood, every eroded beach, every summer dry spell, every tornado, and hurricane, etc. We have met the public nuisance, and it is us!

South Park explains the two-fold appeal of global warming hysteria. First, warmism feeds and legitimizes the desire to punish and blame. It justifies and focuses political indignation. It incites political and legal attack on coal-power plants and oil companies – key sources of our prosperity.

Second, warmism gratifies the need to feel connected to something really big and important, usually on the cheap. It feeds feelings of collective guilt (”we all broke the dam”) while offering a number of easy expiation rituals (”I recyle,” ”I voted for Obama,” “I support cap-and-trade”). 

In light of this, ahem, analysis, we should expect future common law CO2 litigation cases to (a) demand bigger penalties for major emitters and bigger cuts in their CO2 emissions than plaintiffs in State of Connecticut currently demand, and (b) target smaller and smaller entities as public nuisances.

Saluting Norman Borlaug’s scientific, agricultural and humanitarian legacy

“Since when did you become a global warming alarmist?” I kidded Norman midway into our telephone conversation a few weeks before this amazing scientist and humanitarian died.

“What are you talking about?” Dr. Borlaug retorted. “I’ve never believed that nonsense.”

I read a couple sentences from his July 29 Wall Street Journal article. “Within the next four decades, the world’s farmers will have to double production … on a shrinking land base and in the face of environmental demands caused by climate change. Indeed, [a recent Oxfam study concludes] that the multiple effects of climate change might reverse 50 years of work to end poverty.”

I mentioned that my own discussions of those issues typically emphasize how agricultural biotechnology, modern farming practices and other technological advances will make it easier to adapt to any climate changes, warmer or colder, whether caused by humans or by the same natural forces that brought countless climate shifts throughout Earth’s history.

“You’re right,” he said. “I should have been more careful. Next time, I’ll do that. And I’ll point out that the real disaster won’t be global warming. It’ll be global cooling, which would shorten growing seasons, and make entire regions less suitable for farming.”

I was amazed, as I was every time we talked. Here he was, 95 years old, “retired,” still writing articles for the Journal, and planning what he’d say in his next column.

The article we were discussing, “Farmers can feed the world,” noted Norman’s deep satisfaction that G-8 countries have pledged $20 billion to help poor farmers acquire better seeds and fertilizer. “For those of us who have spent our lives working in agriculture,” he said, “focusing on growing food versus giving it away is a giant step forward.”

Our previous conversations confirm that he would likewise have applauded the World Bank’s recent decision to subsidize new coal-fired power plants, to generate jobs and reduce poverty, by helping poor countries bring electricity to 1.5 billion people who still don’t have it. For many poor countries, a chief economist for the Bank observed, coal is the only option, and “it would be immoral at this stage to say, ‘We want to have clean hands. Therefore we are not going to touch coal.'” Norman would have agreed.

“Governments,” he argued,  “must make their decisions about access to new technologies … on the basis of science, and not to further political agendas.” That’s why he supported DDT to reduce malaria, biotechnology to fight hunger, and plentiful, reliable, affordable electricity to modernize China, India and other developing nations.

His humanitarian instincts and commitment to science and poverty eradication also drove his skepticism about catastrophic climate change.

He was well aware that recent temperature data and observations of solar activity and sunspots indicate that the Earth could be entering a period of global cooling. He had a healthy distrust of climate models as a basis for energy and economic policy. And he knew most of Antarctica is gaining ice, and it would be simply impossible for Greenland or the South Pole region to melt under even the more extreme temperature projections from those questionable computer models.

He also commented that humans had adapted to climate changes in the past, and would continue to do so. They would also learn from those experiences, developing new technologies and practices that would serve humanity well into the future.

The Ice Ages doubtless encouraged people to unlock the secrets of fire and sew warm clothing. The Little Ice Age spawned changes in societal structure, housing design, heating systems and agriculture. The Dust Bowl gave rise to contour farming, crop rotation, terracing and other improved farming practices.

Norman’s dedication to science, keen powers of observation, dogged perseverance, and willingness to live for years with his family in Mexico, India and Pakistan resulted in the first Green Revolution. It vastly improved farming in many nations, saved countless lives, and converted Mexico and India from starving grain importers to self-sufficient exporters.

In his later years, he became a champion of biotechnology, as the foundation of a second Green Revolution, especially for small-holder farmers in remote parts of Africa. Paul Ehrlich and other environmentalists derided his ultimately successful attempt to defuse “The Population Bomb” through his initial agricultural advances, and attacked him for his commitment to biotechnology.

His response to the latter assaults was typically blunt. “There are 6.6 billion people on the planet today. With organic farming, we could only feed 4 billion of them. Which 2 billion would volunteer to die?”

The Atlantic Monthly estimated that Norman’s work saved a billion lives. Leon Hesser titled his biography of Borlaug The Man Who Fed the World. Competitive Enterprise Institute senior fellow Greg Conko dubbed him a “modern Prometheus.” Science reporter Greg Easterbrook saluted him as the “forgotten benefactor of mankind.” And the magician-comedy-political team of Penn and Teller said he was “the greatest human being who ever lived.”

He deserved every award and accolade – and merited far more fame in the United States than he received, though he was well known in India, Mexico and Pakistan, where his work had made such a difference.

Norman was also a devoted family man and educator. He served as Distinguished Professor of International Agriculture at Texas A&M University into his nineties. A year and a half ago, he gladly spent 40 minutes on the telephone with my daughter, who interviewed him for a high school freshman English “true hero” paper – and did so just after returning from the hospital and on the one-year anniversary of his beloved wife Margaret’s death.

He told my daughter it was because of Margaret, “and her faith in me and what I was doing, that we were able to live in Mexico, under conditions that weren’t nearly as good as what we could have had in the United States, and I was able to do my work on wheat and other crops.”

I sent him occasional articles, and we talked every few months, about biotech, global warming, malaria eradication, some new scientific report one of us had seen, or some website he thought I should visit. As we wrapped up our early August chat, we promised to talk again soon. Sadly, he entered a hospice and passed away before that could happen.

His mind was “still as clear as ever,” his daughter Jeanie told me, but his body was giving out. To the very end, Norman was concerned about Africa and dedicated to the humanitarian and scientific principles that had guided his life and research, and earned him the 1970 Nobel Peace Prize.

Norman left us a remarkable legacy. But as he told my daughter, “There is no final answer. We have to keep doing research, if we are to keep growing more nutritious food for more people.”

The world, its climate and insect pathogens will continue to change. It is vital that we sustain the incredible agricultural revolution that Norman Borlaug began.

Paul Driessen is senior policy advisor for the Committee For A Constructive Tomorrow and Congress of Racial Equality, and author of Eco-Imperialism: Green power ∙ Black death.

9/18/2009

Last week Senator Blanche Lincoln (D-AR) became chairman of the Agriculture Committee, after Senator Tom Harkin (D-IA), the previous chair, accepted the gavel at the Health, Labor, Education and Pension Committee (vacated by the passing of Ted Kennedy).

Lincoln becomes the first female to chair this powerful committee, and her ascension to the top-spot will have a big impact on the country’s energy policy.

For almost a decade, the Senate Ag Committee has been the primary benefactor of ethanol, a fuel made from corn. Regardless whether the Ag chair was a Republican or a Democrat, the Committee, which is dominated by corn-belt politicians, showered ethanol with subsidies and give-aways-and even a Soviet-style production quota that forces consumers to use it. Government support for ethanol has been great for corn growers (they’ve seen demand increase by almost 50% since 2005), but it’s awful for livestock farmers, who have seen the cost of corn-feed skyrocket. Consumers have also been harmed, as the price of corn derivatives (meat, dairy, soda, etc., etc.,) has increased so sharply that inflation of the cost of food doubled the historical rate in 2008.

With Lincoln taking the gavel of the Ag Committee, however, the ethanol gravy train might be coming to an end. That’s because Lincoln doesn’t represent the corn-belt. To be sure, they grow corn in Arkansas, primarily in the eastern part of the state. But in western Arkansas, farmers raise chickens. In fact, the Natural State is the nation’s #2 producer of broiler chickens. America’s ethanol policy has seriously compromised the chicken industry, so we can expect Lincoln to take a more conservative approach with fuels made out of food.

Lincoln is also likely to affect the climate debate. The Ag Committee has some jurisdiction over climate change legislation, and Lincoln’s vote on cap-and-trade is a priority for her caucus leadership, which is having a tough time finding support for a climate bill among Senate Democrats. But Arkansas politics are decidedly unfavorable to global warming alarmism. Rep. Vic Snyder (D-Arkansas), who represents Little Rock and much of Pulaski County, was the only member of his State’s delegation to vote for the American Clean Energy and Security Act, cap-and-trade legislation that passed through the House of Representatives in late June, and he has been hammered over the airwaves by utilities, agriculture interests, and political opponents ever since. Now, there is considerable speculation that his seat is in jeopardy-all thanks to his vote for a cap-and-trade. No doubt Lincoln has noticed Snyder’s plight.

Announcements

This week the American Energy Alliance launched a four week American energy bus tour to build public awareness of what cap-and-trade is, how it works, and the extent to which it’s capable of inflicting serious damage to the American economy. Click here to learn more.

Freedom Action is a new political advocacy organization that aims to create a gathering of grassroots free market activists that will make their voices heard above special interests and big government advocates. Freedom Action’s first project is to Stop Al Gore’s Electricity Tax, and can be found here.

Americans For Prosperity is hosting grassroots demonstrations against cap-and-trade energy rationing in cities across the country. Learn more about the Hot Air Tour by clicking here.

The Center for Data Analysis (CDA) at the Heritage Foundation last week published state-by-state analysis of what the American Clean Energy and Security Act would cost consumers. Click here to find out how much cap-and-trade would raise energy prices in your state.

In the News

Carbongate
Investor’s Business Daily, 28 August 2009

Greens Threaten Native American Prosperity
William Yeatman & Jeremy Lott, Washington Examiner, 28 August 2009

Why the Electric Industry Supports Energy Rationing
Robert Peltier, MasterResource.org, 27 August 2009

Biofuels Are Going Bust
Ann Davis & Russell Gold, Wall Street Journal, 27 August 2009

GE’s Climate Scam
Timothy Carney, Washington Examiner, 26 August 2009

Carbon Baron Gore
Lawrence Solomon, Financial Post, 26 August 2009

EPA Looking To Shut Down Whistleblower’s Office
Gary Howard, GlobalWarming.org, 26 August 2009

Spiking the Road to Copenhagen
Deepak Lal, Business Standard, 25 August 2009

Counting the Costs
Paul Chesser, American Spectator, 25 August 2009

The Cap-and-Trade Bait and Switch
David Schoenbrod & Richard Stewart, Wall Street Journal, 24 August 2009

12 Facts about Global Warming That You Won’t See in the Mainstream Media
Joseph D’Aleo, Energy Tribune, 18 August 2009

Energy Workers Rally against Climate Plan
Tom Fowler, Houston Chronicle, 18 August 2009

5 Things Congress and the President Are Doing to Keep Gas Prices High
Ben Lieberman, Heritage Webmemo, 13 August 2009

News You Can Use

UN Exaggerates Global Warming 6 Fold

The UN has exaggerated global warming 6-fold, according to a recent peer-reviewed paper by Professor Richard Lindzen of the Massachusetts Institute of Technology.

The Science and Public Policy Institute has reprinted this important new study, which is available here.

Inside the Beltway

Climate Science on Trial

The U.S. Chamber of Commerce this week threatened the Environmental Protection Agency with a lawsuit unless the EPA publicly defends the science it used to conclude last April that carbon dioxide “endangers” health and human welfare.

An “endangerment” ruling might sound like harmless bureaucratese, but it’s actually a clear and present danger to all Americans, because it would tripwire provisions of the Clean Air Act that would send the U.S. economy back to the Stone Age (to learn more about the possibility of this regulatory nightmare, click here).

Despite the far-reaching economic consequences of an “endangerment” ruling, there was virtually no transparency in the EPA’s decision-making process. Earlier this summer, the Competitive Enterprise Institute uncovered evidence that the EPA actually suppressed a dissenting voice from a career official for political reasons. In light of these troubling procedures and tactics, the EPA should grant the Chamber’s request, and put alarmist climate science on trial.

A Crestfallen Greenpeace Activist

Julie Walsh

I recently spoke to a pro-climate bill kid on the street. He had all of Greenpeace’s talking points down-climate refugees, wind and solar’s future, the European heat wave, etc-which I easily refuted.  When he came to Exxon’s past support of climate realists, he looked truly heartbroken when I told him that Exxon now supports Nature Conservancy and Conservation Fund. Using the alarmists’ logic, if we were shills then, they’re the shills now.

And I went on to explain how that, according to the draft, the current energy-rationing bill was “modeled closely” on the recommendations of big corporations-GE, Shell, Duke Energy, etc. I think I may have ruined his day.

This is why Poland’s new proposition cuts to the true motives of the Big Money behind this scheme: Poland may ban utilities from selling European Union carbon emissions permits many of them will get for free from 2013. No more windfall profits.

Today’s excerpt from CEI’s film, Policy Peril: Why Global Warming Policies Are More Dangerous Than Global Warming Itself, is on two global warming policies Congress has adopted: fuel economy standards and biofuel mandates.

Here are my previous posts in this series:

To watch today’s film excerpt, click here. To watch the entire film, click here.

The text of today’s film clip immediately follows. It includes footnotes to additional commentary and supporting information.

Narrator: If stopping new coal is the global warming movement’s top priority, a close second is jump-starting a ‘beyond petroleum’ transport system. They propose to do this by tightening new-car fuel economy standards. Why?

A car that gets more miles to the gallon emits less CO2 per mile [1]. But the federal fuel economy program, also known as CAFE, has serious downsides.

Sam Kazman (General Counsel, Competitive Enterprise Institute): Now there are lots of problems with fuel economy mandates. One thing, they raise new car prices. [2] Secondly, they restrict consumer choice. [3] But the worst thing is an effect you never hear their advocates talking about. Namely, fuel economy mandates kill people. [4]

Narrator: Here’s why. Heavier cars provide more mass to absorb collision forces, and bigger cars provide more space between the occupant and the point of impact. [5] Make a car smaller and lighter, and it will go farther on a gallon of gas.

Kazman: But you also make it less safe. According to the National Academy of Sciences, the current CAFE standard by downsizing cars, contributes to about 2,000 fatalities per year. [6]

Narrator: Legislation Congress passed in December 2007  requires a 40% increase in fuel economy by 2020. [7] In 2007, only two out 1,153 vehicle models met the new standards. [8] So expect more downsizing in the years ahead.

Another ‘beyond petroleum’ policy is to require the sale of alternative fuels. In December 2007 Congress also mandated that motor fuel producers sell 36 billion gallons of ethanol a year by 2022, with 15 billion gallons coming from corn kernals. [9] The result, we’re diverting massive quantities of grain from food to auto fuel. This contributes to the surge in global grain prices that is pushing millions of the world’s poorest people to the brink of starvation. [10]

But at least ethanol cuts down on CO2 emissions, right? Actually, no.

Dr. Dennis Avery (Hudson Institute): As we expand the cropland, then we get into the real trouble, because we release the greenhouse gas that’s stored in the soil as carbon. And with corn, we release twice as much gas as we would have released if we burned gasoline in the first place. [11]

[1] A gallon of gasoline (which weighs about 6.3 lbs.) produces 20 lbs. of CO2 when burned. If a car gets more miles to the gallon, it will emit fewer lbs. of CO2 per mile driven. The relationship between fuel economy (mpg) and lbs. CO2/mile is so strict that EPA bases its fuel economy ratings of vehicle models on tests that measure the carbon content of the emissions, principally CO2.

Unsurprisingly, virtually all CO2-reduction options for new motor vehicles are fuel-economy-increasing options. See p. 10 of the National Automobile Dealer Association’s comment on EPA’s reconsideration of California’s request for a waiver to establish greenhouse gas emission standards for new motor vehicles. 

[2] There are basically two ways to increase fuel economy–downsizing (making cars smaller and lighter) and new technology. Typically, advanced technology costs more than conventional technology. The Energy Information Administration, for example, estimates that California’s greenhouse gas/fuel economy standards, which President Obama recently adopted, will increase the average price of a new car by $1,860 in 2016. [Obama’s program will also impose heavy burdens on the nearly prostrate U.S. auto industry, as economist Keith Hennessey explains.]

[3] The CAFE program all but killed the market for large station wagons, because automakers could not produce millions of these once popular “family cars” and meet the CAFE standard for their vehicle fleets.

In addition, as a general matter, because fuel economy mandates increase vehicle cost, they inevitably price some consumers out of the market for certain vehicle models, restricting their choices.

Ironically, the federal fuel economy program boost the production and sale of gas-guzzling SUVs. Consumers who might otherwise have purchased big station wagons instead bought large SUVs. Congress regulated SUV fuel economy less stringently because (1) SUVs are built on a light-truck chassis and thus are classified as trucks rather than as passenger cars, and (2) most SUVs traditionally were used for farming and business rather than commuting. Fuel economy standards helped create the boom market for low-mpg SUVs–a classic case of the law of unintended consequences.

[4] Sam debates the issue of whether CAFE kills with an analyst from Natural Resources Defense Council (NRDC) here.

[5] I am always amazed when people with scientific credentials deny the safey implications of regulatory-induced vehicle downsizing. How can they claim that size and weight don’t matter? That’s denying the laws of physics. There’s a reason why boxing matches don’t pit lightweights against heavyweights, or why marathon runners don’t play professional football.

Yes, new technology can improve the crashworthiness of small cars. But, as Sam explains elsewhere, a large car with new technology will still be safer than a small car with new technology. To the extent that CAFE constrains the production and sale of larger, heavier vehicles, it limits auto safety.

[6] Sam refers to a National Academy of Sciences/National Research Council (NRC) study, Effectiveness and Impact of Corporate Average Fuel Economy (CAFE) Standards. See pp. 25-29, especially p. 27. The NRC estimates that in 1993, a typical year, downweighting and downsizing of cars contributed to 1,300 to 2,600 auto fatalities, 13,000 to 26,000 incapacitating injuries, and 97,000 to 195,000 total injuries.  

[7] The so-called Energy Independence and Security Act (EISA). Click here to read the Congressional Research Service’s summary of the EISA provisions.

[8] Prior to investigating, I had assumed there must be at least 30-50 models on the road that met the fuel economy standards mandated by the 2007 Energy Independence and Security Act. But EPA’s fuel economy ratings for model year 2008 reveals that only two out of 1,153 models, the Toyota Prius and Honda Civic Hybrid, met or exceeded the standard (35 mpg for both city and highway driving conditions).

[9] Click here to read the Congressional Research Service’s summary of the EISA provisions.

[10] I provide references here on biofuel policy and world hunger. In May 2008, the International Food Policy Research Institute estimated that biofuel demand accounted for 30% of the increase in world cereal prices during 2007-2008. For further discussion, see Dennis Avery’s October 2008 paper for the Competitive Enterprise Institute. 

[11] Dennis’s CEI paper recaps the literature on CO2 increases from biofuel policy-induced land-use changes, including Searchinger et. al. (2008) and Fargione et al. (2008). Additional reviews of these studies are available on World Climate Report and CO2Science.Org.

Today’s post in my series of commentaries on excerpts from CEI’s film, Policy Peril: Why Global Warming Policies Are More Dangerous Than Global Warming Itself, challenges the Gorethodox dogma that the science debate on global warming is “over.”

There are three basic issues in the climate change science debate:

  • Detection – Has the world warmed, and if so, by how much?
  • Attribution – How much of the observed warming (especially since the mid-1970s) is due to increases in atmospheric greenhouse gas concentrations?
  • Sensitivity – How much additional warming should we expect from continuing increases in greenhouse gas concentrations?

Despite what you’ve heard over and over again, these basic issues are unsettled, and more so now than at any time in the past decade. The science debate is not “over.” Reports of the death of climate skepticism have been greatly exaggerated.

Because of time constraints (Policy Peril runs under 40 minutes), the film briefly explores only the most important of the three basic issues: climate sensitivity. Today’s clip comes from that part of the film: an interview with University of Alabama in Huntsville atmospheric scientist Dr. Roy Spencer. To watch the Spencer interview, click here. To watch the entire movie, click here.

Here’s how this post is organized. First, I’ll reproduce the text of Spencer’s interview. Then, I’ll review some recent research bearing on the three fundamental science issues: detection, attribution, and sensitivity.

Text of today’s film clip:

Narrator: All the IPCC models assume that a CO2-induced warming will produce more high-altitude cirrus clouds, which then trap even more heat in the atmosphere. This is what’s called a positive climate feedback. Roy Spencer and his colleagues use satellites to study cirrus cloud behavior.

Dr. Roy Spencer (University of Alabama in Huntsville): Last August, August of 2007, we published research which showed from a whole bunch of satellite data that when the tropical atmosphere heats up–there are these periods when the atmosphere heats up from more rain activity or cools down from less rain activity–that when it heats up, the skies actually open up. The cirrus clouds that are up high, in the troposphere, in the upper atmosphere, open up and let more cooling infrared radiation escape to space. And it was a very strong effect.

Narrator: Spencer says that if climate models incorporated the negative feedback his team discovered, the models might forecast 75% less warming.

This is definitely not the Al Gore view of climate sensitivity. In fact, in An Inconvenient Truth (p. 67), Gore suggests we could get “three times as much” warming by mid-century as has occurred since the “depth of the last ice age.” That would mean a warming of 10ºC-12ºC by mid-century! Gore’s implicit warming forecast goes way beyond the IPCC best-estimate forecast range of 1.8ºC  to 4.0ºC (IPCC WWI AR4, Summary for Policymakers, p. 13). As we’ll see below, several strands of evidence suggest that the IPCC models are also too “hot.”

Detection

The world has warmed overall during the past 130 years, as evidenced by melting glaciers, longer growing seasons, and both proxy and instrumental data. However, the main era of “anthropogenic” global warming supposedly began in the mid-1970s, and ongoing research by retired meteorologist Anthony Watts leaves no doubt that in recent decades, the U.S. surface temperature record–reputed to be the best in the world–is unreliable and riddled with false warming biases.

Watts and a team of more than 650 volunteers have visually inspected and photographically documented 1003, or 82%, of the 1,221 climate monitoring stations overseen by the U.S. Weather Service. In a report summarizing an earlier phase of the team’s investigation (a survey of 860+ stations), Watts says, “We were shocked by what we found.” He explains:

We found stations located next to exhaust fans of air conditioning units, surrounded by asphalt parking lots and roads, on blistering-hot rooftops, and near sidewalks and buildings that absorb and radiate heat. We found 68 stations located at wastewater treatment plants, where the process of waste digestion causes temperatures to be higher than in surrounding areas.

In fact, we found that 89 percent of the stations–nearly 9 of every 10–fail to meet the National Weather Services’s own siting requirements that stations must be 30 meters (about 100 feet) or more away from an artificial heating or radiating/reflecting heat source. In other words, 9 or every 10 stations are likely reporting higher or rising temperatures because they are badly sited.

“It gets worse,” Watts continues:

We observed that changes in the technology of temperature stations over time also have caused them to report a false warming trend. We found gaps in the data record that were filled in with data from nearby sites, a practice that propagates and compounds errors. We found adjustments to the data by both NOAA and another government agency, NASA, cause recent temperatures to look even higher.

How big a problem is this? According to Watts, “The errors in the record exceed by a wide margin the purported rise in temperature of 0.7ºC (about 1.2ºF) during the twentieth century.” Based on analysis of 948 stations rated as of May 31, 2009, Watts estimates that 22% of stations have an expected error of 1ºC, 61% have an expected error of 2ºC, and 8% have an expected error of 5ºC.

watts_fig23

Watts concludes that, “this record should not be cited as evidence of any trend in temperature that may have occurred across the U.S. during the past century.” He further concludes: “Since the U.S. record is thought to be ‘the best in the world,’ it follows that the global database is likely similarly compromised and unreliable.”

A related issue is the influence of urban heat islands on long-term temperature records. Climate Change Reconsidered, a report by the Nongovernmental International Panel on Climate Change (NIPCC), written by Drs. Craig Idso and S. Fred Singer with 35 contributors and reviewers, reviews more than 40 studies on urban heat islands. For example, a study by Oke (1973) of the urban heat island strength of 10 settlements in the St. Lawrence Lowlands of Canada found that a population as small as 1,000 people could generate a heat island effect of 2ºC-2.5ºC. From this study and the others reviewed, the NIPCC concludes:

It appears almost certain that surface-based temperature histories of the globe contain a significant warming bias introduced by insufficient corrections for the non-greenhouse-gas-induced urban heat island effect. Furthermore, it may well be impossible to make proper corrections for the deficiency, as the urban heat island of even small towns dwarfs any concommitant augmented greenhouse effect that may be present [p. 95; emphasis in original].

In a comment submitted to EPA regarding its proposed endangerment finding, University of Alabama in Huntsville (UAH) atmospheric scientist John Christy notes two additional reasons to conclude that the IPCC surface data records exaggerate warming trends:

As a culmination of several papers and years of work, Christy et al. (2009) demonstrates that popular surface datasets overstate the warming that is assumed to be greenhouse related for two reasons. First, these datasets use only stations that are electronically (i.e. easily) available, which means the unused, vast majority of stations (usually more rural and representative of actual trends but harder to find) are not included. Secondly, these popular datasets use the daily mean surface temerpature (TMean) which is the average of the daytime high (TMax) and nighttime low (TMin). In this study (and its predecessors, Christy 2002, Christy et al. 2006, Pielke Sr. et al. 2008, Walters et al. 2007 and others) we show that TMin is seriously impacted by surface development, and thus its rise is not an indicator of greenhouse gas forcing. Some have called this the Urban Heat Island effect, but, as described in Christy et al. 2009, it is much more than this and encompasses any development of the surface (e.g. irrigated agriculture).

For example, the UK Hadley Center, relying on two electronic surface stations, computed a TMax temperature trend in East Africa of 0.14ºC per decade during 1905-2004. Christy, using data from 45 stations, found a trend of only 0.02ºC per decade.

christy-uah-v-hadcrut3

In California, Christy found that the only significant warming trend is for TMin in the irrigated San Joaquin Valley. Note, in the non-irrigated Sierra mountains, where models project a greenhouse gas-induced warming should occur, there is actually a decreasing temperature trend.

christy-tmin-ca

Obviously, temperature data are the starting point of any analysis of global warming. But if we can’t trust the U.S. and IPCC temperature records, how do we know how much global warming has actually occurred?

Satellite observations are not influenced by heat islands and irrigation, or subject to the quality control problems detailed by Watts. Moreover, satellite records tally well with weather balloon observations–an independent database. So maybe detection should be based solely on satellite data, which do show some warming over the past 30 years. However, the “debate is over” crowd is unlikely to embrace this solution.  The satellite record shows a relatively slow rate of warming–about 0.13ºC per decade–hence a relatively insensitive climate.

uah-temperature-anomalies-jan-1979-june-20093

Moreover, as can be seen in the above chart of the University of Alabama-Huntsville (UAH) satellite record, some of the 0.13ºC/decade ”trend” comes from the 1998 El Nino warming pulse. Remove 1998, and the 30-year satellite record trend drops to 0.12ºC/decade.

Attribution

The IPCC, the leading spokesman for the alleged scientific consensus, claims that, “Most of the observed increase in global average temperatures since the mid-20th century is very likely due to the observed increase in anthropogenic greenhouse gas concentrations.” How does the IPCC know this? The IPCC offers three main reasons.

First, according to the IPCC, “Paleoclimate reconstructions show that the second half of the 20th century was likely the warmest 50-year period in the Northern hemisphere in 1300 years” (IPPC AR4, WGI, Chapt. 9, p. 702).  The warmth of recent decades coincided with a rapid increase in GHG concentrations. Therefore, the IPCC reasons, most of the recent warming is likely due to anthropogenic GHG emissions.

This argument is unconvincing if the warming of recent decades is not unusual or unprecedented in the past 1300 years. As it happens, numerous studies indicate that the Medieval Warm Period (MWP)–roughly the period from AD 800 to 1300, with peak warmth occurring about AD 1050–was as warm as or warmer than the Current Warm Period (CWP).

The Center for the Study of Carbon Dioxide and Global Change has analyzed more than 200 peer-reviewed MWP studies produced by more than 660 individual scientists working in 385 separate institutions from 40 countries. The Center divides these studies into three categories–those with quantitative data enabling one to infer the degree to which the peak of the MWP differs from the peak of the CWP (Level 1), those with qualitative data enabling one to infer which period was warmer (Level 2), although not by how much, and those with data enabling one to infer the existence of a MWP in the region studied (Level 3). An interactive map showing the sites of these studies is available at CO2Science.org.

Only a few Level 1 studies determined the MWP to have been cooler than the CWP; the vast majority indicate a warmer MWP. On average, the studies indicate that the MWP was 1.01ºC warmer than the CWP.

mwpquantitative

Figure Description: The distribution, in 0.5ºC increments, of Level 1 studies that allow one to identify the degree to which peak MWP temperatures either exceeded (positive values, red) or fell short of (negative values, blue) peak CWP temperatures.

Similarly, the vast majority of Level 2 studies indicate a warmer MWP:

mwpqualitative

Figure Description: The distribution of Level 2 studies that allow one to determine whether peak MWP temperatures were warmer than (red), equivalent to (green), or cooler than (blue), peak CWP temperatures.

The IPCC’s second main reason for attributing most recent warming to the increase in GHG concentrations is that climate models “cannot reproduce the rapid warming observed in recent decades when they only take into account variations in solar output and volcanic activity. However . . . models are able to simulate observed 20th century changes when they include all of the most important external factors, including human influences from sources such as greenhouse gases and natural external factors” (IPCC, AR4, WGI, Chapt. 9, p. 702).

This would be decisive if today’s models accurately simulate all important modes of natural variability. In fact, models do not accurately simulate the behavior of clouds and ocean cycles. They may also ignore important interactions between the Sun, cosmic rays, and cloud formation.

Richard Lindzen of MIT spoke to this point at the Heartland Institute’s recent (June 2, 2009) Third International Conference on Climate Change:

What was done [by the IPCC], was to take a large number of models that could not reasonably simulate known patterns of natural behavior (such as ENSO, the Pacific Decadal Oscillation, the Atlantic Multi-Decadal Oscillation), claim that such models nonetheless adequately depicted natural internal climate variability, and use the fact that models could not replicate the warming episode from the mid seventies through the mid nineties, to argue that forcing was necessary and that the forcing must have been due to man. The argument makes arguments in support of intelligent design seem rigorous by comparison.

“Fingerprint” studies are the third basis on which the IPCC attributes most recent warming to anthropogenic greenhouse gases. Climate models project a specific pattern of warming through the vertical profile of the atmosphere–a greenhouse “fingerprint.” If the observed warming pattern matches the model-projected fingerprint, that would be strong evidence that recent warming is anthropogenic. Conversely, notes the NIPCC, “A mismatch would argue strongly against any signficant contribution from greenhouse gas (GHG) forcing and support the conclusion that the observed warming is mostly of natural origin” (NPICC, p. 106).

Douglass et al. (2007) compared model-projected and observed warming patterns in the tropical troposphere. The observed pattern is based on three compilations of surface temperature records, four balloon-based records of the surface and lower troposphere, and three satellite-based records of various atmospheric layers–10 independent datasets in all.

“While all greenhouse models show an increasing warming trend with altitude, peaking around 10 km at roughly two times the surface value,” observes the NIPCC, “the temperature data from balloons give the opposite result; no increasing warming, but rather a slight cooling with altitude” (p. 107). See the figures below.

hot-spot

The mismatch between the model-predicted greenhouse fingerprint and the observed pattern is profound. As the Douglass team explains: “Model results and observed temperature trends are in disagreement in most of the tropical troposphere, being separated by more than twice the uncertainty of the model mean. In layers near 5 km, the modeled trend is 100% to 300% higher than observed, and above 8 km, modeled and observed trends have opposite signs.”

douglass

Figure description: Temperature trends for statellite era (ºC/decade). HadCRUT, GHCH and GISS are compilations of surface temperature observations. IGRA, RATPAC, HadAT2, and RAOBCORE are balloon-based observations of surface and lower troposphere. UAH, RSS, UMD are satellite-based data for various layers of the atmosphere. The 22-model average comes from an ensemble of 22 model simulations from the most widely used models worldwide. The red lines are the +2 and -2 standard errors of the mean from the 22 models. Source: Douglass et al. 2007.

The NIPCC concludes that the mismatch of observed and model-calculated fingerprints “clearly falsifies the hypothesis of anthropogenic global warming (AGW)” (p. 108). I would put the state of affairs more cautiously. In view of (1) significant evidence that the MWP was as warm as or warmer than the CWP, (2) the inability of climate models to simulate important modes of natural variability, and (3) the failure of observations to confirm a greenhouse fingerprint in the tropical trosophere, the IPCC claim that “most” recent warming is “very likely” anthropogenic should be considered a boast rather than a balanced assessment of the evidence.

Climate Sensitivity

The most important unresolved scientific issue in the global warming debate is how sensitive (reactive) the climate is to increases in GHG concentrations.

Climate sensitivity is typically defined as the global average surface warming following a doubling of carbon dioxide (CO2) concentrations above pre-industrial levels. The IPCC says a doubling is likely to produce warming in the range of 2ºC to 4.5ºC, with a most likely value of about 3ºC (IPCC, AR4, WGI, Chapt. 10, p. 749). The IPCC presents a range rather than a specific value because of uncertainty regarding the strength of the relevant feedbacks.

In a hypothetical climate with no feedbacks, positive or negative, a CO2 doubling would produce 1.2ºC of warming (IPCC, AR4, WGI, Chapt. 8, p. 631). In most climate models, the dominant feedbacks are positive, meaning that the warmth from rising GHG levels causes other changes (in water vapor, clouds, or surface reflectivity, for example) that either increase the retention of outgoing long-wave radiation (OLR) or decrease the reflection of incoming short-wave radiation (SWR).

In his speech at the June 2 Heartland Institute conference, Professor Lindzen summarized his research on climate sensitivity, which has since been accepted for publication by Geophysical Research Letters. Lindzen argues that climate feedbacks and sensitivity can be inferred from observed changes in OLR and SWR following observed changes in sea-surface temperatures. For fluctuations in OLR and SWR, Lindzen and his colleagues used the 16-year record (1985-1999) from the Earth Radiation Budget Experiment (ERBE), as corrected for altitude variations associated with satellite orbital decay. For sea surface temperatures, they used data from the National Centers for Environmental Prediction. For climate model simulations, they used 11 IPCC models forced with the observed sea-surface temperature changes.

The results are striking. All 11 IPCC models show positive feedback, “while ERBE unambiguously shows a strong negative feedback.”

lindzen-erbe-vs-models1

Figure description: ERBE data show increasing top-of-the-atmosphere radiative flux (OLR plus reflected SWR) as sea surface temperatures rise whereas models forecast decreasing radiative flux. Source: Lindzen and Choi 2009.

The ERBE data indicate that the sensitivity of the actual climate system “is narrowly constrained to about 0.5ºC,” Lindzen estimates. ”This analysis,” says Lindzen in a recent commentary, “makes clear that even when all models agree, they can be wrong, and that this is the situation for the all important question of climate sensitivity.”

erbe-v-model-sensitivity4

At the Heartland Institute’s Second International Conference on Climate Change (March 2009), Dr. William Gray of Colorado State University presented satellite-based research that may explain the low climate sensitivity the Lindzen team infers from the ERBE data.

The IPCC climate models assume that CO2-induced warming significantly increases upper troposphere clouds and water vapor, trapping still more OLR that would otherwise escape to space. Most of the projected warming in the models comes from this positive water vapor/cloud feedback, not from the CO2. Satellite observations do not support this hypothesis, Gray contends:

Observations of upper tropospheric water vapor over the last 3-4 decades from the National Centers of Environmental Prediction/National Center for Atmospheric Research (NCEP/NCAR) reanalysis data and the International Satellite Cloud Climatology Project (ISCCP) data show that upper tropospheric water vapor appears to undergo a small decrease while Outgoing Longwave Radiation (OLR) undergoes a small increase. This is the opposite of what has been programmed into the GCMs [General Circulation Models] due to water vapor feedback.

The figure below comes from the NCEP/NCAR reanalysis of upper troposphere water vapor and OLR.

reanalysis-olr-and-water-vapor-50

Figure description: NCEP/NCAR renalysis of standardized anomalies at 400 mb (~7.5 km altitude) water vapor content (i.e. specific humidity — in blue) and OLR (in red) from 1950 to 2008. Note the downward trend in moisture and upward trend in OLR.

Gray’s paper deals with water vapor in the upper troposphere. What about high-altitude cirrus clouds, which climate models also predict will increase and trap more OLR as GHG concentrations increase?

Spencer et al. (2007), the study Dr. Spencer spoke about in today’s Policy Peril film clip, found a strong negative cirrus cloud feedback mechanism in the tropical troposphere. Instead of steadily building up as the tropical oceans warm, cirrus cloud cover suddenly contracts, allowing more OLR to escape. As mentioned, Spencer estimates that if this mechanism operates on decadal time scales, it would reduce model estimates of global warming by 75%.

A 2008 study by Spencer and colleague William D. Braswell examines the issue of climate feedbacks related to low-level clouds. Lower troposphere clouds tend to cool the Earth by reflecting incoming SWR. Observations indicate that warmer years have less cloud cover compared to cooler years. Modelers have interpreted this correlation as positive feedback effect in which warming reduces low-level cloud cover, which then produces more warming.

Spencer and Braswell found that climate modelers could be mixing up cause and effect. Random variations in cloudiness can cause substantial decadal variations in ocean temperatures. So it is equally plausible that the causality runs the other way, and increases in sea-surface temperature are an effect of natural cloud variations. If so, then climate models forecast too much warming. For more on this, visit Spencer’s Web site.

In a study now in peer review for possible publication in the Journal of Geophysical Research, Spencer and colleagues analyzed 7.5 years of NASA satellite data and “discovered,” he reports on his Web site, “that, when the effects of clouds-causing-temperature-change is accounted for, cloud feedbacks in the real climate system are strongly negative.” “In fact,” he continues, “the resulting net negative feedback was so strong that, if it exists on the long time scales associated with global warming, it would result in only 0.6ºC of warming by late in this century.”

In related ongoing satellite research, Spencer finds new evidence that “most” warming of the past century “could be the result of a natural cycle in cloud cover forced by a well-known mode of natural climate variability: the Pacific Decadal Oscillation (PDO).”

Whether or not the PDO proves to be a major player in climate change, Spencer has identified a potentially serious error in all IPCC modeling efforts:

Even though they never say so, the IPCC has simply assumed that the average cloud cover of the Earth does not change, century after century. This is a totally arbitrary assumption, and given the chaotic variations that the ocean and atmosphere circulations are capable of, it is probably wrong. Little more than a 1% change in cloud cover up or down, and sustained over many decades, could cause events such as the Medieval Warm Period or the Little Ice Age.

Finally, recent temperature history also suggests that most climate models are too “hot.” Dr. Patrick Michaels touched on this topic in Policy Peril (albeit not in today’s excerpt).

Carbon dioxide emissions and concentrations are increasing at an accelerating rate (Canadell, J.G. et al. 2008). Yet, there has been no net warming since 2001 and no year was as warm as 1998.

global-temperature-past-decade

Figure description: Observed monthly global temperature anomalies, January 2001 through April 2009, as compiled by the Climate Research Unit. Source: Paul C. Knappenberger.

Paul C. Knappenberger (”Chip” to his friends) quite reasonably wonders, “[H]ow long a period of no warming can be tolerated before the forecasts of the total warming by century’s end have to be lowered?” After all, he continues, “We’re already into the nineth year of the 100 year forecast and we have no global warming to speak of.” It is instructive to compare these data with climate model projections.

A good place to start is with the climate model projections that NASA scientist James Hansen presented in his 1988 congressional testimony, which launched the modern global warming movement.

The figure below, from congressional testimony by Dr. John Christy, a colleague of Roy Spencer at the University of Alabama in Huntsville, shows how Hanesn’s model and reality diverge.

hansen-models-vs-reality1

Figure description: The red, orange, and purple lines are Hansen’s model forecasts of global temperatures under different emission scenarios. The green and blue lines are actual temperatures from two independent satellite records. Source: John Christy.

“All model projections show high sensitivity to CO2 while the actual atmosphere does not,” Christy notes. “It is noteworthy,” he adds, “that the model projection for drastic CO2 cuts still overshot the observations. This would be considered a failed hypothesis test for the models from 1988.”

What about the models used by the IPCC in its 2007 Fourth Assessment Report (AR4)? How well are they replicating global temperatures?

ipcc-models-vs-recent-temperatures

This figure, also from Dr. Christy’s testimony, is adapted from Dr. Patrick Michaels’s testimony of February 12, 2009. The red and orange lines show the upper and lower significant range (95% of all model runs are between the lines) of global temperature trends calculated by 21 IPCC AR4 models for multi-year segments ending in 2020. The blue and green lines show observed temperatures ending in 2008 from satellite (University of Alabama in Huntsville) and surface (U.K. Hadley Center for Climate Change) records.

Christy comments:

The two main points here are (1) the observations are much cooler than the mid-range of the model spread and are at the minimum of the model simulations and (2) the satellite adjustment for surface comparisons is exceptionally good. The implication of (1) is that the best estimates of the IPCC models are too warm, or that they are too sensitive to CO2 emissions.

Christy illustrates this another way in his comment on EPA’s endangerment proposal.

christy-models-standard-error1

Figure description: Mean and standard error of 22 IPCC AR4 model temperature projections in the mid-range (A1B) emissions scenario. From 1979 to 2008, the mean projection of the models is a warming of 0.22ºC per decade. HADCRUT3v (green) is a surface dataset, UAH (blue) and RSS (purple) are satellite data sets.

Christy comments:

. . . even with these likely spurious warming effects in HADCRUT3v and RSS, the mean model trends are still significantly warmer than the observations at all time scales examined here. Thus, the model mean sensitivity, a quantity utilized by the IPCC as about 2.6ºC per doubled CO2, is essentially contradicted in these comparisons.

Michaels, in his testimony, shows that if year 2008 temperatures persist through 2009, then the observed temperature trend will fall below the 95% confidence range of model projections. In other words, the models will have less than a 5% probability of being correct.

ipcc-models-vs-temperatures-through-2009

Although the IPCC AR4 models have not failed yet, they are, in Michaels’s words, “in the process of failing,” and the longer the current temperature regime persists, the worse the models will perform.

Conclusion

The climate science debate is not “over.” In fact, it is just starting to get very, very interesting. All the basic issues–detection, attribution, and sensitivity–are unsettled and more so today than at any time in the past decade.

A final thought–anyone who wants further convincing that the debate is not over should read the marvelous NIPCC report. On a wide range of issues (nine main topics and 60 sub-topics), the report demonstrates that the scientific literature allows, and even favors, reasonable alternative assessments to those presented by the IPCC.

P.S. Previous posts in this series are available below:

  • Policy Peril: Looking for an antidote to An Inconvenient Truth? Your search is over
  • Policy Peril Segment 1: Heat Waves
  • Policy Peril Segment 2: Air Pollution
  • Policy Peril Segment 3: Hurricanes
  • Policy Peril Segment 4: Sea-Level Rise
  • In An Inconvenient Truth, Al Gore warns that global warming could raise sea levels by 20 feet, and he implies that this could happen quite suddenly–in our lifetimes or those of our children.

    Specifically, on pp. 204-206 of the book version of AIT, Gore warns that if half the Greenland Ice Sheet and half the West Antarctic Ice Sheet melted or broke up and slipped into the sea, some 100 million people living in Beijing, Shanghai, Calcutta, and Bangladesh would  “be displaced,” “forced to move,” or “have to be evacuated.”  Is there any truth to it?

    Today’s clip from CEI’s film Policy Peril: Why Global Warming Policies Are More Dangerous Than Global Warming Itself, again features Dr. Patrick Michaels of the Cato Institute, former Virginia State Climatologist, author of several superb books (most recently, Climate of Extremes: The Global Warming Science They Don’t Want You to Know), and prolific blogger on World Climate Report.

    Here are my previous posts in this series:

    To watch today’s video clip, click here. To watch Policy Peril from start to finish, click here.

    The text of today’s excerpt immediately follows. It includes some graphics from the film and footnotes to the pertinent scientific literature.

    Dr. Patrick Michaels: This even as there is a purported large melt of ice from Greenland. It turned around — the thermohaline circulation became stronger. [1] 

    Narrator: Hmm. These facts are inconvenient only for the makers of An Inconvenient Truth. But who can forget the scenes where a 20-foot wall of water rolls across the world’s coastal communities. In the book version [of AIT], Gore says, “If Greenland melted or broke up and slipped into the sea–or if half of Greenland and half of Antarctica melted or broke up and slipped into the sea–sea levels worldwide would increase by 18 to 20 feet.” Reality check! How much ice is Greenland shedding?

    Dr. Michaels: The actual loss of ice from Greenland is about 25 cubic miles per year. [2] Now, if that seems like a lot, there are about 700,000 cubic miles of ice on Greenland. The loss rate is four-tenths of one percent of its ice mass, per century. I didn’t say per year. I didn’t say per decade. I said four-tenths of one percent per century[3]

    Narrator: That translates into how much sea-level rise?

    Dr. Michaels: If you take a look at the IPCC’s latest volume, by the year 2100, they have two inches of sea-level rise resulting from the loss of Greenland ice. Not two feet. Not 20 feet. Two inches! [4] That’s the “consensus of scientists,” okay. Whether or not we believe in consensus science, that’s what they say.

    Narrator:  Gore says global warming could melt half of Greenland. Is that plausible?

    Dr. Michaels: The United Nations [IPCC] projects that if we raise carbon dioxide to four times the background level–that would be about 1,100 parts per million, right now we’re at about 385 parts per million–and maintain that for 1,000 years, that Greenland would lose about half its ice in a millennium. [5] Now, we don’t have enough fossil fuel to maintain that concentration for 1,000 years.

    Narrator: Gore also says half the ice sheet could break off because of “moulins.” For me, this was the scariest part of  An Inconvenient Truth. Moulins are cracks that channel meltwater from the surface of the ice sheet to the bedrock below. By lubricating the bedrock, moulins could destabilize the ice sheet, Gore says. [6]

    moulin

    Well, a recent study in Science magazine lays that fear to rest. A small meltwater lake poured down a moulin at a flow rate exceeding that of Niagara Falls. [7] Yet, Science magazine reports, “For all the lake’s water dumped under the ice that day, and all the water drained into new moulins in the following weeks, the ice sheet moved only an extra half meter near the drained lake.” [8] An extra half meter. [9]

    Notes

    [1] The thermohaline circulation “became stronger.” Dr. Michaels (Pat to his friends] just finished debunking Gore’s claim that ice melt from Greenland will inject enough fresh water into the North Atlantic to disrupt the thermohaline circulation (THC, a.k.a. oceanic “conveyor belt”), which most scientists–though not Richard Seager of Columbia University’s Lamont-Doherty Earth Observatory–believe is responsible for Europe’s mild winters. There was a brief scare about THC shutdown a few years ago when Bryden et al. (2005) reported that the Atlantic branch of the conveyor belt slowed by 30% between 1957 and 2004. But one year later, Richard Kerr of Science magazine reported on new data showing that the Bryden study was a “false alarm.” In fact, Dr. Michaels says, alluding to Boyer et al. (2006) and Latif et al. (2006), the THC became stronger. The Center for the Study of Carbon Dioxide and Global Change reviews Latif et al. (2006) here, and reviews many other THC studies. I discuss Gore’s warming-causes-cooling fantasy on pp. 11-12 of my April 2007 testimony before the Colorado Republican Study Committee.

    [2] The estimate of 25 cubic miles of Greenland ice loss per year comes from Luthcke et al. (2006), a study summarized here on NASA’s Web page.

    [3] Greenland has approximately 3 million cubic kilometers of ice. To convert cubic kilometers into cubic miles, you multiply the number of cubic kilometers by 0.2399. Hence, Greenland has about 719,000 cubic miles of ice. It is losing about 25 cubic miles of ice per year, which translates into a rate of 2,500 cubic miles per century. 2,500 is 4/10ths of 1% of 719,000.

    [4] In the IPCC’s mid-range emissions scenario (A1B), Greenland ice loss contributes between 1 centimeter (cm) and 8 cm of sea-level rise in the 21st Century; in the IPCC’s high-end emissions scenario (A1FI), Greenland ice loss contributes between 2 cm and 12 cm per year (IPCC AR4 WGI, Chapter 10: Climate Change Protections, Table 10.7, p. 820). Translating into inches, Greenland ice loss contributes between 0.4 and 3.1 inches in mid-range A1B emissions scenario and between 0.7 and 4.7 inches in A1FI high-end emissions scenario.

    [5] Pat here refers to Ridley et al. (2005), as reviewed in Chapter 10 of IPCC AR4 WGI, on p. 830. The figure below shows what the researchers project would happen to Greenland’s ice if carbon dioxide concentrations increase to four times pre-industrial levels and stay there for 3,000 years.

    greenland-ice-melt-ipcc-751

    [6] Gore’s photograph and diagram of moulins come from Zwally et al. (2002), published in Science magazine.

    moulin-diagram

    In AIT, Gore animates the diagram so that the ice sheet begins to break apart at the E.Q. (equilibrium) line. This is thoroughly misleading. The E.Q. line of an ice sheet is the elevation at which glacier melting and snow accumulation are equal. Above the E.Q. line, snow accumulation exceeds ice melt; below it, ice melt exceeds snow accumulation. The E.Q. line is not a fault line or fissure in the ice.

    More importantly, Zwally et al. (2002) is not evidence of an impending ice sheet crackup. The researchers found that moulins associated with summer ice melt accelerate glacial flow, but only by a few percent. For example, the flow rate of one outlet glacier increased from 31.3 cm/day in winter to 40.1 cm in July, falling back to 29.8 cm in August, increasing annual movement by about 5 meters. Apocalypse not!

    [7] In a study updating the Zwally team’s research, Joughin et al. (2008) found somewhat more glacier acceleration associated summer ice melt and moulins. However, the study’s bottom-line conclusion is pointedly non-apocalyptic:

    Surface-enhanced basal lubrication has been invoked previously as a feedback that would hasten the ice sheet’s demise in a warming climate. Our results show that several fast-flowing outlet glaciers, including Jakobshavn Isbrae, are relatively insensitive to this process . . . Our results thus far suggest that surface-melt enhanced lubrication will have a substantive but not catastrophic effect on the Greenland Ice Sheet’s future evolution.

    [8] In a companion article (cited in this Policy Peril excerpt), Science magazine reporter Richard Kerr quotes Pennsylvania State University glaciologist Richard Alley on moulin-induced ice sheet lubrication:

    “Is it run for the hills, the ice is falling into the ocean?” asks Alley. “No, it matters but it’s not huge.”

     Kerr goes on to observe, as noted above, that an entire 4 km-long, 8 m-deep melt-water lake disappeared down a moulin in about 1.4 hours–at an average rate of 8,700 cubic meters per second, “exceeding the average flow rate of Niagara Falls.” Yet, despite all the water dumped under the ice that day and all the water drained into new moulins in the following weeks, the ice sheet moved only “an extra half meter near the drained lake.” 

    [9] To put the extra half meter of glacial movement in perspective, consider that the Greenland Ice Sheet extends 2,530 kilometers (1,570 miles) North-South and has a maximum width of 1094 kilometers (680 miles) near its northern margin.

    In a segment of Policy Peril immediately following today’s film excerpt, Pat also discusses studies in Science magazine indicating that the West Antactic Ice Sheet (WAIS) is more stable than scientists previously believed. The researchers found that outlet glaciers drag debris under the ice that piles up into “wedges.” These hidden land forms then prop up and stabilize the ice shelf.

     stabilizing-wedges 

    The significance? Scientists once worried that sea-level rise of just a few feet could lift the WAIS off its island moorings, hastening its break up and demise. However, as Anderson (2007)  reports in Science magazine, in a review of Anandakrishnan et al. (2007) and Alley et al. (2007) and their discovery of  stabilizing land forms under the WAIS, “At the current rate of sea level rise, it would take several thousand years to float the ice sheet off its bed.”

    A more recent study by Pollard and DeConto (2009), reviewed by the Center for the Study of Carbon Dioxide and Global Change, concludes that “the WAIS will begin to collapse when nearby ocean temperatures warm by roughly 5ºC.” How long would that take? 

    In a companion article, Huybrechts (2009) estimates that, “The required ocean warmings, on the order of 5ºC, may well take several centuries to develop.” He asserts that “such an outcome could result from the accumulation of greenhouse-gas emissions projected for the twenty-first Century, if emissions are not greatly reduced.” His source here, however, is simply the IPCC report with its questionable assumptions about climate feedbacks and sensitivity. Huybrechts continues:

    The implied transition time for a total collapse of the West Antarctic ice sheet of one thousand to three thousand years [in Pollard and DeConto (2009)] seems rapid by Antarctic standards. But it is nowhere near the century timescales of West Antarctic ice sheet decay based on simple marine ice-sheet models. 

    And one to three thousand years is certainly nowhere near the years-to-decades innundation of the world’s coastal communities that Al Gore conjures up in An Inconvenient Truth.