Marlo Lewis

Proponents of the Waxman-Markey (W-M) cap-and-trade bill assure us it will cost the average household less than a postage stamp a day. The Heritage Foundation’s energy team — David Kreutzer, Ben Lieberman, Karen Campbell, William Beach, and Nicolas Loris — have rebutted this claim six four ways from Sunday (see here, here, here, and here).

Some postage stamps, of course, cost more than most people’s homes. For example, this rather plain looking item, a two-pence stamp issued by the Mauritius post office in 1847, sells for $600,000 or more.

 post_office_mauritius

Now, nobody is saying that Waxman-Markey will cost the average household what it costs to buy a mansion, but the National Association of Home Builders (NAHB) estimates that W-M could increase the purchase price of a new home by $1,371 to $6,387, and that this would have the effect of making 337,000 to 1.57 million households unable to qualify for a home mortage. Repeat after me: “Law of Unintended Consequences!”

NAHB summarizes its analysis on pp. 13-14 of its December 30, 2009 comment on various EPA rulemakings regarding greenhouse gases (GHGs) under the Clean Air Act. Here are the main steps:

  1. To produce the materials used to construct a typical single-family home (2,420 square feet plus two-car garage), manufacturers emit 55.42 metric tons (MT) of carbon dioxide-equivalent (CO2-e) GHGs.
  2. The U.S. Energy Information Administration (EIA), using a 4% discount rate, projects that under W-M, carbon allowances in 2030 would cost between $19 and $87 per MT.
  3. Manufacturers’ costs for producing homebuilding materials would increase by $1,037 to $4,831 per single family home (when I do the arithmetic, I get an increase of $1,052 to $4,821).
  4. Factor in additional financing and broker commissions, and the price of a typical single-family home would increase by $1,371 to $6,387.
  5. To qualify for a mortgage, borrowers may not exceed a specific “front end ratio” — the percentage of income that would be consumed paying principal and interest on the mortage, plus property taxes and insurance. A common standard is that these payments should not exceed 28% of household income.
  6. In the low-cost case (carbon permit price = $19/MT CO2-e), roughly 337,000 households that would qualify for a mortgage before the W-M-induced price increase, no longer qualify. In the high-cost case (carbon permit price = $87/MT CO2-e), approximately 1.57 million U.S. households are priced out.

Some enterprising reporter should jump on this. What do Reps. Waxman and Markey have to say about NAHB’s analysis? When they drafted the bill, what assumptions did they make about its potential impacts on housing prices and homeownership? Indeed, can they adduce any evidence that they gave even a moment’s consideration to these important matters?

Today, I submitted a comment on EPA’s proposed Prevention of Significant Deterioration and Title V Greenhouse Gas Tailoring Rule.  The gist of my argument is as follows:

In Massachusetts v. EPA, the Supreme Court legislated from the bench, authorizing and indeed pushing EPA to control emissions of greenhouse gases (GHGs) for climate change purposes. This is a policy decision of immense economic and political magnitude that Congress never intended or approved when it enacted and amended the Clean Air Act (CAA or Act).

Regulating GHGs under the CAA leads inexorably to “absurd results,” including an economically-chilling administrative quagmire. To prevent GHG regulation from overwhelming agency administrative resources and stifling economic development, EPA proposes to suspend, for six years, the “major” source applicability thresholds for the CAA pre-construction and operating permits programs. That is, EPA proposes to amend the Act. This violation of the separation of powers compounds the constitutional crisis inherent in the Court’s substitution of its will for that of the people’s elected representatives.

The small-business protections proposed in the Tailoring Rule are temporary, legally dubious, and incomplete. Even if courts uphold the Tailoring Rule, despite its flouting of clear statutory language, it will not avert the most absurd result of the Court’s misreading of the CAA:  regulation of carbon dioxide (CO2) and other greenhouse gases under the National Ambient Air Quality Standards (NAAQS) program.

EPA runs enormous political risks leading the charge for GHG regulations not approved by Congress. It is in the Agency’s best interest not to oppose legislative action to overturn the endangerment finding and Mass. v. EPA.

The full text of my comment is available here.

Rep. Joe Barton (R-TX), ranking member of the House Energy and Commerce Committee, announced today that he plans to introduce a “resolution of disapproval” to overturn the Environmental Protection Agency’s (EPA’s) recently finalized endangerment finding on greenhouse gases.

This is  huge. It means that Republicans are going to insist that climate and energy policy be made by the people’s elected representatives rather than by non-elected judges, litigators, and bureaucrats. It means that EPA regulation of carbon dioxide (CO2) under the Clean Air Act (CAA or Act) will be an issue in the 2010 elections. It means that citizens will be able to hold accountable — and punish at the ballot box — any Member of Congress who votes against Barton’s resolution of disapproval and in favor of the compliance burdens, rising energy costs, and litigation risks to the economy that EPA regulation of CO2 unavoidably entails.

In a press release issued today, Barton stated:

“I want to announce that I and others on the Republican side will ask the House of Representatives to consider and pass a resolution strongly disapproving the discreditable decision by the Obama administration to outlaw carbon dioxide and with it, millions of jobs in America.

“The Environmental Protection Agency’s endangerment finding plainly was intended to make the president’s policies look good in advance of his visit to the Copenhagen global warming conference, not to advance any public good in America, but it also has policy implications that threaten serious damage to the economy for generations to come.

“The EPA’s finding accurately reflects the thousands of candid, outrageous e-mails that EPA’s allies in the global warming community sent to each other by demonstrating that public relations priorities rather than straightforward science are driving U.S. policymaking on global warming, and no where did anyone demonstrate a whiff of concern for who pays the bill or how they earn their living.

“Everybody also understands that the endangerment finding is supposed to prod Congress into resuscitating cap-and-trade legislation that is dying from overexposure to public scrutiny. The social cost of this public relations effort, however, will dwarf the hundreds of billions of dollars already spent by the most profligate administration in history.

“Worst of all, the policy envisioned by the Obama administration will treat the recession by committing the country to living with fewer jobs instead of more, and to taking even more money out of the pockets of those lucky enough to have jobs so that radical environmentalists can wage a war against nature.

“Congress has the right and the responsibility to nullify the decisions of the bureaucracy when they run counter to the people’s interests, and a formal Resolution of Disapproval is fully warranted in this instance.”

Why is EPA inaugurating a regime of global warming regulations that Congress never voted for or approved?  Because the Supreme Court, in Massachusetts v. EPA (April 2007), decided to legislate global warming policy from the bench.

In Mass. v. EPA, eco-litigation groups, led by a baker’s dozen state attorneys general, attempted to do an end run around Congress and impose Kyoto-like policies on the U.S. economy through judicial fiat. They found five willing accomplices on the Court, who essentially ruled that Congress authorized EPA to regulate GHGs for climate change purposes when it enacted the CAA in 1970 — decades before global warming became a public concern. The Court’s decision — an affront to common sense — all but ensured that EPA would issue an endangerment finding for greenhouse gases. That, in turn, would compel EPA, under CAA Sec. 202, to establish first-ever GHG emission standards for new motor vehicles.

However, what none of the principals in the case bothered to mention, is that once EPA adopts the GHG motor vehicle standards sought by plaintiffs, CO2 automatically becomes a pollutant “subject to regulation” under the Act’s Prevention of Significant Deterioration (PSD) pre-construction permitting program and Title V operating permits program. Under the CAA, firms must obtain a PSD permit in order to construct or modify a “major emitting facility,” and a Title V permit in order to operate such a facility. A facility is major under PSD if it is in one of 28 categories and has a potential to emit 100 tons per year (TPY) of a regulated pollutant, or 250 TPY if it is any other type of establishment. A facility is major under Title V if it has the potential to emit 100 TPY of a regulated pollutant. As it happens, millions of previously unregulated buildings and facilities — office buildings, apartment complexes, big box stores, enclosed malls, heated agricultural facilities, small manufacturing firms, even commercial kitchens — emit enough CO2 to meet these thresholds.

EPA estimates that if PSD and Title V are applied as written to CO2 sources, the number of PSD permit applications per year would jump from 280 to 41,000, and the number of Title V permit applications would jump from 14,700 to 6.1 million! The CAA permitting programs would crash under their own weight, putting a freeze on new construction, and thrusting millions of firms into legal limbo. Thanks to Mass. v. EPA, the CAA is about to become an economic wrecking ball aimed straight at small business.

EPA’s October 2009 proposed Tailoring Rule attempts to avoid these “absurd results” by suspending the PSD and Title V requirements for any source emitting less than 25,000 tons per year (TPY) of CO2-equivalent GHGs. EPA hopes in this way to have its cake (the power to regulate CO2) and eat it (avoid an uncontrollable regulatory cascade that would provoke a backlash against the Obama administration, the eco-litigation fraternity, and the Court). But in order to pull off this trick, EPA must play lawmaker, effectively amend the Act, and violate the separation of powers.

Rep. Barton is right not to put his trust in the efficacy of this solution to the regulatory nightmare the Court conjured up in Mass. v. EPA. For one thing, it is unclear whether the Tailoring Rule will survive judicial challenge, because it flouts clear statutory language. Secondly, to preserve the fiction that EPA is not amending the Act, the Agency claims in the Tailoring Rule that its goal is to apply PSD and Title V to smaller and smaller CO2 sources over time, eventually including sources emitting 250 TPY and 100 TPY. EPA proposes to spend five years developing “streamlined” permitting procedures for smaller sources, but the legality of such contrivances is dubious as well, and at best streamlining would reduce irrational regulatory burdens on small business, not avoid them.

Finally, and most importantly, the Tailoring Rule, even if upheld by courts, would provide no protection from the most “absurd result” of the endangerment finding: Imposition of national ambient air quality standards (NAAQS) for CO2 that essentially require the de-industrialization of the United States.

The endangerment finding that EPA has just finalized substantively satisfies the endangerment test in CAA Sec. 108 that governs the first phase of a NAAQS rulemaking. The endangerment finding asserts that current atmospheric CO2 concentrations endanger public health and welfare, so logically, a NAAQS for CO2 would have to be set below current levels. Two eco-litigation groups, the Center for Biological Diversity (CBD) and 350.org, have already petitioned EPA to establish NAAQS for CO2 set at 350 parts per million (PPM). Their motto is “350 or Bust!

The present atmospheric CO2 level is 390 PPM. Even if the entire world met the emissions reduction target of the Waxman-Markey bill — 83% below 2005 levels by 2050 — this would only “stabilize” CO2 concentrations at 450 PPM. Not even a global depression lasting many decades would be enough to reduce CO2 concentrations to 350 PPM. Yet under established legal interpretation, EPA is prohibited from considering compliance costs when establishing NAAQS.

Clearly, the only solid protection against Mass. v EPA’s “absurd results” is to nip the regulatory mischief in the bud. Barton’s resolution of disapproval would do just that. CBD and its allies have their slogan, and now the friends of liberty have one too: Barton or Bust!

Earlier this week, at an American Geophysical Union meeting in San Francisco, NASA unveiled new data on atmospheric greenhouse gases (GHGs), notably carbon dioxide (CO2) and water vapor, from its Atmospheric Infrared Sounder (AIRS) unit on the agency’s Aqua spacecraft. NASA touted two main findings as “breakthroughs” in GHG research.

One supposed breakthrough is the discovery that CO2 is not “well-mixed” through the global troposphere (mid-level atmosphere), but is actually “lumpy” — distributed in higher concentrations in two “belts” circling the globe, especially in Northern hemisphere, which is more heavily industrialized. Now, I suppose this is a breakthrough in the sense that it will allow researchers to improve CO2 “transport models,” which hitherto have assumed that CO2 concentrations are uniform throughout the troposphere. But it would be surprising indeed if scientists did not know until now that industrialized regions have higher CO2 levels than non-industrialized areas.

The second supposed breakthrough is the claim that the AIRS data remove “most of the uncertainty about the role of water vapor [feedback]” in climate change.  “AIRS temperature data have corroborated climate model predictions that the warming of our climate produced as carbon dioxide levels rise will be greatly exacerbated — in fact, more than doubled — by water vapor,” said climate scientist Andrew Dressler of Texas A&M University. According to Dressler, “We are virtually certain to see Earth’s climate warm by several degrees Celsius in the next century, unless some strong negative feedback mechanism emerges elsewhere in the Earth’s climate system.” Dressler is talking about the assumption, common to all IPCC climate models, that the initial warming from rising CO2 levels increases concentrations of the atmosphere’s main greenhouse gas, water vapor, trapping more outgoing longwave (heat or infrared) radiation (OLR) and increasing global average rainfall.

William Gray of Colorado State University, perhaps the world’s leading hurricane forecaster, offers a different perspective on the NASA water vapor data. Gray’s comment follows:

I have just heard that NASA has a new satellite in orbit that can directly measure CO2 content in the atmosphere and that these new measurements are beginning to show that there is a positive association between increased rainfall (from higher CO2 gas amounts) and Outgoing Longwave Radiation (OLR) suppression. This is to be expected in and around the areas of precipitation — but not necessarily in global areas surrounding precipitation where return flow mass subsidence is driving the water vapor radiation emission level to a lower and somewhat warmer temperature.

I and a colleague, Barry Schwartz, have been analyzing 21 years (1984-2004) of ISCCP (International Satellite Cloud Climatology Project) outgoing longwave radiation on various space scales as related to precipitation differences. We have investigated how OLR changes with variations in precipitation from NOAA reanalysis data on time scales from 3 hours, a day, a month, and up to a year scale.

We find that on a small space scale where rainfall is occurring OLR is greatly suppressed. But on the larger regional to global scales, OLR rises with increasing precipitation. This is due to increased return flow subsidence in the surrounding cloud free and partly cloudy areas. Globally, we are finding that net OLR increases with net increased amounts of global precipitation. This is the opposite of what most GCMs [general circulation models] have programmed into their models and, if I’m interpreting the new NASA announcement correctly, opposite to what they are currently reporting to the media.

Dr. Gray presents a more detailed examination of these issues in his March 2009 Heartland Institute climate conference paper, available here.

RGGI: A tax is a tax is a tax

by Marlo Lewis on December 4, 2009

in Blog

President Obama and other cap-and-trade advocates assured us they had ”learned from Europe’s mistakes” and would auction all emission permits rather than hand them out at no charge to favored constituencies. Then the sausage factory known as Congress took over. The Waxman-Markey cap-and-trade bill proposes to dole out 85% of emission permits to preferred interest groups during the first several years of the program.

What explains this flip-flop? 100% auctioning turns a cap-and-trade program into an energy tax by another name. Actually, whether the permits are auctioned or not, cap-and-trade still raises consumer energy prices, but when permits are auctioned cap-and-trade is nakedly a revenue raiser for the bureaucratic sector. So Waxman-Markey drafters got cute and decided to phase in the auctions over time, on the theory, apparently, that we’re dumb as proverbial frogs in a pot of slowly boiling water and won’t notice being taxed by increments.

Another of the supposed “mistakes” Europe made in setting up its emissions trading system (ETS) was to “over-allocate” emission permits. This crashed the market for energy ration coupons, undercutting any incentive to reduce emissions or invest in lower-carbon energy technologies.

Well, ten Northeastern states, keen to demonstrate their climate leadership and solidarity with the Kyoto Protocol, got together and enacted the Regional Greenhouse Gas Initiative (RGGI), a multi-state cap-and-trade program in which almost all carbon permits are auctioned.

However, as Greenwire reports today (subscription required), “RGGI emission prices continue to slide in sixth auction”: 

Prices slid again in the Regional Greenhouse Gas Initiative’s (RGGI) sixth auction for 2009 emissions allowances to $2.05 per short ton of carbon dioxide equivalent, the Northeast pact announced here today. The previous auction netted $2.19 per ton in September.

More importantly, ”… 2012 allowances fell slightly in the Wednesday auction, to $1.86 per ton, from $1.87 in September.”

Why so?  “There’s way too much supply, and there is no demand,” said Tim Cheung, an analyst with New Energy Finance. “You’re going to have these excess allowances that will continue to carry over to future years, which is why we think that prices will remain depressed going forward.”

RGGI avoided one of Europe’s “mistakes” only to repeat another. Among other things, the plunge in permit prices means RGGI is doing and will do squat to reduce emissions.

So even when politicians auction permits, rather than hand out them out as freebies, they can still run a system as ineffectual (in terms of its stated purposes) as Europe’s ETS.

Which raises an obvious question: Besides giving New England politicos a platform on which to preen and prate about their efforts to save the planet, what is RGGI good for?

Raising taxes, of course. Greenwire reports that:

About 31 million allowances were sold this week, mostly to energy producers facing RGGI compliance rules and secondary market traders. Participants bought 28.5 million 2009 allowances and just under 2.2 million 2012 allowance futures, with cash-strapped state governments garnering $61.6 million [emphasis added].

All told, RGGI has raised about $500 million for state governments in auction proceeds. But here’s where the story gets really interesting. “The 10 RGGI state governments are supposed to use auction proceeds to fund renewable energy or energy efficiency initiatives, but governments are using that cash to plug holes in their budgets.” 

Greenwire mentions two examples:

Earlier this week, the research firm Point Carbon pointed to New York as the latest to cheat, with Albany passing a bill that will allow it to tap $90 million of RGGI auction proceeds to help fill its $5 billion budget shortfall. Today, New York’s Department of Environmental Conservation said the state drew $25.4 million in Wednesday’s auction.

Maryland became the first state to break ranks and use RGGI cash for a project not related to clean energy promotion. In April, Bloomberg reported that Maryland’s Legislature voted to use $70 million of its auction revenue for a rebate program designed to help low-income residents pay electricity bills.

It’s just like my colleague Myron Ebell likes to say. “There are three things you need to know about cap-and-trade: It’s a tax, it’s a tax, it’s a tax.”

Yesterday, the Center for Biological Diversity (CBD) and 350.org petitioned the Environmental Protection Agency (EPA) to establish National Ambient Air Quality Standards (NAAQS) for carbon dioxide (CO2) pegged at 350 parts per million (ppm). CO2 concentrations are currently about 387 ppm. The CBD is the eco-litigation group that successfully sued the Fish and Wildlife Service to list the polar bear as a threatened species under the Endangered Species Act.

I’ll have more to say about the specifics of the CBD-350.org petition (available here) in a later post. For now, I just want to note that the petition is additional confirmation that Massachusetts v. EPA, the April 2007 Supreme Court global warming case, is a bottomless well of absurd results that imperil both our economy and the U.S. Constitution.

CEI has been saying from day one – in our comment on EPA’s July 2008 Advanced Notice of Proposed Rulemaking, our comment on EPA’s April 2009 Endangerment Proposal, our comment on EPA’s September 2009 Motor Vehicle Greenhouse Gas Emissions Standards Proposal, and in columns about Mass. v. EPA when the case was still pending – that an endangerment finding under Sec. 202 of the Clean Air Act (CAA) would satisfy the endangerment test in CAA Sec. 108 and, thus, trigger a NAAQS rulemaking.

Not even a global economic depression sustained over many decades would be enough to stabilize atmospheric CO2 levels at 350 ppm — the goal of the CBD-350.org petition. For example, even if the world’s governments could somehow dial back global CO2 emissions to 1957 levels, when the global economy was smaller than one-third its present size, and then hold CO2 emissions constant for the next nine decades, global concentrations would still increase to 455 ppm by 2100.

Obviously, when Congress enacted the Clean Air Act, it did not authorize EPA to squash the U.S. economy. Indeed, one of the Act’s main purposes is to protect the “productive capacity” of the American people (CAA Sec. 101).

Nonetheless, by misreading the Act to include authority to regulate CO2 as an “air pollutant,” the Supreme Court set the stage for a regulatory chain reaction, including establishment of NAAQS for CO2 set below current atmospheric levels, which would effectively turn the CAA into a national economic suicide pact. 

This is not the only ”absurd result” that follows from the Court’s misreading of the Act in Mass. v. EPA. According to EPA’s proposed Tailoring Rule, “literal” (i.e. lawful) application of the CAA to greenhouse gases would annually require 41,000 small firms to apply for Prevention of Significant Deterioration (PSD) pre-construction permits and 6.1 million firms to apply for Title V operating permits. In other words, EPA and its state counterparts would have to process 140 times as many PSD permits and 400 times as many Title V permits per year as they do now. The permitting programs would crash under their own weight, construction activity would grind to a screeching halt, and millions of firms would suddenly find themselves operating in legal limbo. A more potent Anti-Stimulus Package would be hard to imagine.

To avoid these problems, EPA’s Tailoring Rule proposes, over the next six years, to exempt firms emitting less than 25,000 tons per year (TPY) of CO2-equivalent greenhouse gases, even though the statute specifies that PSD and Title V shall apply to sources with potential to emit 250 TPY and 100 TPY of any regulated pollutant, respectively. The Tailoring Rule is actually an Amending Rule. To prevent Mass. v. EPA from turning the CAA into an economic wrecking ball, EPA proposes to play lawmaker and suspend provisions it doesn’t like, violating the separation of powers.

Even if the Tailoring Rule survives judicial challenge, which is doubtful, because it flouts clear statutory language, it would in no way lessen the threat of economy-crushing NAAQS regulation of CO2.

There is only one sensible course for policymakers to take: Overturn Mass. v. EPA. Congress should enact legislation, such as H.R. 391 introduced by Rep. Marsha Blackburn (R-TN), clarifying that CO2 is not subject to regulation under the CAA for climate change purposes.

This post is a follow up to my previous on why Climategate is the real war on science.

My earlier post commented on Willis Eschenbach’s excellent column on Anthony Watts’s blog (WUWT). Here I want to reproduce a comment Eschenbach posted in the back and forth on WUWT. In it, he explains why hiding computer code is lethal in a field, like climate science, where results hinge on statistical methods and mathematical techniques.  Eschenbach’s comment follows.

Willis Eschenbach (17:15:51):

Part of the difficulty with climate science is that, unlike all other physical sciences, it does not study things — instead it studies averages.

This is because climate by definition is the average of weather over a suitably long period of time (typically taken as a minimum of 30 years).

As a result, much of the study that goes on, and the papers that are written, deal almost exclusively with mathematics and statistics. This is the reason that access to the computer codes is so critical.

It’s simple in the physical sciences to describe an experiment, e.g. “I took three grams of carbon and subjected them to a pressure of 50,000KPa and a temperature of 500C. Unfortunately, the experiment did not succeed, I could not replace the diamond I had lost from my wife’s wedding ring.” Anyone can reproduce that experiment (and get the same results).

But when you say “I took the raw temperature data, variance-adjusted it, averaged it, gridded it, area-adjusted it, extrapolated results to data-free areas within 250 km, and made a global temperature record”, that’s far from enough information. In order to determine what was done, we need far more detailed information in climate science, because in general we are describing intricate mathematical operations. These are often very hard to describe clearly in spoken or written language.

And even a crystal-clear description is not enough. Despite what he says he has done, if the scientist has inadvertently used an improper procedure (e.g. the uncentered principal components analysis used in Mann’s Hockeystick), we’ll never be able to determine that the answer is demonstrably wrong unless we have the actual code that he used. Otherwise, we could spend years trying to guess where he went wrong, but we would never be able to show that he went wrong as science demands.

This is why the insistence of scientists that their computer codes are sacrosanct private secret documents best kept under Hermetic seal in a clandestine vault is lethal to good science. Without the codes, we can’t tell if what has been done is correct and free from hidden mathematical error. Of course, this may be unconnected with the reason that Jones et. al are hiding their codes …

Given that climate science is not the study of things but of the averages of things, and that as a result math and statistics are central to climate science, the findings of the Wegman Report are now seen to be even more insightful, trenchant, and valid. They said:

It is important to note the isolation of the paleoclimate community; even though they rely heavily on statistical methods they do not seem to be interacting with the statistical community. Additionally, we judge that the sharing of research materials, data and results was haphazardly and grudgingly done. In this case we judge that there was too much reliance on peer review, which was not necessarily independent. Moreover, the work has been sufficiently politicized that this community can hardly reassess their public positions without losing credibility. Overall, our committee believes that Mann’s assessments that the decade of the 1990s was the hottest decade of the millennium and that 1998 was the hottest year of the millennium cannot be supported by his analysis.

And presciently, that was written thee years ago, well before we got the CRU emails …

The huge pile of emails purloined or leaked from the Climate Research Unit (CRU) last week does indeed “give every appearance of testifying to concerted and coordinated efforts by leading climatologists to fit the data to their conclusions while attempting to silence and discredit their critics,” as the Wall Street Journal stated yesterday. However, the main issue brought to light by these emails is even more serious.

In a column posted yesterday on Anthony Watts’s blog, amateur scientist Willis Eschenbach documents the many ruses and excuses CRU director Phil Jones and his allies employed over several years to deny outsiders access to the CRU gang’s temperature data and computer codes.

Skeptics have been accused of waging a “war on science“ because they frequently question the Intergovernmental Panel on Climate Change’s (IPCC’s) interpretation of the rapidly expanding field of climate change research.

 But science is not a set of dogmas certified by government-funded bodies. Rather, as Mr. Eschenbach points out, science is fundamentally an ”adversarial process” whereby competing scientists attempt to reproduce — that is, invalidate — each other’s results. This process absolutely depends on each combatant allowing the others to examine his data and methods. Tactics designed to hide data and methods are anti-science even if — nay, especially if – those resorting to such tricks are big-name scientists.   

“Science,” writes Eschenbach, “works by one person making a claim, and backing it up with data and methods they used to make the claim. Other scientists then attack the claim by (among other things) trying to replicate the first scientist’s work. If they can’t replicate it, it doesn’t stand.”

This means, says Eschenbach, that researchers who hide their data and computer codes to prevent others from replicating/invalidating their results “attack . . . the heart of science.” Such behavior is unethical and, as Eschenbach notes, likely illegal as well.

If you read only one commentary on Climategate, read this one. It is an eye-opener.

Real Climate Spin

by Marlo Lewis on November 24, 2009

in Blog

Real Climate.Org is chief defender of ”consensus” climatology on the Internet. One of its enduring missions has been to defend the dubious, indeed discredited “Hockey Stick” reconstruction of Northern hemisphere temperature history. The Hockey Stick was the basis for the IPCC’s claim in its 2001 report that the 1990s were the warmest decade and 1998 the warmest year of the past millennium.

That Real Climate (RC) should feel special solicitude for the Hockey Stick is no accident, comrade. Two of the five principals at RC — Michael Mann and Raymond Bradley — were among the three researchers (Mann, Bradley, and Malcolm Hughes) who authored the Hockey Stick.

All of the RC principals (Gavin Schmidt, Caspar Ammann, Rasmus Benestad, Mann, and Bradley) are frequent senders and recipients of the thousands of emails and other documents, now posted on many Web sites, that were hacked or leaked last week from the University of East Anglia’s Climate Research Unit (CRU).

The Wall Street Journal today published a selection of the leaked emails and an editorial concluding that the emails ”give every appearance of testifying to concerted and coordinated efforts by leading climatologists to fit the data to their conclusions while attempting to silence and discredit their critics.”

Even eco-radical George Monbiot says he is “dismayed and deeply shaken” by the emails, because, “There appears to be evidence here of attempts to prevent scientific data from being released, and even to destroy material that was subject to a freedom of information request.”

So far, the only email on which RC has seen fit to comment is one from CRU director Phil Jones dated Nov. 16, 1999, 13:31. It’s gotten a lot of buzz on the Internet, because it appears to advocate the use of a “trick” to “hide” a “decline” in global temperatures.

In a post titled “The CRU Hack” (November 20), RC writes:

No doubt, instances of cherry-picked and poorly-worded “gotcha” phrases will be pulled out of context. One example is worth mentioning quickly. Phil Jones in discussing the presentation of temperature reconstructions stated that “I’ve just completed Mike’s Nature trick of adding in the real temps to each series for the last 20 years (ie from 1981 onwards) and from 1961 for Keith’s to hide the decline.” The paper in question is the Mann, Bradley and Hughes (1998) Nature paper on the original multiproxy temperature reconstruction, and the ‘trick’ is just to plot the instrumental records along with reconstruction so that the context of the recent warming is clear. Scientists often use the term “trick” to refer to a “a good way to deal with a problem”, rather than something that is “secret”, and so there is nothing problematic in this at all. As for the ‘decline’, it is well known that Keith Briffa’s maximum latewood tree ring density proxy diverges from the temperature records after 1960 (this is more commonly known as the “divergence problem”–see e.g. the recent discussion in this paper) and has been discussed in the literature since Briffa et al in Nature in 1998 (Nature, 391, 678-682). Those authors have always recommend not using the post 1960 part of their reconstruction, and so while ‘hiding’ is probably a poor choice of words (since it is ‘hidden’ in plain sight), not using the data in the plot is completely appropriate, as is further research to understand why this happens.

So a “trick” is just scientific shorthand for a “good way to deal with a problem,” not something “secret.” But RC ducks the real issue. Is the ”trick” Phil Jones learned from Hockey Stick author Michael Mann a form of trickery? Does it create a false impression, as an illusionist does on stage, right out in the open, in front of an audience?

The trick, according to RC, is to splice onto the end of a temperature reconstruction, built on proxy data going back several centuries, the data from instrumental records starting in 1960 and 1981.

Now this is quite a trick, because it involves comparing apples (proxy data) to oranges (instrumental data) and pretending that the composite forms a continuous record.

As the Center for the Study of Carbon Dioxide and Climate Change observed years ago, researchers attempting to construct long-term (centuries to millennia) temperature records should ”finish the dance” with the (proxy) data they started with.

Grafting instrumental data onto proxy data to produce a seemingly continuous record is trickery, because instrumental data, unlike proxy data, are massively influenced by land-use changes and site-specific quality control issues.

Urban heat islands and irrigated agriculture can inject false warming biases into instrumental data that are absent from proxy data taken from remote forests or sediment cores at the bottom of lakes, for example. Improper placement of temperature sensing equipment near local heat sources (e.g. air conditioning vents, asphalt parking lots, waste water treatment plants) also generates significant false warming signals, as retired meteorologist Anthony Watts documents in gory detail.

So RC’s “nothing to see here” argument based on the alleged insider meaning of “trick” raises rather than allays suspicion that CRU is attempting to fit data to a predetermined conclusion.

Note also that RC says nothing about Phil Jones’s advice to backdate correspondence (Sept. 12, 2007, 11:30 a.m.), to delete emails related to the 2007 IPCC report (May 29, 2008, 11:04), and to evade FOIA requests, if necessary by deleting files (Feb. 2, 2005, 9:41 a.m.). RC also says nothing about Mann’s call to delegitimize the Journal of Climate for publishing papers critical of his work (March 11, 2003, 8:14).

The Wall Street Journal editorial’s concluding comment is spot on:  ”In the department of inconvenient truths, this one surely deserves a closer look by the media, the U.S. Congress and other investigative bodies.”

In recent testimony before the Senate Environment and Public Works Committee, energy secretary Steven Chu makes a convoluted case for S. 1733, the Clean Energy Jobs and American Power Act, a.k.a. the Kerry-Boxer cap-and-trade bill.

Chu argues roughly as follows. Global investment in wind turbines and solar panels could reach $3.6 trillion by 2030. China is investing heavily. If we don’t ramp up our investment in “clean tech” products, we’ll be left behind, become increasingly dependent on foreign producers, and China will eat our lunch. The key to growing the U.S. clean-tech sector is to “put a price on carbon” — establish a “cap on carbon emissions that ratchets down over time.”

This is poppycock, as I explain today on MasterResource.Org, the free-market energy blog. 

Yes, China is investing heavily in solar panel and wind turbine manufacture, but China does not cap carbon. Also, only a small fraction of China’s production of solar photovoltaic generators — 20 megawatts out of 820 megawatts produced in 2007 — is for China’s domestic market. So capping domestic carbon emissions is not a prerequisite to success in exporting clean-tech products, nor is having a large domestic market for such products. The experience of the very country Chu spotlights as model and threat rebuts rather than supports the case he wants to make.

A key point Chu completely ignores is that, apart from certain niche markets, “clean tech” products consume more wealth than they create. That’s why they cannot “compete” without benefit of market-rigging mandates, subsidies, and penalties levied against fossil energy.

A fresh example of this inconvenient fact comes to us today from the great state of Massachusetts, home of Sen. John Kerry, chief sponsor of S. 1733, and Rep. Ed Markey, co-sponsor of the House companion bill, H.R. 2454, a.k.a. Waxman-Markey.

The Boston Globe reports that, ”A little more than a year after cutting the ribbon of a new factory in Devens built with $58 million in state aid, Evergreen Solar has announced it will shift its assembly of solar panels from there to China.”

Evergreen received “$58.6 in grants, loans, land, tax incentives, and other support,” says the Globe. Yet, ”Through the first nine months of this year, Evergreen lost $167 million, compared with $33.6 million for the same period last year.”

What would Chu have to say about this? Evergreen is not losing money because there’s no cap on carbon. Massachusetts is one of several states participating in a cap-and-trade program known as the Regional Greenhouse Gas Initiative (RGGI).

Why is Evergreen expanding operations in China?  ”Lower costs.” Such lower costs include lower-cost energy. To repeat, China does not have cap-and-trade; it does not put a price on carbon.

Now, I’ll wager that Evergreen would be losing money even if Massachusetts were a Kyoto-free zone. But we may surmise that Evergreen would not shift its operations to China if China’s economy were carbon-constrained.

Chu should at least consider the possibility that pricing carbon would vitiate what little competitiveness the U.S. clean-tech sector has. Low-cost energy is a source of competitive advantage, as China powerfully demonstrates. By increasing energy costs, cap-and-trade would make all U.S.-based manufacture less competitive, including companies specializing in clean-tech products.