March 2003

The history of science and politics explains a lot about the manifold distortions, exaggerations, and outright hokum that regularly appear in some of our major journals on the subject of global warming.

Specifically, Thomas Kuhn’s landmark book The Structure of Scientific Revolutions (first published in 1967 and reprinted about a zillion times) demonstrates that scientists have a herd mentality in which almost everyone believes in some reigning “paradigm” (in this case, disastrous global warming). Careers are spent proving that paradigm and discounting anything that disagrees with prevailing theory. That’s “science.”

As for “politics,” the two intersect, writes James Buchanan, when researchers find they can sell their ideas to policymakers for mutual gain: If a little distortion is required, so be it. That’s a powerful prism through which we must view so-called “facts”: a throng of scientists acting in concert with the political process.

Prism in hand, we can’t say we were surprised when the science community’s weekly lobbying newsletter, Science magazine, couldn’t stop gushing about recent research purportedly demonstrating that climate models of gloom and doom are right after all and the satellite temperature records reporting precious little warming were wrong.

Don’t misunderstand: Science does also report on scientific research. But anyone who thinks it isn’t lobbying hasn’t read the news or editorial sections and just about anything else in it that isn’t hard science. And any idea that Science is fair and balanced on the subject of global warming can be thrown into the trash along with previous issues.

At least that’s all we can take from their most recent handling of a new take on satellite-measured global temperature data. On May 1, Sciencexpress (a web page where the magazine posts “selected” paperschosen for their “timeliness and importance”before they actually appear in print) featured research led by Lawrence Livermore’s Ben Santer showing that climate models were in better agreement with a newly produced, and as yet unpublished, version of the satellite-measured temperature history that includes a warming trend about three times that of the oft-peer reviewed satellite-temperature history constructed by John Christy and Roy Spencer.

Although the paper itself does not come right out and say it, a between-the-lines read suggests that because the new satellite dataset is in closer agreement with model results, it is likely the more accurate reflection of the actual state of the lower atmosphere. And just in case you didn’t get that message from the paper itself, the accompanying Science press release makes it clear: “A stubborn argument against global warming may be discredited by a reanalysis of the raw data central to its claims.”

Santer used a climate model to discriminate between two versions of the satellite datathe one published by Christy that shows very little warming, and a modified, unpublished version by Frank Wentz that shows much more heating. That was certainly a novel approach because it wasn’t science. In science, we don’t bend data to fit models and then pronounce those models correct. Rather, we bend models to accommodate datathat is, reality.

If Science had any intent to be fair and balanced, it would have noted that John Christy and colleagues published a paper at the exact same time that checked the satellite data the old-fashioned (i.e., scientific) way. He and Roy Spencer compared their record with a totally independent measurement of lower-atmosphere temperature, taken from daily weather balloons, and found the two to be in strong agreement.

Weather balloons are launched twice daily from sites around the world; as they ascend through the atmosphere, they radio back observations of temperature, humidity, and pressure that are used to initialize models of daily weather forecasts. Balloon observations can be compiled into a record of lower-atmospheric temperatures that can then be compared with the satellite measurements.

It is important to realize that the way the weather-balloon data are collected is completely different from how we obtain satellite observations, and thus represents an independent measurement of the same quantity (atmospheric temperature). Of course, as is the case with any measurement, there are complications that must be considered when compiling the raw data.

For that reason, there are several different research groups that have released their own versions of the weather-balloon temperature history. To protect against any accusation of picking only the particular data set that best matches their satellite observations, Christy and Spencer compared the temperature trend during the past 24 years derived from their observations with the trend during the same period as calculated from four different manifestations of the global weather-balloon history.

Figure 1 shows what they found. The trend in their satellite record is 0.06C per decade. The trends from the various weather-balloon records range from 0.02C per decade to 0.05C per decade. In each case, the trend in the satellite record was slightly greater than that in the weather-balloon records, and the match with the two weather-balloon records with the most complete coverage of the globe was within 0.02C. That close correspondence is remarkable: Remember, these are independent observations.

 


Figure 1. A comparison of trends in satellite-measured temperatures (two columns on left) and weather balloonmeasured temperatures (four columns on right, including one equal to zero).

How does the Wentz and Schabel data set compare with the weather balloons? In a word, poorly. So poorly, in fact, that Santer and colleagues, instead of performing such a comparison themselves (as Christy and Spencer did), obviously anticipated the results and instead simply resorted to taking a swipe at the veracity of the weather-balloon data, saying that “As with [satellite] data, however, there is evidence that the choice of the ‘adjustment pathway’ for radiosonde data markedly influences the size and even the sign of the estimated global-scale trend.” The need to crash the satellite data is as obvious. As Tom Wigley stated in the press release from the U.S. National Center for Atmospheric research concerning the Santer paper: “The real issue is the trend in the satellite data from 1979 onward. If the original analysis of the satellite data were right, then something must be missing in the models.”

Obviously, the global warming community as a whole cannot take such a possibility lightly. Still, it is behaving in a much more predictable fashion than the climateprecisely, in fact, as Kuhn’s and Buchanan would predict. Faced with mounting evidence, it resorts to increasingly bizarre excuses as to why the models are still right, even to the point of disparaging actual real-world observations when necessary.

The question remains, and since they didn’t ask it, we will: Which do you believe, models or reality?

We’ll take reality every time.

References

Christy, J.R., et al., 2003. Error estimates of version 5.0 of MSUAMSU bulk atmospheric temperatures. Journal of Atmospheric and Oceanic Technology, 20, 613629.

Santer, B.D., et al., 2003. Influence of satellite data uncertainties on the detection of externally forced climate change, Sciencexpress,1 May.

There are several hurdles that must be overcome before a worldwide carbon emission trading system can become viable, according to experts who spoke last week at the World Resources Institutes Sustainable Enterprise Summit in Washington, D.C.

Ongoing attempts to create voluntary greenhouse gas emission trading markets are being undermined by a lack of consistency that would come from well-developed rules. There are two primary types of emissions trade occurring so far: awarding of credits for baseline reduction projects and trading under a cap-and-trade scheme.

The baseline-and-credit system awards credits to companies that reduce emissions through the Kyoto Protocols Clean Development Mechanism and Joint Implementation provisions. These allow companies to offset emissions through investment in non-emitting technologies. Companies are awarded credits based on the amount of emissions reduced below business-as-usual levels.

The cap-and-trade system, on the other hand, allows companies to trade emissions credits that are allotted based on a predetermined emission cap. Companies that exceed their targets can sell credits to companies that may not be able to meet theirs.

Although there has been some increase in trading, the market will not be truly viable without market-wide trading rules, consistent pricing, and standardized verification methods, according to the summit participants. “The success of the market really hinges on the ability to develop rules of the game,” said Janet Ranganathan, a senior associate with WRIs Sustainable Enterprise Program. Veronique Bishop, principal finance specialist with the World Banks Carbon Fund, agrees on the need for consistent pricing, but believes that, “We are light years away from that.”

What these experts are calling for is a worldwide regulatory regime to bring about the viability of emissions trading. In the absence of a real asset, there can be no “voluntary” markets with both buyers as well as sellers. Emissions trading markets only work when artificial scarcity is created through regulatory fiat and all emitters are required to participate. Otherwise you have a lot of sellers, but few if any buyers.

Indeed, Ranganathan fingers the U.S.s rejection of the Kyoto Protocol for the weak trading market. “The U.S. would have been the major buyer in the Kyoto Protocol had it ratified [the treaty],” she said. “The fact that it has withdrawn now casts doubt over carbon prices, and low carbon prices can create dysfunctions in the market, ultimately undermining the potential environmental gains.”

Robert Routliffe, manager of greenhouse gas emissions trading at DuPont, said that low participation is a “characteristic of any voluntary market. Its hard to get folks to spend money.” Developing carbon trading schemes is a “big, capital-intensive” process, which presents a major obstacle for most companies. Nor would the cost be lessened by mandatory controls (Greenwire, March 17, 2003).

Japans industrial sector is beginning to grouse about its obligations under the Kyoto Protocol, which the government ratified last year. According to Taishi Sugiyama, a senior researcher at Japans independent Central Research Institute of the Electric Power Industry, industry is putting considerable pressure on the government to rethink the Kyoto Protocol. Apparently, the government is listening.

Japan was one of the last countries to ratify Kyoto, partly due to strong opposition by industry groups and the Japanese Conservative Party, which favored voluntary reductions. But the government also felt obligated to ratify a treaty named for its ancient capitol. Now, nearly a year later, industry has become increasingly resentful of the Kyoto Protocol, said Sugiyama, who spoke to the World Resources Institute in Washington, D.C. last week.

Now the government is looking ahead to the 2005 negotiations when Kyoto signatories will discuss actions to be taken beyond 2013. Experts, such as Sugiyama, expect that the government will push for voluntary emissions reductions targets. Others disagree, however, saying that it would be very difficult for Japan to back away from the treaty.

Part of the resentment of the treaty comes from the assumptions the government used to determine its ability to meet the targets. For

example, it assumed that cuts in industrial emissions would be accomplished in large part through carbon leakage. In other words, heavy industry would close plants in Japan and open new plants on the Asian mainland, which the affected industries may have been surprised to learn. There was also widespread doubt that Japan would be able to meet its Kyoto targets, a sentiment the government apparently ignored.

Industry leaders also feel that the treaty is unfair. They argue that Japan is the only country that has enacted truly aggressive implementation policies, while the Kyoto Protocol allows European Union countries to buy emissions credits from less industrialized Eastern European countries, thereby avoiding the need for significant emissions reductions. Moreover, the EU has replaced much of its coal-fired capacity with natural gas since 1990, which serves as the baseline year for Kyoto reductions, thereby making the EUs target much less onerous.

Finally, industry argues that Japan made significant emissions reductions prior to 1990, when the government embarked on a tremendously costly twenty-year program to cope with the Arab oil embargo, making the 1990 baseline unfair to Japan. “We have already done much,” said Sugiyama. “Still, Kyoto requires [Japan] to reduce emissions 6 percent. Given that situation, its going to be extremely difficult to reduce emissions further.”

Last October the government organized a committee to revisit the Kyoto agreement. The committee, made up of 30 stakeholders, half of which are industry representatives, will present its findings to the government this month. It is likely, said Sugiyama, that it will call for a new protocol or an amended agreement with a combination of voluntary and mandatory targets (Greenwire, March 6, 2003).

One February 27, Christopher Essex, a professor in the Department of Applied Mathematics at the University of Western Ontario, and Ross McKitrick, an associate professor in the Department of Economics at the University of Guelph, gave a Cooler Heads Coalition congressional staff and media briefing on their new book, Taken By Storm: The Troubled Science, Policy and Politics of Global Warming.

Essex, who studies the underlying mathematics, physics and computation of complex dynamical processes, raised some very fundamental scientific issues with regard to the science of global warming. Take, for instance, the “average global temperature,” which is a mainstay of the debate. Such a thing doesnt exist, according to Essex. You cant add up temperature and take its average like you can with physical quantities such as energy, length, and so on.

“Thermodynamic variables are categorized as extensive or intensive,” said Essex. “Extensive variables occur in amounts. Intensive variables [such as temperature] refer to conditions of a system, defined continuously throughout its extent.” For example, one could add the temperature of a cup of ice water to the temperature of a cup of hot coffee, but what does that number mean? It doesnt mean anything because there is no such thing as total temperature. Dividing that number by two to get the average doesnt mean anything either. Yet that is exactly what occurs when the average global temperature is computed.

Essex also pointed out that the internal energy of a system can change without changing the temperature and the temperature can change while the internal energy of the system remains the same. “This disconnect happens routinely in the natural world around us all the time,” said Essex. “Ultimately this has to be so because temperature and energy belong to two fundamentally different classes of thermodynamic variables.”

Global warming enthusiasts want us to believe that average temperature can tell us something about what is going on in the climate, but it is just a number with no physical content. To add insult to injury, Essex explained that there are literally an infinite number of averaging rules that could be used, some of which will show “warming” and others that will show “cooling,” but the “physics doesnt say which one to use.”

Essex also explained that the earths so-called greenhouse effect does not work like a greenhouse. “Incoming solar radiation adds energy to the Earths surface,” he said. To restore radiative balance the energy must be transported back to space in roughly the same amounts that it arrived in. The energy is transported via two processes infrared radiation (heat transfer) and fluid dynamics (turbulence).

A real greenhouse works by preventing fluid motions, such as the wind, by enclosing an area with plastic or glass. To restore balance, infrared radiation must increase, thereby causing the temperature to rise. Predicting the resulting temperature increase is a relatively straightforward process.

But the “greenhouse effect” works differently. Greenhouse gases slow down outgoing infrared radiation, which causes the fluid dynamics to adjust. But it cannot be predicted what will happen because the equations which govern fluid dynamics cannot be solved! Scientists cannot even predict the flow of water through a pipe, let alone the vastly more complex fluid dynamics of the climate system. “No one can compute from first principles what the climate will do,” said Essex. “It may warm, or cool, or nothing at all!” Saying that the greenhouse effect works the same way as a greenhouse, which is a solvable problem, creates certainty where none exists, said Essex.

Surely scientists are aware of the issues that Essex brings up (and several other equally devastating points that arent discussed here). If so, then how have we come to a place where the media and politicians repeatedly state that there is a scientific consensus that the planet is warming up, it is caused by man, and the effects will be catastrophic? McKitrick offered a very convincing explanation. He discussed several relevant groups, but well focus on politicians and what McKitrick calls “Official Science.”

Politicians need big issues around which they can form winning coalitions. Global warming is a good issue because, “It is so complex and baffling the public still has little clue what its really about. Its global, so you get to have your meetings in exotic locations. Policy initiatives could sound like heroic measures to save the planet, but on the other hand the solutions are potentially very costly. So you need a high degree of scientific support if you are going to move on it. Theres a premium on certainty.”

This is where Official Science comes in. Official Science is made up of staffs of scientific bureaucracies, editors of prominent magazines, directors of international panels, and so on. These members of Official Science arent appointed by scientists to speak on their behalf, but are appointed by governments. They have the impossible job of striking “a compromise between the need for certainty in policymaking and the aversion to claims of certainty in regular science.” What happens is that science ends up serving a political agenda rather than a scientific one. “If things were as they should be, leaders would want a treaty because they observe that scientists are in agreement. What happens instead is that Official Science orchestrates agreement because leaders want to make a treaty.” The presentation will soon be available at www.cei.org. Taken By Storm may be ordered at www.amazon.ca.

Etc.

On March 13, Hans Blix, the U.N. weapons inspector in Iraq was interviewed on MTV about his thoughts regarding war with Iraq and weapons of mass destruction. During the interview he stated that, “On big issues like war in Iraq, but in many other issues, they simply must be multilateral. Theres no other way around. You have the instances like the global warming convention, the Kyoto protocol, when the U.S. went its own way. I regret it. To me the question of the environment is more ominous than that of peace and war. We will have regional conflicts and use of force, but world conflicts I do not believe will happen any longer. But the environment, that is a creeping danger. Im more worried about global warming than I am of any major military conflict.” Presumably, the risks of war, weapons of mass destruction, and terrorist acts such as 9/11, pale in comparison to the threat of global warming.

UK Prime Minister Tony Blair has thrown his support behind a government plan that would severely restrict greenhouse gas emissions, require large increases in the use of renewable energy, and block any further construction of nuclear power plants. The plan, which was set out in a white paper policy document released by the government on Feb. 24, was hailed by the prime minister as a “step change in the UKs energy strategy over the next 50 years.”

The plan calls for a reduction in carbon dioxide emissions of 60 percent by the year 2050. The massive reductions would to take place without the aid of nuclear power and would rely heavily on building renewable energy capacity as well as energy efficiency. The plan calls for a large increase in renewable energy production, requiring that 20 percent of the nations energy be produced from renewable sources by 2020.

In a speech endorsing the plan, Blair claimed that the technology is available to make the steep reductions in CO2 emissions without hurting economic growth. He also stated that, “It is clear Kyoto is not radical enough” and that he will “continue to make the case to the U.S. and to others that climate change is a serious threat that we must address together as an international community.”

The Financial Times criticized Mr. Blair in a Feb. 25 editorial, stating that, “Having fixed the end, he has not willed the means.” It goes on to say that the white paper “opens a necessary debate on the conflict between energy and the environment but does not provide an answer on how to combine them.” The editorial noted that a Downing Street document published last year said that, “It would be unwise for the UK now to take a unilateral decision to meet the [60 percent] target in advance of international negotiations on longer-term targets.”

It concludes that, “Eventually, the government will have to temper its moral passion for renewables with certain realities,” namely with “awareness that renewable energies can never be a complete solution, because most of them do not work on calm or cloudy days. If avoiding carbon emissions is the priority, this is better performed by nuclear reactors than anything else.”

Court Orders EPA to Hand Over Climate Change Documents

The Environmental Protection Agency has been ordered by the U.S. District Court for the District of Columbia to produce “climate change” documents requested under the Freedom of Information Act by the Competitive Enterprise Institute (CEI), or to justify their withholding. CEI, a non-profit free market advocacy group, requested the documents to determine whether or not the agency was engaging in activities to implement the Kyoto Protocol “through the backdoor” in opposition to congressional prohibition.

“Now we can finally begin assessing how far the agency has gone toward backdoor implementation of the Kyoto Protocol,” said Christopher C. Horner, CEI Counsel who filed the lawsuit.  “We also remain fascinated by a point of which the Court took particular note:  How does EPA explain their shift in alarmism from the global cooling scare of years past to the current emphasis on catastrophic global warming?”

The documents that the EPA has been ordered to hand over are expected to show that the agency has violated the “Knollenberg Provision,” originally sponsored by Rep. Joe Knollenberg (R-MI). The provision prohibits the federal government from spending money to implement the Kyoto Protocol, which has not been ratified by the U.S. Senate.

“By this Order, the D.C. District Court joins CEIs puzzlement over the Administrations refusal to turn over documents on the basis that their release may potentially harm U.S. interests in ongoing Kyoto negotiations,” said Horner.  “And it adds to the mounting public embarrassments over the refusal by various officials to execute the Presidents rejection of Kyoto, instead continuing to try to cut a deal for a treaty the President assured the public he rejected in Americas interest.”

The court ruling, said Horner, will likely expose attempted backdoor implementation during the Clinton Administration. The EPA has until March 31 to either produce the documents or explain to the satisfaction of the court why they are withholding them.

Analyst Shreds AGs CO2 Case

State Attorneys General from several states have filed notice on two separate occasions this year of their intent to sue the U.S. Environmental Protection Agency for failing to regulate carbon dioxide. The first notice came on January 30, informing EPA Administrator Christine Todd Whitman that the Attorneys General of Massachusetts, Connecticut, and Maine planned to sue under Section 108 of the Clean Air Act (CAA), which they claim obligates Whitman to list CO2 as a pollutant that endangers public health and safety.

The second notice came on February 20 when the three AGs, joined by four others representing New York, New Jersey, Rhode Island and Washington, informed Whitman of intent to sue unless she promulgates New Source Review Performance Standards for power plant emissions of CO2 under section 111 of the CAA.

In a critique of the two letters, Marlo Lewis, a senior fellow at the Competitive Enterprise Institute, accuses the AGs of engaging in “mere word play” and a “sophomoric attempt to turn statutory construction into a game of gotcha.”

The question, argues Lewis, is “Did Congress delegate to EPA the power to regulate CO2? When Congress enacted and amended the CAA, did it intend for EPA to set up a mandatory greenhouse gas control program?” The answer is clearly no, according to Lewis. As he has noted elsewhere and repeats in the current critique, CO2 is not mentioned in any CAA regulatory provisions and only once in a non-regulatory provision. The clincher, however, is the statement within the non-regulatory provision that, “Nothing in this subsection shall be construed to authorize the imposition on any person of air pollution control requirements.”

Moreover, the AGs want the EPA to declare CO2 a pollutant under the National Ambient Air Quality Standards (NAAQS) program. But NAAQS is a program that deals with “place-specific air quality programs,” which “measures local pollution levels against national air quality standards and seeks to remedy local problems via state implementation plans.”

It doesnt make any sense to attempt to regulate CO2 under the NAAQS provision because regardless of where the CO2 is emitted, it has the same potential impact on the climate. “If EPA set NAAQS for CO2 above current atmospheric levels, the entire country would be in attainment, even if U.S. consumption of hydrocarbon fuels suddenly doubled,” says Lewis. “Conversely, if EPA set a NAAQS for CO2 below current levels, the entire country would be out of attainment, even if all power plants, factories, and automobiles shut down.”

The second notice of intent to sue is a new innovation in the AGs attempt to force the EPA to regulate CO2. This one seeks to force Administrator Whitman to set New Source Performance Standards (NSPS) for CO2 emission from electric generating units. NSPS requires different categories of stationary sources to meet certain performances. Lewis points out that the NSPS program was enacted in 1970, “years before global warming was even a gleam in Al Gores eye.” Nor did Congress instruct the EPA to address global warming in the NSPS program when it amended the CAA in 1977 and 1990.

Sen. Patrick Leahy (D-VT) introduced legislation to amend the NSPS to cap CO2 from power plants in the 105th, 106th, and 107th Congresses. Each time the bill attracted zero co-sponsors. Its absurd, says Lewis, to argue that Congress implicitly empowered EPA to cap CO2 in 1970 given Leahys efforts to provide that authority and Congresss flat rejection of those efforts. “The phrase laughed out of court was invented for just such inanities.” Lewis makes several other cogent and damning critiques of the AGs arguments.

He concludes by challenging EPA Administrator Whitman to show leadership in the face of these attacks. These notices are designed to force her to choose between the Presidents opposition to CO2 regulation and the career bureaucrats who want to increase their power over the U.S. economy, says Lewis. “Whitman must decide where her loyalties lie with the rule of law, economic growth, and affordable energy, or with the rule of bureaucrats, regulatory excess, and Kyoto-style energy rationing.” The critique, The Anti-Energy Litigation Of The State Attorneys General: From Junk Science To Junk Law, is available at www.cei.org.

Another Hit for the Climate Models

Its not everyday that the climate models take it on the proverbial chin. It just seems like it. In a paper presented at the Annual Meeting of the American Meteorological Society, Dr. Junhong Wang with the National Center of Atmospheric Research discussed his research teams findings that the amount of water vapor in the upper atmosphere is much greater than previously thought at least over Oklahoma and Kansas.

The researchers have built a new radiosonde instrument, called Snow White (SW), which measures relative humidity more accurately than the old instruments, which have been the basis for all upper atmosphere climate records. The new radiosonde will serve as the new reference case from which all previous measurements will be calibrated.

In test runs over Kansas and Oklahoma, the researchers found that below six kilometers the old and new radiosondes agree reasonably well but then diverge at altitudes above six kilometers. At about 11.4 to 12.7 kilometers, SW found a supersaturation layer, which could be the cirrus cloud layer. Previous measurements found relative humidity of below 30 percent.

This finding is important because high altitude cirrus clouds do not block sunlight, indeed they are often invisible to the naked eye, but very efficiently block outgoing infrared radiation (heat), causing a net warming. Where humidity is high, however, the relative effect of greenhouse gases, such as carbon dioxide, on temperature is smaller than in low humidity areas.

Thats why most anthropogenic warming is predicted to take place in extraordinarily dry (and cold) regions such as Siberia. If the humidity data used in a computer model is too low, then the model will overestimate the effect of greenhouse gases. And, the climate models will predict too much warming. The paper is available at www.ametsoc.org/AMS/index.html.

Melting in Arctic May be Natural

Researchers from the Norwegian Polar Institute and the Norwegian Meteorological Institute have compiled data from the ship logs of early Arctic explorers and whalers to determine the sea ice extent from 1553 to 2002.

What they have found is that the current retreat of ice observed in the Arctic occurred before in the early 1700s. While this evidence doesnt rule out that the current melting is due to mans greenhouse gas emissions, it certainly suggests that it may be entirely natural. “If you go back to the early 1700s you find that sea ice extent was about the same as it is now,” said Chad Dick of the Arctic Climate Systems Study.

The researchers also found that sea ice has declined by about 33 percent over the past 135 years, but that most of that retreat occurred before significant manmade emissions of greenhouse gases. This also means that the current melting could be due to natural cycles. “The evidence at the moment is fairly inconclusive,” said Mr. Dick. “The fact is there are natural cycles in sea ice extent and were not outside the range of those natural cycles at the moment.”

Mr. Dick also noted that if the current warming is indeed due to natural cycles, we should begin to see ice thickening again in the near future. It will take about ten more years at the current rate of thinning to get beyond the range that wed expect if the decline in sea ice is due to natural cycles (Globe and Mail, February 27, 2003). The World Wildlife Fund is publishing the sea charts on CD-ROM (www.panda.org).

Here is what experts have to say on the issue of alternative and renewable energy sources.