Science

The history of science and politics explains a lot about the manifold distortions, exaggerations, and outright hokum that regularly appear in some of our major journals on the subject of global warming.

Specifically, Thomas Kuhn’s landmark book The Structure of Scientific Revolutions (first published in 1967 and reprinted about a zillion times) demonstrates that scientists have a herd mentality in which almost everyone believes in some reigning “paradigm” (in this case, disastrous global warming). Careers are spent proving that paradigm and discounting anything that disagrees with prevailing theory. That’s “science.”

As for “politics,” the two intersect, writes James Buchanan, when researchers find they can sell their ideas to policymakers for mutual gain: If a little distortion is required, so be it. That’s a powerful prism through which we must view so-called “facts”: a throng of scientists acting in concert with the political process.

Prism in hand, we can’t say we were surprised when the science community’s weekly lobbying newsletter, Science magazine, couldn’t stop gushing about recent research purportedly demonstrating that climate models of gloom and doom are right after all and the satellite temperature records reporting precious little warming were wrong.

Don’t misunderstand: Science does also report on scientific research. But anyone who thinks it isn’t lobbying hasn’t read the news or editorial sections and just about anything else in it that isn’t hard science. And any idea that Science is fair and balanced on the subject of global warming can be thrown into the trash along with previous issues.

At least that’s all we can take from their most recent handling of a new take on satellite-measured global temperature data. On May 1, Sciencexpress (a web page where the magazine posts “selected” paperschosen for their “timeliness and importance”before they actually appear in print) featured research led by Lawrence Livermore’s Ben Santer showing that climate models were in better agreement with a newly produced, and as yet unpublished, version of the satellite-measured temperature history that includes a warming trend about three times that of the oft-peer reviewed satellite-temperature history constructed by John Christy and Roy Spencer.

Although the paper itself does not come right out and say it, a between-the-lines read suggests that because the new satellite dataset is in closer agreement with model results, it is likely the more accurate reflection of the actual state of the lower atmosphere. And just in case you didn’t get that message from the paper itself, the accompanying Science press release makes it clear: “A stubborn argument against global warming may be discredited by a reanalysis of the raw data central to its claims.”

Santer used a climate model to discriminate between two versions of the satellite datathe one published by Christy that shows very little warming, and a modified, unpublished version by Frank Wentz that shows much more heating. That was certainly a novel approach because it wasn’t science. In science, we don’t bend data to fit models and then pronounce those models correct. Rather, we bend models to accommodate datathat is, reality.

If Science had any intent to be fair and balanced, it would have noted that John Christy and colleagues published a paper at the exact same time that checked the satellite data the old-fashioned (i.e., scientific) way. He and Roy Spencer compared their record with a totally independent measurement of lower-atmosphere temperature, taken from daily weather balloons, and found the two to be in strong agreement.

Weather balloons are launched twice daily from sites around the world; as they ascend through the atmosphere, they radio back observations of temperature, humidity, and pressure that are used to initialize models of daily weather forecasts. Balloon observations can be compiled into a record of lower-atmospheric temperatures that can then be compared with the satellite measurements.

It is important to realize that the way the weather-balloon data are collected is completely different from how we obtain satellite observations, and thus represents an independent measurement of the same quantity (atmospheric temperature). Of course, as is the case with any measurement, there are complications that must be considered when compiling the raw data.

For that reason, there are several different research groups that have released their own versions of the weather-balloon temperature history. To protect against any accusation of picking only the particular data set that best matches their satellite observations, Christy and Spencer compared the temperature trend during the past 24 years derived from their observations with the trend during the same period as calculated from four different manifestations of the global weather-balloon history.

Figure 1 shows what they found. The trend in their satellite record is 0.06C per decade. The trends from the various weather-balloon records range from 0.02C per decade to 0.05C per decade. In each case, the trend in the satellite record was slightly greater than that in the weather-balloon records, and the match with the two weather-balloon records with the most complete coverage of the globe was within 0.02C. That close correspondence is remarkable: Remember, these are independent observations.

 


Figure 1. A comparison of trends in satellite-measured temperatures (two columns on left) and weather balloonmeasured temperatures (four columns on right, including one equal to zero).

How does the Wentz and Schabel data set compare with the weather balloons? In a word, poorly. So poorly, in fact, that Santer and colleagues, instead of performing such a comparison themselves (as Christy and Spencer did), obviously anticipated the results and instead simply resorted to taking a swipe at the veracity of the weather-balloon data, saying that “As with [satellite] data, however, there is evidence that the choice of the ‘adjustment pathway’ for radiosonde data markedly influences the size and even the sign of the estimated global-scale trend.” The need to crash the satellite data is as obvious. As Tom Wigley stated in the press release from the U.S. National Center for Atmospheric research concerning the Santer paper: “The real issue is the trend in the satellite data from 1979 onward. If the original analysis of the satellite data were right, then something must be missing in the models.”

Obviously, the global warming community as a whole cannot take such a possibility lightly. Still, it is behaving in a much more predictable fashion than the climateprecisely, in fact, as Kuhn’s and Buchanan would predict. Faced with mounting evidence, it resorts to increasingly bizarre excuses as to why the models are still right, even to the point of disparaging actual real-world observations when necessary.

The question remains, and since they didn’t ask it, we will: Which do you believe, models or reality?

We’ll take reality every time.

References

Christy, J.R., et al., 2003. Error estimates of version 5.0 of MSUAMSU bulk atmospheric temperatures. Journal of Atmospheric and Oceanic Technology, 20, 613629.

Santer, B.D., et al., 2003. Influence of satellite data uncertainties on the detection of externally forced climate change, Sciencexpress,1 May.

One February 27, Christopher Essex, a professor in the Department of Applied Mathematics at the University of Western Ontario, and Ross McKitrick, an associate professor in the Department of Economics at the University of Guelph, gave a Cooler Heads Coalition congressional staff and media briefing on their new book, Taken By Storm: The Troubled Science, Policy and Politics of Global Warming.

Essex, who studies the underlying mathematics, physics and computation of complex dynamical processes, raised some very fundamental scientific issues with regard to the science of global warming. Take, for instance, the “average global temperature,” which is a mainstay of the debate. Such a thing doesnt exist, according to Essex. You cant add up temperature and take its average like you can with physical quantities such as energy, length, and so on.

“Thermodynamic variables are categorized as extensive or intensive,” said Essex. “Extensive variables occur in amounts. Intensive variables [such as temperature] refer to conditions of a system, defined continuously throughout its extent.” For example, one could add the temperature of a cup of ice water to the temperature of a cup of hot coffee, but what does that number mean? It doesnt mean anything because there is no such thing as total temperature. Dividing that number by two to get the average doesnt mean anything either. Yet that is exactly what occurs when the average global temperature is computed.

Essex also pointed out that the internal energy of a system can change without changing the temperature and the temperature can change while the internal energy of the system remains the same. “This disconnect happens routinely in the natural world around us all the time,” said Essex. “Ultimately this has to be so because temperature and energy belong to two fundamentally different classes of thermodynamic variables.”

Global warming enthusiasts want us to believe that average temperature can tell us something about what is going on in the climate, but it is just a number with no physical content. To add insult to injury, Essex explained that there are literally an infinite number of averaging rules that could be used, some of which will show “warming” and others that will show “cooling,” but the “physics doesnt say which one to use.”

Essex also explained that the earths so-called greenhouse effect does not work like a greenhouse. “Incoming solar radiation adds energy to the Earths surface,” he said. To restore radiative balance the energy must be transported back to space in roughly the same amounts that it arrived in. The energy is transported via two processes infrared radiation (heat transfer) and fluid dynamics (turbulence).

A real greenhouse works by preventing fluid motions, such as the wind, by enclosing an area with plastic or glass. To restore balance, infrared radiation must increase, thereby causing the temperature to rise. Predicting the resulting temperature increase is a relatively straightforward process.

But the “greenhouse effect” works differently. Greenhouse gases slow down outgoing infrared radiation, which causes the fluid dynamics to adjust. But it cannot be predicted what will happen because the equations which govern fluid dynamics cannot be solved! Scientists cannot even predict the flow of water through a pipe, let alone the vastly more complex fluid dynamics of the climate system. “No one can compute from first principles what the climate will do,” said Essex. “It may warm, or cool, or nothing at all!” Saying that the greenhouse effect works the same way as a greenhouse, which is a solvable problem, creates certainty where none exists, said Essex.

Surely scientists are aware of the issues that Essex brings up (and several other equally devastating points that arent discussed here). If so, then how have we come to a place where the media and politicians repeatedly state that there is a scientific consensus that the planet is warming up, it is caused by man, and the effects will be catastrophic? McKitrick offered a very convincing explanation. He discussed several relevant groups, but well focus on politicians and what McKitrick calls “Official Science.”

Politicians need big issues around which they can form winning coalitions. Global warming is a good issue because, “It is so complex and baffling the public still has little clue what its really about. Its global, so you get to have your meetings in exotic locations. Policy initiatives could sound like heroic measures to save the planet, but on the other hand the solutions are potentially very costly. So you need a high degree of scientific support if you are going to move on it. Theres a premium on certainty.”

This is where Official Science comes in. Official Science is made up of staffs of scientific bureaucracies, editors of prominent magazines, directors of international panels, and so on. These members of Official Science arent appointed by scientists to speak on their behalf, but are appointed by governments. They have the impossible job of striking “a compromise between the need for certainty in policymaking and the aversion to claims of certainty in regular science.” What happens is that science ends up serving a political agenda rather than a scientific one. “If things were as they should be, leaders would want a treaty because they observe that scientists are in agreement. What happens instead is that Official Science orchestrates agreement because leaders want to make a treaty.” The presentation will soon be available at www.cei.org. Taken By Storm may be ordered at www.amazon.ca.

Etc.

On March 13, Hans Blix, the U.N. weapons inspector in Iraq was interviewed on MTV about his thoughts regarding war with Iraq and weapons of mass destruction. During the interview he stated that, “On big issues like war in Iraq, but in many other issues, they simply must be multilateral. Theres no other way around. You have the instances like the global warming convention, the Kyoto protocol, when the U.S. went its own way. I regret it. To me the question of the environment is more ominous than that of peace and war. We will have regional conflicts and use of force, but world conflicts I do not believe will happen any longer. But the environment, that is a creeping danger. Im more worried about global warming than I am of any major military conflict.” Presumably, the risks of war, weapons of mass destruction, and terrorist acts such as 9/11, pale in comparison to the threat of global warming.

Another Hit for the Climate Models

Its not everyday that the climate models take it on the proverbial chin. It just seems like it. In a paper presented at the Annual Meeting of the American Meteorological Society, Dr. Junhong Wang with the National Center of Atmospheric Research discussed his research teams findings that the amount of water vapor in the upper atmosphere is much greater than previously thought at least over Oklahoma and Kansas.

The researchers have built a new radiosonde instrument, called Snow White (SW), which measures relative humidity more accurately than the old instruments, which have been the basis for all upper atmosphere climate records. The new radiosonde will serve as the new reference case from which all previous measurements will be calibrated.

In test runs over Kansas and Oklahoma, the researchers found that below six kilometers the old and new radiosondes agree reasonably well but then diverge at altitudes above six kilometers. At about 11.4 to 12.7 kilometers, SW found a supersaturation layer, which could be the cirrus cloud layer. Previous measurements found relative humidity of below 30 percent.

This finding is important because high altitude cirrus clouds do not block sunlight, indeed they are often invisible to the naked eye, but very efficiently block outgoing infrared radiation (heat), causing a net warming. Where humidity is high, however, the relative effect of greenhouse gases, such as carbon dioxide, on temperature is smaller than in low humidity areas.

Thats why most anthropogenic warming is predicted to take place in extraordinarily dry (and cold) regions such as Siberia. If the humidity data used in a computer model is too low, then the model will overestimate the effect of greenhouse gases. And, the climate models will predict too much warming. The paper is available at www.ametsoc.org/AMS/index.html.

Melting in Arctic May be Natural

Researchers from the Norwegian Polar Institute and the Norwegian Meteorological Institute have compiled data from the ship logs of early Arctic explorers and whalers to determine the sea ice extent from 1553 to 2002.

What they have found is that the current retreat of ice observed in the Arctic occurred before in the early 1700s. While this evidence doesnt rule out that the current melting is due to mans greenhouse gas emissions, it certainly suggests that it may be entirely natural. “If you go back to the early 1700s you find that sea ice extent was about the same as it is now,” said Chad Dick of the Arctic Climate Systems Study.

The researchers also found that sea ice has declined by about 33 percent over the past 135 years, but that most of that retreat occurred before significant manmade emissions of greenhouse gases. This also means that the current melting could be due to natural cycles. “The evidence at the moment is fairly inconclusive,” said Mr. Dick. “The fact is there are natural cycles in sea ice extent and were not outside the range of those natural cycles at the moment.”

Mr. Dick also noted that if the current warming is indeed due to natural cycles, we should begin to see ice thickening again in the near future. It will take about ten more years at the current rate of thinning to get beyond the range that wed expect if the decline in sea ice is due to natural cycles (Globe and Mail, February 27, 2003). The World Wildlife Fund is publishing the sea charts on CD-ROM (www.panda.org).

Here is what experts have to say on the issue of alternative and renewable energy sources.

A new report by Dr. David Wojick, which reviews six major National Academy of Sciences studies published over the last five years, argues that a new understanding of climate change has emerged as scientists have grappled with the question of mans influence on the climate.

Wojick states, “The issues in the NAS reports and recent research are far more fundamental and clash with an underlying premise of much climate modeling over the past decade that climate over the past century and a half has been effectively constant and any changes are primarily because of mans activity.”

One of the NAS reports, Decade-to-Century-Scale Climate Variability and Change: A Science Strategy, published in 1998, effectively debunks that premise. “The evidence of natural variations in the climate system which was once assumed to be relatively stable clearly reveals that climate has changed, is changing, and will continue to do so with or without anthropogenic influences.”

The review goes on to quote several more passages from NAS studies, which simply do not offer any confirmation of the claims that the science is settled. The previously mentioned study also states, “Without a clear understanding of how climate has changed naturally in the past, and the mechanisms involved, our ability to interpret any future change will be significantly confounded and our ability to predict future change severely curtailed.”

Another NAS report, The Atmospheric Sciences: Entering the Twenty-First Century, also published in 1998, states, “Large gaps in our knowledge of interannual and decade-to-century natural variability hinder our ability to provide credible predictive skill or to distinguish the role of human activities from natural variability.” In 2001, the NAS admitted in a study titled, Climate Change Science: An Analysis of Some Key Questions, that “the observing system available today is a composite of observations that neither provide the information nor the continuity in the data needed to support measurements of climate variables.”

Far from being settled, the science is still in its infancy. The NASs Global Environment Change: Research Pathways for the Next Decade, published in 1999, argues that, “Climate research is only at the beginning of its learning curve, with dramatic findings appearing at an impressive rate. In this area even the most fundamental scientific issues are evolving rapidly.”

The NAS studies reveal, according to Wojick, that there has been a quiet revolution in climate science. “It seems that we have discovered or confirmed a number of natural mechanisms of climate change, at least ten in fact. These mechanisms provide alternative, competing explanations for global warming; alternative to, and competing with, the theory of human-induced warming. Also alternative to, and competing with, each other.

Each of these mechanisms can in theory explain all of the changes in 20th century climate. Human greenhouse gas emissions are therefore just one of many alternative hypotheses. In addition, the evidence for warming due to greenhouse gas emissions is no greater than for any of the other mechanisms” (Electricity Daily, February 3, 2003). As a result of this revolution, increases in our understanding about climate change have been paralleled by increases in the uncertainty about mans contribution, if indeed there is one.

The mechanisms include solar variation, emergence from the Little Ice Age, lunar energy variation, internal oscillations (such as El Nio), Milankovitch forcing (variations in the Earths orbit), ocean variation, biospheric variation, cryogenic variation (variations in the amount and distribution of ice), surface versus satellite temperature variation, and aerosol forcing mechanisms.

Wojick concludes that, “We do not know the extent of climate change in the past, we do not know why climate changes, and we must focus our research on this issue. Only then can we integrate the potential role of past increases in GHG [greenhouse gas] emissions into recent climate history, and only then can we begin to assess the outlook for future climate.”

Announcements

Christopher Essex of the University of Western Ontario and Ross McKitrick of the University of Guelph, will give a Cooler Heads Coalition congressional and media briefing on their new book, Taken By Storm: the troubled science, policy, and politics of global warming, on Thursday, February 27, from 2:30 to 4:00 PM in Room 406 of the Senate Dirksen Office Building. Reservations are requested. To attend, please contact Myron Ebell at mebell@cei.org or (202) 331-2256. Include your name, telephone number, e-mail address, and institutional affiliation. Registered attendees will receive copies of the book, compliments of the Competitive Enterprise Institute.

Attorneys General Threaten Lawsuit

Attorneys general from three StatesConnecticut, Massachusetts and Mainehave notified the Bush administration of their plan to sue the U.S. Environmental Protection Agency unless it classifies carbon dioxide as a pollutant under the Clean Air Act, which would allow the agency to begin regulating emissions of the gas.

In a letter to EPA administrator Christie Whitman, the attorneys general, all Democrats, warned the EPA that if the agency does not act within 60 days they will bring the suit. “We have not seen any appreciable progress on the development of a national program to address carbon dioxide emissions,” says the letter. “In seeking to protect the health and welfare of our citizens from the impacts of climate change, we are left to fall back on our available remedies available under existing law.”

The attorneys general claim that the EPA is violating federal law by not regulating CO2. “The Clean Air Act requires the EPA to take certain actions when it determines that a pollutant may cause or contribute to air pollution which may reasonably be anticipated to endanger public health or welfare.”

The attorneys general base their argument on the administrations Climate Action Report 2002 (CAR) which was submitted to the United Nations Framework Convention on Climate Change in May 2002. They claim that the admission that climate changes “are likely due mostly to human activities,” obligates the EPA to regulate CO2. But the CAR should have never been released because it was based on the thoroughly discredited National Assessment, a report prepared by a federal advisory committee appointed by the Clinton Administration.

Marlo Lewis, a senior fellow at the Competitive Enterprise Institute in Washington, D.C., argues that the attorneys general ignore the plain language, structure, and legislative history of the law. The AGs build their case on Section 103(g), which refers to CO2 as a “pollutant.” However, the context of that sole reference to CO2 in the Act is a discussion of “nonregulatory strategies,” and the passage concludes with the admonition that, “Nothing in this subsection shall be construed to authorize the imposition on any person of air pollution control requirements.” According to Lewis, “If nothing in that subsection gives EPA authority to impose new control requirements, then the passing reference therein to CO2 as a pollutant cannot provide such authority.”

Lewis also accused the attorneys general of trying to end run the legislative process. “They want to legislate energy taxes or their regulatory equivalent through the courts rather than allow Congress to make the law.”

The threatened lawsuit follows a letter that 11 attorneys general sent to the administration last July urging it to regulate CO2. It also follows a lawsuit against the EPA filed by three environmental groups to force the agency to regulate automotive emissions of CO2 using the same legal argument as the state attorneys general.

An industry source told Greenwire (January 31, 2003) that there is little to worry about. “If Im an EPA attorney, Im not losing any sleep over defending this lawsuit,” he said. “Of course, the issues would have to be briefed properly. But the risk of EPA losing this case, if its even litigated, is very, very low. I just think arguing that CO2 is a pollutant is too big a legal hurdle to get over.”

NC Approves Renewable Energy Plan

The North Carolina Utilities Commission approved a plan on Jan. 28 that will allow the residents of that state to purchase electricity produced from renewable sources. It is the first statewide program of its kind in the nation, according to the Charlotte Observer (January 29, 2003).

The NC Green Power program, to be launched in six months, will initially allow residential customers to buy power generated mostly from landfill gases. Wind and solar power are slated to make up 15 percent of the states electricity supply by the third year of the plan. Customers who choose to purchase electricity from renewable sources will buy it in blocks of 100 kilowatt-hours for an additional $4 per month.

Large commercial and industrial customers will pay an additional $2.50 per block of electricity and will have access to a greater variety of energy sources, including animal wastes and small hydroelectric plants. The rate premiums are meant to cover the higher costs of producing renewable electricity, but similar rate premiums in other states have proved insufficient.

Nearly every proposed source of renewable energy has its detractors, since every source of energy carries some environmental costs. The environmental group, Appalachian Voices, objected to the burning of biomass, for example, on the grounds that it produces air emissions.

“The problem is that everybody wants solar or wind power, but we dont have any,” said Karen King of Advanced Energy. Currently, North Carolina has only a couple of solar power plants and no wind power connected to the grid.

Dan Lieberman of the Center for Resource Solutions, a San Francisco-based nonprofit that certifies renewable energy programs, warns that the program will require massive marketing to be successful. “The product isnt going to sell itself,” he said.

A new study appearing in the January 10 issue of Geophysical Research Letters has found that the greenhouse effect may be weakening due to changes in cloud cover that do not correspond with climate model predictions.

The study, conducted by Robert Cess, with the State University of New York at Stony Brook, and his late colleague Petra Udelhofen, looked at the role of clouds on climate in the tropics and subtropics by comparing the results of a climate model to observations. The model shows temperature rising since 1970 along with the strength of the greenhouse effect.

As more greenhouse gas concentrations increase, less heat or outgoing longwave radiation (OLR), should be able to escape the atmosphere. If this is the case, then observations from space should measure a long-term decline in OLR. The model also indicates, however, that there should be a decline in incoming sunlight, or absorbed shortwave radiation (ASW). This would come about through increases in cloud cover.

Cess and Udelhofen use satellite data from 1985 to the present to test this model result. What they found was that even though there has been a general warming since 1985, both OLR and ASW are increasing, not decreasing. In other words, the greenhouse effect is weakening. With the exception of a few years after the eruption of Mt. Pinatubo, the increase in OLR has been fairly steady. Moreover, increases in the ASW correspond with an observed decline in cloudiness. With fewer clouds, it is easier to raise the temperature at the surface and there is no need to invoke greenhouse gases to explain the change.

There are several possible explanations for the decline in tropical and subtropical cloudiness. Cloudiness may be responding to climate change, as suggested by Richard S. Lindzen at MIT. His research has found that high altitude clouds that block OLR decrease as temperature increases. Cess and Udelhofen argue, on the other hand, that the change in cloud cover is due to natural variation. “If the change in cloud cover is the result of natural variability acting over decadal time scales, this could considerably hamper efforts at detecting the radiative signature of future global warming.” Regardless of the reason, this is yet another instance where the models do not match reality.

Announcements

  • David Wojick, founder and president of ClimateChangeDebate.org, has published a blockbuster study on climate change uncertainties. The study, The New View of Natural Climate Variation: Fundamental Climate Science Issues Raised in Six Major National Academy of Science Studies, shows that there are still major uncertainties in the climate science. According to Wojick, scientists “have discovered or confirmed about ten natural mechanisms of climate variation, each of which can in theory explain all of the changes in 20th Century climate. Human GHG emissions are therefore just one of many alternative theories. The study is available at www.nam.org.

  • Jesse Ausubel of the Rockefeller University will give a Cooler Heads Coalition congressional and media briefing on “Climate Change: the Known, the Unknown, the Unknowable” on Friday, February 7, from noon to 1:30 in Room 628 of the Senate Dirksen Office Building. Lunch will be provided, and reservations are required. To attend, please contact Paul Georgia at pgeorgia@cei.org or (202) 331-2257. Include your name, telephone number, e-mail address, and institutional affiliation.

Many children have been taught to fear the supposedly imminent arrival of global warming even though no one really knows if the world is getting hotter. While it is important to make children aware that current scientific evidence is inconclusive, it also may be helpful to put to rest some of the irrational fears of global warming. Facts, Not Fear, a guide book on teaching children about the environment by Jane Shaw and Michael Sanera, offers several suggestions.

  • First, they suggest teaching children about dinosaurs. Ask children to describe the environment the dinosaurs lived in, including the vegetation, and ask them if the world was warmer or cooler than it is now. Explain to them that at the time dinosaurs lived the atmosphere had CO2 levels that were at least 5 times greater than what we now have and that these high levels of CO2 contributed to the rich vegatation.
  • The book also suggests a field trip to a greenhouse to learn about the “greenhouse effect.” Ask the greenhouse manager to explain how the conditions in the greenhouse are controlled to help plants and ask if the greenhouse adds carbon dioxide. Many greenhouses do add CO2 because it is a vital component of photosynthesis. This can help children learn that CO2 isn’t the dangerous gas that it is often portrayed as.
  • Finally, Sanera and Shaw suggest teaching children about former predictions of a coming ice age. Have children read articles and books such as “The Ice Age Cometh?” from Time in January 31, 1994, The Cooling by Lowell Ponte, and “Brace Yourself for Another Ice Age,” from Science Digest in February of 1975.

Glenn Schleede, the intrepid energy analyst, has done another bang-up job of identifying the weaknesses of yet another wind power project. This time his sights are set on West Virginia, and the prognosis is bleak.

One wind farm is already in operation in West Virginia, another has been approved by the Public Service Commission, and a third application is still pending. The amount of power produced from the three plants, assuming a generous 30 percent capacity factor, would equal a little over 1.6 billion kWh of electricity per year. The plants would occupy 30 to 40 square miles, yet only produce an amount of energy equal to 1.7 percent of the total amount of electricity produced in West Virginia in 2000. A new 265 MW gas-fired combined-cycle generating plant, on the other hand, would produce slightly more electricity on just a few acres.

Not only will these wind farms produce paltry amounts of electricity, the electricity produced will be of lower value due to the intermittent, volatile, and unpredictable nature of wind-generated electricity. To offset these characteristics and maintain the reliability of the grid, they will have to be backed up with dispatchable generating units. “Units serving this backup role must be on line (connected to the grid and producing electricity) and running below their peak capacity and efficiency, or in a spinning reserve mode (i.e., connected to the grid and synchronized but not putting electricity into the grid),” according to Schleede.

Electricity from wind farms also increases the cost of electricity by adding to the burden of keeping the grid in balance and makes it difficult to keep transmission lines from being overloaded. Moreover, mountaintop wind farms require additional transmission capacity, which will only be used between 25 to 35 percent of the time due to wind powers low capacity factors. All of these costs are part of the true cost of wind power, but are usually ignored when the projects are being sold.

Wind power receives generous subsidies from both federal and state governments. The subsidies available to the West Virginia wind farms include federal accelerated depreciation (5 years as opposed to 20 years for other electric generating facilities), production tax credits, a reduction in the West Virginia Corporate Net Income Tax (due to accelerated depreciation), an 87.5 to 93.75 percent reduction in West Virginias Business and Occupation Tax, and a 91.67 percent reduction in West Virginia property taxes.

These subsidies shift the tax burden to other taxpayers and electric customers. As Schleede notes, “The total of $69.7 million in tax liability that could be avoided by the wind farm in the first year, as well as the liability avoided in subsequent years represents a tax burden that would be shifted to remaining taxpayers.”

Global Warming Exonerated in Resurgence of Malaria

“Highland malaria has returned to the tea estates of western Kenya after an absence of nearly 30 years,” begins a new study in Emerging Infectious Diseases, a journal published by the Centers for Disease Control. Many researchers have speculated that the return of this dreaded disease to the East African Highlands is yet another indicator that man is dangerously warming the planet. The new study, however, concludes that, “The results of our work do not support these conclusions.”

The study, led by Dennis Shanks at the U.S. Army Medical Research Unit in Kenya, used long-term malaria illness and total hospital admissions data (January 1966 to December 1995) from a large tea plantation in Kericho, Kenya, located in the Rift Valley highlands. The plantation covers 141 square kilometers and has employed about 50,000 people throughout the study period. The employees receive their health care from the company-operated medical system.

The mean monthly temperature and monthly total rainfall data used in the study came from a meteorological station located at the tea estates, as well as from global climatology data from a larger area of 3,025 square kilometers, which contains the area of the plantation. A secondary variable considered for the study was vapor pressure. The researchers also categorized those months deemed suitable for malaria transmission based on temperature and precipitation thresholds into a monthly suitability index.

What the researchers found was that, “During the period 1966-1995, malaria incidence increased significantly while total (malarial and other) admissions to the tea estate hospital showed no significant change. Measurements of mean monthly temperature and total monthly rainfall [at the local meteorological station] also showed no significant changes.” The study also showed that data from the larger area of study such as “Mean, maximum, and minimum monthly temperatures; precipitation; and vapor pressure all demonstrated no significant trends.” Nor when the “meteorologic data were transferred into months when malaria transmission is possible,” were there significant changes evident.

So what is the cause for the resurgence of malaria transmission in the East African highlands? The most likely culprit is resistance of the disease to the malaria drug chloroquine, especially “since all other relevant environmental and sociological factors are unchanged.” The researchers also note that travel to and from the Lake Victoria region by some of the tea estate workers “exerts an upward influence on malaria transmission in Kericho.” The study notes that similar conclusions have been reached in detailed analyses of other areas of the East African highlands that have experienced a resurgence of malaria transmission.

CO2 Emissions May Lower Ozone and Methane Concentrations

A new study in Nature, posted online in advance of publication, shows newly discovered benefits from anthropogenic carbon dioxide emissions. The studys authors, led by Todd Rosenstiel with the Cooperative Institute for Research in Environmental Sciences, planted three cottonwood plantations of fifty trees each in the forestry section of Columbia Universitys Biosphere 2 Center in Arizona to take advantage of its controlled environment.

One of the plantations carbon dioxide concentration was maintained at 450 parts per million (ppm) while the other two were maintained at 800 and 1200 ppm respectively. What the authors found was that increased concentrations of carbon dioxide lower “isoprene emissions” from the trees leaves by 21 percent at CO2 concentrations of 800 ppm and by 41 percent at 1200 ppm. Isoprene is a highly reactive non-methane hydrocarbon, which is emitted in large quantities from trees and contributes significantly to the production of ground-level ozone.

As the study notes, the expansion of tree farms with fast-growing species “has been suggested as a way to ameliorate increases in atmospheric CO2 concentrations.” Objections have been raised, however, because “the continued expansion of agriforest plantations has the potential to affect significantly the oxidative behavior of the atmosphere, in turn negatively affecting local air quality.” But the new study puts that fear to rest. Not only did higher concentrations of CO2 reduce isoprene emissions, it did so while boosting biomass production by 60 to 82 percent.

Isoprene also boosts the atmospheric lifetime of methane, a greenhouse gas 21 times more powerful than CO2, by 14 percent. So increased carbon dioxide concentrations would also inhibit the negative influence of isoprene emissions on atmospheric methane. The study concludes that, “The negative air-quality impacts (that is, isoprene emissions) of proliferating agriforests may be partially offset by increases in global CO2.”

Announcement

Jesse Ausubel of the Rockefeller University will give a Cooler Heads Coalition congressional and media briefing on “Climate Change: the Known, the Unknown, the Unknowable” on Friday, February 7, from noon to 1:30 in Room 628 of the Senate Dirksen Office Building. Lunch will be provided, and reservations are required. To attend, please contact Paul Georgia at pgeorgia@cei.org or (202) 331-2257. Include your name, telephone number, e-mail address, and institutional affiliation.