Science

A review at CO2science.org of the latest paper by Michael Mann reconstructing historic temperatures (Geophysical Research Letters 30) casts doubt on his continued assertion that the data prove that the warmth in recent decades is unprecedented.

A graph available on the web site (http://www.co2science.org/journal/2003/v6n34c4.htm) reconstructed from Mann’s data shows that, “The end point of their reconstructed global mean temperature history is not the warmest period of the prior 1800 years. In fact, their treatment of the data depicts three earlier warmer periods: one just prior to AD 700, one just after AD 700 and one just prior to AD 1000.”

The review goes on, “The globe only becomes warmer in the 20th century when its measured temperatures are substituted for its reconstructed temperatures. This approach is clearly unacceptable; it is like comparing apples and oranges. If one has only reconstructed temperatures from the distant past, one can only validly compare them with reconstructed temperatures from the recent past.”

Sea Levels See-Saw

An article in Japan’s Kyodo News July 7 attributed a sea level rise in 2002 of over 5 cm greater than the century’s average to global warming. However, the paper added that “the new figure topped the previous highest rise of 5.07 cm recorded in 1948,” also pointing out that the latest rise began only in 1985.

In this regard, an article on sea-level rise in Science (July 1) makes for interesting reading. Scientists are unable to make the “steric” rise (caused by thermal expansion of a warming ocean) now estimated at 0.5 mm/year and the “eustatic” rise (caused by more fresh water) normally estimated at 0.2 mm/year [the IPCC figure] add up to the total IPCC figure for sea-level rise of 1.5-2.0 mm/year during the 20th century. Some have suggested the tide gage readings are predominantly in unusual areas for sea-level readings.

The author believes the explanation is likely more fresh water than thought running off continents, but concludes: “Global coverage by satellite altimetry…shows a notably larger than average level rise in the last decade of the century. The detection of the relatively slow century-scale trend is plagued by the dominance of high (decadal) frequencies in the spectrum of the rate of sea-level variability. It will take several decades to obtain good estimates of the role of global warming in sea level rise.

“In the meantime, 20th century sea level remains an enigma — we do not know whether warming or melting was dominant, and the budget is far from closed.”

According to London’s Guardian (July 19), scientists from Australia’s National Tides Facility suggest that, while the sea level around the Pacific atoll nation of Tuvalu (which is only 3 meters above the sea) have risen by about 5 cm since 1993, this may not be anything to worry about. One scientist said, “We’ve had a large El Nino which appears to have raised sea levels across the western Pacific, so rises in future may well not be as dramatic.” Previous estimates suggested that the aftermath of El Nino could see a fall of up to 30 cm in the waters around Tuvalu.

Nevertheless, the Tuvaluan Congregational Church has asked Australia to give the Tuvaluan government an island to which they can evacuate the entire nation.

Sequestration Appears Sustainable

The idea that carbon sequestration via forests is a sustainable option for reducing the amount of CO2 in the atmosphere has come under attack in recent years. The theory is that new forest growth will quickly become saturated and will start returning stored carbon to the atmosphere by 2050. New research from Luo et al published in Global Biogeochemical Cycles suggests that this may not be the case.

The researchers examined a new forest called Duke Forest established in 1983 in North Carolina. Beginning in 1996, they started enriching 30 meter diameter plots with CO2 to concentration 200 ppm above ambient, while maintaining control plots at the ambient level. The studies revealed “sustained photosynthetic stimulation at leaf and canpy levels which resulted in sustained stimulation of wood biomass increment and a larger C[arbon] accumulation in the forest floor at elevated CO2 than at ambient CO2.”

The researchers then developed a model for studying the long-term sustainability of sequestration. In a scenario where atmospheric CO2 concentration gradually rises from 378 ppm in 2000 to 710 ppm in 2001, they calculated sustained carbon sequestration rising from 69 units to 201 in 2100. (co2science.org, July 16)

Extreme Weather Events Reportedly Increase

On July 2, the World Meteorological Organization (WMO – a UN agency) issued a press release that blamed global warming for an observed increase in extreme weather events, tied to the heat wave in Europe and the busy tornado season in the central US. The WMO also blamed the cooler-than-average spring in the eastern US on increasing temperatures. Hedging their predictions with “mights” and “coulds,” the WMO suggested that, “Recent scientific assessments indicate that, as the global temperatures continue to warm due to climate change, the number and intensity of extreme events might increase.” The press release also claimed that, “considering land temperatures only, last May was the warmest on record.”

However, as John Daly, who runs the Still Waiting for Greenhouse web site from Tasmania, pointed out, much of the seeming increase in extreme weather events could be attributable to increased reporting of the events rather than to an actual increase in their occurrence. When this possibility was put to the Director of the World Climate Program for the WMO, Ken Davidson, he replied, “You are correct that the scientific evidence (statistical and empirical) are (sic) not present to conclusively state that the number of events have (sic) increased. However, the number of extreme events that are being reported and are truly extreme events has increased both through the meteorological services and through the aid agencies as well as through the disaster reporting agencies and corporations. So, this could be because of improved monitoring and reporting.”

Daly also pointed out that, although the scattered surface temperature stations in their urban heat islands may have suggested that May was the warmest month on record, the satellite temperature measurements place this May as only the 4th warmest in the last 25 years. Finally, Daly reminded us that recent temperatures are influenced upwards by the current El Nino. (<http://www.john-daly.com>)

Etc.

Does McCain-Lieberman Cover This?

Tristram West of the Oak Ridge National Laboratory has calculated that barbecue grilling on Independence Day burns the equivalent of 2,300 acres of forest and consumes enough energy to power a town the size of Flagstaff, Ariz., for an entire year.

Not only that, but the grilling emits 225,000 metric tons of carbon dioxide into the atmosphere. West based his calculations on the relative use of gas and charcoal grills. If the nearly 34 million liquefied petroleum and natural gas grills used on July 4 were instead charcoal grills, they would emit an additional 89,000 metric tons of carbon dioxide, West said (a 40 percent increase in emissions). If, however, the nearly 23 million charcoal grills were fueled by liquefied petroleum gas, carbon dioxide emissions could be reduced by about 26 percent, or about 59,000 metric tons. (Eurekalert press release, July 3).

Less Developed Argument

Klaus Toepfer, head of the UN Environment Program, appears to want to keep the Chinese people in poverty. According to Reuters (July 17), he criticized Chinese plans for economic growth on Malthusian grounds, arguing that the world does not possess enough resources to meet China’s aim of quadrupling its economy by 2020.

Saying that this was part of the “rationality of economics,” Toepfer also appeared to cast doubt on the idea that anywhere else had achieved economic progress. He pronounced, “Quadrupling the GDP of a country of 1.3 billion, can you imagine what are the consequences if you go in the same structure as was done in the so-called developed countries?” Toepfer was speaking from Australia. He presumably arrived there by so-called airplane, rather than by outrigger canoe.

Land of the Midsummer Snow

[Editor’s note: Cooler Heads does not stoop to the methods of global warming alarmists, who send out a press release every time there’s a hot spell. We therefore make no claims for the following item. It’s merely anecdotal and tells us nothing about global temperature trends.]

An inch of snow fell on July 16 at the headquarters of Denali National Park in central Alaska. The Fairbanks Daily Miner-News (July18) reported that it was the first snow ever recorded there during the entire month of July. The high temperature of 42 degrees F was also the lowest ever recorded in July.

Similar low temperature marks were set throughout central Alaska. In Fairbanks, the high was 48 degrees, which is only the third July day in 99 years that the thermometer hasn’t reached 50 degrees. Snow was also reported for the first time ever in July at several other locations. According to the U. S. Naval Observatory web site, on July 16 the sun rose in Fairbanks at 3:56 AM and set at 11:56 PM.

Announcements

* “The Climate Conflict”, the award-winning Danish documentary, will be aired on the Science Channel on Friday, August 8, from 9 to 10 PM ET. It will also be shown five times on Saturday, August 9, at midnight, 5 AM, 8 AM, 1 PM, and 4 PM. The broadcast schedule may be consulted at <http://science.discovery.com>. The Cooler Heads Coalition has sponsored two showings of “The Climate Conflict” with introductory remarks by solar physicist Paal Brekke of the European Space Agency, who is interviewed in the documentary. The Science Channel is broadcasting a new English-language version that has been updated. Danish scientist Henrik Svensmark’s theory that solar variation is the main climate driver is investigated.

* The Fraser Institute in Canada has published a paper titled “Greenhouse Gas Reductions: Not Warranted, Not Beneficial” by Dr. Kenneth Green, Fraser’s chief scientist and director of its Risk and Environment Centre. The paper may be found on the web at http://www.fraserinstitute.ca/admin/books/files/Climate.pdf.

* The Center for Science and Public Policy, a project of Frontiers of Freedom Institute, has published a paper titled “EPA Mercury MACT Regulation Rulemaking Not Justified by Science.” Its authors are Dr. Willie Soon and Robert Ferguson. It will soon be posted at www.ff.org <http://www.ff.org>.

Corrections

* In our article in the last issue on S. 139, the Lieberman-McCain bill to regulate CO2 emissions, we said that national disposable income “would take fifteen years to return to the amount reached in 2000” and that, “By 2025, the country’s GDP would be $106 billion lower in real terms than it is today.” Both these statements are incorrect, owing to a misreading of graphs in the report. They should read: “would take fifteen years to return to the levels envisaged without McCain-Lieberman,” and “… would be $106 billion lower in real terms than it would have been without McCain-Lieberman.”

* In vol. VII, no. 12, we also undercounted the number of scientists who signed the open letter to Canadian PM-in-waiting Paul Martin (now available on the web at <http://www.sepp.org/NewSEPP/LttrtoPaulMartin.html>). There were 46 signatories at our last count.

We regret the errors and thank our readers for pointing them out. We encourage our readers to point out any future errors.

Cosmic Influence on Climate

In new research published in GSA Today, a publication of the Geological Society of America, researchers Nir Shahiv and Jan Veizer conclude that cosmic rays emanating from dying stars account for 75 percent of the change in the Earths climate over the past 500 million years. This means that carbon dioxide accounts for much less of the recent mild warming trend than commonly postulated.

The theory is that cosmic rays increase the number of charged particles in the atmosphere, which then leads to the formation of more low-level clouds that cool the atmosphere. Shaviv and Veizer have put together a model that looks at the interaction of cosmic rays with historical climate data.

The researchers were able to place an upper limit on the role of CO2 that translates to a temperature increase of about 0.75 C. associated with a doubling of CO2. This is about one-third the amount of radiative forcing assumed in most general circulation models. The findings are also consistent with the suggestion that much of the warming seen over the last century is associated with increased solar activity rather than greenhouse gases. (Nature, July 8).

Mann Reacts to Paleoclimate Study

Michael Mann of the University of Virginia and a group of other paleoclimatologists have responded to the recent study by Wille Soon et al on the evidence that the Medieval Warm Period and subsequent Little Ice Age were worldwide phenomena. The Soon study refutes the “hockey stick” graph contained in the Third Assessment Report of the IPCC, which appears to show that the 20th Century experienced unusual warming.

The critique by Mann et al appears in the July 8 issue of the American Geophysical Union publication Eos. It alleges three main flaws in Soon et als work. First, that it misinterprets proxy data indicative of drought or excess moisture as evidence of temperature.

Second, that the specific time periods of warmth or coolness in the two alleged climate eras varied from place to place, meaning that applying the labels “medieval warm period” or “little ice age” to them reflected Eurocentrism. The claim of Eurocentrism, at least concerning the Little Ice Age, was demolished in a book, The Little Ice Age, published in 2001 by archeologist Gale Christensen, which finds evidence for that climatic event all over the world.

Finally, Mann alleges that using the entire twentieth century as the temperature base with which to compare previous periods is inappropriate. Soon and his colleagues are preparing a scientific response to the criticisms, which they hope will be published in Eos.

Scientific American Charge Refuted

In a sidebar to an article also criticizing the Soon study in the June 24 issue, Scientific American repeated allegations that the publication of the study in the journal Climate Research was influenced by politics. The sidebar suggested that peer review had failed in this instance.

However, as Ross McKittrick pointed out in a July 10 article on Tech Central Station, “Prof. Otto Kinne, the Director of Inter-Research (the publisher of Climate Research) personally reviewed the file, including the four referee reports and the process leading up to the publication decision. He dismissed the misconduct accusation, finding that the article was properly reviewed and that the editor, Prof. Chris de Freitas, did a good and correct job as editor.”

Urban Heat Islands Mean Fewer Ice Storms

A new study in the Journal of Applied Meteorology adds more details about the urban heat island effect. Researchers from the University of Illinois found that large cities such as New York or Chicago experience significantly fewer days of freezing rain and ice storms than surrounding rural areas. Smaller cities experience less of an effect, although it is still noticeable.

Hashem Akbari of the Heat Island Group at the Lawrence Berkeley National Laboratory in California told Nature that studies of this type, “help us understand how heat islands generate their own weather patterns, change wind direction and modify air quality.” As the magazine pointed out, “Some cities are up to 11 C warmer than the surrounding suburbs. Traffic, buildings, and air-conditioning units all release heat. Tarred roofs and roads soak up solar energy which they give up at night, when the largest temperature differences between city and country occur.” (Nature, July 1)

Hydrogen Poses Risks to Ozone Layer

A report in the June 13 issue of Science entitled “Potential Environmental Impact of a Hydrogen Economy on the Stratosphere” suggests that hydrogen fuel cells could pose environmental risks.

The study theorizes that systems of molecular hydrogen (H2) production, storage, and transport will almost certainly involve some of the hydrogen escaping into the atmosphere. Current losses suggest that 10-20% of all H2 will escape, which implies that if all oil or gasoline combustion technologies were replaced with hydrogen fuel cells, anthropogenic H2 emissions would increase four to eight times.

The researchers suggest that the H2 would move up to mix with air in the stratosphere, where it would oxidize to form water vapor. This would result in a cooling of the lower stratosphere and would also enhance the chemical practices that destroy ozone. A fourfold increase in the amount of H2 in the stratosphere would lead to a stratospheric temperature decrease of about 0.25 C and ozone depletion of around five percent. These effects would rise to a 1 decrease and over 15 percent depletion with a sevenfold increase in H2.

The researchers also suggest that an increase in water vapor in the mesosphere could lead to an increase in noctilucent clouds, potentially affecting the earths albedo.

Atmospheric Mercury Declining

A new study published in Geophysical Research Letters (May 22) has found that total gaseous mercury levels have declined since their peak in the early 1980s. Researchers at the Max Planck Institute for Chemistry measured mercury levels in eight locations in both hemispheres as well as eight trans-Atlantic cruises over the course of more than twenty years. The study found that total gaseous mercury levels have declined since their peak in the early 1980s.  The findings correlate well with earlier research that found decreasing levels of mercury deposition dating back as far as fifty years. 

The study called into question the reliability of mercury models, saying that either “the area of man-made to natural emissions (including re-emissions) has been underestimated or the natural emissions undergo large temporal variations.”  If the discrepancy were natural, it would indicate a far greater degree of natural fluctuation than previously believed.

Where Have All the Flowers Gone?

A June 17 story in the Independent of London was headlined, “Global warming may wipe out a fifth of wild flower species, study warns.” The actual study (published in the Proceedings of the National Academy of Sciences), however, suggested rather less.

The scientists looked at the effects of increased temperature, CO2, precipitation, and nitrogen on a small patch of California meadow, divided into plots of approximately one square yard. They found that under certain conditions, including increases in all four factors, the average number of forb species in the plots (small flowering plants like buttercups) decreased from about 4 to 3.2 over three years. This did not mean that the plots became less diverse, however, as other plant species took their place, leading in some cases to an increase in biodiversity.

The studys authors acknowledged that not all areas would respond to the effects of climate change like California meadows, something that did not make it through to the Independents coverage.

Global Warming Caused Permian Mass Extinction, Researcher Warns

A new book from Michael Benton, head of earth sciences at Bristol University in England, suggests that the mass extinction at the end of the Permian era, 250 million years ago was caused by a global warming of about 6 C. When Life Nearly Died: The Greatest Mass Extinction of All Time suggests that a series of volcanic explosions caused a runaway greenhouse effect that led to the death of the vast majority of the species alive at the time.

Professor Benton told the Press Association: “The Permian crisis nearly marked the end of life. It’s estimated that fewer than one in 10 species survived. Geologists are only now coming to appreciate the severity of this global catastrophe and to understand how and why so many species died out so quickly.”

An advisor on the science behind the award-winning TV series Walking with Dinosaurs, Professor Benton is also the author of the Encyclopedia of Awesome Dinosaurs. (Sydney Morning Herald, June 20).

Earth Greening Rapidly Since 1980

Something remarkable happened between 1980 and 2000. Researchers from a variety of institutions published a study, funded by NASA and the Department of Energy, in the June 6 issue of Science that found that, “Global changes in climate have eased several critical climatic constraints to plant growth, such that net primary production increased 6% globally.” The Amazon rain forests accounted for 42 percent of the observed increase in plant growth.

The Christian Science Monitor (June 6) related how unexpected this result was: “The surprise was twofold. The growth rate far exceeded what most scientists expected. Many models indicated that additional growth in the tropics would be minimal, given the fairly constant temperatures from one season to the next. In addition, many researchers had held that any increased productivity in the tropics would largely be driven by a rise in atmospheric CO2 rather than changes in climate itself.”

The scientists found that this increase was not necessarily due to the direct impact of increased take up of atmospheric carbon dioxide (CO2 fertilization). According to Roger Highfield, writing in Londons Daily Telegraph (June 6), “In general, where temperatures restricted plant growth, it became warmer; where sunlight was needed, clouds dissipated; and where it was too dry, it rained more. In the Amazon, plant growth was limited by sun-blocking cloud cover, but the skies have become less cloudy. In India, where a billion people depend on rain, the monsoon was more dependable in the 1990s than in the 1980s.”

Commenting on what the study means for claims about deforestation, the lead author, Dr Ramakrishna Nemani, of the University of Montana, told the Telegraph that “the role of deforestation may have been overplayed a bit,” although he added that he felt that current forests ought to be preserved. Other team members expressed cautionary notes about the study, noting that the sustainability or otherwise of increased vegetation growth had not been assessed.

However, the most interesting comment on the study from one of its authors came from Dr Charles Keeling, of the Scripps Institution of Oceanography, who told the Telegraph that, “The 36 per cent increase in global population, from 4.45 billion in 1980 to 6.08 billion in 2000, overshadowed the benefits that might have come from increases in plant growth.”

Hazy Aerosol Picture Continues

Confusion appears to reign over what the various recent reports on aerosols mean for the debate over global warming (see the past two issues). New Scientist (June 4) reports that “top atmospheric scientists got together, including Nobel laureate Paul Crutzen and Swedish meteorologist Bert Bolin, former chairman of the UN’s Intergovernmental Panel on Climate Change” at a workshop in Berlin in late May to assess the implications of Anderson et al.s Perspectives piece for Science magazine, which cautioned that the sulfate aerosol cooling effect may be greater than models predict.

The Perspectives piece had said that this might mean either that the earths temperature is more naturally variable than thought or that the climate is more sensitive to forcing than thought. The Berlin workshop settled on the latter, and produced the prediction that, when sulfate aerosol production wanes, the earth might warm between 7-10 C based on the IPCCs worst-case scenario. Readers may remember that the worst-case scenario is based on the assumptions that the entire world will raise itself above the current economic output levels of the United States, population will continue to increase rapidly, and there are no major technological advances.

New Scientist admits that the calculations on which these dire predictions were based were “back-of-the-envelope” figures. Despite this extreme uncertainty, Will Steffen of the Swedish Academy of Sciences was quoted as saying that, “The message for policy makers is clear: We need to get on top of the greenhouse gas emissions problem sooner rather than later.”

Effect of Land Use Change on Climate Greater than Thought

Last year, Roger Pielke, Sr., of Colorado State University added another complicating factor to the debate over what causes rising surface temperatures when he coauthored a major study that found that land use changes may be at least as important as greenhouse gas emissions in accounting for climate change. Growing urban areas, deforestation and reforestation, agriculture and irrigation can have strong influences on regional temperatures, precipitation and large-scale atmospheric circulation.

Now, new research from Eugenia Kalnay and Ming Cai of the University of Maryland backs up Pielkes conclusions. By comparing observed surface temperatures with a reconstruction of global weather over the past 50 years, they were able to estimate the impact of land-use changes on surface warming.

They concluded that there had been an average increase of 0.27C in surface temperature per century attributable to urban and other land-use changes. This represents half the observed change in the range of daytime temperatures, and is twice as high as previous estimates based on urbanization alone. (Nature, May 29)

Hazy Picture over Aerosols

Coming hard on the heels of the findings that soot may be responsible for more atmospheric warming than was previously thought (see previous issue), a team of researchers has looked again at the question of how much the atmosphere might be cooled by the presence of sulfate aerosols.

Their research, published in the Perspectives section of Science magazine, compared the likely cooling effects of aerosols worked out from first principles with the likely effects predicted by climate models. They found a discrepancy between the two that they were unable to explain.

The authors argue that their findings suggest that anthropogenic activity will certainly lead to a strong “forcing” of the Earths climate between 2030 and 2050. However, they also admit that the discrepancy means that “the possibility that most of the warming to date is due to natural variability must be kept open.” (Science, May 16)

Etc

Point / Counterpoint

“[Kyoto] is about trying to create a level playing field for big businesses throughout the world.”

– EU Environment Commissioner Margot Wallstrom, (quoted in the Independent, Mar. 19, 2002)

“Of course its about money, about rubles. They are trying to calculate how much [the Kyoto protocol] will give.”

– Wallstrom, in response to Russian reluctance to ratify the protocol, (quoted by Reuters, May 12)

Making the Data Fit the Model

Science magazine claimed, “A stubborn argument against global warming may be discredited by a re-analysis of the data central to its claims,” when it published, via Scienceexpress.org, a paper by Benjamin Santer of the Lawrence Livermore National Laboratory in California on May 1. Santers team was tackling the well-known argument that atmospheric temperature data from satellites fail to show the warming trend found in surface level observations. By comparing a new dataset to a model that predicts warming in the troposphere, he was able to claim that his team had detected a warming trend of 0.1 degrees Celsius per decade.

This is considerably above the level of +/- 0.05 degrees C per decade previously accepted as demonstrated by the satellite data. The standard dataset is produced by a team at the University of Alabama in Huntsville (UAH), led by John Christy. The new dataset, produced by Frank Wentz of Remote Sensing Systems (RSS) of Santa Rosa, California, is based on the idea that variations between satellites and their orbits can cause variations in the data that need to be accounted for. The RSS data remain unpublished, however, and Christys team has amended its data to account for the factors highlighted by Wentz.

The new study by Santer is based on the idea that there must still be something wrong with the UAH dataset because it fails to match the consequences for the troposphere proposed by the climate model Santer uses. That model appears to predict consequences for the stratosphere quite accurately. Because the RSS data match the predicted warming trend better, Santer suggests that the failure to find a warming trend in the UAH data may be due “an artifact of data uncertainties.”

Christy, however, was already undertaking a rigorous analysis of the UAH data to estimate its error range. His re-examination was published in the May 2003 Bulletin of the American Meteorological Society. By comparing the satellite data to independent data obtained from weather balloons, he was able to re-affirm the reliability of the UAH data. Santers paper suggested that there might be a problem with the balloon data as well.

Christy cast doubt on the reliability of Santers model, telling Reason magazine science correspondent Ron Bailey, “Its a lot easier to model the stratosphere because you only have to consider radiational effects. The troposphere is much messier. It contains complicated things likes clouds, convection, moisture and dust.”

He went on to tell the Oakland Tribune, “It does not bother me that our data do not agree with their virtual model of the world Its a curious way to do science, to use a model to verify data rather than the other way around. If you follow this too far down that road, youre in danger of saying, Its my theory thats correct and the real world thats wrong.” (Ron Bailey, Tech Central Station, May 1).

Soot May Pose More Problems than Previously Thought

The IPCCs Third Assessment Report issued in 2001 argued that “sulfate aerosol” emissions from burning coal helped cool the atmosphere, accounting for the lower than expected warming trend so far detected. Further research, however, from such individuals as James Hansen of the Goddard Institute for Space Studies, suggested that the presence of “black carbon” aerosols soot raises atmospheric temperatures as the particles absorb solar radiation. It was initially thought that the cooling effect of sulfate aerosols and the warming effect of black carbon cancelled each other out.

Hansen and his colleagues published further research on the subject in the May 13 issue of the Proceedings of the National Academy of Sciences. The paper looked at how smoke and other black carbon in the atmosphere interacts with sunlight and chemicals to contribute to climate change. The research team, from NASA, Columbia University and Lawrence Berkeley National Laboratory, found that “a rapid rise in worldwide temperatures over the last 50 years may be largely due to smoky particles in the air.”

Co-author and NASA scientist Dorothy Koch told the Los Angeles Times, “All black carbon does is absorb sunlight If you put more into the atmosphere, you increase the warming.” Other experts, like Stanford University climatologist and leading global warming alarmist Stephen Schneider, urged caution.

The study speaks directly to one of the central issues surrounding the adoption or rejection of the Kyoto Protocol. One reason the administration has given for rejecting the deal is the large accumulation of atmospheric pollutants over southern Asia known as the “Asian Brown Cloud.” The presence of the two-mile thick phenomenon appears to be due to forest fires, the burning of wood and dung in stoves and in the increased use of fossil fuels in developing countries. The Kyoto Protocol exempts developing countries from having to reduce their emissions from such sources. (Greenwire, May 6 2003).

Announcement

With this issue, Iain Murray becomes Managing Editor of Cooler Heads. He has also joined CEI as a Senior Fellow in Environmental Policy. Iain was formerly Director of Research at the Statistical Assessment Service, where he examined a wide range of scientific problems in public policy and the media.

Iain writes regularly for Techcentralstation.com and for United Press International. As a British citizen, he worked for the UK Department of Transport on railroad privatization before coming to the US in 1997. He holds an MA from Oxford University, an MBA from London University, and the Diploma of Imperial College of Science, Technology, and Medicine.

A study by Joel Schwartz challenges the scientific basis of both the Bush Administrations Clear Skies Initiative and Senator Jim Jeffordss (I-Vt.) Clean Power Act. The analysis has implications for climate policy because Jeffordss legislation includes regulation of carbon dioxide emissions and Bushs plan could serve as a proxy climate policy by forcing utilities to close coal-fired power plants in order to reach the limits on mercury emissions.

Clear Skies and Clean Power would impose tough new controls on power plants to reduce levels of fine particle (PM2.5) pollution, which both sides claim kills tens of thousands of people per year. Supporters of these bills promise substantial benefits from additional PM reductions.

Schwartzs new study, published by the Competitive Enterprise Institute, argues that Clear Skies and Clean Power rest on a weak scientific foundation. The U.S. Environmental Protection Agency (EPA) based its new annual fine PM (PM2.5) standard on a study known as the American Cancer Society (ACS) study of PM and mortality, which assessed the association between the risk of death between 1982 and 1998 with PM2.5 levels in dozens of American cities.

Although the ACS study reported an association between PM and mortality, some odd features of the ACS results suggest that PM is not the culprit. For example, higher PM levels increased mortality in men, but not women; in those with no more than a high school degree, but not those with at least some college; in former-smokers, but not current- or never-smokers; and in those who said they were moderately active, but not those who said they were very active or sedentary.

These odd variations in the relationship between PM2.5 and mortality seem biologically implausible. Even more surprising, the ACS study reported that higher PM2.5 levels were not associated with an increased risk of mortality due to respiratory disease; a surprising finding, given that PM would be expected to exert its effects through the respiratory system.

EPA also ignored the results of another epidemiological study that found no effect of PM2.5 on mortality in a cohort of veterans with high blood pressure, even though this relatively unhealthy cohort should have been more susceptible to the effects of pollution than the general population. The evidence therefore suggests that EPAs annual standard for PM2.5 is unnecessarily stringent. Attaining the standard will be expensive, but is unlikely to improve public health.

Air pollution has declined dramatically over the past 30 years, and will continue to decline, both because more recent vehicle models start out cleaner and stay cleaner as they age than earlier ones, and also because already-adopted standards for new vehicles and existing power plants will come into effect in the next few years.

If policymakers feel they must do something to speed up PM reductions, Schwartz advises they offer people tax incentives to scrap high-polluting older vehicles that account for a substantial portion of ambient PM levels in metropolitan areas. This flexible, cost-effective approach is more likely to result in net public health benefits than either Clear Skies or Clean Power.

The study is available at http://www.cei.org/gencon/025,03452.cfm.

Announcement

The Cooler Heads Coalition and the George C. Marshall Institute will host a congressional staff and media briefing on Friday, May 16, from Noon to 1:30 PM in Room G-50 of the Senate Dirksen Office Building. Dr. Willie Soon, a research physicist with the Harvard-Smithsonian Center for Astrophysics, will speak on, “Was the Twentieth Century Climate Unusual? Exploring the Lessons and Limits of Climate History.”

Lunch will be provided, and reservations are required. To register, please telephone the Marshall Institute at (202) 296-9655 or e-mail them at info@marshall.org.

Soons talk will be based on a recent major review article of which he was the lead author. See the April 16 issue for more details. The article has been posted on the web at http://cfa-www.harvard.edu/~wsoon/1000yrclimatehistory-d/. A less technical version is available on the Marshall Institutes web site at www.marshall.org/pdf/materials/136.pdf.

Evidence Confirms Existence of Naturally-Occurring, Large-Scale Climate Change

A new study reviewing over 240 climate studies shows that the 20th century is neither the warmest century nor the century with the most extreme weather over the last 1000 years as has been argued by some scientists. Michael Mann and his collaborators, for example, have published studies using proxy data to reconstruct past global temperatures suggesting that the Medieval Warm Period never existed and that the Little Ice Age was not a global phenomenon. They also argued that the 20th century was the warmest in the last 1000 years. Their research and the resulting “hockey stick” graph showing significant and anomalous 20th century warming were featured prominently in the IPCCs Third Assessment Report.

The new study, conducted by Willie Soon and Sallie Baliunas with the Harvard-Smithsonian Center for Astrophysics, Craig and Sherwood Idso with the Center for the Study of Carbon Dioxide and Global Change, and David Legates with the Center for Climatic Research at the University of Delaware, sets out to answer three questions.

1) “Is there an objectively discernible climatic anomaly occurring during the Little Ice Age, defined as 1300-1900 A.D.?”

2) “Is there an objectively discernible climatic anomaly occurring during the Medieval Warm Period, defined as 800-1300 A.D.?”

3) “Is there an objectively discernible climatic anomaly occurring within the 20th century that may validly be considered the most extreme (i.e., the warmest) period in the record?”

To answer these questions, the researchers reviewed the extensive scientific literature on the available climate proxy data. “Many true research advances in reconstructing ancient climates have occurred over the past two decades,” said Soon, “so we felt it was time to pull together a large sample of recent studies from the last 5-10 years and look for patterns of variability and change.”

The researchers analyzed numerous climate indicators including: borehole data; cultural data; glacier advances or retreats; geomorphology; isotopic analyses from lake sediments or ice cores, tree or peat celluloses (carbohydrates), corals, stalagmite or biological fossils; net ice accumulation rates, including dust or chemical counts; lake fossils and sediments; river sediments; melt layers in ice cores; phenological (recurring natural phenomena in relation to climate) and paleontological fossils; pollen; seafloor sediments; luminescent analysis; tree ring growth, including either ring width or maximum late-wood density; and shifting tree line positions plus tree stumps in lakes, marshes and streams.

What they found from analyzing many more proxy records than did Mann flatly contradicts his conclusions. “Climate proxy research does yield an aggregate and broad perspective on questions regarding the reality of the Little Ice Age, the Medieval Warm Period and the 20th century surface thermometer global warming. The picture emerges from many localities that both the Little Ice Age and Medieval Warm Period are widespread…. [and] are worthy of their respective labels. Furthermore, thermometer warming of the 20th century across the world seems neither unusual nor unprecedented within the more extended view of the last 1000 years. Overall, the 20th century does not contain the warmest or most extreme anomaly of the past millennium in most of the proxy records” (Energy and Environment, March 2003).

Urban Heat in Houston

The urban heat island (UHI) effect is a well known phenomenon that causes an upward bias in the temperature records used to determine whether the earth is warming up. Because concrete, asphalt and steel retain heat better than natural landscapes, cities are warmer than their surrounding areas. As cities grow this temperature bias grows also, making it difficult to determine whether global temperatures are increasing due to greenhouse gases or urbanization. Attempts have been made to account for these biases, but there is simply no way to know whether the methods used are adequate.

New research appearing in the March issue of Remote Sensing of Environment, suggests that the UHI can have a large influence on temperatures. David Streutker with the Department of Physics and Astronomy at Rice University, used satellite remote sensing to determine the effect of urban growth on temperatures in Houston, Texas. Data was collected from two discrete time periods, March 1985 to February 1987 and July 1999 to June 2001. Eighty-two relatively cloud free images were obtained from the first period and 125 were obtained for the second period.

What Streutker found upon analyzing the data is that the temperature from rural areas around Houston experienced no change in temperature. But for the Houston metropolitan area, “Over the course of 12 years, between 1987 and 1999, the mean nighttime surface temperature heat island of Houston increased 0.82 0.10 [degrees C] in magnitude.” This is a rather striking increase, especially when compared to the general belief that global temperatures have only risen about 0.6 degrees C over the last 100 years. Streutker also noted that, “The growth in UHI, both in magnitude and spatial extent, scales roughly with the increase in population, at approximately 30 percent.” World population has increased 280 percent over the last century.

Irrigation Blamed for Warming San Joaquin Valley

A new study funded by the National Science Foundation suggests that the warming experienced in the San Joaquin Valley in California is due to increasing amounts of irrigated land rather than carbon dioxide emissions. One of the researchers, John Christy, director of the Earth System Science Center at the University of Alabama in Huntsville, said that increases in humidity from irrigation may be the culprit in rising temperatures.

“One of the big issues right now is human-induced climate change from carbon dioxide,” said Christy. “Temperatures in the valley are used to assess climate change, and the typical view has been that if there is warming, it must be due to carbon dioxide, therefore we must reduce the use of fossil fuels. Actually, it appears temperature in the valley could be due to a different human factor, and that is irrigation.”

Night-time temperatures in the valley have risen by about 4 degrees over the past 70 years, according to Christy. The study area contains two million irrigated acres and the resulting increases in humidity may be preventing nighttime air from cooling off. “The evidence shows that if this were a large scale climate change caused by carbon dioxide, it would affect the valley, the foothills and the mountains. But we have not seen these changes in the higher elevations,” said Christy.

Environmentalists questioned the usefulness of the research. Bernadette Del Chiaro, a spokeswoman for Environment California worried that this might derail efforts to suppress energy usage. “There may be multiple factors causing climate change,” said Del Chiaro. “But regardless, there is still the larger problem, which is our dependence on fossil fuels.”

When the United Nations Intergovernmental Panel on Climate Change released its Third Assessment Report (TAR) in 2001, many were surprised that its projections for temperature increases had risen substantially. The IPCCs 1996 Second Assessment Report (SAR) predicted that the earths temperature could increase by as much as 0.9 to 3.5 degrees Celsius. The TAR, however predicted a rise of 1.4 to 5.8 degrees C. In a paper published in the Journal of Climate (October 15, 2002), Thomas Wigley with the National Center for Atmospheric Research and Sarah Raper with the Climatic Research Unit at the University of East Anglia, ask the question, “Why are the more recent projections so much larger?”

The authors attempt to quantify how much of the change in projections was due to the new emissions scenarios presented in the IPCCs Special Report on Emissions Scenarios, and how much was due to differences in the science used in the climate models. To determine this, the authors plugged the emissions scenarios responsible for the high and low ends of the temperature projections into the models used for the TAR, what Wigley and Raper call the “TAR science.”

For the TARs high end, coal-intensive scenario, the “CO2 concentrations are remarkably similar” to those used for the high-end scenario in the SAR. The biggest differences between the two high-end scenarios are the assumptions about sulfate aerosol concentrations, which are thought to offset warming. “The large aerosol forcing differences arise because the SRES scenarios account for likely policy responses to sulfur pollution…. This leads to substantially lower SO2 emissions than for the [SAR scenarios].” There are also some differences in methane forcing and tropospheric ozone forcing. This exercise revealed a difference in forcing from changes in the TAR greenhouse gas cycle from 0.5 Watts per meter squared (W/m2) at the low end to 2 W/m2 at the high end.

The differences in science between the two reports refer to changes in the way the models handle complex climate processes. So it is not so much a change in science as a change in modeling. To determine how these changes affect the projections, Wigley and Raper compare the low and high-end scenarios using SAR science and TAR science. What they found was that “the effects on concentration projections for any give emission scenario are relatively small.”

In fact, there was actually a reduction in CO2 forcing combined with increased warming. This was due primarily to two thingsa change in a parameter that defines the relationship between CO2 concentrations and forcing and a change in how the thermohaline circulation (THC) was modeled. A slowdown in the THC, for example, would offset some of the projected warming due to higher greenhouse gas concentrations. In the TAR, the THC will not slow down as much as assumed in the SAR.

The result of these exercises reveals that very little of the change in temperature projections is due to changes in scientific understanding or better modeling, but due almost entirely to different emissions scenarios. “At the low warming limit, TAR science inflates the 1990-2100 warming for the [low-end SAR scenario] by around 34 percent,” says Wigley and Raper. “At the high end, TAR science inflates the 1990-2100 warming for the [high-end SAR scenario] by around 4 percent.” The rest of the high-end alarmist projection comes from changes in the worst-case storyline, which has little basis in reality. A full 79 percent of the change at the high-end projection came from the changed assumptions about sulfate aerosols alone, about which we know very little.