Science

Russia Wants to Burn More Coal

Russian President Vladimir Putin wants his country to make greater use of its coal reserves, which are estimated at 3 trillion tons. But this would make it difficult for Russia to meet its Kyoto target, not to mention that it would eat up all of its available “hot air” emissions credits that many countries are counting on to meet their own targets.

The Kyoto Protocol cannot come into effect unless either the U.S. or Russia ratifies the treaty. Since the U.S. has already said it wont ratify, it is up to Russia to win the day. At the World Summit on Sustainable Development in Johannesburg, Russian Prime Minister Mikhail Kasyanov said that “ratification will take place in the very near future.” But Putins call for exploiting coal puts this commitment into doubt yet again.

Environmentalists are not pleased. “By preparing to burn more coal for its energy needs, Russia aims to free more natural gas for lucrative exports to Western markets,” said Natalia Olefirenko, climate programs coordinator with Greenpeace Russia. “It is a flawed approach, and it amounts to a sell-out of the Russian environment because growing use of coal is likely to adversely affect the countrys ecological balance and cause acid rain” (Asia Times, October 3, 2002).

EU to Miss Kyoto Target

The International Energy Agencys Chief Economist, Fatih Birol, said that the European Union will not be able to meet its Kyoto targets even with new policies to promote the use of renewable energy.

Even if the EU were able to increase the share of renewable energy to produce electricity to 30 percent by 2030, it would still fall short of Kyoto. “Fossil fuels will still dominate,” said Fatih. “Even with these alternative policies [on renewables] we dont reach the Kyoto targets.”

In a business as usual scenario, the EUs emissions of carbon dioxide would rise to 3,146 million tons in 2010 and to 3,829 by 2030, compared to the 1990 baseline of 3,080 tons. With the above-mentioned renewable policies, emission would only be 4.9 percent less than the business as usual scenario in 2010, but still higher than the 1990 baseline. In 2030, emissions would be 19 percent less than business as usual, but still higher overall (Reuters, October 2, 2002).

EU to Chase the Hydrogen Holy Grail

The European Union has announced that it will initiate a major investment project to develop hydrogen power. According to European Commission President Romano Prodi, this effort will be just as important to Europe as the space program was for the United States in the 1960s, but “We expect an [even] better technological fallout,” he said.

The EU plans to spend $2.09 billion from 2003 to 2006 on hydrogen-related renewable energy development. Apparently the EU sees this as a hydrogen race with the U.S. Earlier this year the Bush administration launched the Freedom Car project, a fuel cell research effort, and has asked Congress to provide $150 million in funding.

But there is little reason to believe that such efforts will yield dividends. As noted by the Wall Street Journal (October 16, 2002), “Meanwhile, for all the hope surrounding hydrogen, it is still years, if not decades, away from making significant inroads into the power and transport markets, which currently account for most of the worlds oil and gas use.” Hydrogen is much more expensive than traditional fuels and would require massive infrastructure investments.

What the Journal fails to mention is that hydrogen is not an energy source, but merely a way to store and transport energy. That is due to the fact that there are no free standing sources of hydrogen. It must be extracted from water or other sources, which requires energy. It then must be re-oxidized to extract the energy. Energy is lost throughout the process, so that when all is said and done, less energy is produced than was used to get the hydrogen in the first place. The only way to overcome that is to circumvent the laws of thermodynamics.

Solar Power Plant Not So Impressive

Arizona Public Service Co. has begun construction on what will become one of the largest solar power plants in the world and could supply electricity for up to 3,000 homes. Solar power advocates are “oohing” and “aahing” over the power plant, but even this solar power behemoth is not that impressive.

Herb Hayden, director of APSs solar energy program, acknowledges that the biggest obstacle for solar power is that it costs twice as much as electricity produced from conventional fuels, but he thinks the project will break even. APS had been able to cover most of its costs on other solar projects by selling the power at a premium through its Solar Partners Program. Participating customersabout 3,000pay a monthly premium (read donation) for 15-kilowatt-hour blocks of renewable power. “It doesnt cover the costs, but it shows the product has value,” Hayden said.

The power plant, which will eventually produce 5.5 MW of electricity, will occupy 50 acres near the airport in Prescott, Ariz. By comparison, a small-to-midsize natural gas power plant that produces 250 MW of electricity occupies only a few acres. To equal that output using solar technology would require the use of about 2,300 acres.

APS currently has seven other solar power projects which produces a total of 1.7 MW of electricity and has committed to spending $12 million a year on solar power through 2004. A large part of the cost will be covered by surcharges on electricity bills, from 35 cents per month for residential customers to $39 per month for industrial customers. The surcharge raises about $20 million per year.

Hayden may be forgiven his somewhat misguided optimism since his company has little choice in the matter. The Arizona Corporation Commission approved in 2000 the Environmental Portfolio Standard, which forces investor-owned utilities to generate at least 1 percent of the electricity they sell from renewables. APS must generate 25 MW of electricity from renewable resources. They might as well smile about it.

Melting sea ice in the Arctic is claimed to be one of the major signals that man is dangerously warming the planet. Indeed, one of the more amusing episodes of the global warming chronicles was the discovery of open water at the North Pole, which the New York Times claimed had not been seen for 50 million years. It had to retract the story, of course, because open water at the North Pole in the summer is not unusual. It is becoming less and less clear, however, whether melting Arctic sea ice is a symptom of warming temperatures or the cause.

A study in the Journal of Climate (September 2002) finds that global warming is not the cause of melting sea ice, but that melting sea ice is causing the warming. Changes in sea ice extent, it turns out, are related to the well known phenomenon known as the Arctic Oscillation (AO) index, which is a measure of atmospheric circulation in the Arctic. Much of the observed thinning in the Arctic is due to changing wind patterns that rearrange the sea ice. When the AO index is in its positive phase, sea ice thins and retreats; in its negative phase it thickens and advances. The AO is an entirely natural process for which we have measurements for the last 100 years. Its current values are about the same as they were 100 years ago.

The study concludes, “Intuitively, one might have expected the warming trends in SAT [surface air temperature] to cause the thinning of sea ice, but the results presented in this study imply inverse causality; that is, that the thinning ice has warmed SAT by increasing the heat flux from the ocean.” In other words, a change in the AO index thins the ice which exposes the warmer ocean water to the cold air and warms it. It turns out that melting Arctic sea ice is responsible for Arctic warming, not the reverse.

Announcement

The Cooler Heads Coalition will hold a screening of The Climate Conflict, on October 7, from 3:15 to 4:15 p.m., in Room 385 of the Senate Russell Office Building. The Climate Conflict is an award-winning Danish documentary about the global warming debate in general and about the role of solar variability in particular. Although it has won six awards in Europe and has been shown on major networks in most European countries, no American network or station has picked it up. Our special screening will be introduced by Dr. Paal Brekke of the European Space Agency, who is also interviewed in the film.

New York Wind Farms a Bad Decision

In August, New York Governor George Pataki announced a $17 million aid package to four private companies to develop wind farms in various parts of the state. But, according to Glenn Schleede, president of Energy Market & Policy Analysis, New Yorkers should be wary of the environmental claims of wind power.

The New York Energy Plan estimates that the eight wind farms, with a combined 250 wind turbines, would produce approximately 900,000 kilo-watt hours (kWh) of electricity per year. But this is a drop in the bucket compared to the states total electricity demand. For example, this amount equals 58/100 of 1 percent of the total electricity imported into New York in 2000. It is only 15 percent of the energy that will be produced from a single gas-fired combined cycle plant that is scheduled to come online in Athens, NY in 2003.

The wind power industry often claims that “electricity generated by the wind turbines will displace on a kWh for kWh basis electricity that would be generated by fossil-fuel generating units and any associated emissions.” But that simply is not true, says Schleede. “Such claims are generally exaggerated. For example, they do not take into account that any fossil-fueled generating unit that is kept available to back up the intermittent electricity from the wind farm will be giving off emissions while it is running at less than peak efficiency or in spinning reserve mode. Nor do they take into account the fact that other alternatives for reducing emissions are likely to be far more cost-effective.”

New Yorkers should also be aware that there is growing opposition to wind farms wherever they are proposed, in Europe, Australia and in nearly every state in the U.S., says Schleede. “Opposition is due to a variety of reasons including scenic and property value impairment, noise, bird kills, flicker effects of spinning blades after sunrise and before sunset, potential safety hazards from blade and ice throws, interference with telecommunications, and higher costs of electricity.”

Full Expensing of Capital Will Reduce Carbon Intensity

Several climate-related initiatives pose a serious threat to Americas economic future, according to Marlo Lewis, a senior fellow at the Competitive Enterprise Institute. One such scheme is President Bushs proposal to expand the Department of Energys Voluntary Reporting of Greenhouse Gases program to include the awarding of transferable carbon credits for voluntary greenhouse gas reductions.

Currently, the DOE program is a simple voluntary reporting program with no regulatory significance. But, says Lewis, writing for Tech Central Station (September 10, 2002), the addition of the awarding of credits to companies that report greenhouse gas reductions will corrupt the “politics of U.S. energy policy” and “grow the greenhouse lobby.”

Under Bushs proposal, companies that begin to comply with Kyoto before it is ratified would be awarded credits that they could sell or use to offset future regulatory obligations. In the absence of a regulatory cap on carbon emissions, the credits are worthless. Only if Kyoto or a similar regulatory program were enacted would the credits yield dividends. “Credit-holders thus acquire cash incentives to support Kyoto, or lobby for its domestic equivalent,” says Lewis.

A credit scheme would be a zero-sum game where one companys gain is anothers loss. Every credit awarded in the voluntary early action period is one that wont be available during the mandatory period. Companies that dont or cant “volunteer” to reduce greenhouse gas emissions now will be penalized later under the mandatory cap, which means that the program isnt really voluntary.

Lewis argues that the Bush administration should stop legitimizing climate alarmism by playing games within the Kyoto framework. Instead, it should embrace non-regulatory, pro-growth policies that would also have the side benefit of reducing carbon intensity. Bush should lower tax barriers to investment by allowing companies to “deduct from current-year revenues, the full cost of capital investment,” says Lewis. Replacing the current system of capital depreciation with full expensing for all types of capital investment would eliminate barriers to economically efficient capital turnover.

Scientists Still Baffled by Surface- Atmosphere Discrepancy

A new study in the September 2002 issue of the Journal of Climate takes another look at the discrepancy in temperature trends between the surface, measured by ground-based thermometers, and the atmosphere (more specifically the troposphere), measured by satellite-borne instruments, and concludes that we dont know why there is a discrepancy.

The temperature differential between the surface and the atmosphere is known as the lapse rate. From 1964 to 1979 the lapse rate decreased, meaning that surface and atmospheric temperatures were converging. However, beginning in 1980 the lapse rate began to increase and has continued to do so until the present time. Much of the winter-to-winter lapse rate variability in the high latitudes is dynamically induced, according to the study, but most of the change in lapse rate is over the lower latitudes or tropics.

The researchers, Gabriele C. Hegerl of Duke University and John M. Wallace of the University of Washington, attempted to account for this change by comparing the pattern to El Nio southern oscillation and other factors, but found that, “Trends in these patterns can account for only a small fraction of the observed trend in lapse rate.”

The researchers then ran the data through a climate model, both a control run and a run with greenhouse gas and aerosol forcings, which did a decent job of simulating short-term, monthly changes in lapse rate, but failed to simulate decadal scale changes. The model shows a tighter long-term coupling between the surface and atmospheric temperatures than is observed in nature. As this study shows, our understanding of heat transfer between the surface and atmosphere is still incomplete, and until this problem is resolved there is little hope that climate models can tell us anything about what the climate may be like in 10, 50 or 100 years.

Etc.

  • The September 2002 issue of The Washington Monthly ran an article reminiscent of the “ozone hole over Kennebunkport” flap under Bush I about the possible effects of global warming on President George W. Bushs ranch in Crawford, Texas.

The article begins with an account of British Prime Minister Tony Blairs visit to the Bush ranch for a meeting with the “cowboy president.” President Bushs plans to take Blair on a tour of the ranch were ruined by severe thunderstorms and golf-ball-sized hail. The article fingers global warming as the culprit. “But that possibility apparently seemed as remote to Bush as the likelihood that the storm was a sign from God,” it said.

Theres a good reason why this twaddle may not have crossed President Bushs mind. It turns out that, according to data from the United States Historical Climatology Network, its getting cooler around Crawford. The nearest long-term temperature station to the Bush ranch is in Temple, Texas, 34 miles south of Crawford. It shows a cooling trend since 1890, and since 1920 the yearly average temperature has fallen by well over 2 degrees Celsius.

Announcements

The Cooler Heads Coalition will hold a congressional and media briefing by Professor Richard S. Lindzen of MIT on September 30 from noon to 1:30 PM in Room 345 of the Cannon House Office Building. Lunch will be provided. Reservations are required. Those wishing to attend should e-mail their name, affiliation, and phone number to mebell@cei.org or telephone Myron Ebell at (202) 331-1010, ext. 216. Dr. Lindzen will be speaking “On the meaning of global warming claims.”

A study in the August 28 issue of Geophysical Research Letters finds that there is a serious error in the global circulation models when it comes to predicting temperatures in the Earths polar regions. The study measured atmospheric temperatures, at the stratopause and mesopause regions (the atmospheric layers at about 30 and 50 miles altitude respectively), at the Earths poles. What the researchers found was that atmospheric temperatures at the South Pole are about 40-50 degrees Fahrenheit cooler than model predictions.

“Our results suggest that wintertime warming due to sinking air masses is not as strong as the models have assumed,” according to Chester Gardner, a professor of electrical and computer engineering and coauthor of the study. “But in all fairness, since no one had made these measurements before, modelers have been forced to estimate the values. And, in this case, their estimates were wrong.”

The researchers made temperature measurements from December 1999 to October 2001 using a laser radar system in combination with weather balloon measurements of the troposphere and lower stratosphere. Temperatures were recorded from the surface to an altitude of 70 miles.

It was discovered that at about 30 miles altitude it was much colder than model predictions, said Gardner. “The greatest difference occurred in July, when the measured stratopause temperature was about 0 degrees Fahrenheit, compared to about 40 degrees Fahrenheit predicted by the models.”

Gardner explains the significance of this finding: “After the autumnal equinox in March, radiative processes begin cooling the polar atmosphere. During the long polar night, the atmosphere above Antarctica receives little sunlight and is sealed off by a vortex of winds that spins counterclockwise. This stable polar vortex prevents the transport of warmer air from lower latitudes into the pole, and leads to extreme cooling of the lower stratosphere.”

The only source of heat during the winter comes from down-welling air masses, which warms the air by compressing it. “Current global circulation models apparently overpredict the amount of down-welling, because they show warmer temperatures than we observed,” said Gardner.

When the researchers plugged their results into the climate model at the National Center for Atmospheric Research in Boulder, Colorado, the difference was significant. “With the reduced down-welling, the predicted mesopause temperature near 60 miles altitude decreased from about minus 120 degrees Fahrenheit to about minus 140 degrees Fahrenheit, in better agreement with our measurements for mid-winter conditions,” Gardner said. “In the stratopause region, the predicted temperature decreased from about 35 degrees Fahrenheit to about 12 degrees Fahrenheit, also in better agreement with our measurements.”

Etc.

  • The Bush Administrations Climate Action Report 2002 continues to undermine its position on global warming. In a major speech in Mozambique on September 1 just before his appearance at the World Summit, British Prime Minister Tony Blair stated, “They [the Bush Administration] accept the science, but they believe the targets are unachievable without unacceptable economic consequences.”

New York Times Silly Season: Killer Heat Waves!

In its August 13 issue, the New York Times has continued its tradition of peddling bogus global warming scare stories during the late summer silly season. This time the threat isnt anything dramatic like the North Pole melting. “Heat waves come on subtly, raising summer temperatures just a little higher than normal and then receding,” opined the Times. “But they kill more people in the United States than all other natural disasters combined.”

Thats a striking, but erroneous statement. The deadliest killer is cold weather, not hot weather, as a number of studies have shown. For example, a study conducted in Europe several years ago found that, “Mean annual heat related mortalities were 304 in North Finland, 445 in Athens, and 40 in London. Cold related mortalities were 2457, 2533, and 3129 respectively.” The researchers argue that, “Our data suggest that any increases in mortality due to increased temperatures would be outweighed by much larger short term declines in cold related mortalities” (British Medical Journal, September 16, 2000).

The Times couldnt resist making the connection to global warming, but if climate models are correct, then global warming will not lead to increases in heat-related deaths, since the majority of the warming predicted would occur during the winter at night in high latitudes. This would actually save lives by lessening the severity of the coldest weather.

The Times, whose fact checkers (if they still have any) are not up to the standard of the National Enquirers, also failed to mention that, even though heat waves are indeed deadly catastrophes, the deaths that result are also the most easily preventable. Those who succumb to heat waves have two things in common: they are either elderly or in poor health and are too poor to afford air conditioning. But if the pro-Kyoto Times gets its way, energy prices will soar and even fewer people will have access to the one thing they need to beat the heat.

Flooding Not Caused by Warming

European leaders, like the big re-insurance companies, have been quick to link the catastrophic flooding this month in central Europe to global warming. And some have then used that to blame the United States. According to an August 15 Reuters story, German Environment Minister “Juergen Trittin, a Green, said higher global temperatures in recent decades had led to rising sea levels and increased rainfall and were at least partially to blame for a bout of unpredictable weather seen in recent years.”

“Mankind shares a real co-responsibility,” Trittin told a news conference. “It is unacceptable for American citizens to pump twice as much carbon dioxide into the atmosphere as Europeans,” he said earlier in an ARD television interview.

However, according to Tim Osborn, a climatologist at the University of East Anglia, global warming is an unlikely culprit in this case. “If this had happened in winter, then it might be reasonable to talk about global warming,” he said. “However, the models suggest that rainfall in summer is likely to remain the same, or perhaps even fall, if climate change continues” (www.bbc.co.uk, August 13, 2002).

What is really going on, says the BBC, is that the jet stream, which determines the rate of progress of weather systems, is in an unusual position and is pushing the weather systems out of their normal paths. “Instead of moving eastwards across the north Atlantic, picking up relatively little water because of the low temperatures at those latitudes, the system crossed into Europe at a lower point, carrying far more moisture as a result.”

Geoff Jenkins from the UKs Met Office says it is wrong to “jump to conclusions” about global warming in this case. “We have to be careful about ascribing all these changes to global warming, because the Earth is a very variable system already. We do get these events from year to year they are unusual, but not unprecedented. The weather at any particular point at any particular time is determined in our latitudes by the jet stream, and it just happens that at the moment the jet stream is in a very unusual position.”

Etc.

  • Greenpeace, the radical environmental pressure group, gained major media coverage around the world for before and after photos of an Arctic glacier that purport to show dramatic shrinking due to global warming. The photos are of a glacier on Svalbard, a 63,000-square kilometer island on the edge of the Arctic Circle. The 2002 photo shows that the glacier has receded a long way since a 1918 photo.

The two photos were published across the world with the statement, “The blame can be put squarely on human activity. Our addiction to fossil fuels releases millions of tons of greenhouse gases into the atmosphere and this is what is causing temperatures to rise and our future to melt before our eyes.”

But according to Professor Ole Humlum, a leading glaciologist in Svalbard, “That glacier had already disappeared in the early 1920s as a result of a perfectly natural rise in temperature that had nothing to do with man-made global warming.” The photos are misleading, he said. “They should have asked the specialist on Svalbard first” (Daily Telegraph, August 17, 2002).

  • Britains Environment Minister Michael Meacher, who likes to berate the U.S. for its backward stance on global warming, made a fool of himself in the August 9 issue of the London Sunday Times. Heres an excerpt from the interview:

Meacher:  “I mean floods in Britain is one we are having to explain, rising sea levels, but in America quite serious things are happening, certainly stronger hurricanes on the east coast which are to do with, what is the name of that hurricane that comes every 2-3 years?”

Interviewer: “They call them different names.”

Meacher: “No, no, there is a name which is the Spanish word for a young child, what is it called?”

Interviewer: El Nio.”

Meacher: “The El Nio is becoming more frequent and more violent.”

El Nio, of course, is not a hurricane, nor is it becoming more frequent or more violent. The last one began in 1997, five years before the current El Nio, which began this year. Thats the average interval between El Nios. Moreover, the current El Nio is significantly less powerful than the one in 1997. Nor are hurricanes becoming more frequent or more intense.

Discrepancy in Temperature Trends Explained

A new study appearing in Geophysical Research Letters (June 26, 2002) by Richard Lindzen and Constantine Giannitsis at MIT proposes an explanation for the discrepancy between surface and satellite temperature data. It also argues for lower climate sensitivity than is generally assumed.

Scientists have been puzzling over the difference in temperature trends between the surface layer of the atmosphere up to about 5,000 feet and the layer above that known as the troposphere. Since 1979, when scientists began using satellites to take the temperature of the troposphere, it appears that even while the surface has apparently warmed, tropospheric temperatures have remained steady. This is puzzling because greenhouse theory says that the troposphere should warm first, followed by the surface layer.

This scientific controversy even merited special attention from the National Research Council, which assembled a panel to assess the situation. It concluded that both the surface data and the satellite data are correct, but only speculated about the possible causes.

According to the study, “The surface data suggests a warming of about 0.25 degrees C, while the satellite data shows no significant increase.” Because the satellite data began in 1979, however, it has been noted that it is too short to “infer trends from any of the series since the trends estimated depend greatly on the subintervals chosen.” Fortunately, the close agreement between the satellite and weather balloon data, which also measures tropospheric temperatures, allows for a longer time period to be considered.

Looking at the balloon data the study notes that there was a pronounced jump in the atmospheric temperature of about 0.25 degrees C in 1976. The surface followed suit but at a slower pace, taking about ten years to catch up. The delay in the surface data is probably due to the heat capacity of the oceans, which is related to overall climate sensitivity. The delayed response also accounts for the discrepancy between the surface-based temperature data and that taken from satellites. Since the satellite data began in 1979 it missed the jump in 1976, which was documented in the slower surface warming.

More importantly, the rate at which the surface temperature caught up with the tropospheric temperature can be used to calculate climate sensitivity. The study finds that the surface temperature will rise about one degree C for a doubling of carbon dioxide levels. This is significantly lower than predictions made by the United Nations Intergovernmental Panel on Climate Change, and provides empirical support to climate skeptics who argue that climate sensitivity has been significantly overstated.

The study concludes, “The longer series suggests that the increase in tropospheric temperature occurred rather abruptly around 1976, three years before microwave observations began. The suddenness of the tropospheric temperature change seems distinctly unlike what one expects from greenhouse warming, while the relative rapidity with which the surface temperature caught up with the troposphere, less than about 10 years, suggests low climate sensitivity for a wide range of choices for thermocline diffusion.”

New York Times Halfway Corrects Alaska Story

On June 16, the New York Times ran a story on the front page that claimed that average temperatures in Alaska had risen 7 degrees over the last thirty years. The figure was later repeated in a June 24 Times column written by Bob Herbert.

The article was criticized by scientists who keep track of Alaskas temperatures, prompting the paper to correct the story on July 11. The Times wrote, “A front-page article on June 16 about climate change in Alaska misstated the rise in temperatures there in the last 30 years According to an assessment by the University of Alaskas Center for Global Change and Arctic System Research, the annual mean temperature has risen 5.4 degrees Fahrenheit over 30 years, not 7 degrees.”

But according to the Alaska Climate Research Center, this figure is still too high. Using data from temperature stations in Barrow, Fairbanks, Anchorage and Nome, which represent a cross section of Alaska from north to south, the researchers found that, “the value of 5.4 degrees F too great by a factor of two for the 1971 to 2000 period, the last 30 years.” These stations were chosen because they are professionally maintained and yield high quality, long term data.

Australias CSIRO Plays Games with Sea-Level Numbers

The Commonwealth Scientific & Industrial Research Organization (CSIR) in Australia has published a brochure on sea-level rise in support the Intergovernmental Panel on Climate Change. The brochure does make some good points, but it also presents a phony picture on sea level changes in Australia.

It notes, for instance, “No acceleration [in sea-level rise] has been detected during the 20th Century.” It also points out that, “If greenhouse gases are stabilized today, sea-level rise will continue for hundreds of years.”

When it comes to describing sea-level change in Australia, however, CSIRO only presents a small amount of the available data, which gives an erroneous picture of the true changes. The brochure states, “In Australia the rise in sea-level relative to the land (June 2001) at Fremantle is 1.38 mm/year with more than 90 years of data, and at Sydney 0.86 mm/year from 82 years of data. (Corrected for land motion, the rate of sea-level rise at these two locations is estimated to be about 1.6 to 1.2 mm/year respectively.)”

John Daly, who runs the Still Waiting for Greenhouse (www.john-daly.com) website, has pointed out that there are 23 long term tide gauges in Australia, not just two. Using all of the data, the National Tidal Facility of Australia has determined that there has been a sea-level rise of only 0.3 mm/year. Moreover, the two tide gauges cited by CSIRO are outliers due to land subsidence caused by ground water withdrawal.

CSIRO may have also erred in its calculation of land motion at the Fremantle site, says Daly. According to data from Perth, which is 20 miles away, the land has been subsiding at a rate of 2.08 mm/year since 1996. If that rate of land subsidence has been going on for some time, that would mean that there has actually been a lowering of sea-level at Fremantle of about -0.7 mm/year. Further support for this can be found by examining tide gauges on either side of Fremantle, Bunbury to the north and Geraldton to the south. These show a sea-level change of +0.4 mm/year and -0.95 mm/year, respectively.

In a presentation to the Cooler Heads Coalition in May, Daly noted the importance of Australia in determining rates of sea-level change since it is fronted by three oceans. The data from Australia suggest that there is little to no sea-level change. The CSIRO brochure is available at www.marine.csiro.au.

Etc.

Thursday, July 25, is the fifth anniversary of Senate passage of the Byrd-Hagel Resolution by a 95 to 0 vote. Exercising its constitutional role of advising the president on international treaties, the Senate warned then-President Clinton not to negotiate a protocol to the U. N. Framework Convention on Climate Change that would significantly harm the U. S. economy and that did not include significant participation by developing nations. The U. S. negotiating team took this position to Kyoto in late November 1997, but then-Vice President Al Gore flew to Kyoto in early December and successfully demanded that these two conditions be abandoned. Negotiations on the Kyoto Protocol were concluded on December 11, 1997.

More CO2 Helps Plants, Not Bugs

Two studies appearing in the July issue of Global Change Biology have concluded that increasing CO2 levels will have negative effects on insects, but positive effects for plants. One report by Peter Stiling and colleagues found that, “More herbivores die of host plant-induced death in elevated CO2 than in ambient CO2” and these plant-eating insects are more likely to be attacked because they have to feed for longer periods of time to get the nutrients needed.

A second study by David Stacey and Mark Fellowes concluded that there is now “empirical evidence that changes in host plant quality by increased levels of CO2 can alter the outcome of interspecific competition among insect herbivores.” As Robert Balling of Arizona State University points out, both studies show that plants would benefit from higher CO2 levels. Balling states, “These two articles add to the evidence that elevated CO2 will benefit plants without giving herbivores any competition advantage over the plant.”

Climate Models Fail to Reproduce Natural Temperature Fluctuations

Climate models that serve as the basis for global warming predictions fail to reproduce correctly the fluctuations in atmospheric temperatures over time scales of months and years, according to new research appearing in the July 8 issue of Physical Review Letters.

The study explains that large-scale atmospheric and oceanic dynamics are solved in the models using highly sophisticated numerical solutions, but that there are also “subgrid-scale processes” that are too small to be modeled. These are handled by “parameterization schemes,” which amounts to little more than arbitrarily assigning a value to the particular process being considered. Some of these subgrid-scales includes, surprisingly enough, the roles of various greenhouse gases including carbon dioxide and the effect of aerosols.

In earlier research, the authors discovered a universal mathematical relationship, known as a scaling law, which describes the correlations between temperature fluctuations. What they found was that temperature variations from their average values exhibit persistence that decays at a well-defined rate. “The range of this persistence law exceeds ten years, and there is no evidence for a breakdown of the law at even larger timescales,” according to the study.

Using this scaling law, the researchers tested seven general circulation models, including the U.S.-based model at the National Climate for Atmospheric Research, against historical atmospheric temperature data from six representative sites. What they found was that the models, “fail to reproduce the universal scaling behavior observed in the real temperature records.”

The researchers explain that, “It is possible that the lack of long-term persistence is due to the fact that certain forcings such as volcanic eruptions or solar fluctuations have not been incorporated in the models.” But they cannot “rule out that systematic model deficiencies (such as the use of equivalent CO2 forcing to account for all other greenhouse gases or inaccurate spatial and temporal distributions of sulfate aerosols) prevent the [climate models] from correctly simulating the natural variability of the atmosphere.”

They conclude, “Since the models underestimate the long-range persistence of the atmosphere and overestimate the trends, our analysis suggests that the anticipated global warming is also overestimated by the models.”

More National Assessment Shenanigans

The National Assessment (NACC), a Clinton administration report on the possible impacts of climate change that has resurfaced in the Bush administration, has come under repeated and heavy criticism due to its sloppy research, absurd computer modeling, and political bias. Now it appears that the NACC involves scientific fraud as well. According to Tech Central Station at (www.techcentralstation.com, June 28, 2002), the U. S. Global Change Research Program altered the color scheme of the graphics used in the assessment to hide large discrepancies between the two models one from the Canadian Climate Centre and the other from the UK-based Hadley Centre that were used to make 100-year forecasts.

When the results from the two models were completed, they showed very different trends for future warming. Typically, when modelers show their results graphically, they use colors on a map to show temperature variations. The color scale goes from dark blue to lighter blues, which represent cooling, to green, yellow, orange and red to show progressively warmer temperatures. Dark blue sections represent about a 5 degree cooling, whereas red represents about a 15 degree warming.

Putting the two model results side by side, one could see that there were huge discrepancies between them. Several of the technical reviewers commented that the differences between the models cast doubt on the quality of the predictions. To “fix” the problem, the USGCRP changed the color scheme so that the scale used was mostly red and orange with a little blue at the bottom. Indeed, the orange extended all the way into the cooling part of the scale. The result was to eliminate blues, greens and yellows from the maps, which became nearly all orange or red, thereby obscuring the differences between the model results. The before and after pictures are available at the Tech Central Station website. A more detailed discussion of the changes can be found at www.john-daly.com.

Etc.

At a press conference held by Environmental Media Services in Washington on July 2, it was claimed that U. S. businesses could be harmed by the Bush Administrations decision to reject the Kyoto Protocol. John Palmisano, whose company Evolution Markets advises corporations on how to trade and reduce emissions, asserted that even if global warming is not true, business will benefit through emissions trading becausewell, because they will have the ability to track emissions. Palmisano was previously a leading promoter of emissions trading at Enron Corporation.

Joseph Romm, of the Center for Energy and Climate Solutions and a former Assistant Secretary of Energy in the Clinton Administration, said: “There is no type of business in this country that cannot make a profit through reducing carbon emissions.” Romm claimed that consumers prefer to buy products from “cool companies”, a sentiment echoed by Michael Marvin of the Business Council for Sustainable Energy. Marvin claimed that both consumers and shareholders want emissions reductions.

Environmental Media Services passed out copies of a recent report by the Conference Board titled, “Global Climate Change: Fact or Fiction? It Doesnt Matterthe Issue is Here to Stay.” For those interested in basing public policy or investment decisions on fantasy, the report may be found at http://www.conference-board.org/ea_reports/EA_23.pdf.

Announcement

The Cato Institute is sponsoring a lunchtime briefing on Global Warming: Rational Science, Rational Policy on July 19 in Room B-338 of the Rayburn House Office Building. The speakers will be Patrick Michaels of the University of Virginia and Cato and Jerry Taylor of Cato. Registration, which is required, may be accomplished by calling Julie Cullifer at (202) 789-5229 or online at www.cato.org/events.

What Constitutes “Dangerous Anthropogenic Interference?”

The stated goal of the United Nations Framework Convention on Climate Change it to stabilize greenhouse gas concentrations at levels that avoid dangerous anthropogenic interference with the atmosphere. As President George W. Bush has rightly pointed out, the Kyoto targets “were arbitrary and not based upon science” and “no one can say with any certainty what constitutes a dangerous level of warming, and therefore what level must be avoided.”

Brian C. ONeill at the Watson Institute for International Studies and the Center for Environmental Studies at Brown University and Michael Oppenheimer at the Woodrow Wilson School of Public and International Affairs and the Department of Geosciences, Princeton University, attempt to answer that question in a Policy Forum paper in the June 14 issue of Science.

The authors admit that this is a difficult task that must “ultimately involve a mixture of scientific, economic, political, ethical, and cultural considerations, among others. In addition, the links among emissions, greenhouse gas concentrations, climate change, and impacts are uncertain. Furthermore, what might be considered dangerous could change over time.”

The Intergovernmental Panel on Climate Change has identified two criteria for defining “dangerous” interference: “warming involving risk to unique and threatened systems and warming engendering a risk of large-scale discontinuities in the climate system.”

The authors offer three risks which would fall under these criteria, (1) “Large-scale eradication of coral reef systems,” (2) the “disintegration of the Western Antarctic Ice Sheet (WAIS),” and (3) “the weakening or shutdown of the density-driven, large scale circulation of the oceans (thermohaline circulation or THC).”

To prevent severe damage to reef systems, the authors recommend a long-term target of one degree C above 1990 global temperatures. To protect the WAIS would require a limit of two degrees C above 1990 temperatures. And to stop the shutdown of the THC would require a limit of three degrees C warming over the next 100 years.

None of these cases is particularly convincing, however. As noted in a critique of this study by The Center for the Study of Carbon Dioxide and Global Change (www.co2science.org), the evidence shows that coral reefs have been shown to be very resilient in the face of temperature change. Coral reefs have survived several past interglacial periods that experienced warmer temperatures than now. For example, the most recent interglacial was a full three degrees C warmer than the current interglacial, yet coral reefs survived.

Recent studies have cast serious doubt on the likelihood of a collapse of the WAIS or the shutdown of the THC. A June 19 study in Science found that the WAIS is actually thickening. A January 31 study in Nature found that the Antarctic has been cooling since 1966. And another study in the February 21 issue of Nature found that the palaeoclimate data show that abrupt changes to the THC are not characteristic of the current interglacial, but are characteristic of the latest glaciation. In other words, it is unlikely that global warming will cause the THC to shut down.

ENSO Not Linked to Global Warming

With El Nio returning to the southern Pacific, it wont be long before we begin hearing about the ill effects of global warming and its possible link to this large-scale natural weather phenomenon. A study in the February 2002 issue of Paleoceanography shows that there is no connection between global warming and the frequency or strength of El Nio.

Using palaeoclimate data, the researchers reconstructed a record of El Nio activity, the latter 400 years of the record being the most reliable. The data show that El Nio activity “was more frequent after 1980, lower in the 1940-1975 epoch, and again more frequent around the beginning of the 1900s.”  It also showed that “the 1860-1880 period was relatively quiescent,” while “the 1820-1860 period was also a period of relatively vigorous ENSO [El Nio Southern Oscillation] activity.”  Indeed, the 1820-1860 period was “similar to [that] observed in the past two decades.”

The study concludes, “As these observations extend at least into the preindustrial period, attribution of the unusually ENSO-rich past few decades may lie in part with natural variability.” This is a bit of an understatement given the evidence. It looks like El Nio activity is likely due entirely to natural variability.

At a briefing in Capital Hill on October 5 Danish statistician Bjorn Lomborg, once a member of Greenpeace, argued that predictions of the world heading for ruin are wrong. In 1997 he set out to challenge acclaimed economist Julian Simon who refuted environmentalist claims that the world was running out of resources. Lomborg discovered that the data on a whole supported Simon. “The Skeptical Environmentalist,” Lomborg’s new book is a composite of graphs, charts and statistics that factually show the earth’s environment is steadily improving.

His book asserts among other things that the global warming issue is overblown. In short he attests, “Things are getting better.” In his presentation, Lomborg said that global warming is a real issue, but suggested that the prime danger is the Kyoto Treaty, which he cites as a grand waste of money. He said, “Essentially Kyoto will do very little to change global warming. On the other hand Kyoto will be very expensive. It will cost anywhere from $150-350 billion a year, and that’s a lot of money when compared to the total global aid of $50 billion a year. Basically, just for one year of Kyoto, we could give clean drinking water and sanitation to every person on earth. This would avoid 2 million deaths a year, and assist half a billion people from not getting seriously ill each year.”

Environmentalists tend to be ecologically pessimistic about the future. Veterans of the environmental movement such as Paul Ehrlich of Stanford University and Lester Brown of the Worldwatch Institute have formed a litany of fears. These fears include depletion of natural resources, ever-growing population, extinction of species and pollution of the planet’s air and water. However, Lomborg’s approach is decidedly different. He says, “We must remove our myths about an imminent doomsday and remember we do not have to act in total desperation. Essential information is necessary to making the best possible decisions. Statistics tell you how the world is. Resources have become even more abundant and things are likely to progress in the future.”

The briefing was sponsored by the Cooler Heads Coalition, made up of 23 non-profit organizations that work on global warming issues.