February 2009

Who Cares About the Consumer?

by Iain Murray on February 13, 2009

in Blog

Electricity consumers beware! The so-called-stimulus bill includes provision for something called “decoupling.” E&E Daily reports:

Also included in the final version is a requirement that governors who want additional state energy efficiency grants ensure that their state regulators guarantee revenue to utilities to support efficiency programs.

State regulators and consumer advocates strongly opposed the provision, saying it ties regulators’ hands and is not the best tool to promote efficiency.

The National Association of Regulatory Utility Commissioners said many regulators cannot assure that “decoupling” requirements will be met. “These ambiguous conditions will create confusion and legal uncertainty and will likely delay or preclude the release of these critical funds,” NARUC said in a statement. “This benefits neither the States the utilities, nor, most importantly, the citizens they serve.”

“Decoupling” is a mystifying-sounding name for an economically terrifying concept. This is how it is described in government/regulatory jargon:

In order to motivate utilities to consider all the options when planning and making resource decisions on how to meet their customers’ needs, the sales-revenue link in current rate design must be broken. Breaking that link between the utility’s commodity sales and revenues, removes both the incentive to increase electricity sales and the disincentive to run effective energy efficiency programs or invest in other activities that may reduce load. Decision-making then refocuses on making least-cost investments to deliver reliable energy services to customers even when such investments reduce throughput. The result is a better alignment of shareholder and customer interests to provide for more economically and environmentally efficient resource decisions.

Now, in English: the laws of supply and demand mean that if the quantity demanded goes down, you sell less of the product you supply. In energy supply terms, this means that if conservation works, energy utilities see their profits decline, because in general they are regulated so tightly that they cannot raise their prices, which is the usual response to declining demand. Therefore, if there is a policy goal of increasing energy conservation, then utilities are likely to stand in the way, because their profits depend on selling more energy; they are unlikely to install technologies that reduce the need for energy, for example. Accordingly, the link between quantity sold and profits must be broken, or “decoupled.” This is normally done by regulating rates such that if more energy is sold, the marginal rate goes down and, if less is sold, the marginal rate goes up.

Now, to some this may sound like supply and demand at work, but it is actually a market manipulation aimed at achieving a policy goal. In fact, it most resembles a supply-side reform designed by someone who doesn’t understand supply-side economics. The utility remains regulated and the incentive structure is designed such that the utility is more inclined to respond to the regulator rather than the consumer. When profits are essentially guaranteed at a certain level, the utility will be more likely to spend money pleasing the regulator and delivering service improvements to that body than to the consumer. The consumer may end up paying more money for less electricity and the utility and regulator will both be happy. The dangers here are obvious; insulating the supplier from the consumer is a terrible idea.

Here is a useful paper from the Electricity Consumers Resource Council that raises several further objections to decoupling, which it says is a blunt instrument. They are:

  • 1. Decoupling Promotes Mediocrity In The Management Of A Utility.
  • 2. Decoupling Shifts Significant Business Risk From Shareholders To Consumers With
    Only Dubious Opportunities For Net Increases In Consumer Benefits.
  • 3. Decoupling Eliminates A Utility’s Financial Incentive To Support Economic
    Development Within Its Franchise Area. This Includes The Incentive To Support The
    Well Being of Manufacturers And Their Workforce.
  • 4. Revenue Decoupling Mechanisms Tend To Address ‘Lost Revenues’ And Not The Real
    Issue, Which Is Lost Profits.
  • 5. The First And Most Important Step Regulators Can Take To Promote Energy
    Efficiency Is To Send The Proper Price Signals To Each Customer Class.
  • 6. Several States Have Successfully Used Alternative Entities—Including Government
    Agencies—For Unselling Energy. This Creates An Entity Whose Sole Mission Is To
    Promote Energy Efficiency, And Retains A Separate Entity Whose Responsibility Is To
    Efficiently Sell And Deliver Energy.
  • (Not sure about that last one, but if there’s a policy goal to reduce energy consumption, that’s certainly a better way to go about it than decoupling).

    A true supply-side reform would actually reduce regulation to the basics (reasonable safety requirements etc) and thereby not only allow but encourage the best conservation measure of all – demand-based pricing. This would allow rates to increase and decrease not according to some bureaucrat’s assessment of whether a policy goal is being met, but hourly, according to whether the system is being over- or under-used. Less energy will be consumed at peak times, thereby reducing the need for back-up energy generation, and more will be used at off-peak times, reducing the amount of wasted energy then. Overall, as long as the consumer responds to the price signal, consumers will probably use less electricity but also see their bills drop, while the utilities will save in lower production costs. Decoupling-style rate regulation actually stands in the way of this win-win goal.

    Image by Skagit Information Management Systems, used under Creative Commons License.

    Tom Peterson

    I’ve catalogued a lot of evidence that the Center for Climate Strategies, the so-called unbiased “technical consultant” to states for their global warming policy commissions, is totally controlled by the Pennsylvania Environmental Council alarmism advocacy group. This is despite denials by CCS’s executive director, Tom Peterson. Well, in the research for my American Spectator piece yesterday (which explains how now CCS and PEC are now running away from one another — looks bad, ya know) I discovered yet another clear statement that CCS is totally controlled by PEC.

    It turns out that last year PEC enlisted an executive search agency to find them a new president and CEO. The job listing (PDF) explained as follows with regard to the position’s responsibilities:

    The President & CEO is responsible for managing the overall staff of 27, including four affiliated entities – Enterprising Environmental Solutions, Inc….The position has 8 direct reports, (including) Executive Director of Enterprising Environmental Solutions, Inc., chief operating officer, (etc.)…

    Enterprising Environmental Solutions is the nonprofit PEC created and totally controlled (see tax returns), and is where CCS resides as a policy center. So…not only was there board control and sharing of staff, but in this clear statement the executive director of EESI/CCS was to report directly to the president of PEC. A reminder of what Peterson told me back in April 2007:

    “The idea that we have advocates for PEC working on the North Carolina project is incorrect,” he said. “(EESI) does not have an advocacy mission, and it doesn’t have an advocacy history.”

    By the way, EESI’s Web site has been taken down — I wonder why? Were there some legal problems with this arrangement?

    Last night in Hickory, N.C., in a forum co-sponsored by the John Locke Foundation and the Reese Institute for Conservation of Natural Resources, atmospheric scientist John Christy debated William Schlesinger, former dean of the Nicholas School of the Environment at Duke University. It was a skeptic vs. alarmist smackdown, and the local newspaper of record, the Daily Record, thinks that Christy may have prevailed (see sidebar):

    An informal, unscientific survey revealed that many attendees, regardless of age, think global warming is overblown and people should not panic about the future.

    Schlesinger said most people don’t want to recognize the problem, that they hope it will just go away. “It won’t,” he said. “It will only get worse.”

    Show me the evidence was Christy’s mantra. The data doesn’t justify the gloom and doom, he said. Wishful thinking or not, a lot of people who attended the forum agree with Christy.

    My colleague at the Locke Foundation, vice president for research (and debate timekeeper) Roy Cordato, had a much stronger take with which I agree, as you might expect:

    …Schlesinger said that he was not going to discuss the science. He then went directly to rattling off scary scenarios about the future. So about two thirds of his talk was scare mongering with no actual defense of the hypothesis that human induced catastrophic global warming is in the process of occurring. What is interesting is that in “skipping over the science” he flipped through a number of slides that he had prepared to use including the now infamous and discredited “hockey stick” graph showing 900 years of no climate change and the last 100 years of dramatic warming.

    You can watch the whole 75-minute debate on the Locke Foundation’s blog. It’s also posted in eight pieces on YouTube.

    I just watched the Energy & Commerce Subcommittee hearing on “The Climate Crisis: National Security, Public Health, and Economic Threats.”

    Committee rules allow the minority one-third of the witnesses. Originally, there were to be four majority witnesses, which works out to only one minority witness, or one-fourth (because two witnesses would equal two-fifths–slightly more than one-third). However, when Chairman Markey learned that Dr. Patrick Michaels of the Cato Institute was to be the minority witness, he added a 5th majority witness, Prof. Daniel Schragg of Harvard University. So the decks were stacked against Michaels 5 to 1.

    However, even that was not enough to satisfy Rep. Jay Inslee (D-WA). He attacked Michaels personally, accusing him of not being “forthright” with the Committee, trying to “pull a fast one,” and treating the Members like “chumps.” Inslee demanded to know why it was even necessary to have witnesses like Michaels on the panel, when it’s so obvious that global warming is bad and nothing could be more costly than inaction on climate change.

    Michaels’s oral testimony may be summarized as follows: (1) Forecasts of the impacts of climate change on national security, public health, and the economy cannot be better than the temperature projections on which they are based; (2) the 21 models used in the IPCC’s mid-range greenhouse gas emissions scenario project a constant, not accelerating rate of global warming through the 21st century; (3) the observed rate of temperature change over the past 20 years has been remarkably constant; (4) however, the observed rate is at or below the low-end of the range forecast by the models; (5) therefore, the models are too sensitive and likely over-predict future warming; (6) hence, also, impact assessments based on those model projections are unlikely to be correct.

    In his fulmination, Inslee claimed (a) that Michaels compared apples (observed temperatures) to oranges (model projections of future warming), and (b) that global warming is accelerating. He is wrong on both counts. Michaels compared observed temperatures with model projections over the same period. Finding a poor fit, he drew the only reasonable conclusion: Model projections of future warming are also likely to be erroneous. Also, global warming is not accelerating. Since 1976, the observed rate has been about 0.17 degrees Celsius per decade. So, on the basis of two falsehoods, Inslee essentially called Michaels a liar.

    Then, instead of letting Michaels respond, Inslee asked for commentary by Prof. Schragg. This left Michaels exactly 15 second to respond to 4-plus minutes of verbiage from Inslee and Schragg.

    The contrast between Dr. Michaels’s calm, clear, patient exposition of scientific basics and Inslee’s rude, arrogant, intolerance of dissenting views could not have been clearer. Global warming zealotry is poisoning the atmosphere of public discourse–that is probably the main conclusion Web viewers draw from this hearing.

    In a letter dated 5 February 2009, 17 state attorneys general (AGs) plus three other non-federal officials urge EPA Administrator Lisa Jackson to respond to the Supreme Court case of Massachusetts v. EPA (2007) by issuing a finding that greenhouse gas (GHG) emissions from new motor vehicles cause or contribute to “air pollution” that may reasonably be anticipated to endanger public health and welfare.

    To explain why EPA should make an endangerment finding, the AGs quote from EPA’s July 2008 Advanced Notice of Proposed Rulemaking (ANPR): “The IPCC projects with virtual certainty (i.e., a greater than 99% likelihood) declining air quality in cities due to warmer days and nights, and fewer cold days and nights, and/or more frequent hot days and nights over most land areas, including the U.S.” In the ANPR, EPA goes on to say that the increase in air pollution from global warming will lead to “increases in regional ozone pollution, with associated risks for respiratory infection, aggravation of asthma, and potential premature death, especially for people in susceptible groups.”

    This chain of reasoning flies in the face of history and public policy reality.

    As American Enterprise Institute scholar Joel Schwartz documents, air quality in U.S. cities has improved steadily over the past three decades as urban air temperatures have increased. Nobody should know this better than EPA, because EPA deserves much of the credit and regularly publishes the relevant data. From 1980 to 2006, emissions of the six criteria pollutants fell by the following amounts: lead, 97%; oxides of nitrogen, 33%; volatile organic compounds, 52%; sulfur dioxide, 47%; carbon monoxide, 50%; PM10, 28%; and PM2.5, 31%. As a consequence, ambient concentrations of polluting emissions also declined. From 1980 to 2007, air pollution levels fell by the following amounts: nitrogen dioxide, 43%; sulfur dioxide, 68%; and ground-level ozone, 21%.

    More importantly, under existing regulatory requirements, air pollution emissions and concentrations will continue to decline despite potential climate change. Schwartz explains:

    EPA’s Clean Air Interstate Rule (CAIR) requires power plant SO2 and NOX emissions to decline more than 70% and 60%, respectively, during the next two decades, when compared with 2003 emissions. This is a cap on total emissions from power plants that remains in place independent of growth in electricity demand. [Note, in July 2008, the D.C. District Court of Appeals overturned CAIR, but whatever EPA puts in its place will likely be even more stringent.]

    Recently implemented requirements for new automobiles and diesel trucks, and upcoming standards for new off-road diesel equipment will eliminate more than 80% of their VOC, NOX, and soot emissions during the next few decades, even after accounting for growth in total driving. Dozens of other federal and state requirements will eliminate most remaining emissions from other sources of air pollution.

    We may “reasonably anticipate” that in 20 years most U.S. air pollution problems will have been solved, and that by mid-century significant air pollution will exist only in history books.

    So the AGs are advising Jackson to act on the basis of bogus “science” that EPA parroted from the IPCC without due diligence.

    The AGs are a notoriously unreliable bunch. When litigating Massachusetts v. EPA, they said that the case posed no risks to the U.S. economy because it solely concerned one subset of mobile emission sources (new motor vehicles) under one provision of the Clean Air Act (§202), which requires EPA to consider compliance costs when setting emission standards. The only significant consequence of promulgating first-ever GHG emission standards for new cars and trucks, they said, was that we’d all get better gas mileage and suffer less pain at the pump.

    In reality, as EPA’s ANPR and numerous comments thereon reveal, the Clean Air Act is a highly interconnected statute. Setting GHG emission standards for new motor vehicles would initiate a regulatory cascade through multiple provisions of the Act, exposing 1.2 million previously unregulated buildings and facilities to costly and time-consuming regulation under the Act’s Prevention of Significant Deterioration (PSD) pre-construction permitting program, and millions of such sources to pointless paperwork burdens and emission fees under the Title V operating permits program.

    Nor is that all. The endangerment finding that would compel EPA to establish GHG emission standards for new motor vehicles would also set a precedent for establishing National Ambient Air Quality Standards (NAAQS) for greenhouse gases. As I explain here, this could lead to the promulgation of NAAQS for carbon dioxide and other GHGs that the United States could not attain even through outright de-industrialization.

    In short, if Ms. Jackson acts on the AGs’ advice, she would start a process that could turn the Clean Air Act into a gigantic de-stimulus package.

    Environmentalist activists must certainly mean well.  But, at times, some are so silly that all you can do is laugh.  Consider a recent Tree Hugger post comparing bottled water to orange juice and its lament about carbon footprints!  The post points out that orange juice has an even bigger footprint than—brace yourself—Fiji water! Fiji water is supposedly the world’s “most wasteful” water because it is shipped across continents.

    Alas, if you don’t live in a community that grows oranges organically for locally produced juice, the carbon footprint is just unacceptable. In fact, the post concludes, all citrus products are “an imported luxury” that responsible environmentalists shouldn’t be drinking every day!

    What the greens have discovered here is no great revelation.  The reality is:  Everything in life has a carbon footprint! And bottled water probably has one of the lower ones. Unfortunately for so many well-intended greens, having a light carbon footprint requires considerable self denial.  If orange juice is so bad, just consider the carbon footprint of the computers used to produce Tree Hugger posts, the coffee consumed (do they really need coffee anyway?) while writing such posts, and yes, even that morning McMuffin!

    Fortunately for market advocates, we understand the value of globalization—the opportunity eat bananas from Brazil, drink wine from Australia, and and yes, even consume water all the way from Fiji. We recognize that a better world is one in which more people have more access to such goods so more people can eat well, heat their homes, and live well. Drinking orange juice, water, or whatever, from places where it is most efficiently produced around the globe is a blessing, not a curse. In fact, CEI has shown many times over that the best approach to climate change is not to get wrapped up in such foolish worries or policies that they produce, like bans on bottled water!

    In a report titled “Beyond Transport Policy,” the European Environment Agency (EEA) bemoans the fact that European transport sector CO2 emissions increased by 26% during 1990-2006. The report is called “Beyond Transport Policy” because–hold on to your hat–the “drivers” of transport demand growth are “external” to the transport sector itself. For example, people don’t fly for the sheer thrill of flying, but in order to vacation or conduct business in an increasingly global economy.

    Consequently, traditional transport policies such as fuel economy regulations, motor fuel taxes, and infrastructure upgrades have had little impact on transport demand and the associated emissions.

    This implies that in order to achieve what the EEA calls a “sustainable transport system,” politicians and bureaucrats must control those pesky “external drivers”–basically the totality of things that constitute work and play in the modern world.

    But, as I discuss here, the EEA’s proposed solutions are not “beyond transport policy,” but are the same old, same old: new taxes on fuels, vehicles, passengers, and imports. The EEA is stuck in a mental rut; it has taxes on the brain.

    On Tuesday (Feb. 10), USDA Secretary Tom Vilsack urged EPA to increase the quantity of ethanol blended into gasoline from the current amount–10% ethanol per gallon–to some higher percentage, Reuters reports.

    Will EPA heed Vilsack’s request or heed the clear implication of www.fueleconomy.gov, a Web site EPA administers jointly with the Department of Energy?

    The EPA/DOE Web site reveals that filling up with ethanol is a big fat money-loser. To see for yourself, click on “Flex-Fuel Vehicles,” then click on “Fuel Economy Information for Flex-Fuel Vehicles,” and then click on “Go.”

    EPA and DOE compare the average annual cost of using regular gasoline and E-85 (motor fuel blended with 85% ethanol) for 90 different flex-fuel models. In every case, regardless of make or model, fueling the vehicle with E-85 costs more than gasoline—lots more.

    Consider a few examples:

    E-85

    Regular Gas

    Annual Cost

    Annual Cost

    Chevrolet HHR 4WD

    $2,225

    $1,063

    Chrysler Avenger

    $2,644

    $1,256

    Mercedes-Benz C3004matic

    $2,821

    $1,553

    Dodge Caravan 4WD

    $3,253

    $1,452

    Lincoln Town Car FFV

    $3,020

    $1,452

    GMC Sierra C15 2WD Pickup

    $3,524

    $1,725

    Dodge Ram 1500 Pickup 2WD

    $4,230

    $1,841

    Jeep Grand Cherokee 4WD

    $4,230

    $1,841

    Toyota Sequoia 4WD

    $4,230

    $1,971

    The Nissan Titan 4WD pickup

    $4,230

    $1,971

    Another neat thing about this site is that it compares the annual carbon footprint of using E-85 versus regular gasoline for each vehicle. In every case, ethanol has a lower carbon footprint (emits fewer annual tons of CO2). This is controversial in light of research (see here, here, and here) indicating that ethanol is a net contributor to greenhouse gas emissions when you take into account emissions from fertilizer used to grow corn and the carbon released from forests and soils as corn cultivation expands into previously unfarmed areas.

    Nonetheless, even if one eschews a lifecycle analysis and considers only the direct emissions released by burning equal volumes of gasoline and ethanol, the cost per ton of CO2 avoided by using E-85 is ridiculously expensive.

    Consider the Nisan Titan 4WD. According EPA and DOE, using regular gasoline, the Titan emits 13.1 tons of CO2 per year; using E-85, it emits 11.1 tons of CO2 per year. So fueling the Titan with E-85 instead of gasoline reduces the vehicle’s annual CO2 emissions by 2 tons. However, the E-85 costs $2,259 more, which means the per-ton cost of reducing CO2 by using E-85 instead of gasoline is $1,129.50. That’s a dozen times more costly than the “social cost of carbon” (how much damage each ton of CO2 allegedly does) as estimated by Richard Tol, perhaps the world’s leading climate economist, in a major literature review.

    For additional perspective, the Energy Information Administration estimated that emission permits under the Lieberman-Warner Climate Security Act (S. 2191) would cost $16.88 per ton in 2012, $29.88 in 2020, and $61.01 in 2030. So the per-ton cost of reducing CO2 emissions by switching your Nisan Titan from gasoline to E-85 is between 18 and 66 times more costly than emission permits under Lieberman-Warner, a bill the U.S. Senate did not see fit to pass.

    The EPA/DOE Web site demolishes claims that ethanol reduces pain at the pump or provides a cost-effective antidote to global warming. Yet the agencies established the Web site partly to promote E-85 and flex-fuel vehicles, and neither agency advises consumers not to use ethanol–quite the reverse.

    So EPA will probably side with Vilsack against the clear implication of its own analysis that ethanol is a consumer ripoff.

    Tucker 1 Lovins 0

    by Iain Murray on February 10, 2009

    Those who have been following the “alternative energy” fantasists for a while will recognize the name of Amory Lovins, the so-called “sage” (yet another pseudo-religious title utilized by liberal environmentalists for their heroes) of the Rocky Mountain Institute. They will also remember that he regularly advances marvelous-sounding schemes for re-imagining America’s energy mix, which never seem to go anywhere. He’s at it again, this time on the popular Freaknomics blog, where he suggests that renewable “micropower” is the future of energy:

    Power plants also got irrationally big, upwards of a million kilowatts. Buildings use about 70 percent of U.S. electricity, but three-fourths of residential and commercial customers use no more than 1.5 and 12 average kilowatts respectively. Resources better matched to the kilowatt scale of most customers’ needs, or to the tens-of-thousands-of-kilowatts scale of typical distribution substations, or to an intermediate “microgrid” scale, actually offer 207 hidden economic advantages over the giant plants. These “distributed benefits” often boost economic value by about tenfold. The biggest come from financial economics: for example, small, fast, modular units are less risky to build than big, slow, lumpy ones, and renewable energy sources avoid the risks of volatile fuel prices. Moreover, a diversified portfolio of many small, distributed units can be more reliable than a few big units. Bigger power plants’ hoped-for economies of scale were overwhelmed by diseconomies of scale.

    Thankfully, William Tucker, author of the excellent new book Terrestrial Energy, has responded in the comments section. His comment is worth reproducing in full:

    Quite briefly, Lovins is drawing a false analogy between the miniaturization and distribution of computing and telecom instruments and the production of energy. Computers and telephones can be miniaturized and distributed according to Moore’s Law because they involve information. You can use less and less energy to store each bit. For that reason you can have as much computing power on your desktop today as Univac had in an entire room in the 1960s. Computers can be distributed because they have become so powerful.

    But things don’t work that way with energy. A kilowatt is a kilowatt, whether it’s generated in your backyard or at a power station. You can “distribute” generation anywhere you want but you still have use the same amount of fuel or wind or whatever. We could replace central thermal stations with gas turbines on every street corner, but the fuel is going to be expensive and produce a lot more carbon emissions, which is something Lovins conveniently overlooks.

    The real irony, however, is his suggestion that wind fits this small-is-beautiful scenario. Sure wind is “distributed.” After all, you need 125 square miles of 45-story windmills to generate the same 1000 megawatts that can be generated in one square mile at a central thermal station. You’ve got to put them somewhere! And that’s just their nameplate capacity. To produce 1000 MW of base load electricity, you’d need at least three or four 125-square-mile wind farms scattered at diverse locations around the country.

    That’s the reason Lovins himself has suggested covering all of North and South Dakota with wind farms. Al Gore matches him by asking for 1/5 of New Mexico, the fifth largest state, for solar collectors. On top of this, they want to rebuild the entire national grid to 765 kilovolts in order to ferry all this electricity from the remote areas where it’s best generated to population centers. And Lovins calls 1000-MW power plants operating on the current transmission system “irrationally big!”

    What Lovins never wants to acknowledge is the energy density of nuclear power. With nuclear, the energy produced from 500 square miles of windmills can be generated with a fuel assembly that would fit in the average living room. Why “distribute” all this generating capacity into big, ugly structures that litter the landscape and only work when the wind blows? Why not concentrate it all in one place? Then once every 18 months a single tractor-trailer can come in with a new set of fuel rods.

    In one respect, though, Lovins may be right. Maybe we shouldn’t be building nuclear reactors to 1500 MW. Hyperion, a New Mexico company, has invented an 80-MW mini-reactor the size of a gazebo that can power a town of 20,000. You could put it in someone’s basement and no one would ever notice. While “alternate energy” has gotten more and more gigantic, nuclear is getting smaller and smaller.

    Who would have thought it would be nuclear that is small and beautiful?

    Indeed. As Mr Tucker explains at greater length in his book, the real problem with “renewable” energy is that it is just so distribute and dispersed that collecting in the quantity and quality we need it is a real problem, one that size alone can solve. Lovins’ argument is just about the reverse of reality.

    Our very own Christopher C. Horner explains the hype behind global warming and talks about his new book, Red Hot Lies: How Global Warming Alarmists Use Threats, Fraud, and Deception to Keep You Misinformed on Living the Life, available on ABC Family and CBN.com. Learn more about the climate debate at GlobalWarming.org.