June 2014

In a recent influential article in the Energy Law Journal, former FERC General Counsel William Scherman made a number of bombshell claims regarding what he and two co-authors described as the “lop-sided and unfair” FERC enforcement process. If true, these allegations raise troubling concerns about the absence of due process at FERC’s Office of Enforcement, including a putative practice of withholding exculpatory evidence from investigation subjects.

Unsurprisingly, the law review article has figured prominently in the confirmation process for President Obama’s nominee for FERC chair, Norman Bay, who has served as the head of FERC’s Office of Enforcement since 2009 and was, therefore, at the center of the allegations. During Bay’s May 20th confirmation hearing, Bay was questioned about Scherman’s law review article by Senator John Barrasso. In response, Bay said that he would be “very concerned” about the article, but only if it is true. And in follow-up, written responses to questions from Barrasso, which I’ve excerpted below the break, Bay states that none of the allegations in the law review article are true.

So, a prominent person is lying. Either former FERC General Counsel William Scherman is telling fibs about FERC’s Office of Enforcement, or current FERC chair nominee Norman Bay is being duplicitous in rebutting the allegations. Given the gravity of the charges leveled in the law review article, I’d want to find out who’s lying before I supported Norman Bay’s nomination, if I were a Senator.

[click to continue…]

America’s electric grid is undergoing unprecedented changes that could threaten reliability in discrete markets. This was the take-away lesson from the Federal Energy Regulatory Commission’s annual reliability conference in Washington, D.C. yesterday. To watch an archived webcast of the conference, or read participant testimonies, click here.

The challenges facing the grid are many:

  • Due to breakthroughs in drilling, natural gas is historically cheap. Gas, in turn, is the benchmark that sets wholesale electricity rates. With prices depressed,  utilities and independent power producers have retired almost 4,000 megawatts of nuclear power, due primarily to nuclear power’s relatively high operating costs, although Clean Water Act compliance costs contributed to these decisions.
  • At the same time, the EPA is waging a war on coal,  which, when combined with depressed prices, has lead to the retirement of 22,000 megawatts of coal-fired electricity. Due to one egregious regulation, known as the Utility MACT, many thousands more megawatts of coal-fired electricity will retire next spring.
  • Gas has been filling the void left by coal and nuclear retirements. Gas fired generation capacity has increased to now represent more than 40 percent of total capacity in North America, an increase from under 30 percent as recent as five years ago. This presents challenges unto itself, because the natural gas pipeline system wasn’t built to accommodate electricity production. The ensuing bottlenecks can have reliability impacts. This was a major lesson from last winter’s polar vortex.
  • Finally, the enactment of green energy production quotas in 30 States has resulted in widespread use of intermittent renewable energy sources like solar and wind energy. Because the power supplied to the grid must be balanced carefully with power consumed from the grid, the unreliable nature of green energy presents engineering challenges.

Into this milieu will be thrust FERC’s next chair to succeed Jon Wellinghoff, whose term ended last November. FERC’s role in protecting the grid’s reliability stems from an amorphous mandate in the 2005 Energy Policy Act. In fact, FERC has limited power to effectuate policy that impacts reliability; however, in practice, the commission can perform a vital role as an information clearinghouse and a source of expertise. The chair, moreover, has a bully pulpit. Ideally, he or she would also act as the grown up in the room when EPA contemplates politicized regulations with broad consequences for the electricity sector.

The President’s first pick for the job was Ron Binz, whose nomination was scuttled by the Senate Energy and Natural Resources (ENR) Committee when it became apparent that he was an opponent of all forms of conventional energy.

For his second pick, the President chose Norman Bay. And again, the President took the highly unusual step of nominating someone for FERC chair who isn’t a currently serving commissioner. Instead, Bay comes from FERC’s Office of Enforcement. Bay’s confirmation hearing before the Senate ENR Committee took place on May 20th (webcast here). As a former prosecutor, it should come as no surprise that he was adept at handling questions in person. However, in answers to follow up written questions, Bay raises serious red flags about whether he’s an appropriate pick for chair, especially in light of the tumult within the electric sector.

[click to continue…]

Back in January, in the midst of one incredibly cold winter, John Holdren, Director of the White House Office of Science and Technology Policy, posted a short video on the agency’s website entitled The Polar Vortex Explained in 2 Minutes.  In that video, Holdren claimed that a “growing body of evidence suggests that the kind of extreme cold being experienced by much of the United States as we speak is a pattern that we can expect to see with increasing frequency as global warming continues.”  In short, global warming was responsible for colder winters.

This, of course, would be yet another step towards galactic nonverifiability—If global warming is responsible for everything, it can be never be tested empirically.

But as a number of climate scientists soon pointed out, Holdren’s claim of a growing body of evidence on this issue was simply false.  In fact, from September 2013 on, three peer-reviewed studies appeared debunking the notion that polar warming had led to an increase in what are known as winter blocking episodes—situations where extremely low temperatures become locked in for exceptionally long periods of time.  That was why, in April, we filed a formal request for correction with OSTP under what’s known as the federal Data Quality Act.

After we filed our petition , by the way, yet a fourth study appeared disputing the global warming/polar vortex connection.

Yesterday, shortly before OSTP’s 90-day deadline for responding to correction requests, we received the agency’s denial (see below).  OSTP claims that Holdren was simply expressing his “personal opinion” rather than any “comprehensive review of the scientific literature”.

On its face, this response is shovel-ready nonsense.  Holdren, and others at OSTP who parroted his claim, at no point suggested that they were speaking personally rather than as agency employees.  To the contrary, they employed both the agency’s resources and stature to disseminate the polar vortex claim.

More importantly, the specific contention—of a “growing body of evidence”—can be tested by any kindergartner.  Four recent studies on this issue all contradict the global warming/polar vortex connection, more than countering the older studies that support Holdren—that at least balances, and more likely outweighs, whatever Holdren was relying on.  And the notion that the body of evidence supporting him is growing is nonsense.

If Holdren were selling pizza, the FTC would’ve been all over him long ago.

OSTP IQA Response

Post image for EPA’s Carbon “Pollution” Rules: War on Coal by the Numbers

How will EPA’s existing-source carbon “pollution” rule, released last week as a pre-publication document, affect U.S. power markets between now and 2030? The rule requires all states, on average, to reduce their power-sector carbon dioxide (CO2) emissions 30% below 2005 levels by 2030. Each state is, however, assigned a different “standard” (calibrated in pounds CO2 per megawatt hour), and the level of effort required to meet the standards will vary from state to state.

Two variables in particular affect both a state’s 2030 standard and the expense required to meet it: (1) how much of the state’s current generation comes from coal, and (2) how much idle natural gas combined cycle (NGCC), renewable, and nuclear generation capacity exists to meet consumer demand as the state ramps down and phases out coal generation.

The chart below presents some of the relevant data used to compute each state’s standard. It also confirms — despite EPA’s protestations to the contrary – that the agency is waging a war on coal.

As EPA reads the Clean Air Act, it must first propose emission performance standards for new sources under §111(b) before it may propose performance standard guidelines for existing sources under §111(d). Arguably, this means a court decision invalidating new source standards would invalidate existing source standards as well. EPA published its carbon “pollution” rule for new coal power plants in January. My colleague William Yeatman identifies six legal flaws in EPA’s new source carbon “pollution” standard.

Data in the accompanying chart make it even clearer that EPA’s new source rule is a de-facto fuel-switching mandate — a policy Congress has not approved and which would be dead on arrival if introduced as a bill. [click to continue…]

Cooler Heads Digest 6 June 2014

Post image for Carbon Rule Targets by State — Wall Street Journal Chart

The chart below comes from Wall Street Journal reporter Amy Harder. I have broken it into two pieces to make it more readable in a WordPress format. For understanding the level of effort EPA’s ‘Clean Power Rule’ will impose on each state, Harder’s chart is the place to begin.

Although EPA estimates its proposed rule will reduce power sector carbon dioxide (CO2) emissions 30% below 2005 levels by 2030, the agency did not use 2005 emissions data to set state-by-state CO2 reduction targets. Instead, Harder explains, EPA used 2012 and 2013 data to set different targets for each state “based on what it thinks each state can achieve by 2030, taking into account several factors including the energy mixes in the region and how much of its electricity each state can shift from coal to natural gas,” an electricity fuel that emits half as much CO2 as coal.

As shown in columns one and two of the chart, for 2012 EPA estimated each state’s CO2 emissions by weight in millions metric tons and electricity output in terawatts. From those data, EPA calculated each state’s 2012 CO2 emission rate or “standard” in pounds CO2 per megawatt hour (lbs/MWh), shown in column three. Columns four and five show each state’s standard for 2030 and the percent CO2 reduction required to meet it.

A key factor in how hard it will be for a state to achieve the percent CO2 reduction required by the 2030 standard is the current (2013) share of the state’s electricity generated from coal, shown in column six. A relatively small percent CO2 reduction can be more difficult for a state where coal’s share of electric generation is high than a relatively large percent CO2 reduction for a state where coal’s share is low.

“For instance,” writes Harder, “Kentucky only needs to cut its carbon emissions 18%, compared with Washington state’s 72% reduction requirement. But coal provides 93% of Kentucky’s electricity and just 6% of Washington’s. The one coal plant in the Evergreen State is already scheduled to retire by 2025.” [click to continue…]

Post image for CASAC Sows Confusion on Ozone by Playing Legal Word Games

Regulating ozone became much more confusing yesterday, as the EPA’s Clean Air Scientific Advisory Committee (CASAC) leaned on a muddled mandate from the D.C. Circuit Court to introduce even more uncertainty into the standard setting process.

Under the Clean Air Act, EPA must establish, and periodically review, a national standard for ambient air concentrations of ground-level ozone at a level “requisite to protect public health” with an “adequate margin of safety.” And in 1977, the Congress established CASAC to provide “independent” advice to EPA on the setting of national standards for pollutants like ozone. CASAC’s seven member board is nominated annually, primarily from the ranks of epidemiologists and public health officials.

Setting an ozone standard at a level that protects public health with a margin of error sounds simple enough, but in practice it is an impossible task. There is, in fact, no threshold at which ambient air concentrations of ozone ceases to have an effect on human health. To be sure, we’re not talking about mortality (at least, not in the U.S.). Instead, there’s evidence that ozone can be a non-mortal irritant to sensitive populations in rough proportion to air concentrations.

Because there’s no threshold below which there is zero impact, it’s absurd to require, as does the Clean Air Act, EPA to choose a specific level of ozone that is “requisite to protect public health” with an “adequate margin of safety.” Such levels simply don’t exist. In light of this inherent contradiction between the directives of the Clean Air Act and the physical realities of ozone pollution, courts are put in an unenviable position when they try to determine whether the agency’s ozone NAAQS adheres to the statute. The D.C. Circuit Court of Appeals has exclusive jurisdiction to hear a challenge to an ozone NAAQS promulgated by EPA, and the Court’s current approach to such a review can be located in a ruling that was delivered last summer, in Mississippi et al. v. EPA.

The Mississippi ruling introduced a dichotomy between “science” considerations and “policy” considerations. The former (science considerations) is construed as pertaining to the component of the ozone NAAQS that is “requisite to protect public health.” The latter (policy considerations) is the component of the NAAQS that represents an “adequate margin of safety.”

For “science” considerations, the Court reasoned that CASAC’s recommendations are basically controlling. I’ve written before about the troubling ramifications of giving such a broad power to an unelected group of technocrats.

For “policy” considerations, the Court determined that EPA has a much greater degree of discretion. So, for these considerations, which pertain to establishing a margin of error beyond an ozone standard that is “requisite” to protect public health, CASAC’s advice is not controlling.

As I explained at the time, the difference between “science” and “policy” considerations was less than clear cut as articulated by the Court. To my eyes, it looked like a “judgment call.”

Yesterday morning, CASAC convened to decide upon its first ozone NAAQS recommendation since Mississippi et al. v. EPA. Currently the ozone standard is set at 75 parts per billion (ppb). At the outset of yesterday’s meeting, all CASAC members agreed that the standard should be set at least as stringent as 70 ppb. However, the panel was divided on how stringent  an ozone standard to recommend. Some members supportted a standard near 60 ppb, while others supported a standard closer to the upper limit (i.e., 70 ppb).

In order to achieve a compromise and attain consensus, CASAC panelists agreed to get lawyerly. According to InsideEPA’s Lea Radick, they will recommend an upper limit of 70, “given that the [the recommendation] to [EPA Administrator Gina] McCarthy is modified to indicate [that] 70 ppb has a “limited” or “inadequate” margin of safety.” Also, BNA’s Patrick Ambrosio reported that CASAC will identify a “policy preference” for an ozone standard set between 60 to 65 ppb.

[click to continue…]

Post image for Administration & Allies Ask: Won’t Somebody Please Think of the Children?

President Obama, the Environmental Protection Agency, and their allies are laying it on really thick with the use of children in general, and asthmatic children in particular, as political props in promotion of the agency’s just-released climate plan. In fact, the rule has nothing to do with either (1) children or (2) asthmatic children, because (1) the rule won’t impact global temperatures and (2) greenhouse gases don’t trigger asthma. Such inconvenient truths haven’t stopped proponents of the rule from running every cheap “it’s for the kids” trick in the political playbook. [click to continue…]

Congress, we have held, does not alter the fundamental details of a regulatory scheme in vague terms or ancillary provisions—it does not, one might say, hide elephants in mouseholesWhitman v. American Trucking Associations 531 US 457, 468 (2001)

During the last month, both Politico and the New York Times have published reports on the origins of Clean Air Act §111(d), the statutory provision that authorizes a major climate change regulation for existing power plants that EPA rolled out this week. Notably, both of these major media outlets chose the word “obscure” to describe §111(d).

The modifier is apt. At 291 words, §111(d) is a relatively tiny provision in the Act, a proportion that befits its limited purpose (as intended by the Congress). In fact, §111(d) is defined primarily by what it isn’t. The foundational air quality regulatory regime established by the Clean Air Act is the National Ambient Air Quality Standards program; NAAQS addresses 6 “criteria” pollutants. The other major air quality program for stationary sources in the Act targets hazardous air pollutants from industrial categories. The objective of §111(d) is to regulate existing sources of pollution that IS NOT a “criteria” pollutant (i.e., subject to a NAAQS) or a hazardous air pollutant.

Not surprisingly, applications of this catch-all provision have been few and far between. Since implementing regulations were first promulgated in 1975, EPA has used §111(d) to regulate four pollutants from five source categories: (1) sulfuric acid mist emissions from sulfuric acid production plants; (2) fluoride emissions from phosphate fertilizer plants; (3) fluoride emissions from primary aluminum production plants; (4) total reduced sulfur from kraft pulp mills; and (5) landfill gases from solid waste landfills. All told, these regulations have affected maybe 80 sources (and that’s a very conservative estimate).

So…§111(d) has been employed by EPA a handful of times to a few score sources during the last 40 years. Moreover, it has never been controversial. Indeed, many of EPA’s approvals of state plans to meet §111(d) requirements were promulgated as “direct final rules,” which the agency only uses when it’s confident the matter is ultra-mild and no one will object. The length of EPA approvals of State plans averages 2 pages in the Federal Register. This is truly an obscure provision.

And yet…§111(d) is, today, the basis of an EPA regulation that would overhaul electricity oversight in 50 States. Costs no doubt will be significant, but more important is the gross expansion in federal power, a subject that I explain here and here (audio). Eighty years ago, Congress explicitly barred federal energy regulators from interfering with State management of the electricity sector within that State’s borders.* Now, for the first time ever, EPA is claiming such powers over States—authority, again, that has long been denied the Federal Energy Regulatory Commission (and it precursors). And the legal foundation of this unprecedented (and expensive) growth of the federal government is…an “obscure” provision of the Clean Air Act. To my eyes, this is the quintessential elephant in a mousehole. [click to continue…]

Today on E&E TV, Roger Martella, a partner at Sidley Austin and former EPA general counsel, discussed some of the legal and economic issues raised by EPA’s carbon “pollution” rule for existing power plants, which the agency released Monday (June 2, 2014).

The rule requires states, on average, to achieve a 30% reduction in power-plant carbon dioxide (CO2) emissions below 2005 levels by 2030. It allows states to meet their respective targets through four main strategies: increase the efficiency of coal electric generation, substitute natural gas generation for coal generation, substitute renewable and nuclear generation for fossil-fuel generation, and reduce electric demand by industry and other end-users. To implement those strategies, states may use various policy options including cap-and-trade, renewable electricity mandates, and demand-side management programs.

In this brief post, I will state the gist of some of interviewer Monica Trauzzi’s questions and excerpt from Martella’s responses.*

Q: How much flexibility is EPA actually giving states to comply with the rule?

RM: On the one hand, the word that was probably used the most during Administrator McCarthy’s presentation [on Monday] was “flexibility,” and she kept her promise to give the states as much flexibility as possible. And from one perspective that’s true. I think EPA’s basically saying any way you want to reduce greenhouse gas emissions, we’ll find a way to make it work. But what she didn’t discuss as much was the other side of that coin, which is the numeric targets that EPA’s set for individual states, now, I think 49 states, state-by-state numeric targets. In some cases those are extremely aggressive, not flexible, and EPA makes it very clear these will be mandatory, binding targets on these states. [click to continue…]