Blog

A new MIT study implicitly confirms the obvious: EPA’s “carbon pollution rule” — the agency’s proposed carbon dioxide (CO2) emission standards for new fossil-fuel power plants — is a fuel-switching mandate. Whether through miscalulation or design, the rule does not promote investment in new coal generation with carbon capture and storage (CCS) technology. Rather, the rule effectively bans investment in new coal power plants.

The study, “CO2 emission standards and investment in carbon capture,” puts the point more delicately:

First, the impact of the U.S. EPA’s proposed emission standard of 1100 lbs CO2/MWh is most likely to be an acceleration of the ongoing shift of generation from coal to natural gas. An emission standard of this level is unlikely to incentivize investment in coal with CCS, regardless of any stated intentions.

Why does the “carbon pollution” rule block investment in new coal generation? Coal power plants can meet the standard only by installing CCS technology. CCS adds significantly to the cost of coal generation, natural gas combined cycle (NGCC) power plants already comply with EPA’s rule without additional technology or investment, and “even in the absence of the standard, low natural gas prices would favor natural gas-fired generation over coal fired generation.” Thus, “fuel switching, rather than investment in emissions control (i.e., CCS), is the lowest cost compliance strategy.”

The charts below show the cost penalties incurred by installing CCS technology. Both variable O&M costs and overnight capital costs (the full cost of building the plant if paid upfront) increase as the percentage of CO2 capture increases. [click to continue…]

In a new study published in the journal Environometrics, economists Ross McKitrick and Timothy Vogelsang compare climate models and observations over a 55-year span (1958-2012). Observations are from three sets of weather balloon measurements of tropical troposphere temperatures. Those are compared with 57 runs each of 23 CMIP3 models used by the IPCC in its 2007 Fourth Assessment Report (AR4).

In a lengthy post on the Drudge Report Climate Audit, McKitrick explains that the study focuses on the tropical troposphere because “that is where most solar energy enters the climate system, where there is a high concentration of water vapour, and where the strongest feedbacks operate.” The two economists used AR4 climate models because they began the study years ago before a “library” of CMIP5 models used in the IPCC’s Fifth Assessment Report (AR5) was available. (Note: McKitrick plans to update the study using the CMIP5 library.)

The graphic below shows how model projections compare with balloon data in the lower- and mid-troposphere over the observation period.

McKitrick and Vogelsang 2, July 2014

 

 

 

 

 

 

 

 

McKitrick and Vogelsang’s paper is 20 pages long and heavy on the math. Here is the bottom line as McKitrick presents it on Drudge Climate Audit: [click to continue…]

EPA yesterday promulgated in the Federal Register the agency’s 52nd Clean Air Act takeover of a state air quality program, known as a “Federal Implementation Plan” (“FIP”). This time, the target was Arizona’s visibility improvement program, known as Regional Haze.

The agency’s latest takeover provides an unfortunate segue to a report I authored that was published this week by the Competitive Enterprise Institute, titled “How the Obama Administration Is Undermining Cooperative Federalism under the Clean Air Act.” The paper, which is reposted at the bottom of this blog, includes the latest survey of EPA regulatory takeovers of state air quality programs, known as federal implementation plans (“FIPs”). As noted above, Obama’s EPA has imposed 52 Clean Air Act FIPs. By comparison, the previous three presidential administrations—George H.W. Bush, William Clinton, and George W. Bush—accounted for a grand total of…FIVE! Mind you, there are still two lame duck years left of the current administration.

AAAAA FIP Chart

This is not a welcome trend. As I explain in the paper, a FIP is the most aggressive action EPA can take against a State government. It’s a direct usurpation of a co-sovereign. This is why previous administrations have resorted to FIPs so sparingly. Moreover, the paper details how 98% of EPA’s Clean Air Act FIPs are of dubious legitimacy. Finally, the paper proposes a number of legislative solutions to reestablish Clean Air Act cooperative federalism as the Congress intended it. The most provocative of these solutions is for Congress to level the deference accorded by Article III courts to agency-decision making when State and Federal Governments disagree how to implement the Act, such that EPA’s factual determinations and textual interpretations are no longer controlling in this circumstance.

FINAL William Yeatman -How the EPA is Undermining Cooperative Federalism

Last Friday, EPA’s staff issued its final recommendation for a revised national ambient air quality standard for ozone (“ozone NAAQS”), known as a Policy Assessment, which I’ve posted at the bottom of this blog. The document is supposed to represent “the latest scientific knowledge useful in indicating the kind and extent of all identifiable effects on public health or welfare which may be expected from [ozone],”* and thereby inform Administrator Gina McCarthy’s determination of where to set the standard. The ozone NAAQS was last revised to 75 parts per billion in 2008; on Friday, the EPA staff recommended that standard be revised to somewhere between 60 and 70 parts per billion.

But here’s the thing: The staff’s advice doesn’t matter. Thanks to a recent ruling in the D.C. Circuit Court of Appeals, the EPA—indeed, the federal government!—has no say in the setting of an ozone NAAQS. Instead, that prerogative has been bestowed on an obscure group of technocrats known as the Clean Air Scientific Advisory Council.

This is to be feared. The economic consequences of a revised ozone NAAQS are tremendous. There are literally trillions of dollars at stake. Such a decision is unequivocally a POLICY determination, especially given that we’re talking about non-mortal health impacts. In America, a POLICY decision should be rendered by a branch of government with an electoral foundation, not a roomful of epidemiologists enamored with the “profound policy implications” of their research. I explain this here and here.

*But rarely does.

 

PA

 

Thor 2

“The Obama administration is working to forge a sweeping international climate change agreement to compel nations to cut their planet-warming fossil fuel emissions, but without ratification from Congress,” Coral Davenport reports in the New York Times.

Were you surprised? In domestic climate policy, Team Obama routinely flouts the separation of powers. Their M.O. from day one has been to ‘enact’ regulatory requirements that, if proposed in legislation, would be dead on arrival.

During this year and next, climate negotiators are again trying to work out a successor treaty to the Kyoto Protocol, which expired at the end of 2012. Under the U.S. Constitution, a treaty enters into force only if ratified, and ratification requires the approval of “two-thirds of Senators present.”

Although Democrats control the Senate, a ratification vote on Kyoto II would fail if held today. With Republicans expected to pick up Senate seats in November, the constitutional path to a new climate treaty seems hopelessly blocked.

So, according to Davenport, the Obama administration plans to negotiate an agreement that is not a treaty yet binding in effect:

To sidestep that [two-thirds] requirement, President Obama’s climate negotiators are devising what they call a “politically binding” deal that would “name and shame” countries into cutting their emissions. The deal is likely to face strong objections from Republicans on Capitol Hill and from poor countries around the world, but negotiators say it may be the only realistic path.

The agreement Obama seeks is no mere ‘coalition of the willing.’ Even though not ratified by the Senate, elements of agreement would still be enforceable as a matter of international law. From the NYT article: [click to continue…]

Post image for Social Cost of Carbon: GAO Report Ignores Pro-Regulation Bias

An eight-month investigation conducted by the Government Accountability Office (GAO) finds no flaws in the Obama administration’s interagency process for developing social cost of carbon (SCC) estimates. Remarkably, GAO has “no recommendations” to improve the process.

GAO did not attempt to evaluate the “quality” of the administration’s SCC estimates. Even so, it’s unusual for GAO to review an agency, program, or policy and find no room for improvement.

Not that anyone should expect GAO to confront the inherent flaws of SCC analysis. As previously argued on this blog, carbon’s social cost is an unknown quantity, discernible in neither meteorological nor economic data. SCC estimates are perforce spun out of non-validated climate parameters and made-up social damage functions. Armed with such sophistry, climate campaigners can make renewable energy look like a bargain at any price and fossil fuels look unaffordable no matter how cheap.

But even taking SCC analysis at face value, the administration’s process is biased, and the evidence is right there in GAO’s report.

Before getting down to particulars, let’s recall why this topic matters. The SCC is an estimate of the dollar value of damages allegedly caused by an incremental ton of carbon dioxide (CO2) emitted in a given year. The higher the SCC estimate, the more plausible the claims of Obama administration officials and their allies that the benefits of CO2-reduction policies justify the costs.

The administration’s SCC interagency working group (IWG) has published two reports called technical support documents. SCC estimates in the 2013 TSD are roughly 50% higher than in the 2010 TSD. In just three years, CO2 reductions became 50% more valuable. Amazing!

Social Cost of Carbon 2010 and 2013 Central Estimates Compared, GAO August 2014

EPA, the Department of Energy, and/or the Department of Transportation have used SCC estimates in 68 rulemakings since May 2008, according to GAO (Appendix I). Fossil fuel foes now use SCC analysis to sell everything from carbon taxes to renewable energy mandates to regional cap-and-trade programs to EPA greenhouse gas regulations.

GAO says everything is hunky-dory because the administration “used consensus-based decision making” (several agencies participated), “relied on existing academic literature and models,” and “took steps to disclose limitations and incorporate new information.” Well, of course they did. These folks are professionals; they know how to check the requisite boxes.

Nonetheless, the administration’s process is biased in four ways. In both the 2010 and 2013 TSDs, the IWG:

  1. Inflated the perceived benefit of CO2 reductions to the U.S. economy by providing only higher global SCC values, not lower domestic SCC values, as required by OMB Circular A-4.
  2. Inflated the estimated benefit of CO2 reductions by using only low discount rates (2.5%, 3%, 5%) to estimate the present value of future CO2-related damages, not a 7% discount rate, as also required by OMB Circular A-4.
  3. Inflated the estimated benefit of CO2 reductions by including ‘worse than we thought’ climate impact projections but not ‘better than we feared’ projections.
  4. Inflated the estimated benefit of CO2 reductions by uncritically accepting the IPCC’s 2007 Fourth Assessment Report (AR4) climate sensitivity estimates despite growing evidence that IPCC models are tuned ‘too hot.’

[click to continue…]

Post image for Increase in Reported UK Floods Due to Population Growth, Not Climate Change – Study

For years, University of Colorado Prof. Roger Pielke, Jr. has been demonstrating that damages due to hurricanes are not increasing once economic data are adjusted (‘normalized’) for increases in population, wealth, and the consumer price index.

More people with more valuables at higher prices incur greater combined monetary losses when disaster strikes. There is no “greenhouse signal” in properly-adjusted hurricane loss data — no trend reflecting a potential warming-induced increase in hurricane frequency or power.

Pielke Jr Normalized US Hurricane Damage 1900-2012

Source: R. Pielke, Jr. Normalized U.S.  Hurricane Damages: 1900-2012. The gray bar indicates estimated damages from Hurricane Sandy.

University of Amsterdam Prof. Laurens M. Bouwer reviewed 22 studies of damages from tropical storms, thunder storms, tornados, floods, hail, brushfires, and earthquakes over multiple decades in the U.S., Europe, Asia, Latin America, and Australia. He came to the same conclusion:

The studies show no trends in losses, corrected for changes (increases) in population and capital at risk, that could be attributed to anthropogenic climate change. Therefore, it can be concluded that anthropogenic climate change so far has not had a significant impact on losses from natural disasters.

If that seems counterintuitive, it’s because detection and reporting of extreme weather has increased. The improved density and spatial coverage of monitoring systems coupled with round-the-clock weather news makes extreme weather seem much more common today than it was perceived to be, say, in the 1970s.

Apparent increases in one type of extreme weather — flooding — may have an even simpler explanation: more people living in places where floods occur.

A new study by scientists at the University of Southampton Tyndall Center for Climate Change Studies finds that population growth and urban expansion account for the reported increase in damaging floods in the UK over the past 129 years. In the words of lead author Andrew Stevens, a growing population means “more properties exposed to flooding and more people to report flooding.” [click to continue…]

Guest Post by Dave Juday, commodity market analyst and principal of The Juday Group

“Government mandates like RFS, subsidies, loan guarantees, and investments have not proven any better than the market for developing new energy resources – just much more costly.  It is time to let the market sort things out.”

KiOR, once the darling of the renewable energy world, reported in a filing with the Securities and Exchange Commission (SEC), that it has a net deficit of $629.3 million and said it expects to continue incurring losses for the foreseeable future.  The details of the filing are not shocking; in March of this year KiOR released a statement that it had “substantial doubts about our ability to continue as a going concern.”

KiOR Chart 2011 - 2014

KiOR’s problems have repercussions beyond just shareholders and employees.  This isn’t just another high-tech start-up in the renewable fuels world. KiOR was considered the next great thing since sliced bread and in many ways was the cornerstone of advanced renewable fuels policy.  Following is short re-cap of the KiOR story. [click to continue…]

Post image for Are Fossil Fuels the Past, Renewables the Future?

Prussian military theorist Carl von Clausewitz famously defined war as “the mere continuation of policy [politics] by other means.” An unstated implication of this oft-quoted maxim is that politics is a continuation of war by non-military means.

What is the optimal way to win wars, political or military? Chinese general Sun Tzu said that “supreme excellence” in the art of war “consists in breaking the enemy’s resistance without fighting.” Unsurprisingly, throughout history, political combatants often try to inculcate the belief that the future is already written, tomorrow belongs to them, hence, resistance is futile.

This psyops component of warfare explains one of the standard tropes of green rhetoric. Fossil fuels are belittled as outmoded energies destined for history’s dustbin whereas wind, solar, and biofuels – sources requiring Soviet-style production quota and other policy privileges to capture significant market share — are hailed as technologies of tomorrow.

Consider two recent examples.

In a speech to the League of Conservation Voters declaring opposition to a proposed coal export terminal, Oregon Gov. John Kitzhaber stated:

First, it is time to once and for all to say NO to coal exports from the Pacific Northwest. It is time to say Yes to national and state energy policies that will transform our economy and our communities into a future that can sustain the next generation. . . . The future for Oregon and the West Coast does not lie in 19th century energy sources.

Yesterday, the Illinois Commerce Commission hosted a stakeholder meeting on EPA’s proposed guidelines to reduce carbon dioxide (CO2) emissions from existing power plants. Rebecca Stanfield of the Natural Resources Defense Council reportedly characterized “jockeying” by coal and nuclear interests as a “sideshow.” Climatewire (paywall protected) quotes her saying:

This is about leading the energy economy of the future, not about looking in the rearview mirror at the resources that powered the past.

The real “sideshow,” however, is you-are-obsolete rhetoric, which distracts public attention from the merits of competing energy technologies and, thus, from the costs and limitations of renewable energy. Whatever their date of origin, all energy technologies undergo continual modification and innovation. What matters is their value to consumers today and the foreseeable future, not when they first deployed at commercial scale.

Besides, people who live in glass houses shouldn’t throw stones. It’s not just coal-based power that got its start in the 19th century. So did renewables, especially hydropower and wind. [click to continue…]

Post image for Can Natural Variability Save Climate Models?

Climate scientists Patrick Michaels and Chip Knappenberger have a blockbuster post on the Cato Institute blog. They claim to have uncovered a “clear example of IPCC ideology trumping fact.”

As is widely known, global mean surface temperature (GMST) has not increased over the past 13-plus years, contributing to a growing divergence between global warming predictions and observations.

Christy McKnider data v models

Figure source: John Christy and Robert McKnider

While acknowledging there are “differences” between modeled and observed temperatures for “periods as short as 10 to 15 years,” the IPCC’s 2013 Fifth Assessment Report (AR5) claims models and observations “agree” over the 62-year period from 1951 to 2012 (Summary for Policymakers, p. 15). Moreover, the IPCC has “very high confidence” the models’ long-term GMST trends are “consistent with observations” (Chapter 9, p. 769). The chart below illustrates “model response error” during two 15-year periods and the longer 62-year period.

Models vs Observations IPCC AR5 Box 9.2

In each panel, red hatching shows observed temperatures as compiled by the UK Hadley Center; the gray bars show GMST trend distribution from 114 climate models. IPCC AR5, Chapter 9, Box 9.2.

Panel (c) appears to depict a close match between simulations and observations. But when Michaels and Knappenberger unpack the information incorporated in the graphic, they find that 90 out of 108 models hind-cast more warming than actually occurred.

IPCC Model Simulated and Observed Temperatures 1951-2012 Disaggregated by Michaels and Knappenberger

Okay, that makes IPCC’s “very high confidence” seem misplaced, but why is Michaels and Knappenberger’s column a blockbuster? Because of what they show next.  [click to continue…]