Post image for Keystone XL: Does Hatred Blind Peace Prize Winners?

Former President Jimmy Carter and nine other Nobel Peace Prize winners this week called upon President Barack Obama and Secretary of State John Kerry to “do the right thing” and “reject” the Keystone XL Pipeline. The Nobel Laureates’ open letter, published in Politico, is worth reading because it epitomizes the intellectual poverty of the anti-Keystone faction.

Asserting that Obama and Kerry’s stand on Keystone XL will “define” their “legacy” on climate change, the Nobels claim that rejection of the pipeline would (1) “have meaningful and significant impacts in reducing carbon pollution,” (2) “pivot our societies away from fossil fuels and towards smarter, safer and cleaner energy,” and (3) “signal a new course for the world’s largest economy.” Wrong on all counts.

As Cato Institute scientist Chip Knappenberger shows, using a climate model developed by the EPA, even if we make the totally unrealistic assumption that all the oil shipped via Keystone XL is additional oil in the global supply that would otherwise never be produced, the pipeline would have to run at full capacity for 1,000 years just to raise average global surface temperature by one-tenth of a degree Celsius. Climatologically, Keystone XL is irrelevant.

The Nobels might reply that the pipeline’s emissions are not the issue. According to them, Keystone XL is the “linchpin for tar sands [oil] expansion,” hence approving the pipeline would commit the world to a “dangerous” development path while rejecting it would move the world towards a new, cleaner path. Okay, time for a restatement of the obvious.

The U.S. economy is in the midst of a revolution in unconventional oil and gas, and the global appetite for oil and gas is growing by leaps and bounds. These trends are determined by basic economic and technological realities, not by a political decision about one infrastructure project. The Nobels’ conceit that Keystone XL is a “pivot” for the global economy and, thus, for the climate system is a reversion to the magical thinking of children.

The Nobels assert that, “The myth that tar sands development is inevitable and will find its way to market by rail if not pipeline is a red herring.” But alternate delivery via rail is not a myth, it’s a massive and growing reality. Maybe before writing to Secy. Kerry, the Nobels should read the State Department’s Final Supplemental Environmental Impact Statement (FSEIS) on Keystone XL, especially Chapter 4: Market Analysis.

As in the 2011 Final EIS and 2013 Draft Supplemental EIS, State again concludes that “the proposed Project is unlikely to significantly affect the rate of extraction in oil sands areas (based on expected oil prices, oil-sands supply costs, transport costs, and supply-demand scenario).” A big difference, though, is that whereas the 2011 and 2013 reports “discussed the transportation of Canadian crude by rail as a future possibility,” the FSEIS “notes that the transportation of Canadian crude by rail is already occurring in substantial volumes.” Indeed, from January 2011 through November 2013, rail transport of Canadian crude to U.S. refineries increased from practically zero barrels per day (bpd) to 180,000 bpd.

marlo post 3

The completed Keystone XL Pipeline is estimated to have a capacity to deliver 830,000 bpd of crude oil. According to the FSEIS, rail-loading facilities in the Western Canadian Sedimentary Basin (WCSB) are already “estimated to have a capacity of approximately 700,000 bpd of crude oil, and by the end of 2014, this will likely increase to more than 1.1 million bpd.”  [click to continue…]

Post image for Judge Kavanaugh’s Dissent in the EPA Mercury Rule Case: Will the Supreme Court Review EPA’s $9.6 Billion Reg?

Yesterday, the D.C. Circuit Court of Appeals ruled 2-1 to uphold the EPA’s nonsensical Mercury Air Toxics Standards (MATS) Rule. The MATS Rule requires electric utilities to install maximum achievable control technology (MACT) to reduce emissions of mercury and other hazardous air pollutants (HAPs) from coal-fired power plants.

The rule is nonsensical because, as explained below, although one of the most expensive regulations in history (officially estimated at $9.6 billion in 2016), its health benefits are illusory.

In the case, titled White Stallion Energy Center LLC et al. v. EPA et al., Judge Brett Kavanaugh wrote a powerful dissenting opinion, as my colleague William Yeatman noted yesterday. Kavanaugh agreed with industry petitioners that EPA unreasonably excluded cost considerations (economic impacts) when determining whether MACT regulation of power plant HAPs is “appropriate and necessary.”

The two-judge majority partly based their opinion on the Supreme Court’s ruling in Whitman v. American Trucking Ass’ns (2001) that EPA may not take costs into consideration when setting national ambient air quality standards (NAAQS).

Kavanaugh argues the majority ‘misreads’ or ‘over-reads’ Whitman by ignoring a key difference between the Clean Air Act provisions governing NAAQS rulemakings – §108(a) and §109(b) – and the provision addressing potential MACT regulation of power plant HAPs – §112(n)(1)(A).

The NAAQS provisions clearly allow no room for cost considerations. If an air pollutant is emitted by numerous or diverse mobile or stationary sources and the associated air pollution is reasonably anticipated to endanger public health or welfare, then, pursuant to §108(a), EPA must establish NAAQS for those pollutants, and, pursuant to §109(b), the standards must be requisite to protect public health with an adequate margin of safety. Period. End of story.

In contrast, §112(n)(1)(A) requires EPA to study and issue a report on the public health hazards anticipated to occur as a result of power plant HAP emissions, and then apply MACT regulation “if” the Administrator “finds such regulation is appropriate and necessary.” The provision does not define the terms “appropriate” and “necessary.” Common sense suggests that a regulation is “appropriate” if the benefits justify the costs.

Perhaps more importantly, §112 tasks EPA to determine whether MACT regulation of HAPs is “appropriate and necessary” only for “electric steam generating units.” For all other major sources of HAP emissions, EPA has no discretion and is simply required to promulgate MACT regulations. The statute thus seems to contemplate that, in the special case of coal power plants, MACT regulation may not be appropriate even if the associated HAP emissions pose public health hazards. In other words, a less stringent form of Clean Air Act regulation (such as new source performance standards) or state-level regulation might be “appropriate.”

Yeatman opines that Kavanaugh’s dissent may persuade the Supreme Court to review the case. If so, the Court might rule that EPA is allowed or even required to consider costs when determining what is “appropriate” when regulating HAP emissions from power plants. That, in turn, could set the stage for litigation on whether the MATS Rule is too costly to be “appropriate” within the meaning of the statute.

Of course, EPA contends the MATS Rule is a bargain at almost any price, delivering $33 billion to $89 billion in annual health benefits. Litigation reviving public debate on such claims could be a great teaching moment.

Our June 2012 study, All Pain and No Gain, provides a detailed critique of EPA’s MATS Rule health benefit estimates. Below is a summary of key points. [click to continue…]

Post image for Will Cherry Blossoms Get Sucked into the Polar Vortex?

DC’s cherry trees hit their official peak blossom date last Thursday, April 10th.  That’s the latest in the year that the Capital has experienced peak blossoming in over two decades.  (For you botanical historians, the last time that peak blossoming occurred this late or later was in 1993, when the event fell on April 11.)

In 2013 the blossoms were almost as late, hitting their peak on April 9.  That was a pretty dramatic change from 2012, when the date fell on March 20. This change was most disconcerting to two groups: tourists trying to plan their trips to DC in advance, and global warming alarmists who trumpeted every earlier-than-expected cherry blossom as yet further proof of global warming.  In fact, in a sizzling multi-part blog post series last year, followed by dozens of readers, we charted peak blossom dates against global warming data.  We even had graphs.  (See Adam Sandberg, Peak Bloom Is Here – DC’s Global Warming Canary Lands with Frost on its Feet, April 15, 2013.)

The past two years of unusually late blooms largely resulted from unusually cold weather.  But unusually cold weather, in the view of White House Science Advisor John Holdren, is actually yet another sign of global warming.  Holdren explained this to a freezing yet grateful nation in a two-minute video last January entitled The Polar Vortex Explained in 2 Minutes.

We suspect that Holdren’s agency, the Office of Science and Technology Policy (OSTP), may now have a second video in the works in this Blame-Everything-On-Global-Warming series.  Perhaps they’ll call it Delayed Peak Blossoming Explained in 2 Minutes.

Regardless, we think Holdren’s first video is scientifically bogus, and so today we’re filing a formal Information Quality Act Correction Request with OSTP on that very issue.  Who knows—we may yet nip this video series in the bud.

Post image for The Troubling Basis for EPA’s Rosy Cost-Benefit Analysis of the Clean Air Act

Perhaps you’ve heard or seen the eye-popping statistics, trumpeted by EPA and its supporters, regarding the incredible benefits supposedly wrought by the Clean Air Act.

In a recent study, for example, EPA claimed that in 2020 alone, the Clean Air Act would be responsible for “approximately $2 trillion” in benefits. Given that the costs of the Clean Air Act are estimated to be $65 billion in 2020, this represents a benefits-to-cost ratio of more than 30, which, of course, renders EPA’s work in a very favorable light. And it’s not just EPA trumpeting these numbers. Last week, Senate Environment and Public Works Chairwoman Barbara Boxer cited data from the aforementioned report, in order to rebut Republican criticisms of EPA Administrator Gina McCarthy. Indeed, environmental special interests always are quick to cite these benefits when they defend the agency they’ve captured.

Eighty-five percent ($1.7 trillion) of the $2 trillion number is derivative of two variables: (1) how many deaths EPA purports to prevent and (2) the supposed value of these prevented deaths. EPA forecasts that its regulations will prevent almost 240,000 deaths in 2020; the agency estimates that the value of a statistical life is about $7.4 million. Multiply those two data points, then adjust for inflation, and voila!—you’re at $2 trillion in “benefits” in 2020.

In this post, I will briefly explain that this result is a total sham, because the underlying numbers are unreliable to the point of being meaningless.

Start with EPA’s calculation of “prevented deaths”—i.e., the mortality benefits of environmental regulations. In fact, this estimate is based almost entirely on controversial, “secret” science. To be precise, in establishing a relationship between decreased air pollution and decreased mortality, EPA relies on decades-old data from the two reports, the Harvard Six Cities Study and the American Cancer Society’s Cancer Prevention Study II. So when EPA claims that it will prevent 240,000 deaths in 2020, this number is an extrapolation from these two key studies.

And yet, despite the evident importance of these two studies, EPA refuses to make publicly available the underlying data. For two years, House Science, Space and Technology Committee Lamar Smith has pressed the Agency to produce this “secret science.” And for two years, his requests have been rebuffed by the EPA. Remember, these studies were taxpayer funded. They serve, moreover, as the justification for public policy. And yet, EPA refuses to turn over the data to a Member of Congress. To read about Chairman Smith’s diligent efforts to unlock this secret science, see here, here, and here. Senate Environment & Public Works Ranking Member David Vitter also is investigating the matter. Suffice it to say for this post, EPA’s estimates of mortality avoided due to the Clean Air Act cannot be trusted.

What about the other variable, the value of a statistical life? How does the agency calculate this figure? The EPA does not place a dollar value on individual lives. Rather, when conducting a benefit-cost analysis of new environmental policies, the Agency uses estimates of how much people are willing to pay for small reductions in their risks of dying from adverse health conditions that may be caused by environmental pollution.

Below is the example provided by EPA:

“Suppose each person in a sample of 100,000 people were asked how much he or she would be willing to pay for a reduction in their individual risk of dying of 1 in 100,000, or 0.001%, over the next year. Since this reduction in risk would mean that we would expect one fewer death among the sample of 100,000 people over the next year on average, this is sometimes described as “one statistical life saved.” Now suppose that the average response to this hypothetical question was $100. Then the total dollar amount that the group would be willing to pay to save one statistical life in a year would be $100 per person × 100,000 people, or $10 million. This is what is meant by the “value of a statistical life.”

Simply put, this metric makes no sense. The “value” of each “prevented death” is ascertained by asking people how much hypothetical money they’d be willing to spend in order to avoid a fraction of one percent chance of death. How could this possibly have meaning? Absolutely nothing is concrete. The question doesn’t actually pertain to your money, after all. More importantly, there’s zero referent for estimating the value of reducing your mortality risk by a fraction of one percent. The “benefit” is a total abstraction.

[click to continue…]

Post image for Do Skeptics ‘Reposition’ Warming as ‘Theory’ or Do Alarmists ‘Reposition’ Fear as ‘Fact’? Revisiting an Urban Legend

How many times have you heard climate activists claim skeptics are just latter-day “tobacco scientists?” Google “tobacco scientists” and “global warming,” and you’ll get about 1,110,000 results. With so much (ahem) smoke, surely there must be some fire, right?

Al Gore helped popularize this endlessly repeated allegation. In An Inconvenient Truth (p. 263), he contends that just as tobacco companies cynically funded corrupt scientists to cast doubt on the Surgeon General’s report linking cigarette smoking to cancer, so fossil fuel companies fund “skeptics” to create the appearance of scientific controversy where none exists.

Here’s the pertinent passage:

The misconception that there is serious disagreement among scientists about global warming is actually an illusion that has been deliberately fostered by a relatively small but extremely well-funded cadre of special interests, including Exxon Mobil and a few other oil, coal, and utilities companies. These companies want to prevent any new policies that would interfere with their current business plans that rely on the massive unrestrained dumping of global warming pollution into the Earth’s atmosphere every hour of every day.

One of the internal memos prepared by this group to guide the employees they hired to run their disinformation campaign was discovered by the Pulitzer Prize-winning reporter Ross Gelbspan. Here was the group’s stated objective: to “reposition global warming as theory, rather than fact.”

This technique has been used before. The tobacco industry, 40 years ago, reacted to the historic Surgeon General’s report linking cigarette smoking to lung cancer and other lung diseases by organizing a similar disinformation campaign. 

One of their memos, prepared in the 1960s, was recently uncovered during one of the lawsuits against the tobacco companies in behalf of the millions of people who have been killed by their product. It is interesting to read it 40 years later in the context of the global warming campaign:

“Doubt is our product, since it is the best means of competing with the ‘body of fact’ that exists in the mind of the general public. It is also the means of establishing controversy.” Brown and Williamson Tobacco Company memo, 1960s

There’s just one problem with this tale of corruption and intrigue — much of it is false and all of it is misleading. Let’s examine the flaws in this urban legend, going from minor to major.

[click to continue…]

Post image for Which Sovereign Merits Judicial Deference When State & Federal Governments Conflict under the Clean Air Act’s Cooperative Federalism Arrangement?

What is the proper scope of review when an Article III court adjudicates a federalism dispute under the Clean Air Act? Is a court supposed to review the reasonableness of the state’s determinations? Or is it supposed to review the reasonableness of the EPA’s review of the reasonableness of the state’s determination? Simply put, to which sovereign should courts defer?

In light of the Obama administration’s aggressive oversight of Clean Air Act programs operated by the States, this is a hugely consequential question, with billions of dollars at stake. Yet there exists little statutory direction and conflicting case law to guide lower courts in their review of State-Federal disagreements pursuant to the Clean Air Act. On January 29, the State of Oklahoma petitioned the Supreme Court to revisit this matter, and clarify which sovereign warrants ascendant respect from reviewing courts.

Below, in the first of a two part series, I explore how the cooperative federalism regulatory regime established by the Clean Air Act confuses judicial deference to agency decision-making. Ultimately, I urge the Supreme Court to grant Oklahoma’s cert petition, in order to cut through the uncertainty and establish unequivocally the boundaries of authority between the State and Federal Governments under the Clean Air Act.

In the second part of the series, I will make the case that States, and not the EPA, are the proper recipients of the court’s respect.

Cooperative Federalism Conundrum: Delegation Can Be Split; Deference Can’t

[click to continue…]

Post image for Would Keystone XL Serve the U.S. National Interest?

Would the Keystone XL Pipeline (KXL) serve the U.S. national interest? If the State Department answers that question in the affirmative, TransCanada Corporation can finally begin building the pipeline, more than five and a half years after originally applying for a construction permit.

TransCanada recently submitted comments to State making the case for an affirmative “national interest determination” (NID). The comments are clear, comprehensive, accurate, and, in my opinion, compelling.

Inspired by those comments, I will attempt here to state the common sense of the issue in my own words.

The interminable controversy over the KXL is stunningly pointless. Do modern commerce and transport chiefly run on petroleum-based products? Yes. Are pipelines the most economic, efficient, and safe way to transport large volumes of petroleum? Yes. Is Canada our closest ally and biggest trading partner? Yes. Is Canada already the largest single source of U.S. petroleum imports? Yes. Would building the KXL enhance the efficiency of oil transport from Canada to U.S. markets? Yes. Would building the KXL support tens of thousands of American jobs and add billions to the GDP during the construction period? Yes. Would all the financing be private and not cost taxpayers a dime? Yes.

So how could building the KXL not be in U.S. national interest?

In 2012, TransCanada sought permission to build the “Gulf Coast Project” (the green line in the map below), the southern leg of the 1,700 mile pipeline it originally proposed to build from Hardisty, Canada to Port Arthur, Texas. State environmental agencies and the U.S. Army Corps of Engineers granted all necessary permits for the Gulf Coast Project by August 2012.

Keystone XL Pipeline Gulf Coast Route, State Department Final EIS 2014

Construction began in August 2012 and the project commenced commercial service in January 2014. The earth did not shake, the sky didn’t fall, no one felt a “disturbance in the Force . . . as if millions of voices suddenly cried out in terror and were suddenly silenced.” [click to continue…]

Post image for EPA: Artless Dodging on ‘Carbon Pollution’ Rule

Yesterday, the House Science Energy and Environment Subcommittees held a joint hearing on the “Science of Capture and Storage: Understanding EPA’s Carbon Rules.” EPA Air Office Acting Administrator Janet McCabe was the sole witness on the second panel. Her testimony begins about one hour and 54 minutes (1:54) into the archived Webcast. Although calm and non-ideological in tone, McCabe’s responses in the lengthy Q&A were terse, usually uninformative, and often evasive.

The hearing focused on carbon capture and storage (CCS), the technology new coal-fired power plants will have to install to meet the carbon dioxide (CO2) New Source Performance Standards (NSPS) in EPA’s proposed “Carbon Pollution Rule.” Under the Clean Air Act, NSPS are to “reflect the degree of emission limitation achievable through the application of the best system of emission reduction which . . . has been adequately demonstrated.”

Environment Subcommittee Chair David Schweikert (R.-Ariz.) kicked off the Q&A (1:59) by noting that the “Carbon Pollution Rule” assumes CCS technology is “robust and ready to go,” yet the “previous panel was pretty crisp, even from right to left, that there are still some real concerns on the technology itself.” He asked for “technical” information clarifying how EPA set the CO2 standards.

McCabe responded by explaining that the “Carbon Pollution Rule” does not actually mandate the use of CCS, it sets a performance standard based on CCS, and let’s covered facilities decide for themselves how to meet the standard. Okay, but that’s a distinction without a difference, since the only known technology that can reduce CO2 emissions from coal plants as much as CCS is CCS.

McCabe continued:

When it comes to the technology that we based those numbers on [i.e. 1,100 lbs. CO2 per MWh for new coal plants], we believe that if you look across all the information and data that’s available, that there is adequate and robust data showing that the various components that we based the standard on are in use, have been in use, and will be ready.

In other words, instead of providing technical information addressing the concerns raised during the previous panel, McCabe said, in effect, ‘Trust us, we’re the experts.’

Rep. Suzanne Bonamici (D-Ore.), noting GOP Members’ concerns about the cost of CCS, asked McCabe to discuss the “costs associated with the lack of action to address climate change and increasing emissions.” McCabe responded (2:10):

That’s a very good question. There are costs to our economy and to society from the impacts of climate change that is already happening. In 2013, there were seven extreme weather events. Which I think is a nice way of saying great, big, huge horrible storms that cost the economy over a billion dollars each. This is a real economic impact on our communities, our families across the country.

Prompted by Bonamici, McCabe went on to include “health care costs” and “disruption to families and whole communities” among the costs of inaction.

Whether deliberately or otherwise, McCabe blurs the distinction between climate risk and climate change risk. Hurricanes are not some new phenomenon unique to the Age of Global Warming. Huge, horrible storms have billion-dollar costs — that is the nature of the beast. Blaming hurricanes on CO2 emissions is unscientific. There has been no long-term trend in the strength or frequency of hurricanes, none in global accumulated cyclone energy, and none in hurricane damages once losses are adjusted to take into account increases in population, wealth, and the consumer price index. The U.S. is currently experiencing the longest period on record with no major (category 3-5) hurricane landfall strikes.

Blaming hurricane damages on congressional gridlock (“lack of action”) is loopy. Even complete elimination of U.S. CO2 emissions via immediate and total shutdown of the economy would avert only a hypothetical 0.19°C of warming by 2100 – too small a reduction to have any detectable effect on weather patterns. Ergo, no lesser emission reductions that might have been implemented during the past decade or two could provide any meaningful protection to people or property even if one assumes all seven billion-dollar storms in 2013 were ginned up by climate change. [click to continue…]

Post image for More Studies Find Lower Climate Sensitivity

Secy. of State John Kerry last week exhorted all State Department officials to conclude a new international climate change agreement, integrate climate change with other priorities, and, in general, “elevate the environment in everything we do.” In the same week, climate researchers produced two more studies undercutting Kerry’s opinion that climate change is “perhaps the world’s most fearsome weapon of mass destruction.”

The studies address the core scientific issue of climate sensitivity — the question of how much warming results from a given increase in atmospheric carbon dioxide (CO2) concentrations.

There are two types of sensitivity estimates. Equilibrium climate sensitivity (ECS) is an estimate of the increase in ‘steady state’ surface temperature after the climate system has fully adjusted to a doubling of CO2 concentrations — a process assumed to take centuries or longer due to oceanic thermal inertia. Transient climate sensitivity (TCS) is the estimated increase in surface temperature during the 20-year period when CO2 doubling occurs, presumably during the final decades of this century.

ECS is the key variable in both climate model predictions of future global warming and model estimates of the “social cost of carbon” – the damage allegedly inflicted on society by an incremental ton of CO2 emissions.

The IPCC’s 2007 Fourth Assessment Report (AR4) estimated a “likely” ECS range of 2°C-4.5°C, with a “best estimate” of 3°C. Since 2011, however, the warming pause and the growing divergence of model predictions and observed global temperatures have been the impetus for several studies finding that IPCC sensitivity estimates are too hot.

Cato Institute scientists Patrick Michaels and Chip Knappenberger maintain a growing list of such studies, which totaled 18 as of February 2014.

Climate Sensitivity Michaels & Knappenberger 18 Studies Feb 2014

The average sensitivity estimate of the 18 studies is just under 2°C. In other words, the AR4 “best estimate” of 3°C is 50% higher than the mean estimate of the new studies. That may be why the IPCC’s 2013-2014 Fifth Assessment Report (AR5) declines to offer a “best estimate.”

A new “best estimate” of 2°C would deflate the scary climate change impacts featured elsewhere in AR5, but recycling the same old 3°C “best estimate” would deflate the IPCC’s claim to be an honest broker. So instead the IPCC chose to lower the low end of the “likely” sensitivity range. Whereas the “likely” range in AR4 was 2°C-4.5°C, in AR5 it is 1.5°C-4.5°C.

That small concession, however, does not dispel the growing challenge to consensus climatology. As indicated in the chart above, the average sensitivity of the climate models used in AR5 is 3.2°C. That is 60% higher than the mean of recent estimates (<2°C). Let’s take a quick look at three studies that have come out this year. [click to continue…]

Post image for Secretary Kerry Focuses on Climate Diplomacy While Russia Marches Into Crimea

U. S. Secretary of State John Kerry on 7th March 2014 issued his first official Policy Guidance to all Ambassadors and other heads of missions abroad.  It’s not about Russia’s aggressive takeover of the Crimea, a part of the sovereign state of Ukraine.  It’s not about China’s naval buildup.  It’s not about the implosion of Venezuela’s elected dictatorship.  It’s not about Iran’s ongoing program to build nuclear weapons.  It’s not about the continuing civil war in Syria.  It’s about what Secretary Kerry thinks is the major national security threat facing the United States—global warming!

Here is how Secretary Kerry introduces his Policy Guidance:

Leading the way toward progress on this issue is the right role for the United States, and it’s the right role for the Department of State.  That’s why I’ve decided to make climate change the subject of my first Policy Guidance as Secretary of State.  I have been deeply impressed by the way Secretary Clinton elevated global women’s issues as a top-tier diplomatic priority, and believe me, we’re committed to keeping them there.  When the opportunities for women grow, the possibilities for peace, prosperity, and security grow even more.  President Obama and I believe the same thing about climate change.  This isn’t just a challenge, it’s also an incredible opportunity.  And the Policy Guidance I’m issuing today is an important step in the right direction.

One thing’s for sure:  there’s no time to lose.  The scientific facts are coming back to us in a stronger fashion and with greater urgency than ever before.  That’s why I spoke in Jakarta about the threat of climate change and what we, as citizens of the world, can do to address it.  That’s why I raised this issue at our senior management retreat here in Washington, and why I’ll be raising it again at our Chiefs of Mission Conference next week.  This challenge demands elevated urgency and attention from all of us.

I’m counting on Chiefs of Mission to make climate change a priority for all relevant personnel and to promote concerted action at posts and in host countries to address this problem.  I’ve also directed all bureaus of the Department to focus on climate change in their day-to-day work.

[click to continue…]