Cloud Computing: Friend or Foe of Kyotoism?

by Marlo Lewis on September 25, 2012

in Blog, Features

Post image for Cloud Computing: Friend or Foe of Kyotoism?

As I sit here typing away, Amazon.Com’s Cloud Player serves up 320 tunes I’ve purchased over the past year and a half. I can play them anywhere, any time, on any computer with Internet access. I don’t have to lug around my laptop or even a flash drive. What’s not to like?

Our greener friends worry about all the power consumed by the data centers that deliver computer services over the Internet. Think of all the emissions!

A year-long New York Times investigation summarized in Saturday’s (Sep. 22) edition (“Pollution, Power, and the Internet“) spotlights the explosive growth of the data storage facilities supporting our PCs, cell phones, and iPods — and the associated surge in energy demand. According to The Times:

  • In early 2006, Facebook had 10 million or so users and one main server site. “Today, the information generated by nearly one billion people requires outsize versions of these facilities, called data centers, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.”
  • “They [Facebook’s servers] are a mere fraction of the tens of thousands of data centers that now exist to support the overall explosion of digital information. Stupendous amounts of data are set in motion each day as, with an innocuous click or tap, people download movies on iTunes, check credit card balances through Visa’s Web site, send Yahoo e-mail with files attached, buy products on Amazon, post on Twitter or read newspapers online.”
  • “To support all that digital activity, there are now more than three million data centers of widely varying sizes worldwide, according to figures from the International Data Corporation.”
  • “Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.”
  • “Jeremy Burton, an expert in data storage, said that when he worked at a computer technology company 10 years ago, the most data-intensive customer he dealt with had about 50,000 gigabytes in its entire database. (Data storage is measured in bytes. The letter N, for example, takes 1 byte to store. A gigabyte is a billion bytes of information.)”
  • “Today, roughly a million gigabytes are processed and stored in a data center during the creation of a single 3-D animated movie, said Mr. Burton, now at EMC, a company focused on the management and storage of data.”
  • “Just one of the company’s clients, the New York Stock Exchange, produces up to 2,000 gigabytes of data per day that must be stored for years, he added.”

The impact of the Internet — or, more broadly, the proliferation of digital technology and networks — on energy consumption and greenhouse gas emissions has been a contentious topic since 1999, when technology analyst Mark P. Mills published a study provocatively titled “The Internet Begins with Coal” and co-authored with Peter Huber a Forbes column titled “Dig more coal: The PCs are coming.”

Mills and Huber argued that digital networks, server farms, chip manufacture, and information technology had become a new key driver of electricity demand. And, they said, as the digital economy grows, so does demand for super-reliable power — the kind you can’t get from intermittent sources like wind turbines and solar panels.

Huber and Mills touted the policy implications of their analysis. To wire the world, we must electrify the world. For most nations, that means burning more coal. The Kyoto agenda imperils the digital economy, and vice versa.

Others — notably Joe Romm and researchers at the Lawrence Berkeley National Laboratory (LBNL) — argued that the Internet was a minor contributor to electricity demand and potentially a major contributor to energy savings in such areas as supply-chain management, telecommuting, and online purchasing.

Although Mills’s “ballpark” estimates — 8% of the nation’s electric supply absorbed by Internet-related hardware and 13% of U.S. power consumed by the all information technology — were likely much too high in 1999, they may now be close to the mark. On the question of basic trend and direction, Mills was spot on.

Critics scoffed at Mills’s contention that, in 1999, computers and other consumer electronics accounted for a significant share of household electricity consumption. Ten years later, in Gadgets and Gigawatts, the International Energy Agency (IEA) reported that in many OECD country households, electronic devices — a category that includes televisions, desktop computers, laptops, DVD players and recorders, modems, printers, set-top boxes, cordless telephones, answering machines, game consoles, audio equipment, clocks, battery chargers, mobile phones and children’s games — consumed more electricity than did traditional large appliances. The IEA projected that to operate those devices, households around the world would spend around $200 billion in electricity bills and require the addition of approximately 280 Gigawatts (GW) of new generating capacity by 2030. The agency also projected that even with improvements foreseen in energy efficiency, consumption by electronics in the residential sector would increase 250% by 2030. Saturday’s New York Times article further vindicates Mills’s central insight (even if not his specific estimates).

Jonathan Koomey, one of the authors of the LBNL critique of Mills’s 1999 study, estimates that, nationwide, data centers consumed about 76 billion kilowatt-hours in 2010, or 2% of U.S. electricity use in that year. In a Forbes column published last year, Mills opined that if we factor in three other components of “digital energy ecosystem” — (1) the energy required to transport the data from storage centers to end users, (2) the “electricity used by all the digital stuff on desks and in closets in millions of homes and businesses,” and (3) the energy required to “manufacture all the hardware for the data centers, networks, and pockets, purses and desktops” — then the digital economy’s total appetite “is north of 10% of national electricity use.”

The Times laments that data centers “waste” vast amounts of power. On a typical day, only about 6% to 12% of a center’s computing power is actually utilized, yet most of the facility’s servers will be kept running around the clock. To call that wasteful, however, is to confuse the engineering concept of efficiency with the economic concept. In economics, what matters is value to the consumer. Consumers demand reliable, uninterrupted access to data. Keeping all the servers humming ensures the center can handle unexpected peaks in demand without crashing. A center that saves energy but bogs down or crashes will lose customers or go out of business. As one industry analyst told The Times, “They [data center managers] don’t get a bonus for saving on the electric bill. They get a bonus for having the data center available 99.999 percent of the time.”

Obviously, it is in a center’s interest to find ways to provide the same (or greater) value to consumers at lower cost, including lower energy cost. But, notes Mills, efficiency tends to increase consumption, not reduce it:

Car engine energy efficiency improved 500 percent pound-for-pound from early years to the late 20th century. Greater efficiency made it possible to make better, more featured, safer, usually heavier and more affordable cars. So rising ownership and utilization lead to 400 percent growth in transportation fuel use since WWII. The flattening of automotive energy growth in the West is a recent phenomenon as we finally see near saturation levels in road-trips per year and cars-per-household. We are a long way from saturation on video ‘trips’ on the information highways.

Efficiency gains are precisely what creates and increases overall traffic and energy demand; more so for data than other service or products. From 1950 to 2010, the energy efficiency of information processing improved ten trillion-fold in terms of computations per kWh. So a whole lot more data-like machines got built and used — consequently the total amount of electricity consumed to perform computations increased over 100-fold since the 1950s – if you count just data centers. Count everything we’re talking about here and the energy growth is beyond 300-fold.

Fundamentally, if it were not for more energy-efficient logic processing, storage and transport, there would be no Google or iPhone. At the efficiency of early computing, just one Google data center would consume more electricity than Manhattan. Efficiency was the driving force behind the growth of Internet 1.0 as it will be for the wireless video-centric Internet 2.0.

So what’s the solution? Where Mills once argued that the “Internet Begins with Coal,” he now argues that “The Cloud Begins with Coal (and Fracking)“:

Some see the energy appetite of the Cloud as a problem. Others amongst us see it as evidence of a new global tech boom that echoes the arrival of the automotive age. We’re back to the future, where the microprocessor today as an engine of growth may not be new, anymore than the internal combustion engine was new in 1958. It’s just that, once more, all the components, features and forces are aligned for enormous growth. With that growth we will find at the bottom of this particular digital well, the need to dig more coal, frack more shale….

 

BobRGeologist September 25, 2012 at 10:51 pm

I find this heavy data storage consumption of electric energy alarming. With time it will be expanding, maybe exponentially. There is a finite supply of electric energy and world demand for all other uses must be met to maintain our civilization. It appears that data storage is approaching it’s limits. The solution seems a time limit on storage. If my reasoning is correct only the vital material must be transferred a medium such as tape. It seems our pet baby elephant we raised in our living room is now full grown.

BobRGeologist September 25, 2012 at 11:39 pm

The war on coal is pure stupidity of our environmentalists. Some paleoclimates had as high as 20 times the amount of CO2 as in our today’s atmosphere without causing undo strain on plant or animal life. If we carry this war on coal, and other sources of CO2, we will be setting us up for the next Pleistocene glacial Ice Age #6. Today our CO2 level is near 400 ppm. During the last glaciation over 10.000 years ago it had dropped to 150 ppm. Remember, or sun’s output is sometimes inadequate to prevent our planet icing up due to the interaction of cosmic and natural forces without a robust greenhouse gas, our security blanket.

Comments on this entry are closed.

{ 2 trackbacks }

Previous post:

Next post: