It's Too Darn Hot
From BusinessWeek, March 30, 2008.
The huge cost of powering—and cooling—data centers has the tech industry scrambling for energy efficiency
A 35-minute drive south of Iceland's capital of Reykjavik lies the tiny fishing village of Grindavik. One January day, Kristinn Haflioason steers his car a few minutes out of town to a vast, snow-swept expanse of volcanic rock that juts out into the Atlantic Ocean. He climbs out and launches into an unlikely sales pitch that he hopes will persuade corporations from the U.S. and Europe to locate operations there. "Dozens of companies have expressed interest," he says.
This is no joke. Haflioason works for Invest in Iceland, a government agency. He's pitching the desolate spot outside of Grindavik as a site for data centers, the sprawling facilities chock-full of computers that tech companies build to handle the swelling oceans of digital information.
It's a testament to the challenges companies face in operating data centers that Google (GOOG), Yahoo! (YHOO), and Microsoft (MSFT) have all checked out this remote corner of the world (although none has made a commitment so far). The reason: Iceland has a rare combination of vacant land, cheap geothermal energy, and chilly climate that makes cooling a data center nearly free.
The tech industry is facing an energy crisis. The cost of power consumption by data centers doubled between 2000 and 2006, to $4.5 billion, and could double again by 2011, according to the U.S. government. With energy prices spiking, the challenge of powering and cooling these SUVs of the tech world has become a major issue for corporations and utilities. "The digital economy is pervasive," says Andy Karsner, Assistant U.S. Energy Secretary for energy efficiency. "The demands for computing will grow exponentially, but electric consumption can't grow the same way."
The race is on to come up with creative solutions. Companies are scouring the globe for new technologies and advantageous locations. Iceland may have the ideal climate; Saudi Arabia may offer the lowest energy costs. Every company in the business is looking to squeeze expenses in hopes of becoming the low-cost producer in the Digital Age.
Where will the breakthroughs come from? Utilities, construction companies, and tech outfits all are working on the issue. Bruno Michel, a researcher at IBM (IBM)'s Zurich lab is developing ways that the biology of the human body can be translated into cooling systems for computers. Mark Bramfitt, principal program manager at Pacific Gas & Electric (PCG), is experimenting with incentives to curb overall energy use. Just a few watts per computer can add up. An efficient data center uses about 25% less electricity than a run-of-the-mill one. In a midsize facility, that could amount to $4.5 million a year in savings.
STATE SECRETS
The modern data center is like a vast refrigerator with hundreds or thousands of ovens blazing away inside. Six-foot-tall metal racks stacked with pizza box-size computers, storage devices, and network-routing machines are lined up in rows. Chilled air blows through the equipment from vents in the floors of "cold aisles." Hot air blows out of the back ends into "hot aisles" and is drawn off and vented out of the building. Inside the centers, there's a dull roar as large quantities of air shoot through ducts, vents, and computers.
So intense is the competition among tech companies to lower their costs of processing data that some treat information about their energy use like state secrets. When Google built a data center along the Columbia River in Oregon a few years ago, it bought the land through a third party so its involvement was hidden, and the city manager had to sign a confidentiality agreement. In North Carolina, the state's sunshine laws forced it to disclose the incentive package it offered Google to locate a data center there, but the company's plans for power consumption were redacted as trade secrets. Little wonder, perhaps: In the future, the competition between Google and Microsoft in the Web search business might be determined as much by data center energy efficiency as by which company writes the best search algorithm.
Think of the data center as the factory for the info economy. Every day, in thousands of nondescript buildings in the hinterlands or in the basements of office towers, trillions of chunks of information coded in ones and zeros are moved, stored, and assembled into new shapes. Those electronic packets include millions of e-mails, Facebook pages, blog entries, and YouTube (GOOG) videos; vast quantities of electronic transactions; plus an ever-expanding universe of data that needs to be stored, sliced, diced, and analyzed. Tech market researcher IDC (IDC) estimates that, each year, the digital world creates about 3 million times the information contained in all the books ever written.
People don't typically think of information as having substance, but the essence of computing is physics. When bits of data in the form of tiny electrical charges move across the wires of semiconductor chips, with millions of transistors switching on and off, they meet resistance. Overcoming resistance requires energy and creates heat. (If they're run too fast, chips won't burst into flames, but they will crack like an overheated automobile engine block.) Move enough bits around, and you have one huge bill for powering and cooling the equipment: The $4.5 billion spent in the U.S. in 2006 is the equivalent of the electric bills for 5.8 million U.S. households.
Energy consumption in data centers emerged as a major issue in the past couple of years. According to a January survey of its members by AFCOM, an association of data center managers, 88.5% of them described energy consumption as a serious or very serious concern, up from 36.1% five years ago. "You're realizing the cost of all that power. You look for opportunities to reduce costs, but now the cost of power is soaring," says Sal Cefalu, a vice-president for data center operations at telecom giant Verizon (VZ), which has 18 data centers nationwide. These days, for every $1 spent on computing equipment in data centers, an additional 50 cents is spent each year to power and cool them. About half of the energy is for air conditioning.
HEAT MAP
Three years ago, Microsoft began to tackle the energy issue in earnest. CEO Steve Ballmer and other top executives had decided to greatly expand the services offered over the Web to consumers and businesses, including e-mail and instant messaging, to parry threats from Google. So a group of about 20 people gathered at an off-site meeting at the Salish Lodge in Redmond, Wash., with a view of a waterfall out the window, to hash out a forecast of data center needs. "We decided to take the most aggressive, craziest plan we could come up with," says Michael Manos, senior director of data center services. "In hindsight, we were conservative."
Since then, Microsoft has spent more than $2 billion building four gigantic facilities—in Chicago, Dublin, San Antonio, and Quincy, Wash. Now it's looking at locations overseas, including Iceland and Siberia. Its computing needs have doubled each year for the past three years, and Microsoft expects them to continue doubling each year for the foreseeable future.
Energy consumption is a major factor in Microsoft's planning. The company has created what it calls a "heat map" of the globe that takes into account 35 factors in site selection, including cost and availability of power. A group of scientists studies the latest advances in data center design and computing efficiency. And Manos looks for local resources he can take advantage of—such as water from a waste-treatment system that cools computers at the new data center in San Antonio. As a result of all of this effort, Manos believes his 20 data centers are 30% to 50% more efficient than the industry average.
Solutions to tech's energy crisis won't come easily. The largest data centers can cost more than $200 million to construct and are expensive to upgrade to the latest energy-saving technology.
There are a number of relatively quick fixes, however. Most server computers use only about 10% of their capacity at any given time, so data center operators are using software that shifts computing jobs between servers to make the most of their capabilities. New power management systems shut off servers and other equipment automatically when not in use. And most computer companies have caught the green religion. They claim their newest equipment is ultra-energy-efficient.
The problem is that widely accepted yardsticks don't exist for comparing one server computer with another to show which is the more green. For servers or data centers, there's no equivalent of the miles-per-gallon ratings for cars. The Environmental Protection Agency and Energy Dept. are working with industry groups to set up benchmarks.
Although more progress is needed, there have been some major advances in data center energy conservation. In each case, the breakthroughs were the result of scientists and others questioning the conventional wisdom in their industries.
Take the design of the chips themselves. A decade ago the chip industry had a single focus: making chips process bits and bytes ever faster. But Marc Tremblay, a top scientist at Sun Microsystems (JAVA), saw a fatal flaw in that strategy. The faster chips ran, the hotter they got, and eventually they'd be too hot to work properly. So he designed a so-called multicore chip, which has several processors on a single sliver of silicon. Each core runs slower and cooler than a processor that does the same amount of work all by itself.
ABOUT-FACE
Because of the complexities of semiconductor technology, it took nearly a decade for Tremblay's innovation to come to market. Since Sun's server computers based on Tremblay's designs went on sale two years ago, they have helped turn the company around and have dramatically slashed energy consumption. Each chip in those servers consumes just 70 watts of power, about one-third that of a conventional microprocessor.
Even the traditionally slow-moving utility industry is showing signs of surprising change. Take PG&E. Two years ago, after Bramfitt was handed the job of designing conservation incentives for Northern California's tech companies, he came up with a creative solution. He decided to provide financial incentives to companies that decide to use what's known as virtualization software. Without the software, each server computer typically handles only one program at a time, and only a fraction of the capacity is used. With virtualization software, many programs can be run on a single computer.
The result is that managers can pack more programs onto each server, thus burning less electricity and reducing the number of computers needed. Under Bramfitt's incentive program, PG&E pays customers for every kilowatt-hour of energy they save using the software.
Bramfitt's program is a business model innovation, but most of the data center energy savings are likely to be found in technology advances. IBM researcher Michel is focusing on a seldom-trod territory—where biology and physics meet. Michel, who has a PhD in biochemistry from the University of Zurich, is designing devices that he expects will one day cool chips with a system modeled after the human body. While the processors in server computers are typically cooled with air, these chips are chilled with a liquid delivered through a system similar to the body's capillaries. One of Michel's inventions is a metal cap that fits over a processor and sprays jets of water out of some 50,000 nozzles into microscopic channels etched in the metal. The channels circulate the liquid efficiently and cut the amount of energy required to pump the water.
While there have been substantial improvements, more innovations will be necessary in fields from technology to utilities. Putting data centers in Iceland or Siberia may help, but that on its own won't be enough to solve tech's energy crisis.
With Kerry Capell in Grindavik, Iceland
<< Home