We know that computers and related gear consume a lot of electricity. What we don’t know, with any precision, is what “a lot” is. Earlier this year, the Lawrence Berkeley National Laboratory conservatively estimated that running corporate server computers, and keeping them cool, accounted for about 1.2% of total electricity consumption in the United States in 2005. That’s equivalent, the researchers said, to the output of “about five 1000 MW power plants.” (The study excluded the hundreds of thousands – possibly millions, at this point – of custom-built servers that Google operates.)
Corporate servers, though, are just one component of the global computing grid. Now, as Slashdot notes, a journeyman researcher named David Sarokin has taken a crack at estimating the overall amount of energy required to power the country’s computing grid. He combines the Berkeley numbers on server consumption with estimates from other recent studies of the power used by PCs, networking gear, and the telephone network. The numbers, in billions of kilowatt-hours, break down as follows:
Data center servers: 45
PCs and monitors: 235
Networking gear: 67
Phone network: 0.4
That amounts to about 350 billion kWh a year, representing a whopping 9.4% of total US electricity consumption. On a global basis, Sarokin estimates that the computing grid consumes 868 billion kWh a year, or 5.3% of total consumption.
These are, it’s safe to say, rough estimates. Each of the studies Sarokin draws on makes various assumptions of greater or lesser reliability – and Sarokin adds a few assumptions of his own. Still, even if they’re just in the ballpark, his numbers underscore the huge amount of electricity that our modern reliance on computing and the Internet demands. It’s worth noting, moreover, that Sarokin’s study appears to overlook some related sources of power demand – not just custom-made servers but data center components beyond servers and air-conditioning, various peripherals such as printers and scanners, and all the devices that we’re always recharging and that increasingly draw on the Net, from BlackBerrys to iPods to iPhones.
Despite the increasing attention being paid to computing’s vast energy requirements, most companies remain more or less oblivious to how much power their own IT operations are using. According to a report on a survey of top IT executives issued yesterday by the Economist Intelligence Unit (and sponsored by IBM, which has a commercial stake in the issue):
Most IT executives say that their firm does not monitor its IT-related
energy spending (and a further 9% don’t know) [and] only 12% of respondents believe that the energy efficiency of IT equipment is a critical purchasing criterion.
“Although concerns about energy efficiency and global warming are now high on the political agenda, the spotlight has not yet been turned onto the IT function,” observes the report’s editor. That is sure to change as the public becomes more aware of computing’s oversized carbon footprint.
EPA recently estimated U.S. data center use at 1.5% of total consumption.
They also estimated it to be 61 billion kwH, not 45. If Sarokin’s ratios are correct, that’s bad news…
From Service Level Automation in the Datacenter: The IT Power Divide:
“What I’ve learned is that there are deep divides between IT and facilities view of electrical efficiency…IT believes that once they get a 1 Mw data center, they should figure out how to efficiently use that 1 Mw–not how to squeeze efficiencies out of the equipment to run at some number measurably below 1 Mw. Meanwhile, facilities gets excited about any technology that reduces overall power consumption and maintains excess power capacity, but lacks the insight into what approaches can be taken that will not impact the business’s bottom line.”
Nick: Great post, as always. However, while we talk about the IT power drain, it would be telling to see data on the degree of power consumption IT is helping us to avoid as well. For example, how many physical stores have not been built, and do not operate, due to e-commerce? How many automobile trips and additional office space is no longer necessary due to telecommuting and remote work? Perhaps, we’ll find, for every kWh IT consumes, it saves x number of kWhs. Is there any data on this anywhere? It would be interesting to look at the big picture from this perspective.
-Joe McKendrick
Hey, Nick,
Did you notice who commissioned this question? Part of my research into the One Machine.
I did indeed. I didn’t mention it because I was only 99.8% sure it was you and not some other Kevin Kelly.