Hey there, human — the robots need you! Vote for IEEE’s Robots Guide in the Webby Awards.

Close bar

Cloud Computing’s Coming Energy Crisis

The cloud’s electricity needs are growing unsustainably

2 min read

A cloud on powerlines between two towers.
Harry Campbell

How much of our computing now happens in the cloud? A lot. Providers of public cloud services alone take in more than a quarter of a trillion U.S. dollars a year. That's why Amazon, Google, and Microsoft maintain massive data centers all around the world. Apple and Facebook, too, run similar facilities, all stuffed with high-core-count CPUs, sporting terabytes of RAM and petabytes of storage.

These machines do the heavy lifting to support what's been called “surveillance capitalism": the endless tracking, user profiling, and algorithmic targeting used to distribute advertising. All that computing rakes in a lot of dollars, of course, but it also consumes a lot of watts: Bloomberg recently estimated that about 1 percent of the world's electricity goes to cloud computing.

That figure is poised to grow exponentially over the next decade. Bloomberg reckons that, globally, we might exit the 2020s needing as much as 8 percent of all electricity to power the future cloud. That might seem like a massive jump, but it's probably a conservative estimate. After all, by 2030, with hundreds of millions of augmented-reality spectacles streaming real-time video into the cloud, and with the widespread adoption of smart digital currencies seamlessly blending money with code, the cloud will provide the foundation for nearly every financial transaction and user interaction with data.

How much energy can we dedicate to all this computing? In an earlier time, we could have relied on Moore's Law to keep the power budget in check as we scaled up our computing resources. But now, as we wring out the last bits of efficiency from the final few process nodes before we reach atomic-scale devices, those improvements will hit physical limits. It won't be long until computing and power consumption will once again be strongly coupled—as they were 60 years ago, before integrated CPUs changed the game.

We can't devote the whole of the planet's electricity generation to support the cloud. Something will have to give.

We seem to be hurtling toward a brick wall, as the rising demand for computing collides with decreasing efficiencies. We can't devote the whole of the planet's electricity generation to support the cloud. Something will have to give.

The most immediate solutions will involve processing more data at the edge, before it goes into the cloud. But that only shifts the burden, buying time for rethinking how to manage our computing in the face of limited power resources.

Software and hardware engineering will no doubt reorient their design practices around power efficiency. More code will find its way into custom silicon. And that code will find more reasons to run infrequently, asynchronously, and as minimally as possible. All of that will help, but as software progressively eats more of the world—to borrow a now-famous metaphor—we will confront this challenge in ever-wider realms.

We can already spy one face of this future in the nearly demonic coupling of energy consumption and private profit that provides the proof-of-work mechanism for cryptocurrencies like Bitcoin. Companies like Square have announced investments in solar energy for Bitcoin mining, hoping to deflect some of the bad press associated with this activity. But more than public relations is at stake.

Bitcoin asks us right now to pit the profit motive against the health of the planet. More and more computing activities will do the same in the future. Let's hope we never get to a point where the fate of the Earth hinges on the fate of the transistor.

This article appears in the August 2021 print issue as “Cloud Computing's Dark Cloud."

The Conversation (4)
Carlos Noriega
Carlos Noriega25 Feb, 2023
INDV

As we switch from fossil fuel to electricity, that 1% will become more significant, and will compete with all the applications that will be created as a result of the energy source tranfer

Joshua Stern
Joshua Stern14 Aug, 2021
LM

Hold on there cowboy, yes it's an issue, but right now it's a 1% issue and a lot of people seem pretty happy about spending 1% of their cash on it Let's look at the history, well known, of many orders of magnitude improvement in computation for the dollar, or watt, over the past decades, but it's not just CPUs it's also storage and communications. Compare storage from a 25 megabyte 2314 disk drive, consuming let's call it 1 kW, to today's SSDs at a couple of watts per terabyte(s), 500 terabytes, hey make it 9 orders of magnitude, and that's a freakin' lot. Do I expect another 9 orders of magnitude, no, not really, but instead I think we *already* have headroom for the next century - in storage. Ditto in communications. So, CPU and DRAM? Eh. I do expect demands for both to continue to soar and we are at kind of a stop on energy efficiency there but, if it's any comfort, I think it will be outside the cloud more than inside. We're wasting a ton right now on AI/ML processes I don't much respect. Wouldn't surprise me if we come to our senses on a lot of that and server power consumption goes pretty flat for the decade or two, call it 2x for planning purposes, though again clients may go up 10x. That's my two cents.

dirk bruere
dirk bruere02 Aug, 2021
INDV

It is inevitable that eventually the overwhelming bulk of planetary energy will go into computing

1 Reply