Power mad: AI’s massive energy demand risks causing major environmental headaches


November 28 – Artificial intelligence might seem like an ethereal force, transforming the global economy while remaining invisible to billions of people whose lives it transforms. But while AI exists in cyberspace, it also leaves a huge physical footprint in the real world.

Every time an AI application crunches numbers, analyses data or answers questions, it uses a “graphic processing unit”. These GPUs, which are usually stored within servers located in data centres, consume around four times as much power as the servers used for cloud applications.

This means data centres’ power consumption is set to massively increase. McKinsey estimates that the power needs of U.S. data centres will balloon from 17 gigawatts (GW) in 2022, to 35 GW by 2030.

“That’s going to put such a strain on resources, particularly on power infrastructure, let alone data centre capacity infrastructure,” says Dominic Ward, chief executive of data centre company Verne Global.

Data centres in some countries are already under fire for consuming more than their fair share of resources. In March this year, the irate boss of a Norwegian armaments company blamed “the storage of cat videos” for impeding plans to ramp up production of munitions for Ukraine. TikTok data centres near the company’s factory meant not enough electricity was available to allow production to expand, he said.

Data centres generally also consume large quantities of water in their cooling systems. AI will exacerbate this further, since power-intensive GPUs have greater cooling requirements than conventional servers. Microsoft’s water usage increased by 34% in 2022, a trend that analysts have pinned to the corporation’s growing investment in generative AI.

On the positive side, the data centre industry has proved in recent years that it can improve efficiency.

“Data centres now are incredibly efficient at managing heat – they’ve got very, very efficient cooling systems,” says Andrew Jay, head of EMEA data centre solutions at CBRE, a provider of data centre services. From an efficiency perspective, he adds, “a brand-new data centre is the most fabulous place” for a server to sit.

Jay notes that servers typically have two power cords but suggests that AI equipment may be developed with only a single power feed. This could boost efficiency by enabling data centres to discard unnecessary electrical infrastructure, he says. And AI applications could themselves help to manage data centres’ power usage to reduce strain on electricity grids.

But the key factor in satisfying data centres’ power demands sustainably is the availability of renewable energy.

Certain regions therefore have a distinct advantage in attracting data centres. Verne Global operates facilities in Iceland, where the electricity grid is entirely powered by renewables, and Finland, which also has a largely low-carbon electricity network. The cold climate across the Nordic region is another a major bonus, as it reduces the need for cooling systems.

“Geography is really important,” says Ward. Selecting a Nordic location, he says, is an ideal way for companies with applications that consume high power to reduce their carbon footprints.

The drawback to remote locations is that a greater distance between a data centre and its customers’ operations results in a slower connection speed. The Nordics, however, have mitigated this “latency” problem through improved fibre-optic connections with leading markets.

And Ward emphasises that AI applications mostly do not need to use data centres located in urban areas. This is especially true for their “training” needs, the processes through which the application is taught how to analyse data or recognise patterns. It is only the inference functions, where a ChatGPT-style app responds in real time to a user’s questions, that rely on an ultra-low latency connection.

Ward says that data centre customers need to “think about the best place for a given application.” He acknowledges that some latency-sensitive applications might need an urban location. “But the training and the very dense workloads – they’ve got to sit in a more efficient environment. And if they don’t, we’re going to put such enormous strain on the power grids throughout not just Europe, but the planet, that we’re going to have real problems coming up over the next five to 10 years.”

The problem of data centre electricity usage is such that Microsoft is preparing to deploy nuclear energy, in the form of small modular reactors, to power some of its data centres.

“It seems like a bit of an extreme solution,” says Alex de Vries, a Dutch data scientist who recently published an academic paper that raised concerns over the environmental consequences of AI and data centres.

De Vries argues that data centre expansion will inevitably lead to increased use of fossil fuels. While noting that many data centre customers prioritise green electricity, he says “if they manage to take those renewable energy sources, it just means you have to power something else with fossil fuels.”

The solution, de Vries argues, is “common sense” in deciding whether energy-intensive AI applications are really needed.

“There’s going to be plenty of applications where AI just doesn’t make any sense at all,” he says. “This year we’ve seen so much hype and a lot of fear of missing out – everyone’s just pushing AI into their products regardless of whether it’s a good idea. That’s ultimately where the real waste is.”


To view original article, click here

Leave a reply