a slightly opmenous background of power poles and 0 and 1s representing AI with a yellow toned lowatop featureing the logos of meta, google's gemina, Chat GPT and Claude on the screen
AI uses heaps of electricity, and it’s going to get more expensive (Image: The Spinoff)

InternetJuly 29, 2024

AI is already straining electricity systems – and we’re just at the beginning

a slightly opmenous background of power poles and 0 and 1s representing AI with a yellow toned lowatop featureing the logos of meta, google's gemina, Chat GPT and Claude on the screen
AI uses heaps of electricity, and it’s going to get more expensive (Image: The Spinoff)

As well as concerns about intellectual property and ethics, generative AI is incredibly electricity intensive. What is being done about it?

Insert the prompt and the computer will make you wait for a few seconds. Sentences start spitting across the screen; a selection of images appears, smooth and vibrant, perhaps uncanny. Assuming no workplace or plagiarism guidelines apply, perhaps you can use the text in an email, or send the image to your group chat.

As generative AI has become increasingly integrated into digital services in the last two years, the amount of energy it uses has become obvious. Google, which has integrated AI into some of their search tools and launched the Gemini tool, recently announced that its carbon emissions have increased by more than 50% – largely due to the energy consumption of their data centres. Similarly, Microsoft, a key backer of OpenAI which has included ChatGPT in some of its subscription products, has increased its emissions by more than 30%.

What we know about how much energy AI uses

Allyn Robins, a senior consultant at the Brainbox Institute, a Wellington-based digital policy thinktank points out that it’s almost impossible to know where the electricity for AI comes from. The data centres are scattered around the globe. “There are data centres in the US, Europe, India, Singapore, Malaysia is building lots too,” he says. Where the data centre almost certainly won’t be is New Zealand: none of the major AI companies have data centres here. Construction on an Amazon data centre in West Auckland has paused, and while Microsoft has announced plans to build a New Zealand server farm, it hasn’t yet been completed.

In most cases, these overseas data centres will be using the municipal grid, meaning that your AI request uses another country’s electricity system. Given that about half of the world’s electricity is produced with fossil fuels, your AI-supported speech is probably contributing to rising global emissions.

a man with light brown skin, , a slightly scruffy black beard, glossy hair around his airs, clear acetate glasses, and a textured navy shirt
Allyn Robins wants people to know that ephemeral-feeling AI is actually physical infrastructure. (Photo: supplied)

Just how much energy does one AI request use? Estimates vary, but it’s well known that training AI models is resource intensive. “You basically need to run a lot of very powerful computers for a long time,” Robins says. Thousands of chips process thousands of gigabytes of information to make connections and associations that allow AI interfaces to make music, answer questions (not necessarily accurately) and create images (not necessarily within copyright law). AI companies keep their training processes secret, so information is based on best guesses.

Once the model has been made, the computers don’t need to run constantly, but will use energy each time they’re queried – making an image or responding to a prompt, say. One estimate, where researchers ran different generation examples on a variety of test models, concluded that generating one AI image uses about the same amount of electricity as charging a phone. Text-to-video AI models will presumably use heaps more. “There are so many calculations required to produce a new piece of content,” says Andrew Lensen, a senior lecturer in AI at Victoria University.

“If you’re doing that 50 or 100 times to get the image you want, it might feel efficient to you, but you’ve used more electricity than most people appreciate,” Robins says.

It helps that individuals aren’t paying the bill: most people mucking around with AI services are doing so for free. There are subscriptions for some AI services, but they’re nowhere close to paying for the $600bn USD a year Sequoia Capita has estimated AI companies need to make to pay for their physical data centres, digital software and labour costs. “Open AI is losing millions of dollars per day,” Lensen says. All the other major AI companies are, too.

It doesn’t have to be this way: there are other ways to do AI. “Traditional AI, like Gmail’s spam detection, does narrower tasks,” explains Lensen. “Is this email spam” has a yes/no answer, less complex than, say, “make me an image of the parrot police chasing an anxious octopus”. This type of machine learning, which doesn’t require as many resources, was the focus for most AI research before generative AI became popular.

a cool blue toned room with dark glowing servers in it
The ‘cloud’ is actually warehouse full of silent screenless servers – the same needed to process AI requests (Photo: Getty)

Generative AI models could also be made more efficient. Information sets fed into AI as training data need to be big, but not all of that data is needed to generate new pieces of data. “You could take an existing big model, prune it to chop off the parts of the model that are underutilised, and it won’t necessarily impact the final output,” Lensen says. “But maybe I’m just saying that because the smaller approaches are what academics like me and my colleagues can afford.”

This could also create a paradox: if AI becomes more efficient and therefore cheaper, it might be used more – wiping out any benefit of the efficiency in the first place.

It’s also possible to make AI models small enough that they can run on people’s individual phones or computers, without needing the energy-sucking data centres. “These models are less powerful and versatile, they’re harder to create in many ways – but they are cheaper to run and maintain,” Robins says.

Beyond that, Lensen says that in many cases, the hype of artificial intelligence is preventing people from asking whether any kind of AI is needed to perform a task. “ChatGPT is so impressive and groundbreaking that it is the only type of AI people think is useful – like generative AI is the solution for everything”. Yes, generative AI has the potential to be incredibly useful – and will undoubtedly continue to be used into the future – but its convenience could prevent people from developing sleeker, more efficient technical solutions to problems.

blue sky with power pulons with notes pegged on them
AI uses lots of electricity, and AI is expensive (Photo: Getty)

AI’s future and the bigger picture

If generative artificial intelligence uses a lot of electricity now, when it isn’t widely integrated into digital systems, then how much will it use when it’s involved in every single search engine, messaging app, email program and video editing tool? “It starts from a small base, using a small number of terawatt hours, and increases from there,” says Nirmal Nair, an associate professor at the University of Auckland who studies power grids and electricity markets. Data centres are already estimated to consume 2-3% of all energy in the EU, using an estimated 45-65 TWh of electricity in 2022. That’s more energy than all of New Zealand produced and used in the same year.

To Nair, New Zealand’s relatively clean and cheap renewable electricity could be a business opportunity. “If AI servers come to New Zealand, could it be a go-to place to invest?” he asks. Setting up more data centres will require government investment, he says, but if scaled correctly, New Zealand could increase electricity generation to meet the power demand of data centres.

Others are less enthused: Robins points out that as Aotearoa is further away from most places where AI services are used, the slight delay in undersea fibre cables means exporting data isn’t attractive for overseas investors. “Ultimately, we’re limited by the speed of light and the laws of physics – if you send your data round the world it will take longer to get back to you than if it was local.”

an ocean with some bobbing buoys and a boat and a line whic his the undersea cable
The Hawaiki cable connects New Zealand, Australia, American Samoa, Hawaii and the US West Coast.

While concerns about AI’s electricity consumption are justified, Robins points out that both doomsaying and relentlessly promoting AI’s possibilities take the growth of generative AI for granted. “There’s such an incentive to be hyperbolic about AI in both directions, it gets you funding whether you say AI will save the world or destroy it.”

Robins has noticed that the predictions about AI’s future electricity use take for granted the industry’s projections about growth – and the more it says it’ll grow, the more likely funders are to keep giving it money to run its expensive data centres. “You don’t trust what they say about whether they’re environmentally friendly, so why would you take their word for how they’ll grow,” he adds.

The current focus on the electricity use of AI is just the latest chapter in worries about how much resources our digital lives require from the planet. At the height of the similarly energy-intense crypto craze, there were dozens of articles about bitcoin miners setting up shop in places with cheap electricity like Kazakhstan, causing pollution and power outages. Cloud storage, the ubiquitous service that keeps 4,000 photos of your pet in iCloud (as well as being used by your bank, streaming services and many other online systems), also needs big server banks and warehouses.

At the very least, AI as it’s being used now is adding to that electricity demand – even if in a couple of years, that impact will fade into the background if everyone is used to generative AI in customer service chatbots, website images and adding surreal cat pictures to their email replies. That means that making it less power hungry, or not using AI at all, is something every electricity user will benefit from. “I’m all for cat memes, don’t get me wrong,” says Lensen. “But we need to improve efficiency.”

“It’s natural to think of AI as this ephemeral thing, untethered from the physical world,” says Robins. “But it is wire, it is chips, it is water to cool servers, it is people building data centres with their hands and others walking around them as security guards – it’s a tremendous amount of people and resources.”

Keep going!