AI and the environment

There’s no doubt that Generative AI (GenAI) has been – and continues to be – revolutionary. It’s not just about making machines smarter, it’s about enhancing human potential – freeing up time for deeper thinking and unlocking new levels of creativity. It’s a complete shift in the way we work, create, and innovate.

These powerful tools can generate text, images, code and even music in a human-like way. Some models can even understand and generate nuanced, context-aware responses. They can produce high-quality work faster – boosting productivity – and they’re able to learn and be customised. It’s been a game-changer for multiple industries.

But behind the scenes, theses GenAI models require a massive amount of computational power – which translates to significant energy consumption and environmental impact.

Bound for the backlash

When something is this revolutionary and hyped, a backlash is inevitable. So far, we’ve seen variations of ‘AI is going to take your job’ and ‘It’s not as good as we thought it would be.’ In recent months, though, you’ve likely seen that there are some concerns around AI’s impact on the environment. Some content creators are even calling for people to stop using GenAI in place of Google. But is this a backlash or a legitimate concern?

So, what’s the deal?

Training these advanced AI models isn’t just a quick process; it requires an enormous amount of electricity. And the demand doesn’t stop once the model is built. Running AI-powered applications for millions of users and continuously fine-tuning them also consumes substantial energy over time. Plus, the hardware that powers GenAI models needs constant cooling, which puts pressure on water supplies and local ecosystems.

“When we think about the environmental impact of generative AI, it’s not just about plugging in a computer,” explains Elsa A. Olivetti, a professor at MIT and the lead of the Decarbonisation Mission at MIT’s Climate Project. “The effects ripple out on a much broader scale.”

A ripple in light blue water
AI has a ripple effect, especially on the environment

Data centre demand

One major factor driving AI’s environmental footprint is the ever increasing number of data centres. These high-tech facilities, filled with thousands of servers, store and process vast amounts of data needed to train and run AI models like ChatGPT and DALL-E.

Data centres aren’t new, they date back to the 1940s, but the explosion of GenAI has dramatically increased the demand for them. According to research, the power consumption of North American data centres doubled between 2022 and 2023, largely due to AI’s growing popularity. Globally, data centres used 460 terawatts of electricity in 2022 – that’s enough to make them the 11th largest electricity consumer in the world. We can only assume that’s increased since then. In fact, by 2026, experts predict that data centre energy consumption could surpass 1,000 terawatts. For perspective, the whole of the UK used almost 263 terawatts in 2023.

The hidden cost of everyday AI

Even after an AI model is trained, it continues to use energy every time someone interacts with it. For example, asking ChatGPT a question uses about five times more electricity than a simple Google search.

“The ease of using AI means people don’t always think about its environmental impact,” says Noman Bashir, an MIT researcher studying AI’s climate effects. “Without awareness, there’s little incentive for users to limit their AI use.”

GenAI models also have shorter lifespans compared to traditional AI. Companies frequently release newer, more complex versions, requiring even more energy to train, while older models quickly become obsolete.

London lights at night
By 2026, AI data centres could use almost four times more electricity than the whole of the UK does in one year

Water worries

An often overlooked concern is the amount of water required to keep data centres cool. For every kilowatt hour of energy used, a data centre may need about two litres of water for cooling. This heavy water usage has direct consequences for local water supplies and biodiversity.

The production of AI hardware also has environmental downsides. High-powered GPUs (specialised processors used for AI) require more energy to manufacture than standard computer chips. The mining of raw materials and the emissions from transporting these components also add to AI’s environmental footprint.

Market research shows that in 2023 alone, over 3.85 million GPUs were shipped to data centres, a significant jump from previous years, with that number expected to rise.

What about the future?

It’s clear the current trajectory of AI development isn’t sustainable, but there are ways to mitigate its environmental impact. Olivetti emphasises the need for a balanced approach, one that considers both the societal benefits of AI and its environmental costs.

“We need a more systematic way of understanding AI’s trade-offs,” she says. “Because technology is advancing so fast, we haven’t yet had a chance to fully measure and assess its impact.”

At Salocin Group, we recognise and witness the transformative potential of GenAI in driving innovation and efficiency every day. However, before it’s adopted, it’s imperative to understand and address the significant environmental implications associated with its deployment.

We know through recent studies that training large AI models consumes a substantial amount of energy, leading to increased carbon emissions and water usage. But it’s not just emissions and consumption, The Financial Times writes that the energy demands of AI data centres have resulted in over $5.4 billion in public health costs in the U.S. over the past five years, primarily due to air pollution from fossil fuel-based energy sources.

Industrial chimneys
There’s a health cost too

Looking ahead, the industry is poised to face heightened scrutiny from regulators and the wider public concerning AI’s environmental footprint. Legislative actions, such as those proposed in California, aim to enforce energy usage disclosures and efficiency standards for data centres.

This trend suggests that sustainability will become a critical factor in AI development and deployment strategies.

To mitigate environmental concerns, industry must prioritise the development of energy-efficient AI models and invest in infrastructure powered by renewable energy sources. Collaborative efforts between technology companies and governments are essential to establish and adhere to environmental standards. Additionally, integrating sustainability metrics into executive performance evaluations can drive accountability and progress.

For companies aiming to adopt AI responsibly, it’s advisable to conduct comprehensive assessments of the environmental impacts of AI. Selecting AI solutions that emphasise energy efficiency and sustainability is crucial.

Engaging with stakeholders to transparently communicate environmental strategies can also enhance corporate responsibility and public trust.

If you would like to talk about the best way for you to use AI, please get in touch.

Get in touch today

Interested in speaking to us? We'd love to hear from you.