AI has already changed how many of us work, without question. But the environmental cost is often overlooked (and it’s not insignificant). Training GPT-3 produced 626,000 pounds of CO₂ (roughly 300 London-Luxor return flights). And it’s not just the training… every ChatGPT query uses 10x more electricity than a Google search.
81% of Americans worry about AI’s side effects (job losses, climate impact). Nearly two-thirds of Gen Z want regulations on AI’s environmental footprint.
So what do we actually do about it? Whether you’re tinkering personally or rewiring your business for AI, here are some things you can do to minimize the impact:
1
Be selective
The greenest computation is the one you don’t run. Don’t use a heavyweight reasoning model for every trivial task. Skip the “AI image of me hugging my younger self.” For businesses – don’t embed an LLM where a workflow or FAQ page will do the job.
Reserve AI for where it genuinely adds value: orchestrating complex data, automating workflows, generating creative content.
2
Use the right tool for the job

AI comes in all sizes. GPT-5, Claude 4 Sonnet, Gemini 2.5 Pro – brilliant at multi-step reasoning, but energy-hungry. Smaller models like GPT-4o-mini, Gemini 1.5 Flash, Mistral Small, Phi-3? Often deliver the same output at a fraction of the cost and carbon.
Don’t use a sledgehammer to crack a nut. Build workflows that step down to lighter models whenever possible.
3
Declutter your data
Dark data wastes energy. 52% of stored enterprise data is never used, yet it’s consuming storage and cooling capacity. That’s 6.4 million tons of CO₂ wasted annually.
Strong data retention policies, regular audits, tidying cloud drives and inboxes. Leaner data = cleaner AI.

4
Streamline your interactions
Every AI prompt consumes server power. Multiply that by millions and you see the impact. Reducing consumption doesn’t need to be complicated – ask clear questions (fewer retries), batch related queries, request only the output length you actually need.
Save and reuse AI outputs instead of regenerating from scratch. For enterprises, this is where AI agents shine. Instead of recreating content each time, agents adapt and reapply templates (reports, proposals, FAQs, marketing copy). This keeps outputs consistent, compliant, and efficient whilst cutting carbon footprint and making teams faster.
5
Choose greener infrastructure and providers
Microsoft and other tech giants are investing heavily here. Microsoft’s committed to 100% renewable energy and carbon negativity by 2030. Choose providers prioritising green data centres and your AI workloads inherit those benefits.
If you’re running your own models, schedule training and heavy inference when renewable energy supply is high. Design pipelines with efficiency in mind from the start.
6
Design responsibly
Not every problem needs an LLM (AI seems to be the solution in need of a problem sometimes!). They burn compute, add latency, and don’t always improve UX.
Ask: does this really need an LLM? Often the smartest, greenest choice is workflow automation, decision trees, or templated responses. If you do need AI, default to lightweight models before escalating.

A final thought
The future’s promising. Small Language Models (SLMs) are gaining traction – faster, more focused, significantly more energy-efficient. If adoption grows, AI can be both more sustainable and more accessible.
Mindful AI isn’t about using less – it’s about using better. For individuals, that’s good habits. For enterprises, it’s optimising, automating, then AI-ing (as we are doing with our Reimagine & Rewire programme internally).
Done right, you get productivity and sustainability at once.