PSF Insights: The AI Energy Dilemma

By Tess Galloway, Senior Analyst

Introduction

When an old computer is on for too long, it heats up and whirs loudly - we can feel and hear the energy it consumes. With AI, everything feels instant. There’s nothing tangible to contextualise the physical toll of our endless questions and searches.

Over the past two years, artificial intelligence has slotted itself into our lives and become a necessity. Yet, as we employ it more mindlessly, its environmental impact remains largely overlooked.

This past month, fresh competition in the AI space has ignited concerns about sustainability. Chinese company DeepSeek entered the market, challenging the dominance of US based large language models with a bold claim: greater efficiency and a reduced environmental footprint. AI systems consume vast amounts of energy and water, but DeepSeek promises a development model which requires significantly less. Such competition has opened the floor to criticism, bringing the environmental impact of AI into the limelight.

AI Hype

The launch of ChatGPT by OpenAI in 2022 marked the start of a new AI-obsessed era, garnering 100 million users in two months. ChatGPT’s ability to generate human-like text whilst answering questions meant global attention from consumers and investors alike. The stock market’s performance in recent years has been driven largely by the “Magnificent Seven”—Alphabet, Amazon, Apple, Meta, Microsoft, Nvidia, and Tesla—whose success is closely tied to the growth of AI.

AI- related investments have driven significant market growth, but it remains uncertain whether these companies will ultimately capture lasting profits from AI or if the hype will outpace real financial returns. However, 2025 has seen more decisive government action. Donald Trump’s $500 billion AI infrastructure initiative and the recent AI Action Summit in Paris demonstrate large-scale global backing. If we are getting serious about AI, it is crucial to understand its real-world applications and implications.

Do We Understand AI?

Despite the hype, AI adoption remains uncertain for users and businesses. In offices, for example, people use AI in secret, fearing it will put their job at risk or call for a reshuffling. Meanwhile, only 5% of U.S. businesses report using AI in their products, while 75% of AI revenue comes from individual users rather than corporate subscriptions. This lack of widespread, open adoption contributes to ongoing skepticism about AI’s real-world impact.

The Financial Times raises an interesting point; we tend to think of the internet and AI as a “disembodied thing (like a cloud)”. The massive computations powering AI don’t happen on our personal devices—they rely on vast, energy-intensive data centers that are far out of sight and mind for most. In reality, AI is having very real and damaging effects on our planet and the “cloud” narrative obscures the unglamorous physical infrastructure that makes this “thing” work.

The Environmental Impact

Generative AI like ChatGPT works much harder than Google, for example. Dr. Sasha Luccioni, a machine learning specialist, explains that “every time you query the model, the whole thing gets activated, so it’s wildly inefficient from a computational perspective”. LLM’s are trained on massive amounts of text, allowing them to generate responses to almost any query. “When you use Generative AI, it’s generating content from scratch,” Dr. Luccioni explains. That takes a lot of computing power. It is thought that if Google were to adopt an LLM powered search, its energy consumption could increase tenfold.

In the US and some countries in Europe, energy demand is higher than the global average, with AI and data centres consuming around 3% to 4%. Given our increasingly digital world, this level of consumption may seem inevitable—even relatively modest. The main concern is how this could spiral. Datasets and models and the energy needed to train and run them are progressing concurrently.

According to OpenAI researchers, the amount of computing power required to train AI has doubled every 3.4 months since 2012. Training ChatGPT-4 alone consumed enough electricity to power 5,000 American homes for a year—50 times more than its predecessor, GPT-3. As AI proliferates across industries, this energy demand will only grow. When AI is used to optimise energy-intensive manufacturing, which will need experimentation and more data, this problem will grow further.

DeepSeek Enters The Market

Markets were spooked on January 27th when DeepSeek released its R1 model which was created and trained at a much lower LLM development cost. DeepSeek quickly overtook ChatGPT as the most downloaded app and chip- making giant Nvidia, the world’s most valuable company at the time, lost a record breaking $600bn of its market value in one day - perhaps related to their chips being used in leading AI LLMs.

Tariffs likely forced Chinese company DeepSeek to be resourceful. By using older, more accessible models of Nvidia chips in China, DeepSeek achieved dramatic cost cuts, undermining Nvidia’s market clout. Its R1 model reduced both the amount of computation time required to train its model and the amount of memory needed to store it.

American companies dismiss the innovation behind DeepSeeks work, claiming it is only a consequence of “distilling American models”. However, the American strategy of funneling investment into a few monopoly firms and acquiring the best, newest chips to maintain AI leadership has proven somewhat redundant. The fact that DeepSeek has managed to close the gap by focusing on efficiency and cost-effectiveness threatens the anticipated monopoly-like profits investors have bet on.

A hole has also been blown in Siemens Energy’s valuation, signaling expectations that DeepSeek could shift AI’s trajectory away from energy-intensive models. Companies that supply data centres now have a shadow of doubt over their robust growth expectations. Terry Smith, manager of one of the UK’s biggest funds, Fundsmith Equity, warned against picking long-term winners in an evolving space, comparing Nvidia and Microsoft’s AI dominance to past tech leaders like BlackBerry and Myspace.

The Future

Despite DeepSeek’s clear success, existing monopolies hold hope in the Jevons paradox—the idea that increased efficiency leads to greater resource use, not less. Satya Nadella, CEO of Microsoft, remains confident that this will ultimately benefit everyone in the AI space, as AI becomes more of a commodity.

Our increased appetite to live digital lives inevitably means more energy consumption, and the anticipated surge in AI usage makes sustainability a critical priority to minimise environmental impact. Tech companies like Microsoft, Google and Apple do indeed demonstrate effort on this front, investing heavily in their future power sustainability, with wind, solar, battery and nuclear innovation. OpenAI are also big advocates for Small Modular Reactors. So maybe this could cancel out the extreme energy consumption of their endeavours. Maybe right now is a temperamental phase of innovation - the efficiency and offsetting can be worked out later.

AI is only just getting started as an investment opportunity, yet its significant impact across various sectors is already apparent. Whilst first movers do not always stay on top, the original frontrunners are still very well placed in a highly competitive marketplace, with Nvidia already recovering half of its losses. We can expect more significant changes before the technology becomes ubiquitous and, along the way, make calls for increased sustainability.

References

  1. C. Baraniuk, “Electricity grids creak as AI demands soar,” BBC, May 21, 2024.
  2. Available at: https://www.bbc.co.uk/news/articles/cj5ll89dy2mo (accessed Feb. 23, 2025).
  3. A. Duncan, “DeepSeek: What lies under the bonnet of the new AI chatbot?,” BBC, Feb. 1, 2025.
  4. Available at: https://www.bbc.co.uk/future/article/20250131-what-does-deepseeks-new-app-mean-for-the-future-of-ai (accessed Feb. 23, 2025).
  5. P. Jiang, C. Sonne, W. Li, F. You, and S. You, “Preventing the Immense Increase in the Life-Cycle Energy and Carbon Footprints of LLM-Powered Intelligent Chatbots,” Science Direct, 2024.
  6. Available at: https://www.sciencedirect.com/science/article/pii/S2095809924002315 (accessed Feb. 24, 2025).
  7. S. Kolostyak, “Fundsmith Underperforms, Smith Takes Aim at AI Hype,” Morningstar, Jan. 10, 2024.
  8. Available at: https://www.morningstar.co.uk/uk/news/244710/fundsmith-underperforms-smith-takes-aim-at-ai-hype.aspx (accessed Feb. 23, 2025).
  9. Anonymous, “Nvidia is now the world’s most valuable company,” The Economist, June 20, 2024.
  10. Available at: https://www.economist.com/business/2024/06/20/nvidia-is-now-the-worlds-most-valuable-company (accessed Feb. 23, 2025).
  11. R. Shanbhogue, “Will the bubble burst for AI in 2025, or will it start to deliver?,” The Economist, Nov. 18, 2024.
  12. Available at: https://www.economist.com/the-world-ahead/2024/11/18/will-the-bubble-burst-for-ai-in-2025-or-will-it-start-to-deliver# (accessed Feb. 24, 2025).
  13. A. Scagg, “AI’s ‘relentless thirst for power’,” The Financial Times, Jan. 16, 2025.
  14. Available at: https://www.ft.com/content/852bc3e2-d7fb-467b-9651-077a7d09a0ce (accessed Feb. 23, 2025).
  15. Anonymous, “The real meaning of the DeepSeek drama,” The Economist, Jan. 29, 2025. (accessed Feb. 23, 2025).
  16. H. Ritchie, Sustainability by Numbers, Nov. 18, 2024.
  17. Available at: https://www.sustainabilitybynumbers.com/p/ai-energy-demand (accessed Feb. 24, 2025).
  18. R. Waters, “Big Tech is moving on from the DeepSeek shock,” Financial Times, Feb. 13, 2025.
  19. Available at: https://www.ft.com/content/42d4c050-adba-4262-9cdf-b407b144d0a7 (accessed Feb. 22, 2025).