How can we reduce data centres’ emissions through AI?

Companies are facing a Catch 22 when it comes to the need to invest in new forms of AI, whilst continuing to hit increasingly stringent sustainability targets. Maxime Vermeir, Senior Director for AI strategy at ABBYY,

Research from Morgan Stanley notes that as data centres expand due to the increasing demand for AI and cloud infrastructure, they are likely to produce around 2.5 billion metric tonnes of carbon dioxide-equivalent emissions globally through the end of the decade. To put it another way, training a single AI model can consume as much electricity as 100 homes for one year.

So, where does this leave business leaders who are trying to balance the need for innovation, while complying with environmentally friendly practices?

Optimise AI for renewable energy

Data centre operators aim to manage costs and ensure reliable performance by maintaining steady energy consumption. However, they currently rely on electricity grids and renewable energy sources, which can be inconsistent depending on the weather. We’re still in a position where they may need to use fossil fuels during energy shortages, if they don’t have sufficient pumped storage.

There are AI tools that can overcome these challenges by optimizing energy use. For example, the technology can predict the availability of solar energy by using weather data and predictive analysis. Predictive analytics like this could allow data centres to shift workloads according to renewable energy generation peaks, eventually enabling them to reduce their dependence on fossil fuels.

While AI is undoubtedly having an environmental impact, for many in the industry the benefits of the technology will outweigh the risks in combating the climate crisis. With the rise of energy-efficient algorithms and groundbreaking cooling systems, AI can be used as a powerful ally in the fight against climate change. 

Utilise new AI technologies

AI is already playing a key role in redesigning sustainability programs, and it has the potential to significantly enhance energy efficiency. By optimizing operations and eliminating unnecessary processes, AI could lead to a substantial reduction in overall energy consumption over time. One way it is doing this is through digital twin technology.

There are many variations of digital twins, and depending on the type, the technology may cost as little as £50,000 to over £766,000 to deploy.

Digital twins are able to create a virtual replica of data centre infrastructure, which can then be used to generate real-time insights to improve efficiency. By automating tasks such as cooling adjustments, they reduce human error and reduce the amount of energy used.

Pivot to purpose-built AI

Finally, using specialized or purpose-built AI such as small language models (SLMs) can significantly reduce energy consumption. Advanced platforms such as generative AI and large language models (LLMs) have come under scrutiny for their high energy usage, stemming from the massive stores of data that must be navigated to yield results.

Instead, enterprises have begun to pivot to purpose-built AI specialized for narrower tasks and goals. These solutions are tailored to improve accuracy in real-world scenarios. For example, ABBYY trains its machine learning and natural language processing (NLP) models to read and understand documents that run through enterprise systems just like a human. With pre-trained AI skills to process highly specific document types with 95% accuracy, organizations can save trees by eliminating the use of paper while also reducing the amount of carbon emitted through cumbersome document management processes.

The future of AI will be determined by regulation

AI regulation is still in its early stages, and some aspects are yet to be fully refined. The EU’s AI Act, which aims to regulate the use of AI systems based on their risk levels, came into force in July this year. It requires AI systems to prioritise transparency and safety, which could encourage responsible AI development and usage. However, how effective it ends up being will depend on how well it is implemented.

Currently, regulations lean towards the data security and privacy side, often neglecting the environmental impacts of AI. While security concerns are clearly vital to address, it’s important to consider the broader implications of AI on the environment as well.

To my mind, better defined national, regional and international frameworks are needed for energy consumption, especially in view of the role of the energy sector in the global economy and its importance for climate goals.

As more regulations are proposed and come into play, the fact remains that it’s important to get the balance right between regulation on one hand, and allowing businesses enough creative freedom to make advances in AI on the other.

With a lack of legislation in the sustainability arena, it’s up to business leaders to take the initiative and ensure transparency and accountability for sustainability credentials are upheld when implementing AI. Companies willing to embrace this challenge can not only gain substantial economic advantages but also establish a new standard for sustainable innovation.

By David de Santiago, Group AI & Digital Services Director at OCS.
By Krishna Sai, Senior VP of Technology and Engineering.
By Danny Lopez, CEO of Glasswall.
By Oz Olivo, VP, Product Management at Inrupt.
By Jason Beckett, Head of Technical Sales, Hitachi Vantara.
By Thomas Kiessling, CTO Siemens Smart Infrastructure & Gerhard Kress, SVP Xcelerator Portfolio...
By Dael Williamson, Chief Technology Officer EMEA at Databricks.