AI concept - digital image a of a planet.

Image: Getty.

AI has a large and growing carbon footprint, but there are potential solutions on the horizon

Given the huge problem-solving potential of artificial intelligence (AI), it wouldn’t be far-fetched to think that AI could also help us in tackling the climate crisis. However, when we consider the energy needs of AI models, it becomes clear that the technology is as much a part of the climate problem as a solution.

The emissions come from the infrastructure associated with AI, such as building and running the data centres that handle the large amounts of information required to sustain these systems.

But different technological approaches to how we build AI systems could help reduce its carbon footprint. Two technologies in particular hold promise for doing this:  and lifelong learning.

The lifetime of an AI system can be split into two phases: training and inference. During training, a relevant dataset is used to build and tune – improve – the system. In inference, the trained system generates predictions on previously unseen data.

For example, training an AI that’s to be used in self-driving cars would require a dataset of many different driving scenarios and decisions taken by human drivers.

After the training phase, the AI system will predict effective manoeuvres for a self-driving car. , are an underlying technology used in most current AI systems.

They have many different elements to them, called parameters, whose values are adjusted during the training phase of the AI system. These parameters can run to more than 100 billion in total.

While large numbers of parameters improve the capabilities of ANNs, they also make training and inference resource-intensive processes. To put things in perspective, training GPT-3 (the precursor AI system to the current ChatGPT) generated 502 metric tonnes of carbon, which is equivalent to driving 112 petrol powered cars for a year.

GPT-3 further emits  due to inference. Since the AI boom started in the early 2010s, the energy requirements of AI systems known as large language models (LLMs) – the type of technology that’s behind ChatGPT – have gone up .

With the increasing ubiquity and complexity of AI models, this trend is going to continue, potentially making AI a significant contributor of COâ‚‚ emissions. In fact, our current estimates  due to a lack of standard and accurate techniques for measuring AI-related emissions.

Spiking neural networks

The previously mentioned new technologies, spiking neural networks (SNNs) and lifelong learning (L2), have the potential to lower AI’s ever-increasing carbon footprint, with SNNs acting as an energy-efficient alternative to ANNs.

ANNs work by processing and learning patterns from data, enabling them to make predictions. They work with decimal numbers. To make accurate calculations, especially when multiplying numbers with decimal points together, the computer needs to be very precise. It is because of these decimal numbers that ANNs require lots of computing power, memory and time.

This means ANNs become more energy-intensive as the networks get larger and more complex. Both ANNs and SNNs are inspired by the brain, which contains billions of neurons (nerve cells) connected to each other via synapses.

Like the brain, ANNs and SNNs also have components which researchers call neurons, although these are artificial, not biological ones. The key difference between the two types of neural networks is in the way individual neurons transmit information to each other.

Neurons in the human brain communicate with each other by transmitting intermittent electrical signals called spikes. The spikes themselves do not contain information. Instead, the information lies in the timing of these spikes. This binary, all-or-none characteristic of spikes (usually represented as 0 or 1) implies that neurons are active when they spike and inactive otherwise.

This is one of the reasons for .

Just as Morse code uses specific sequences of dots and dashes to convey messages, SNNs use patterns or timings of spikes to process and transmit information... 

Continues…

The Conversation logo, featuring the word 'conversation'

The full article - by Dr Shirin Dora, of ÌìÌÃÊÓƵ's Department of Computer Science - can be

Notes for editors

Press release reference number: 24/26

ÌìÌÃÊÓƵ is one of the country’s leading universities, with an international reputation for research that matters, excellence in teaching, strong links with industry, and unrivalled achievement in sport and its underpinning academic disciplines. 

It has been awarded five stars in the independent QS Stars university rating scheme, named the best university in the world for sports-related subjects in the 2023 QS World University Rankings – the seventh year running – and University of the Year for Sport by The Times and Sunday Times University Guide 2022. 

ÌìÌÃÊÓƵ is ranked 7th in The UK Complete University Guide 2023, 10th in the Guardian University League Table 2024 and 10th in the Times and Sunday Times Good University Guide 2024. 

ÌìÌÃÊÓƵ is consistently ranked in the top twenty of UK universities in the Times Higher Education’s ‘table of tables’, and in the Research Excellence Framework (REF) 2021 over 90% of its research was rated as ‘world-leading’ or ‘internationally-excellent’. In recognition of its contribution to the sector, ÌìÌÃÊÓƵ has been awarded seven Queen's Anniversary Prizes. 

The ÌìÌÃÊÓƵ London campus is based on the Queen Elizabeth Olympic Park and offers postgraduate and executive-level education, as well as research and enterprise opportunities. It is home to influential thought leaders, pioneering researchers and creative innovators who provide students with the highest quality of teaching and the very latest in modern thinking. 

Categories