Advancements in artificial intelligence have continued to accelerate over the course of this year, bringing with them innovation on a scale that has never been seen before.
Now firmly in the mainstream, AI tools are expected to revolutionise all kinds of industries, from transforming retail experiences to changing the way we access medicine.
Businesses are already having to compete with some of the biggest names in the industry, including the likes of Chat GPT, and many are hard at work on their own versions of popular chatbots with similar capabilities.
But like most technological advances, the introduction of every new AI chatbot comes with a downside. In this case, it’s a considerable amount of energy consumption.
Some of the biggest players in this market, such as Microsoft Corp., Alphabet Inc.’s Google and ChatGPT maker OpenAI, use cloud computing to power their tools. The technology provided by these companies relies on thousands of chips, housed within servers in gargantuan data centres across the globe.
The power of AI chatbots lies in their ability to train algorithms, known as models, analysing data in order to learn from it and better perform tasks. But what does this mean in terms of energy usage? Is the price being paid for AI tools like these higher than we might have first realised?
Here we will explore the issue of AI energy consumption, to reveal which chatbots have the highest energy requirements, and which have a far lower impact on the environment.
Which AI chatbots are most popular around the world?
Before we discuss the energy consumption of the biggest names amongst AI chatbots, it’s worth taking a look at which of these tools is currently most popular, and how far their usage has spread across the world.
Our researchers analysed the popularity of a range of different chatbots, looking at factors such as the number of times per month, on average, people search for these chatbots as target keywords.
Analysis covered searches being undertaken all over the planet, so the results can be seen as a guide to the most popular chatbots amongst users worldwide right now. Let’s take a look at the rankings:
1. ChatGPT search volume: 16M
2. Bing search volume: 6.7M
3. Jasper search volume: 348K
4. Google Bard search volume: 201K
5. Chatsonic search volume: 158K
6. Socratic by Google search volume: 109K
7. YouChat search volume: 20K
How do the most popular chatbots of the moment rank in terms of energy consumption?
The team at TRG Datacenters delved into the data on today’s top chatbots, to find out more about how these tools used electricity, and which were the biggest consumers of energy.
Researchers found marked differences in the energy usage of different tools, with some consuming far more than others, despite relatively similar functionality.
Here are the team’s findings in detail.
Bing: Energy Consumption in MWh = 7,200 MWh
Bing was the biggest consumer of energy, out of the top chatbots we analysed for this study.
Microsoft Bing Uses OpenAI ChatGPT-4. GPT-4 has the same architecture, but has 100 trillion parameters. That’s 5.8 times more than GPT-3.
If GPT-4 has 5.8 times more parameters, it should take 5.8 times more petaflop/s-days of computation than GPT-3. Therefore, it’d take 5.8 * 3640 petaflop/s-days of computation, totalling 21,000 petaflop/s-days of computation.
If OpenAI uses the same Nvidia GPUs V100 in its supercomputer, with 0.014 petaflop/s, then it should take around 150 days to complete its training.
The training time of GPT-4 is around five to six months. So, 10,000 V100, running for 150 days on full power, means energy consumption of 7200000000 watt-hours, or 7,200 MWh.
ChatGPT: Energy Consumption in MWh = 1,248 MWh
The architecture of ChatGPT-3, also known as GPT-3 (Generative Pre-trained Transformer 3), is based on the Transformer architecture.
GPT-3, the language model developed by OpenAI, has 175 billion trainable parameters and consists of 96 layers. OpenAI’s documentation states that training the largest version of GPT-3 required 3640 petaflop/s-days of computation.
OpenAI used a supercomputer with over 285,000 CPU cores, 10,000 GPUs and 400 gigabits per second of network connectivity for each GPU server to train GPT-3. The supercomputer was built by Microsoft specifically for OpenAI’s training needs.
OpenAI, GPT-3 was trained on 10,000 V100 GPUs. Therefore, each Nvidia V100 uses around 250W, V100 has a 0.014 petaflop/s
We know that it took 3,640 petaflop/s-days of computation to complete this, so it is reasonable to assume that it would take approximately 26 days for 10,000 V100. (0.014*10,000*26 = 3640 petaflop/s-days). So, 10,000 V100 running 26 days on full power would mean an energy consumption of 1248,000,000 watt-hours.
Jasper, YouChat, Chatsonic by Writesonic: Energy Consumption is the same as GPT-3 and GPT-4
To calculate the consumption of these chatbots, researchers took into account the fact that YouChat uses ChatGPT-3 and Jasper uses OpenAI 3.5.
Chatsonic by Writesonic is an AI platform that uses GPT-4 as its underlying technology, meaning that the chatbot’s energy consumption would be the same as that of GPT-3 and GPT-4.
Google Bard: Energy Consumption in MWh = 312 MWh
At the lower end of the scale, Google Bard was found to consume 312 MWh. Bard uses TPUv4 which is twice as fast, and half as energy hungry as Nvidia V100.
Bard has 137 billion parameters, which are very close to ChatGPT-3. If we consider them to be the same then it’s likely that it would take 3,640 petaflop/s-days of computation. So, if GPT-3 took 26 days to complete, then TPUv4 would take 13 days to complete.
If V100 consumes 200W, we can calculate that TPUv4 would consume around 100W. Therefore, the energy consumption of this chatbot is approximately 312,000,000 watt-hours.
Socratic by Google: Energy Consumption in MWh = 50 to 300 MWh
The lowest consumer of energy amongst the chatbots analysed in this study was Socratic by Google. This education-focussed chatbot consumes between 50 to 300 MWh and uses its own natural language model.
In terms of energy consumption, Socratic by Google is one of the most efficient tools our researchers examined – but of course it does come with limitations in terms of its capabilities.
There are now hundreds of thousands of AI chatbots in use around the world. Facebook Messenger alone is home to 300,000 active AI chatbots. And the number of chatbots available is set to soar in the near future, as businesses begin to adopt this innovative technology in their droves.
Analysts predict that the number of devices equipped with chat assistants could soon reach 8 billion globally. And of course this will have a significant impact in terms of energy consumption.
It remains to be seen how companies will balance the functionality of this transformative technology with the wider impact of their introduction. Eyes are on major players within the industry, such as Microsoft, Google, and OpenAI, to see how teams might impede rapid rises in energy consumption, as the mind-boggling capabilities of their latest chatbots develop.
Methodology: The monthly energy usage of the AI models and the energy consumption associated with the training process for each AI model was calculated by looking at the following specifications: Model Architecture, Hardware Specifications, Training Phase, Inference Phase and Power Efficiency.
Whilst the overall energy usage is an estimate, the numbers associated with these specifications were taken from multiple, accredited online sources.