The energy question at the heart of AI hype | WARC | The Feed
You didn’t return any results. Please clear your filters.
The energy question at the heart of AI hype
Amid the excitement of artificial intelligence systems, it’s important to remember that these need lots of energy to run, with initial estimates suggesting the enormous scale of the output.
Why it matters
Well, nobody knows how much energy exactly, and that’s a really major problem when businesses and governments ought to be doing everything in their power to reach net zero.
There are echoes here of blockchain hype two years ago, in which the often gimmicky technology masked a horrific gas-guzzling infrastructure that (between Bitcoin and pre-proof-of-stake Ethereum) consumed major-economy levels of energy.
In December, OpenAI CEO Sam Altman noted that “compute costs are eye-watering” just for the ChatGPT early release. Eye-watering costs indicate eye-watering energy usage and should factor not only into the thinking of tech companies but the brands and agencies that are considering AI.
What’s going on
Training an AI model is very intensive, according to a Bloomberg estimate, with a single model’s development sometimes using as much electricity as 100 US homes would use over the course of a year.
- For GPT-3, for instance, which is the language model on which ChatGPT is based, researchers at UC Berkeley and Google estimated that its training used the equivalent of 120 US homes’ annual energy consumption.
- Bloomberg also notes that Google’s AI capabilities are reported to use between 10-15% of the company’s total electricity consumption.
- GPT-4 is now in development and is said to contain 170 trillion parameters to GPT-3’s 175 billion. Google’s PaLM language model has 540 billion. All are using vast amounts of energy.
Off-the-peg AI solutions can only take specialist companies, brands and agencies so far. Effectively, these systems will become most useful when trained across both external and internal material (or parameters) and the likely cost and energy implications should factor into a company’s thinking.
It’s worth noting that the major companies – Microsoft, whose Azure cloud computing services run OpenAI’s GPT models and Google – are both committed to being carbon free or even carbon negative by 2030.
Sourced from Bloomberg, Microsoft, Google, WARC
Email this content