Media: S&P Feature - AI and electricity demand

Our analysis and research was featured in S&P market intelligence. Read the full piece here.

"Regarding US power demand, it's really hard to quantify how much demand is needed for things like ChatGPT," David Groarke, managing director at consultant Indigo Advisory Group, said in an interview, referring to the AI-powered language model developed by OpenAI LLC. "In terms of macro numbers, by 2030, AI could account for 3% to 4% of global power demand. [Google LLC] said right now AI is representing 10% to 15% of their power use, or 2.3 TWh annually."

“It is useful to look at AI in two broad slots, Indigo Advisory Group's Groarke said: "Narrow AI" is a little more contained and not that energy intensive, with use cases like load forecasting and predictive maintenance. "Inference usage," like running a prompt that provides an answer, adds to power consumption”

Power demand for AI comes from training these models, which are pulling in "huge amounts of data from the web, and that seems to be doubling every year," Groarke said.

Global data volumes double every few years, but it is challenging to isolate datacenter usage from the total, he said.

"The intensity of the training of the models is using the most power," Groarke said. "We need new ways of creating those models and a lot of this power demand is predicated on how AI is adopted."

AI is not on par with power usage for cryptocurrency, "but we could see that as companies adopt large language models," he said.

Many companies are working at embedding large language models in their own networks. Within the large language model industry there is an effort to reduce complexity and increase the efficiency of hardware. "There is an awareness of not just building these models for the sake of it," Groarke said”

Previous
Previous

Media: Research on AI and the energy sector featured in Reuters

Next
Next

Event: Verge: San Jose, Oct 24th