A coalition of over 100 organizations has issued an open letter urging the artificial intelligence (AI) industry and regulators to ensure that the expansion of AI data centers does not lead to the depletion of natural resources or force utilities to depend on non-renewable energy sources. This plea comes amidst growing concerns about the energy consumption of AI technologies, particularly platforms like ChatGPT. However, recent analyses suggest that earlier estimates of ChatGPT's power usage may have been significantly overstated.
ChatGPT, developed by OpenAI, has been widely cited as requiring approximately 3 watt-hours of power to answer a single query—ten times the energy needed for a Google search. Yet, new insights reveal that this figure may not accurately reflect reality. Epoch, a research organization, argues that the 3 watt-hours estimate is inflated. Their findings indicate that using OpenAI’s latest model, GPT-4o, ChatGPT queries actually consume an average of around 0.3 watt-hours each—less than many common household appliances.
“I’ve seen a lot of public discourse that correctly recognized that AI was going to consume a lot of energy in the coming years, but didn’t really accurately describe the energy that was going to AI today,” said Joshua You.
Joshua You and his colleagues have observed discrepancies in public perceptions of AI's current energy demands. The widely-cited 3 watt-hours figure stemmed from outdated research and lacked precision. Moreover, this calculation did not account for additional energy costs associated with features like image generation or input processing.
“Also, some of my colleagues noticed that the most widely-reported estimate of 3 watt-hours per query was based on fairly old research, and based on some napkin math seemed to be too high,” added Joshua You.
The misconceptions surrounding ChatGPT's power usage highlight the broader issue of understanding AI's energy impact. As reasoning models become more sophisticated, they require increased computing power, which in turn raises energy consumption. These models can "think" for extended periods before producing a response, a process demanding significant computational resources.
The expanding user base of ChatGPT amplifies server demands. By 2030, training advanced AI models may necessitate energy outputs comparable to eight nuclear reactors (8 GW). Additionally, a Rand report predicts that within two years, AI data centers could require nearly all of California's 2022 power capacity (68 GW).
“The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car,” noted Joshua You.
OpenAI's lack of transparency regarding the specific details needed for precise energy consumption calculations complicates efforts to gauge ChatGPT's true impact. The assumption that OpenAI operates on older, less efficient chips has further muddied these estimates. Despite this, OpenAI and its partners plan to invest billions in new AI data centers over the coming years, signaling the industry's commitment to growth.
“You could try using smaller AI models like [OpenAI’s] GPT-4o-mini,” suggested Joshua You.
OpenAI's attention, along with the broader AI sector's focus, is gradually shifting towards reasoning models. These advanced models are generally more capable and efficient but also demand more computing power to operate effectively. The energy usage associated with ChatGPT and similar platforms largely hinges on their application and the specific AI models employed.
“The AI will get more advanced, training this AI will probably require much more energy, and this future AI may be used much more intensely — handling much more tasks, and more complex tasks, than how people use ChatGPT today,” concluded Joshua You.
Leave a Reply