From the Wall Street Journal, “Artificial Intelligence’s ‘Insatiable’ Energy Needs Not Sustainable, Arm CEO Says” (ARM being a chip design company):AI models such as OpenAI’s ChatGPT “are just insatiable in terms of their thirst” for electricity, Haas said in an interview. “The more information they gather, the smarter [sic] they are, but the more information they gather to get smarter, the more power it takes.” Without greater efficiency, “by the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements. Today that’s probably 4% or less,” he said. “That’s hardly very sustainable, to be honest with you.” From Forbes, “AI Power Consumption: Rapidly Becoming Mission-Critical“:Big Tech is spending tens of billions quarterly on AI accelerators, which has led to an exponential increase in power consumption. Over the past few months, multiple forecasts and data points reveal soaring data center electricity demand, and surging power consumption. The rise of generative AI and surging GPU shipments is causing data centers to scale from tens of thousands to 100,000-plus accelerators, shifting the emphasis to power as a mission-critical problem to solve… The [International Energy Agency (IEA)] is projecting global electricity demand from AI, data centers and crypto to rise to 800 TWh in 2026 in its base case scenario, a nearly 75% increase from 460 TWh in 2022.
From the World Economic Forum,AI requires significant computing power, and generative AI systems might already use around 33 times more energy to complete a task than task-specific software would.As these systems gain traction and further develop, training and running the models will drive an exponential increase in the number of data centres needed globally – and associated energy use. This will put increasing pressure on already strained electrical grids.Training generative AI, in particular, is extremely energy intensive and consumes much more electricity than traditional data-centre activities. As one AI researcher said, ‘When you deploy AI models, you have to have them always on. ChatGPT is never off.’ Overall, the computational power needed for sustaining AI’s growth is doubling roughly every 100 days.
Translating, electric power is going to be increasingly scarce, even when (if) we start to modernize the grid. When there’s real “pressure,” and push comes to shove, where do you think the power will go? To your Grandma’s air conditioner in Phoenix, where she’s sweltering at 116°F, or to OpenAI’s data centers and training sets? Especially when “national security” is involved?