What Amazon is far less known for is being the owner and operator of nuclear power plants.
Yet that’s exactly what its cloud subsidiary, AWS, did in March, purchasing a $650 million nuclear-powered data center campus next to a nuclear power plant from Talen Energy in Pennsylvania.
On the surface, the deal indicates Amazon’s ambitious expansion plans. But dig deeper, and the company’s purchase of a nuclear power facility speaks to a broader issue that Amazon and other tech giants are grappling with: the insatiable demand for energy from artificial intelligence.
In Amazon's case, AWS purchased Talen Energy’s Pennsylvania nuclear-powered data center to co-locate its rapidly expanding AI data center next to a power source, keeping up with the energy demands that artificial intelligence has created.
The strategy is a symptom of an energy reckoning that has been building as AI has been creeping into consumers' daily lives — powering everything from internet searches to smart devices and cars.
Companies like Google, Apple, and Tesla continue to enhance AI capabilities with new products and services. Each AI task requires vast computational power, which translates into substantial electricity consumption through energy-hungry data centers.
Estimates suggest that by 2027, global AI-related electricity consumption could rise by 64%,reaching up to 134 terawatt hours annually — or the equivalent of the electricity usage of countries like the Netherlands or Sweden.
This raises a critical question: How are Big Tech companies addressing the energy demands that their future AI innovations will require?
The rising energy consumption of AI
According to Pew Research,more than half of Americans interact with AI at least once a day.
Prominent researcher and data scientist Sasha Luccioni, who serves as the AI and climate lead at Hugging Face, a company that builds tools for AI applications, often discusses AI's energy consumption.
Luccioni explained that while training AI models is energy-intensive — training the GPT-3 model, for example, used about 1,300 megawatt-hours of electricity — it typically only happens once. However, the inference phase, where models generate responses, can require even more energy due to the sheer volume of queries.
For example, when a user asks AI models like ChatGPT a question, it involves sending a request to a data center, where powerful processors generate a response. This process, though quick, uses approximately 10 times more energy than a typical Google search.