TBN 0.00% 18.0¢ tamboran resources corporation

AI will require over 17 bfc/d of natural gas-fired power globally

  1. 814 Posts.
    lightbulb Created with Sketch. 267
    Insightful latest quarterly report from Natural Resources specialists, Goehring & Rozencwajg

    ON LNG, AI & SHALE SUPPLY: WE EXPECT THE TURN IN US GAS IS HERE

    US domestic gas consumption for electricity is expected to rise materially in the coming years, driven by the proliferation of data centers and artificial intelligence. Over the past several months, we have read countless articles detailing the energy demand from generative AI, such as ChatGPT.

    Some of the best work comes from Rob West at Thunder Said Energy who quantified the potential impact. Although he is uncertain about some of his projections, demand will be material. Modern artificial intelligence consists of two distinct phases: training and inference. During the training phase, vast quantities of computing power optimize trillions of parameters (or neurons) across zettabytes of textual data. This process consumes an enormous amount of energy. It is estimated that training GPT-4 alone consumed 50 GWH of electricity, equivalent to the average annual consumption of 5,000 American households. Once a model has been trained, end users queried it, a process known as “inference.” Although each inference requires only a fraction of the energy needed for training, a single model might be queried billions of times. West estimates a ChatGPT “inference” requires ten times as much energy as a Google search -- 3.6 Wh compared to 0.3 Wh. Generative AI’s total energy consumption is a function of several related variables: the number of new models trained per year, the complexity of each model, the energy efficiency of new AI chipsets, and the total queries per trained model.

    Although it is beyond the scope of this essay to dissect each assumption, a few key drivers are worth discussing.

    First, many analysts expect energy efficiency to improve, mitigating energy demand growth. Unfortunately, this violates Jevons Paradox – a concept discussed in our 3Q23 letter. Jevons observed in the seventeenth century that instead of lowering demand, improved steam-engine efficiency dramatically increased English coal consumption. Although steam engines were becoming far more efficient, lower operating costs increased their proliferation, offsetting any gains and increasing overall coal demand. Jevons Paradox is even more pronounced with generative AI. Supercomputer energy efficiency is measured in giga-flops per watt. Despite having improved five times since 2018, the total energy required to train an AI model has increased by an incredible 5,000 times. Training GPT-4 required fifty times more energy than a 2022-vintage model. As chips become more energy efficient, model complexity grows exponentially, requiring more energy to train. Furthermore, the number of distinct models has also grown exponentially. A significant number of more complex models has dwarfed any improvement in chipset energy efficiency, a trend that we expect will continue.

    Second, despite the rhetoric around “green” data centers, we expect generative AI electricity demand will fall primarily to natural gas for two reasons. First, West estimates that the cost of training an AI model is five times more sensitive to electricity utilization than to price. As a result, both wind’s and solar’s inherent intermittency preclude them from being viable sources of electricity to power AI data centers. Second, even when a PV solar or wind installation generates power, the “quality” of the electricity, measured by its harmonic distortion, is unsuitable for the sensitive hardware used to train and query AI models. As a result, we believe the widespread proliferation of AI must be met with either coal, natural gas, or nuclear-based power. It is unlikely that new coal-fired power will be sanctioned in the US and the lead time on new nuclear power plants is too long to meet demand over the next several years. Therefore, natural gas should be the primary beneficiary of the AI rollout through the decade’s end.

    A MODERN DATA CENTER IS EXPECTED TO NEED AS MUCH ENERGY AS EVERY DATA CENTER DOMINION HAS CONNECTED SINCE 2019. INTERESTINGLY, THE TWO MOST ADVANCED SMALL MODULAR REACTOR COMPANIES, TERRAPOWER AND OKLO, ARE BACKED BY BILL GATES AND SAM ALTMAN, RESPECTIVELY.

    The impact of AI’s relentless power demand is already being felt. In May 2024, Dominion announced that new data centers in Virginia, used to train and query AI models, require several gigawatts of power, equivalent to several large nuclear power plants. t However, we do not expect either of these technologies to be rolled out until at least the end of the decade. In the interim, we believe natural gas demand will surge.

    Although estimates vary, West believes 150 GW of AI data centers will be required by 2030, consuming 1000 TWH annually. Assuming 40% of global data center capacity is installed in the United States, AI data centers will consume 400 TWH of electricity, requiring 7 bcf/d of natural gas. Such a buildout would represent the largest increase in gas-fired power capacity in US history.
 
watchlist Created with Sketch. Add TBN (ASX) to my watchlist
(20min delay)
Last
18.0¢
Change
0.000(0.00%)
Mkt cap ! $370.8M
Open High Low Value Volume
18.0¢ 18.0¢ 17.5¢ $79.62K 442.4K

Buyers (Bids)

No. Vol. Price($)
7 1457861 17.5¢
 

Sellers (Offers)

Price($) Vol. No.
18.5¢ 147000 2
View Market Depth
Last trade - 14.45pm 12/07/2024 (20 minute delay) ?
TBN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.