My question is How much :) AI response high and low ways at current prices :)
Based on the current market price of U308, 850,000 pounds of U308 is worth $50,116,000. This calculation uses the most recent spot price of $58.96 per pound as of January 2025
NB Current spot is below that today so AI missed a little :)
If sold under contract at the Long-Term price, 850,000 pounds of U308 would be worth $69,700,000. This calculation is based on the current Long-Term Price Indicator of $82.00 per pound U3O8 as of December 31, 2024
NB slightly wrong at today's LT price
next question is what is the expected u308 price in 2026 :)Based on the available forecasts, the expected Long-Term price for uranium in a year's time (March 2026) is likely to be in the range of $80-$94 per pound U3O8.
Several analysts have provided projections for uranium prices in 2026:
UBS has lowered its uranium price forecast to $80/lb for 20265.
The Office of Australia's Chief Economist predicts the price will settle around $99/lb in 20262.
Citi forecasts uranium prices to average $110/lb in 2025, suggesting a continued bullish trend into 20261.
Ord Minnett believes the price can be maintained above $85/lb from 2026-20303.
It's important to note that these forecasts vary due to different assumptions about supply and demand dynamics. Some analysts expect supply to exceed demand until the early 2030s, while others predict a structural supply deficit
how much is 850,000lbs of u308 worth link
While using this AI gave me another question of course. :)
how much extra electricity was used using AI for these questions compared to a non AI searchTo calculate the extra electricity used by AI for answering these questions compared to a standard search engine:
:
:
Assuming three questions were asked and answered using ChatGPT.
:
For ChatGPT: 3×0.0029 kWh=0.0087 kWh3×0.0029kWh=0.0087kWh.
For Google Search: 3×0.0003 kWh=0.0009 kWh3×0.0003kWh=0.0009kWh.
:
Extra energy used by ChatGPT: 0.0087−0.0009=0.0078 kWh0.0087−0.0009=0.0078kWh.
Thus, using AI (ChatGPT) for these three questions consumed 0.0078 kWh more electricity than using a traditional search engine like Google. This is about 10 times the energy of standard searches for the same queries13
To really add a bit of fun I noticed I could choose a new DEEP search option today. It was taking a LONG time to search for Nuclear Power News 30 to 40 times!!!In stark contrast to traditional search engines, AI deep research systems that utilize large language models (LLMs) like those powering ChatGPT and similar services require significantly more computational resources. These systems don't simply retrieve information but generate novel content through complex neural network computations that process and synthesize data.
According to available estimates, a single query to a ChatGPT-4 level system consumes between 0.001-0.01 kWh (3.6-36 kJ) of energy, depending on the model size and the number of tokens processed1. This substantial energy requirement stems from the fundamental architecture of these systems, which must perform billions of mathematical operations across neural networks containing hundreds of billions of parameters.
The energy consumption of these AI systems is significant enough that researcher Jesse Dodge estimated a single inquiry to ChatGPT utilizes energy equivalent to illuminating a light bulb for approximately 20 minutes3. This vivid comparison helps contextualize the substantial power draw of these sophisticated AI systems compared to simpler information retrieval operations.
A significant contributing factor to this high energy consumption is the computational intensity of "forward passes" through these massive neural networks. Each time a user submits a query, the AI model must process the input through multiple layers of computation to generate a coherent, contextually appropriate response. A single forward pass through a model like GPT-4 can consume several joules per query5, which may seem minimal but multiplies dramatically across millions of daily interactions.
When directly comparing the energy consumption of AI deep research systems to standard search engines, the difference is striking. Based on multiple sources and analyses, AI-powered research assistants consistently require substantially more electricity than traditional search queries.
A report by Goldman Sachs cited in the search results indicates that a ChatGPT query requires nearly ten times the electricity of a Google search query3. This represents a significant energy premium for the enhanced capabilities of AI-powered research.
Even more dramatic estimates come from researcher de Vries, who calculated that AI-integrated search requests through NVIDIA servers would require 7-9 watt-hours (Wh) of energy per request, which is 23-30 times the energy of a normal search based on Google's reported figures4. While these estimates were made in 2023 and may have improved with technical advancements, they still illustrate the substantial energy differential between these technologies.
Using a mid-range estimate from the available data, we can calculate that AI deep research systems like those powering sophisticated question-answering tools consume approximately 0.005 kWh per query, compared to 0.0003 kWh for a standard search. This represents an additional 0.0047 kWh per query, or approximately 16.7 times more energy than a traditional search engine query1.
The extra electricity consumed can be calculated as:
Energy(AI Deep Research) - Energy(Standard Search) = 0.005 kWh - 0.0003 kWh = 0.0047 kWhThis means that for every query processed, an AI deep research system consumes an additional 0.0047 kWh of electricity compared to a standard search engine. While this may seem small on an individual query basis, the cumulative impact becomes significant when scaled to millions or billions of queries.
The substantially higher energy requirements of AI deep research systems have far-reaching environmental and infrastructure implications. As these technologies become more widely integrated into everyday digital interactions, their collective energy consumption grows exponentially.
The tech industry's aggressive pursuit of AI capabilities is reflected in massive infrastructure investments. Google alone allocated $12 billion in capital expenditures in a single quarter to strengthen its data center infrastructure supporting AI developments3. This rapid expansion is contributing to projections that data centers will consume 8% of total power in the U.S. by 2030, up from 3% in 20223.
Both Google and Microsoft have reported substantial increases in greenhouse gas emissions attributed to AI development and deployment. Google disclosed a 48% increase in greenhouse gas emissions compared to 2019, while Microsoft reported a 29% growth in emissions since 2020, primarily due to expanding data center infrastructure tailored for AI workloads3. These increases highlight the environmental cost of the AI revolution, even as these companies simultaneously pursue carbon neutrality and renewable energy goals.
The energy demands of AI systems have even affected broader energy infrastructure decisions. In some cases, the surge in energy consumption has led to delays in the decommissioning of coal plants, as power providers struggle to meet the rapidly growing demand from AI data centers3. This illustrates how AI energy consumption can have unintended consequences that potentially undermine broader climate goals.
Despite the current high energy costs of AI deep research systems, significant work is underway to improve their efficiency. The field of AI is evolving rapidly, with newer models capable of matching the accuracy of earlier models at a fraction of the computational cost4.
Several approaches are being explored to reduce the energy footprint of AI systems:
Hardware optimization: Specialized chips like Google's Tensor Processing Units (TPUs) and advanced NVIDIA GPUs are being designed specifically for AI workloads, offering improved performance per watt5.
Model efficiency: Techniques such as model distillation, quantization, and pruning can significantly reduce the computational requirements while maintaining most of the performance capabilities5.
Energy-aware infrastructure: Companies are increasingly deploying AI workloads in regions with access to renewable energy and implementing advanced cooling systems that leverage AI for optimization, as demonstrated by Google's DeepMind-powered cooling solution that reduced cooling energy by 40%6.
Query optimization: Encouraging more efficient user interactions and designing systems that minimize unnecessary computation can further reduce energy consumption on a per-query basis.
Recent developments suggest that the energy efficiency of AI models is improving at a faster rate than their growth in size and capabilities. This trend, if maintained, could eventually narrow the energy gap between AI deep research systems and standard search engines, though a significant differential is likely to persist due to the fundamentally different computational approaches.
Conclusion
Based on the analysis of available data, AI deep research systems currently consume approximately 10-30 times more electricity than standard search engines per query, with a mid-range estimate suggesting a factor of approximately 16.7 times greater energy consumption. This translates to an additional 0.0047 kWh per query compared to a traditional search engine.
While this extra energy consumption enables significantly more sophisticated processing and content generation capabilities, it also represents a substantial environmental cost that scales with increased adoption of these technologies. The energy premium of AI systems highlights the importance of continued research into energy efficiency optimizations and the deployment of renewable energy sources to power AI infrastructure.
As AI technologies continue to evolve and integrate more deeply into our information ecosystem, balancing their enhanced capabilities with responsible energy consumption will remain a critical challenge. Future developments in hardware, software, and infrastructure will play crucial roles in determining whether the energy costs of AI systems can be sufficiently reduced to mitigate their environmental impact while still delivering their transformative benefits.
ShareExportRewritePeople also ask
- Forums
- ASX - By Stock
- BOE
- Ann: Boss acquires shares in Queensland uranium developer
BOE
boss energy ltd
Add to My Watchlist
3.23%
!
$4.47

Ann: Boss acquires shares in Queensland uranium developer, page-8
Featured News
Add to My Watchlist
What is My Watchlist?
A personalised tool to help users track selected stocks. Delivering real-time notifications on price updates, announcements, and performance stats on each to help make informed investment decisions.
|
|||||
Last
$4.47 |
Change
0.140(3.23%) |
Mkt cap ! $1.854B |
Open | High | Low | Value | Volume |
$4.41 | $4.63 | $4.31 | $50.02M | 11.12M |
Buyers (Bids)
No. | Vol. | Price($) |
---|---|---|
2 | 16000 | $4.46 |
Sellers (Offers)
Price($) | Vol. | No. |
---|---|---|
$4.47 | 11310 | 2 |
View Market Depth
No. | Vol. | Price($) |
---|---|---|
2 | 16000 | 4.460 |
1 | 20257 | 4.450 |
5 | 22058 | 4.440 |
3 | 53856 | 4.430 |
4 | 43410 | 4.420 |
Price($) | Vol. | No. |
---|---|---|
4.470 | 11310 | 2 |
4.480 | 21383 | 2 |
4.490 | 30289 | 1 |
4.500 | 33642 | 3 |
4.510 | 33038 | 3 |
Last trade - 16.10pm 17/06/2025 (20 minute delay) ? |
Featured News
BOE (ASX) Chart |
The Watchlist
RC1
REDCASTLE RESOURCES LIMITED
Ronald Miller, Non-Executive Director
Ronald Miller
Non-Executive Director
SPONSORED BY The Market Online