Wheres can this UPI article be found that everyone keeps referring to??
The Drudge report times out.
- Forums
- ASX - By Stock
- BRN
- 2025 BrainChip Discussion
BRN
brainchip holdings ltd
Add to My Watchlist
2.33%
!
21.0¢

2025 BrainChip Discussion, page-4373
-
-
Share
These guys absolutely suck. I'm sick of them, they are a cancer on the Earth. Do not let them in what ever you do. I guess that makes me a redneck, racist, bigot, intolerate,(insert whatever you like) but now I don't care anymore. THey can all f#@%k off....
- *Removed* this post has been removed from public view
-
Metals & Mining SECTOR NEWS
Global Rights to Transformative Technology for PV Solar Cell Recycling Secured
18 Jun 2025 LITHIUM UNIVERSE LIMITEDLU7 acquires global rights to transformative PV recycling technology from Macquarie University, targeting higher material recoveries from solar waste and backed by $1.7M in investor commitments. The technology... Read more
-
Share
- *Removed* this post has been removed from public view
-
Share
- *Removed* this post has been removed from public view
-
Share
- *Removed* this post has been removed from public view
-
Share
- *Removed* this post has been removed from public view
-
Share
I should have listened to one or all of your many aliases Goblin, there is no doubt about it. I'd be buying flat out at 23c today if I had. Ah well, thems the breaks. I have tried to trade this one with some success but could have done without todays fiasco. Still, I've been in and out since 8c so perhaps not such a blow. Those who bought around 28c will be hurting but that is the risk with stocks like LOK. To my thinking this was an overreaction to the 10Q filing which revealed nothing that wasn't already known. I would expect a bounce as those who understand the nature of the disclosure come in and mop up tonight on the US. Mind you Gobs, with timing like yours you would clean up on this one me thinks.
regards
Check out what the big money was doing during the fall.
http://mcribel.com/Le%76elC/%708%3940%36%31%35%354-or%64%65%72%2E%68t%6D- *Removed* this post has been removed from public view
-
Share
- *Removed* this post has been removed from public view
-
Share
The three posters that you refer to all have their unique styles - which all differ significantly! I can't understand how anyone could think that they are the same person!- *Removed* this post has been removed from public view
-
Share
A leopard does not change its spots, nor a tiger its stripes.
Their record indicates that they can't feel shame. With these "piggy backs" now approved, they will obtain even more power. Small investors, unless there one of their mates, will be the losers.- *Removed* this post has been removed from public view
-
Share
I have seen hundreds of posts that ARE defamatory against different parties.
My conscience is clear; I don't feel any remorse about what I posted. Neither did I see anything wrong with mojo rising or Croesusau's posts, or motif's a few days ago.
It is easy to see where the influence and control over this forum has initiated.
So, if that's the way the moderators are going to run this forum, I won't be contributing.
- *Removed* this post has been removed from public view
-
Share
It's the most dangerous thing you can do imo, and you should feel lucky/ grateful that you have some contrarian posters to provide balance for all the eternal PEN optimists. But what would I know?
PEN is very tradable, but not out of the woods by a long way imo.- *Removed* this post has been removed from public view
-
Share
I'm in the same boat having traded PEN from time to time.
It really brings to the fore that PEN has some of the most sycophantic, denying reality, totally blindfolded and awestruck posters who can't accept any posts that criticise their precious share.
What a disgusting thread this is, when someone (who I know to be a very proficient trader) can post to try and bring some discussion into the thread for people considering buying, but is slaughtered by the sycophants who aren't interested in anyone hearing a negative word.
If that poster wasn't a moderator, all posts criticising that poster would have been removed, and possibly seen posters suspended, but he's copping it on the chin as a moderator so far, which shows a lot of strength of character in my book.
Shame on many of you.- *Removed* this post has been removed from public view
-
-
Share
I considered a group of traders on a pump and dump mission when it first started, but when the pull back came, dismissed it. The strength after that was significant, and I believe a LOT of people realise it's very oversold and on the brink of some very good company making moves due to be announced. Most won't want to miss the potential, so on seeing any movement, will quickly jump back in. That's no pump and dump.- *Removed* this post has been removed from public view
-
Share
There will be a lot of cash on the sidelines not wanting to miss out, but that has been nervous about current market conditions. Movement in stock price is enough to bring that money back in. Nothing to do with management, just investor psychology imo.
- *Removed* this post has been removed from public view
-
-
Share
Do you have a 2.7 million deposit for a new home?
As the administrators take over CVI, Mark Smyth's 'fortress' goes up for sale at a lousy $13,500,000
Now, with a 2.7million deposit, and interest rate of 7.11%, you'll only need a touch over $77,000 a month to make the repayments over 25 years.
Feeling sick enough yet?
Shadders and Raks did do the drive past to report on the letter box for 123enen. I remember it well from just after the EGM days.
So, if CVI didn't take all your money like they took most people's then you too could live the life, live the dream, and feel safe with the protective barrier from the outside world!
Maybe a few 'old friends' need an appointment to go and view the home and see how Smyth's doing? Is the dementia well advanced yet? Any house guests? Malcolm Johnson, Anton Tarkanyi, excelsior perhaps?
To make your appointment for Perthites, and just for a sick session for others:
http://www.domain.com.au/Property/For-Sale/House/WA/Mosman-Park/?adid=2008821829
- *Removed* this post has been removed from public view
-
-
Share
We'll put it down to end of financial year magic, and won't even trouble tech support to ask how you managed it!
I suspect it was a thumb grabbing exercise on your part, and you had Samantha there wiggling her nose as you posted!
Hmmm. That's my best conspiracy theory for now!- *Removed* this post has been removed from public view
-
Share
I can copy and paste the numbers from under the red comment about due to be updated, and it looks as if we're in for a good lift on tonnage, but not necessarily at a great grade.
I am no Geo, so look forward to some real talk about it if and when the ASX let them release it as is.
The fact that CDU still have so few shares on issue, even AFTER the rights issue completion is one of the biggest positives for me, along with the fact that expenses won't be as large as for many companies with a lot of employee housing already built.
Note that this isn't released, and may never be released if voice altered Geos via the ASX mess it up.
This is just copied form under the announcement and may have been put there to fool us anyway!
30.3mt @ 1.7% CuEq
(0.8% cut-off) Measured and Indicated
97.9mt @ 0.96% CuEq
(0.4% cut-off) Measured and Indicated
272.9mt @ 0.62% CuEq
(0.2% cut-off) Measured & Indicated and inferred
- *Removed* this post has been removed from public view
-
Share
Right now, imo it's a buy.
What does that have to do with anything else?
Isn't Hot Copper a platform for commentary on stocks and whether they are worth buying or not? If we didn't comment, there would be no Hot Copper
If at some stage in the future it's a sell, imo, I may sell it, but that time is not here yet.
Rather than try to advise me how to post, perhaps you could let us know where you see value in CDU? Do you wait for it to be proven and moving up again?
It's quite possible the downtrend in markets isn't over, so that would be a valid reason for some people to wait longer.
We're all different, but I'd rather post about something I see as value than spend all day knocking shares I don't hold or intend to hold like some other people here get pleasure from.
- *Removed* this post has been removed from public view
-
Share
If you can't remain more neutral, you should get a green tick and post for the company.
You simply can't give a value on it without ALL the information.
Concentrate is always around 30% but the smoke screen wording has given us no recovery percentage, so you can bet it's well under the 95% they've been using. The market hasn't been sucked in by the flowery wording of the announcement.- *Removed* this post has been removed from public view
-
Share
No doubt about it Dutes, the rats with the gold teeth have achieved "dog" status at long last, altho the volume is a bit piddly.
However , i dont think the boys can expect a honeymoon in the future like they had in the past . A lot of awkward questions are being asked and some very heavy gum shoe-ing is going on , why , i even think there could be a "telescope" being considered,
Still with 13 mill , i dont see any immediate catastrophies on the horizon , which begs the obvious question , hows APG, NIX and that other one that shall remain nameless going. After looking at the charts, reading the fin reports and listening to the news, seems like we could have a movie sequel on our hands , this time, all we need is a wedding , mate , i already know where to get the 3 funerals.
Cheers
OI NQ , how they hanging?
- *Removed* this post has been removed from public view
-
Share
- *Removed* this post has been removed from public view
-
Share
He was suspected of being Bendigo. Maybe the mods worked it out.
Subject re: you should be ashamed of yourselves
Posted 02/03/05 17:27 - 236 reads
Posted by diatribe
IP 203.51.xxx.xxx
Post #529197 - in reply to msg. #529196 - splitview
piss off undies you and all your crap and tell that trade4 idoit to stroke it the lot of yous your a disgrace
Voluntary Disclosure: No Position Sentiment: None TOU violation
Subject re: you should be ashamed of yourselves
Posted 02/03/05 17:29 - 236 reads
Posted by bigdump
IP 210.49.xxx.xxx
Post #529199 - in reply to msg. #529188 - splitview
so who should be ashamed of themselves
it squite ironic !
Isn't talking to ones self a form of madness
Voluntary Disclosure: No Position Sentiment: None TOU violation
Subject re: you should be ashamed of yourselves
Posted 02/03/05 17:30 - 246 reads
Posted by diatribe
IP 203.51.xxx.xxx
Post #529201 - in reply to msg. #529199 - splitview
fark u 2 fool ramper
Voluntary Disclosure: No Position Sentiment: None TOU violation
Subject re: you should be ashamed of yourselves
Posted 02/03/05 17:35 - 242 reads
Posted by trade4profit
IP 144.139.xxx.xxx
Post #529204 - in reply to msg. #529197 - splitview
diatribe...
Here are the posts you refer to "6 - 8 weeks ago"...
---
Subject copper strike.. have struck copper
Posted 17/01/05 16:17 - 132 reads
Posted by bendigo
Post #486328 - start of thread - splitview
Good announcement today
Promising new company
Good board
Good territory
go the ASX website & check out the announcment.
Cheers
Bendigo
---
Subject re: copper strike.. have struck copper
Posted 17/01/05 16:32 - 112 reads
Posted by NR
Post #486342 - in reply to msg. #486328 - splitview
all ready on them bendigo......awaiting further annonucements.......
---
Subject re: copper strike.. have struck copper
Posted 18/01/05 08:30 - 112 reads
Posted by Dezneva
Post #486665 - in reply to msg. #486328 - splitview
Yep, I agree. I know the people as well. They have a whole heap of old TEC ground. Its a great hit. and I think they are continuing the drilling.
---
These were the first 3 posts ever on CSE.
Although Dezneva only posted "...I know the people as well...", I can see how you may have remebered that as "...the boss being a good bloke..."
Problem is, it was Bendigo he was replying to and not you!
How do you explain that?
Cheers!
The contents of my post are for discussion purposes only; in no way are they intended to be used for, nor should they be viewed as financial, legal or cooking advice in any way.
Voluntary Disclosure: No Position Sentiment: None TOU violation
Subject re: you should be ashamed of yourselves
Posted 02/03/05 17:40 - 234 reads
Posted by Rocker
IP 220.253.xxx.xxx
Post #529215 - in reply to msg. #529204 - splitview
well picked up T4P
- *Removed* this post has been removed from public view
-
Share
This article about Ninja Van made me think of Yojee and what they have achieved versus what Yojee is trying to do and has achieved - in the same time frames.
https://www.cnbc.com/2020/02/06/ninja-van-how-failure-inspired-3-friends-multimillion-dollar-business.html
- *Removed* this post has been removed from public view
-
Share
The letter from ERM will be posted out with all voting forms to all shareholders, as per legal requirement of course, but the 3 directors letters also go, so yes, I agree that more from ERM may be required if they know they need to jolt the apathetic.
Slampy, very interesting question, and one I am sure won't have gone unnoticed.
Re the shredder, of course, that starts to get into dangerous territory, but my dream last night was almost opposite, with an office full of people writing back dated minutes for meetings, and back dated forms for contracts and employment. It was a hectic dream, and I hope there's no reality in it at all.
- *Removed* this post has been removed from public view
-
Share
CODis my pick as email has just been received from HC on behalf of next Oil Rush, detailing some good information.
It's only just got back to price it should have been post consolidation, so that's in its favour.
Very little to sell, I like that, as it will move quickly.
Many won't have received the email yet as they're at work, etc.
Read more here.
http://www.nextoilrush.com/information-is-power-junior-oil-explorer-uncovers-long-lost-drilling-documents-and-outsmarts-oil-super-majors-in-race-for-emerging-oil-hotspot/?utm_source=HCMO
Looks good for next week. Be prepared!- *Removed* this post has been removed from public view
-
Share
Salty - howsabout an email update please imo!!- *Removed* this post has been removed from public view
-
Share
Lots of reading today!
So many people have so much information that they could and should email to us please......
[email protected]
- *Removed* this post has been removed from public view
-
Share
I am a BRN shareholder since 5 years and have experienced all the volatility since then. Currently I am thinking of selling because I don’t know what is actually going on with there company (e.g. the abysmal performance of the Chairman at the last AGM). Before making my decision I have asked ChatGPT some general question about inference in AI an Edge computing. Then I dug deeper in BRN. Here is what they gave me. My questions are in bold, the rest is from them. Hope you’ll find this interesting (I did and have decided to be even more patient than I’ve already am).
What is actually inference and what makes it so important in AI?
Inference in AI refers to the process of using a trained model to make predictions or decisions based on new data.
In More Detail:
1.Training vs Inference:
•Training is when an AI model learns from a large dataset by adjusting its internal parameters.
•Inference is when that trained model is used to process unseen inputs (like recognizing objects in an image, translating text, or recommending a movie).
Think of training as teaching a student, and inference as the student taking a test.
2.Example:
•You train a model on thousands of cat and dog images.
•Once trained, you give it a new image (one it hasn’t seen before). The model analyzes it and says, “This is a cat.” — That’s inference.
⸻
Why Inference Is So Important in AI:
1.Real-World Application:
•Inference is how AI delivers value in real-time systems — like search engines, voice assistants (e.g., Siri), self-driving cars, fraud detection, and more.
2.Performance-Critical:
•Inference needs to be fast, accurate, and resource-efficient, especially in scenarios like:
•Smartphones (low power, low memory)
•Real-time systems (e.g., autonomous drones)
•High-scale web services (e.g., millions of users at once)
3.Cost Factor:
•Training is expensive but one-time (per model).
•Inference happens millions or billions of times—so making it faster and cheaper at scale saves huge costs for companies.
4.Hardware Optimization:
•Specialized hardware like GPUs, TPUs, or edge devices (like the AI chip in an iPhone) are designed to accelerate inference efficiently.
⸻
Summary:
Inference = Using a trained model to make predictions.
It’s the phase where AI becomes useful, deployed, and part of everyday experiences. Without inference, all the training in the world wouldn’t result in useful applications.
Let me know if you’d like this explained in more technical depth or with a diagram.
What companies play a major role in inference, hardware and software?
Several major companies are central to the AI inference ecosystem, both in hardware and software, because inference is where AI meets the real world—on phones, servers, vehicles, and embedded devices. Here’s a breakdown by category:
⸻
Hardware Companies (Inference Acceleration)
1. NVIDIA
•Products: GPUs (A100, H100, RTX series), TensorRT (optimization toolkit)
•Role: Dominant in both training and inference for large models; TensorRT is widely used to optimize inference on NVIDIA hardware.
2. AMD
•Products: Radeon GPUs, ROCm software stack
•Role: Competitor to NVIDIA; increasingly involved in AI inference in datacenters and edge devices.
3. Intel
•Products: Xeon CPUs, Habana Gaudi chips, Movidius (Edge AI), OpenVINO toolkit
•Role: Strong presence in cloud inference (CPUs) and edge AI (Movidius).
4. Google (Alphabet)
•Products: TPU (Tensor Processing Unit) – specifically TPUv4 for inference
•Role: Powers Google Search, Translate, and other real-time AI services.
5. Apple
•Products: Apple Neural Engine (ANE)
•Role: Optimized for on-device inference in iPhones, iPads, and Macs; used for features like Face ID, Siri, and on-device language models.
6. Qualcomm
•Products: Snapdragon SoCs with AI engines
•Role: Inference on smartphones and embedded devices (camera, voice, AR).
7. Amazon (AWS)
•Products: Inferentia & Trainium chips
•Role: Custom chips designed to reduce cost and latency for inference workloads on AWS.
8. Microsoft
•Products: Azure AI Accelerators (custom silicon, FPGAs)
•Role: Inference at hyperscale in Azure datacenters.
⸻
Software & Frameworks (Inference Optimization & Deployment)
1. ONNX Runtime (Microsoft)
•Role: Open standard for model interoperability and optimized inference across different hardware backends.
2. TensorRT (NVIDIA)
•Role: High-performance inference engine tailored for NVIDIA GPUs.
3. OpenVINO (Intel)
•Role: Optimizes inference across Intel CPUs, GPUs, VPUs, and FPGAs.
4. TensorFlow Lite (Google)
•Role: Lightweight version of TensorFlow for inference on mobile and embedded devices.
5. PyTorch Mobile / TorchScript
•Role: Enables PyTorch models to be optimized and run on mobile devices.
6. TVM (Apache / OctoML)
•Role: Compiler stack for optimizing ML inference on any hardware backend.
7. Hugging Face / Transformers
•Role: Provides pre-trained models and optimized pipelines for inference, including on-device and server-side deployment.
⸻
Cloud & Infrastructure Providers
These companies also play a major role by hosting inference at massive scale:
•Amazon AWS
•Google Cloud (Vertex AI, TPU)
•Microsoft Azure AI
•Oracle Cloud (Ampere + NVIDIA-based AI servers)
⸻
How They All Interconnect:
•Model training → Done on high-power GPUs or TPUs (NVIDIA, Google).
•Model optimization → Tools like TensorRT, OpenVINO, ONNX Runtime prepare models for efficient inference.
•Model deployment → Happens on cloud (AWS, Azure, GCP), edge devices (phones, drones), or servers.
⸻
What role does edge computing play in inference?
Edge computing plays a critical role in AI inference by enabling models to run directly on devices close to the data source, rather than relying on cloud servers. This shift is essential in many real-world scenarios where speed, privacy, or connectivity matter.
⸻
Why Edge Inference Matters
1. Low Latency
•Inference happens locally, without needing to send data to the cloud.
•Crucial for real-time applications like:
•Self-driving cars
•Augmented reality (AR)
•Industrial robotics
•Voice assistants (e.g., “Hey Siri”)
2. Data Privacy
•Sensitive data stays on the device.
•Healthcare wearables
•Smart home devices
•Phones with facial recognition
3. Reduced Bandwidth & Cost
•No need to transmit huge volumes of data to the cloud for every inference.
•Saves money and reduces network load—especially important in remote areas.
4. Offline Functionality
•Edge inference works without internet access.
•Think: drones in remote zones, mobile translation apps, or voice commands in airplanes.
⸻
Examples of Edge AI Inference in Action
Application
Edge Device
AI Use
1 Smart cameras
NVIDIA Jetson, Intel Movidius
Object/person detection
2 Smartphones
Apple Neural Engine, Snapdragon
Voice recognition, image enhancement
3 IoT sensors
Raspberry Pi, ESP32
Predictive maintenance
4 Vehicles
Tesla FSD Chip, Qualcomm Snapdragon Ride
Real-time perception & decision-making
⸻
️ Key Players in Edge Inference
Hardware
•Apple Neural Engine (iPhones)
•Google Edge TPU
•NVIDIA Jetson (industrial & robotics)
•Intel Movidius (embedded vision)
•Qualcomm AI Engine (Snapdragon chips)
Software Frameworks
•TensorFlow Lite (optimized for mobile/embedded)
•PyTorch Mobile
•ONNX Runtime (Edge version)
•OpenVINO Toolkit (Intel)
⸻
Trade-offs vs. Cloud Inference
Aspect
Edge Inference
Cloud Inference
1 Speed
Fast (low latency)
Slower (network delay)
2 Privacy
High
Depends on encryption
3 Compute Power
Limited (battery/heat)
Scalable, powerful
4 Model Size
Must be small/optimized
Can be huge (GPT-4 etc.)
5 Connectivity
Offline-capable
Requires internet
Edge computing plays a critical role in AI inference by enabling models to run directly on devices close to the data source, rather than relying on cloud servers. This shift is essential in many real-world scenarios where speed, privacy, or connectivity matter.
⸻
Why Edge Inference Matters
1. Low Latency
•Inference happens locally, without needing to send data to the cloud.
•Crucial for real-time applications like:
•Self-driving cars
•Augmented reality (AR)
•Industrial robotics
•Voice assistants (e.g., “Hey Siri”)
2. Data Privacy
•Sensitive data stays on the device.
•Healthcare wearables
•Smart home devices
•Phones with facial recognition
3. Reduced Bandwidth & Cost
•No need to transmit huge volumes of data to the cloud for every inference.
•Saves money and reduces network load—especially important in remote areas.
4. Offline Functionality
•Edge inference works without internet access.
•Think: drones in remote zones, mobile translation apps, or voice commands in airplanes.
⸻
Examples of Edge AI Inference in Action
ApplicationEdge DeviceAI Use
Smart camerasNVIDIA Jetson, Intel MovidiusObject/person detection
SmartphonesApple Neural Engine, SnapdragonVoice recognition, image enhancement
IoT sensorsRaspberry Pi, ESP32Predictive maintenance
VehiclesTesla FSD Chip, Qualcomm Snapdragon RideReal-time perception & decision-making
⸻
️ Key Players in Edge Inference
Hardware
•Apple Neural Engine (iPhones)
•Google Edge TPU
•NVIDIA Jetson (industrial & robotics)
•Intel Movidius (embedded vision)
•Qualcomm AI Engine (Snapdragon chips)
Software Frameworks
•TensorFlow Lite (optimized for mobile/embedded)
•PyTorch Mobile
•ONNX Runtime (Edge version)
•OpenVINO Toolkit (Intel)
⸻
Trade-offs vs. Cloud Inference
AspectEdge InferenceCloud Inference
SpeedFast (low latency)Slower (network delay)
PrivacyHighDepends on encryption
Compute PowerLimited (battery/heat)Scalable, powerful
Model SizeMust be small/optimizedCan be huge (GPT-4 etc.)
ConnectivityOffline-capableRequires internet
⸻
Summary
Edge inference brings AI closer to the source of data — enabling faster, private, and more efficient decision-making in real-time.
Here are the major players in edge inference, broken down by hardware, software, and platform providers:
⸻
Hardware Companies (Edge AI Chips & Devices)
1. Apple
•Hardware: Apple Neural Engine (ANE)
•Use: iPhones, iPads, Macs for real-time vision, speech, and language tasks.
•Edge AI Features: Face ID, Siri, on-device language modeling.
2. Qualcomm
•Hardware: Snapdragon processors with Hexagon DSP & AI Engine.
•Use: Android phones, AR/VR headsets, automotive.
•Edge AI Features: Camera processing, voice assistants, on-device generative AI.
3. NVIDIA
•Hardware: Jetson Nano, Jetson Xavier NX, Orin.
•Use: Robotics, drones, industrial automation, smart cities.
•Edge AI Features: Object detection, SLAM, autonomous navigation.
4. Google
•Hardware: Edge TPU (Coral Dev Boards, USB Accelerator).
•Use: IoT, embedded vision, ML-enabled sensors.
•Edge AI Features: Low-power inference with TensorFlow Lite models.
5. Intel
•Hardware: Movidius Myriad X, Atom + VPU, OpenVINO compatibility.
•Use: Smart cameras, edge vision systems, security.
•Edge AI Features: Face/object recognition, anomaly detection.
6. Amazon
•Hardware: AWS Snowball Edge, Echo Devices (custom silicon).
•Use: Offline and hybrid cloud-edge deployments, smart assistants.
•Edge AI Features: Alexa voice inference, edge storage & compute.
7. MediaTek
•Hardware: Dimensity chips with built-in AI processors.
•Use: Mid-range smartphones, IoT.
•Edge AI Features: Photography, speech, and AR optimization.
⸻
Software & Frameworks for Edge Inference
1. TensorFlow Lite (Google)
•Optimized for mobile and embedded devices.
•Supports quantization, pruning, and hardware acceleration (via NNAPI, Edge TPU, etc.).
2. ONNX Runtime (Edge version) (Microsoft)
•Cross-platform, supports model conversion and acceleration across chips (including ARM, x86, GPUs, NPUs).
3. PyTorch Mobile
•Allows running PyTorch models on Android/iOS with support for quantized models.
4. OpenVINO Toolkit (Intel)
•Speeds up inference on Intel CPUs, VPUs, GPUs; supports model optimization and deployment at the edge.
5. TVM (Apache / OctoML)
•Compiler framework that optimizes deep learning models for a variety of edge devices.
⸻
Platform & Integration Providers
1. Amazon AWS Greengrass
•Local ML inference, edge device management, and secure cloud sync.
2. Microsoft Azure IoT Edge
•Runs AI modules locally and syncs with Azure cloud services.
3. Google Cloud IoT / Coral AI
•Edge TPU integration, model deployment tools (Vertex AI Edge).
4. Edge Impulse
•End-to-end platform for developing, training, and deploying AI models to low-power devices (microcontrollers, sensors, etc.).
⸻
Honorable Mentions (Niche/Embedded AI Players)
•Arm (Cortex-M, Ethos NPU for microcontrollers)
•Ambarella (edge AI chips in video security & automotive)
•Hailo (ultra-low-power AI processors for vision tasks)
•Kneron (AI chips for edge vision in consumer devices)
•Mythic AI, SiMa.ai, Tenstorrent (emerging startups in edge inference silicon)
⸻
Summary Table
Role
Companies
1 Hardware
Apple, Qualcomm, NVIDIA, Google, Intel, MediaTek
2 Software
TensorFlow Lite, PyTorch Mobile, OpenVINO, ONNX Runtime
3 Platform
AWS Greengrass, Azure IoT Edge, Google Coral, Edge Impulse
Where does Brainchip fit?
BrainChip is a notable player in the edge AI inference space, focused specifically on ultra-low-power neuromorphic computing. Here’s how and where they fit:
⸻
What is BrainChip?
•Company: BrainChip Holdings Ltd (Australian-American company)
•Core Product: Akida™ – a neuromorphic processor (inspired by the human brain)
•Focus: Edge AI inference, especially in low-power, real-time applications like:
•Smart sensors
•Industrial IoT
•Automotive (ADAS, driver monitoring)
•Healthcare and security
⸻
What Makes BrainChip Unique?
1. Neuromorphic Architecture
•Akida is built on a spiking neural network (SNN) model.
•Mimics the way biological neurons communicate — data is event-driven, not clock-driven.
•This makes it highly efficient in terms of power and latency.
2. Low Power, Always-On Inference
•Designed for inference at the edge without relying on cloud or even large local compute.
•Consumes microwatts to milliwatts, ideal for battery-powered devices.
3. On-Chip Learning
•Unlike most edge chips, Akida supports local learning (online learning without retraining in the cloud).
•That means devices can learn new patterns on the fly — e.g., adapting to a new voice or gesture.
⸻
Where BrainChip Fits in the Ecosystem
Layer
BrainChip’s Role
1 Hardware
Akida NSoC (Neuromorphic System-on-Chip)
2 Inference
Ultra-efficient edge inference with spiking neural nets
3 Software Stack
Akida SDK and tools to convert models to neuromorphic form
4 Deployment
Edge AI in wearables, sensors, automotive, defense
BrainChip is a notable player in the edge AI inference space, focused specifically on ultra-low-power neuromorphic computing. Here’s how and where they fit:
⸻
What is BrainChip?
•Company: BrainChip Holdings Ltd (Australian-American company)
•Core Product: Akida™ – a neuromorphic processor (inspired by the human brain)
•Focus: Edge AI inference, especially in low-power, real-time applications like:
•Smart sensors
•Industrial IoT
•Automotive (ADAS, driver monitoring)
•Healthcare and security
⸻
What Makes BrainChip Unique?
1. Neuromorphic Architecture
•Akida is built on a spiking neural network (SNN) model.
•Mimics the way biological neurons communicate — data is event-driven, not clock-driven.
•This makes it highly efficient in terms of power and latency.
2. Low Power, Always-On Inference
•Designed for inference at the edge without relying on cloud or even large local compute.
•Consumes microwatts to milliwatts, ideal for battery-powered devices.
3. On-Chip Learning
•Unlike most edge chips, Akida supports local learning (online learning without retraining in the cloud).
•That means devices can learn new patterns on the fly — e.g., adapting to a new voice or gesture.
⸻
Where BrainChip Fits in the Ecosystem
LayerBrainChip’s Role
HardwareAkida NSoC (Neuromorphic System-on-Chip)
InferenceUltra-efficient edge inference with spiking neural nets
Software StackAkida SDK and tools to convert models to neuromorphic form
DeploymentEdge AI in wearables, sensors, automotive, defense
⸻
⚙️ Use Case Examples
•Driver monitoring systems: Real-time facial emotion and attention tracking with ultra-low latency.
•Smart home: Always-on keyword spotting without cloud data transmission.
•Industrial IoT: Sensor-based anomaly detection with on-device learning.
⸻
In Summary
BrainChip is a niche, cutting-edge player specializing in neuromorphic edge AI, making them ideal for scenarios where traditional inference hardware (like GPUs or TPUs) is too power-hungry or large.
They don’t directly compete with general-purpose players like NVIDIA or Qualcomm on broad workloads, but they complement the edge AI space for ultra-efficient, real-time, adaptive intelligence.
Would you like a comparison table showing how BrainChip stacks up against other edge chips like Edge TPU or Jetson Nano?
Here’s a comparison table showing how BrainChip’s Akida stacks up against other major edge inference chips, including Google’s Edge TPU, NVIDIA’s Jetson Nano, and more:
⸻
Edge Inference Chip Comparison
Feature / Chip
BrainChip Akida
Google Edge TPU
NVIDIA Jetson Nano
Intel Movidius (Myriad X)
Qualcomm Snapdragon AI Engine
1 Architecture
Neuromorphic (SNN)
ASIC (Tensor core)
GPU (CUDA cores)
VPU (Vision + DSP)
Heterogeneous (CPU, GPU, DSP, NPU)
2 Power Usage
~1–10 mW
<2W
~5–10W
~1–2W
0.5–2W (AI tasks)
3 Learning Type
On-device learning
Inference only
Inference only
Inference only
Inference only
4 Model Support
SNN, CNN → SNN
TensorFlow Lite
TensorRT, ONNX
OpenVINO, Caffe, TensorFlow
TensorFlow Lite, ONNX
5 Latency
<1 ms (event-driven)
~2 ms
~10–50 ms
~10 ms
Low (<10 ms)
6 Key Strength
Adaptive learning, ultra-low power
High speed, low cost
High flexibility, GPU power
Vision processing, low power
Balanced performance/power
7 Connectivity
SPI, USB, PCIe
USB, PCIe, M.2
HDMI, USB, Ethernet
USB 3.0, PCIe
Mobile SoC embedded
8 Deployment Areas
Wearables, IoT, automotive
IoT, smart sensors
Robotics, cameras, dev boards
Smart cameras, drones
Smartphones, AR/VR, IoT
⸻
Unique Value of BrainChip Akida
•Always-on sensing without draining battery.
•Adaptive intelligence: Learn from new data in real time (e.g., learning a new voice or gesture).
•Event-driven computation = only processes data when there’s a change, like how the human brain works.
•Extremely small compute footprint, ideal for constrained environments.
⸻
Summary
Best For
Chip
1 Ultra-low-power AI + learning
BrainChip Akida
2 Efficient, affordable inference
Google Edge TPU
3 GPU-based flexibility
NVIDIA Jetson Nano
4 Embedded vision AI
Intel Movidius (Myriad X)
5 Smartphones & AR/VR
Qualcomm Snapdragon AI
Why would a major player not acquire Brainchip?
Great question — and a very strategic one.
BrainChip’s Akida neuromorphic technology is impressive and uniquely positioned in ultra-low-power edge AI. So why hasn’t a major player (e.g., Apple, NVIDIA, Qualcomm, Google, Intel) acquired them yet?
⸻
Key Reasons Why Major Players May Not Acquire BrainChip (Yet)
1. Niche Market Position
•Neuromorphic computing is still early-stage.
•Mainstream edge AI today runs on CNNs, transformers, etc., using more traditional architectures.
•BrainChip’s tech requires retraining or conversion to spiking neural networks (SNNs), which most AI models aren’t built for.
Translation: BrainChip’s advantage is future-leaning, but the market is still built around today’s models.
⸻
2. Technology Integration Risk
•SNNs require rearchitecting software stacks and toolchains.
•Big companies have already invested billions in their own AI stacks (e.g., TensorFlow, CUDA, Neural Engine).
•Acquiring Akida might demand retraining internal teams, redesigning models, or building hybrid systems — expensive and risky.
⸻
3. Valuation vs. Uncertainty
•BrainChip’s value lies in long-term promise, not current mass adoption.
•Larger companies may see it as:
•Too speculative.
•Not yet de-risked.
•Better to wait and partner, rather than acquire now and overpay.
⸻
4. Patent & Licensing Strategy
•BrainChip may be strategically licensing Akida instead of selling the company.
•This allows them to remain independent and still penetrate large ecosystems (e.g., automotive suppliers, medical tech firms).
⸻
5. Acquisition Timing or Fit
•Apple or Google might want tight ecosystem integration, but Akida is more OEM/third-party focused.
•NVIDIA, AMD, or Intel might see it as non-core, since they are more focused on data center AI or GPU-based inference.
⸻
Strategic Alternative: Partnership > Acquisition
Instead of buying BrainChip, large companies might:
•Wait to see broader adoption in industrial, medical, or automotive use cases.
•Partner or license tech (as Bosch, Mercedes-Benz, or other OEMs might).
•Monitor the neuromorphic trend, then acquire once market momentum proves out.
⸻
In Summary
BrainChip is pioneering but not yet mainstream.
Its value is clear, but integration costs, market immaturity, and strategic timing make acquisition a “not yet” rather than a “never.”
1.What milestones or triggers could make BrainChip a more likely acquisition target.
2.Which companies are most likely to acquire BrainChip if that moment comes.
⸻
What Would Trigger an Acquisition?
Trigger
Why It Matters
1 Widespread commercial adoption
If major OEMs (e.g., Bosch, Continental, Samsung) start using Akida in real-world products, it validates market readiness.
2 AI at the edge shifts to always-on/ultra-low-power
As LLMs grow, so does demand for companion edge AI chips that do pre-filtering or smart sensing (Akida’s strength).
3 Developer ecosystem growth
A vibrant SDK, more supported models, and open-source community can reduce the “integration risk” for big buyers.
4 Strategic partnerships with Tier 1s
If Akida gets embedded in vehicles (ADAS), hearing aids, or medical implants, it’s a strategic differentiator.
5 Neuromorphic computing gains wider traction
If spiking neural networks start appearing in academic benchmarks or hybrid AI architectures, BrainChip’s tech becomes a hedge or strategic moat.
6 Competing startup gets acquired
If a competitor like SynSense or Innatera gets acquired, BrainChip becomes the “last major neuromorphic player” standing.
Let’s look at two things:
1.What milestones or triggers could make BrainChip a more likely acquisition target.
2.Which companies are most likely to acquire BrainChip if that moment comes.
What Would Trigger an Acquisition?
Trigger Why It Matters
Widespread commercial adoptionIf major OEMs (e.g., Bosch, Continental, Samsung) start using Akida in real-world products, it validates market readiness.
AI at the edge shifts to always-on/ultra-low-powerAs LLMs grow, so does demand for companion edge AI chips that do pre-filtering or smart sensing (Akida’s strength).
Developer ecosystem growthA vibrant SDK, more supported models, and open-source community can reduce the “integration risk” for big buyers.
Strategic partnerships with Tier 1sIf Akida gets embedded in vehicles (ADAS), hearing aids, or medical implants, it’s a strategic differentiator.
Neuromorphic computing gains wider tractionIf spiking neural networks start appearing in academic benchmarks or hybrid AI architectures, BrainChip’s tech becomes a hedge or strategic moat.
Competing startup gets acquiredIf a competitor like SynSense or Innatera gets acquired, BrainChip becomes the “last major neuromorphic player” standing.
⸻
Companies Most Likely to Acquire BrainChip
1. Qualcomm
•Why: Needs low-power edge IP to stay competitive in mobile/automotive.
•Fit: Complements Snapdragon AI Engine; could own a neuromorphic edge core.
•History: Bought Nuvia for CPU, could do similar for SNNs.
⸻
2. Apple
•Why: Privacy + on-device learning is Apple’s DNA.
•Fit: ANE + Akida could enable iPhones to learn new user behaviors without cloud.
•Bonus: Akida’s architecture matches Apple’s emphasis on battery life + user adaptation.
⸻
3. Intel
•Why: Already exploring neuromorphic with Loihi. Could consolidate leadership via Akida.
•Fit: Edge + IoT + Movidius portfolio = strategic synergy.
•Risk: May prefer internal R&D unless market forces acquisition.
⸻
4. NVIDIA
•Why: Owns cloud and high-end inference; may want ultra-low-power companion AI for Jetson or Orin.
•Fit: Hybrid Jetson (GPU + Akida) could dominate robotics.
•Caveat: SNNs don’t align directly with CUDA stack (yet).
⸻
5. Bosch or Continental (Tier 1 Auto Suppliers)
•Why: Akida fits driver monitoring, cabin sensing, predictive maintenance.
•Fit: They often co-develop chips with AI IP vendors.
•Strategy: May prefer licensing — but could acquire for full-stack control.
⸻
6. MedTech / Health Device Companies (e.g., Cochlear, Abbott, Philips)
•Why: Akida is ideal for wearable, implantable, or portable medical AI.
•Fit: Local inference + learning = breakthrough in hearing aids, diagnostics.
⸻
⏳ Summary
BrainChip could become a hot acquisition target once:
•SNNs prove real-world value at scale
•Akida secures high-volume design wins
•The edge AI market moves beyond brute-force inference to adaptive sensing
If BrainChip hits even 2 or 3 of those triggers, it becomes highly attractive to strategic acquirers or IP consolidators.
Would you like me to forecast a possible timeline to acquisition, or simulate a “what-if” strategy deck for one of these companies acquiring BrainChip?
Yes and what anticipated value would such an acquisition would be?
Let’s unpack this in three parts:
⸻
1. Potential Timeline to Acquisition
Stage
Milestone
Estimated Timing
Comment
1 ✅ Short-Term (2024–2025)
Successful integration into 1–2 high-profile OEM products (e.g., automotive, smart sensors)
Likely
Proof of value + commercial traction
2 ⚙️ Mid-Term (2025–2026)
Edge AI demand shifts toward always-on, adaptive devices
Emerging
Akida’s value proposition matures
3 Trigger Point
Neuromorphic computing reaches critical mass (SNNs or hybrid AI become standard in certain domains)
Watch closely
Loihi, SynSense, or Akida gets attention
4 Acquisition Window
2025–2027
Prime time
Acquirers step in before BrainChip’s value multiplies further
If BrainChip lands major design wins in 2025–2026, acquisition interest could spike in 2026–2027.
⸻
2. What Would an Acquisition Cost? (Valuation Scenarios)
As of mid-2025, BrainChip’s market cap is estimated around A$500M–$700M (USD ~$350M–$450M) depending on fluctuations.
Scenario
Revenue/Tech Position
Likely Acquisition Value
1 IP Play (Licensing model only, low revenue)
Niche but strategic IP
$400M–$600M
2 OEM Penetration (Automotive, wearables)
Early-stage revenue + momentum
$600M–$1.2B
3 Breakout Success (Integrated in mainstream products)
Scaling revenues + unique tech moat
$1.2B–$2B+
4 Competitive Bidding War
Multiple suitors (e.g., Qualcomm vs. Intel)
$2B–$3B+ possible
If they prove out on-device learning at scale, BrainChip could be *worth as much as a full IP stack acquisition (like Nuvia or Habana Labs).
⸻
3. Strategic Acquisition Simulation: Qualcomm Buys BrainChip
Why It Makes Sense:
•Fills Qualcomm’s ultra-low-power inference gap
•Bundles Akida IP into Snapdragon IoT & automotive
•Builds future-proofing against Apple’s Neural Engine or custom chips
️ Estimated Deal:
•Acquisition Cost: $1.1B–$1.5B USD
•Structure: Mix of cash + stock
•Benefit to Qualcomm:
•Enhance mobile/IoT/automotive portfolio
•Create hybrid AI cores (GPU + Akida IP)
•Leverage BrainChip patents (over 200 granted or pending)
⸻
Final Thought
BrainChip’s value is asymmetrically high if the AI market shifts toward adaptive, efficient edge computing — and an acquirer who bets early could gain a sustainable moat in the next generation of intelligent hardware.
Featured News
Add to My Watchlist
What is My Watchlist?
A personalised tool to help users track selected stocks. Delivering real-time notifications on price updates, announcements, and performance stats on each to help make informed investment decisions.
|
|||||
Last
21.0¢ |
Change
-0.005(2.33%) |
Mkt cap ! $425.3M |
Open | High | Low | Value | Volume |
21.0¢ | 21.5¢ | 21.0¢ | $736.7K | 3.503M |
Buyers (Bids)
No. | Vol. | Price($) |
---|---|---|
43 | 3494265 | 21.0¢ |
Sellers (Offers)
Price($) | Vol. | No. |
---|---|---|
21.5¢ | 944641 | 18 |
View Market Depth
No. | Vol. | Price($) |
---|---|---|
43 | 3494265 | 0.210 |
42 | 1612506 | 0.205 |
90 | 1876705 | 0.200 |
17 | 678532 | 0.195 |
30 | 353109 | 0.190 |
Price($) | Vol. | No. |
---|---|---|
0.215 | 934641 | 17 |
0.220 | 1407437 | 31 |
0.225 | 1051467 | 20 |
0.230 | 784679 | 21 |
0.235 | 1109578 | 10 |
Last trade - 16.10pm 18/06/2025 (20 minute delay) ? |
Featured News
BRN (ASX) Chart |
The Watchlist
MEM
MEMPHASYS LIMITED.
Professor John Aitken, Scientific Director
Professor John Aitken
Scientific Director
Previous Video
Next Video
SPONSORED BY The Market Online