BRN 2.56% 20.0¢ brainchip holdings ltd

2024 BrainChip Discussion, page-6968

  1. 7,905 Posts.
    lightbulb Created with Sketch. 2037
    Extending the IoT to Mars
    • May 30, 2024
    • Steve Rogerson
    • Eseye

    Credit: Nasa/JPL/MSSS
    How far can the IoT go? Further than you think, as Steve Rogerson discovered at this week’s Hardware Pioneers Max show in London.

    We will never see a little green man using the self-checkout till at the Mars branch of Walmart to buy a bottle of Romulan ale. Nor will a woman from Venus see how she looks in that polka-dot dress using a magic mirror on the Moon.

    But that does not mean the IoT will not stretch beyond the upper reaches of Earth’s atmosphere; it already has.

    These seeds were first sown when the realisation of the benefits of connecting armies of sensors to the internet first dawned, though then few considered the problems lurking of handling such vast amounts of data. Yes, the data are useful, but only if something useful can be done with them.

    Given much of this involved monitoring situations remotely, an added problem became latency. Some benefits could only be harvested if the response to the data was immediate, or close to it.

    Not long passed before it became obvious that for the IoT to succeed, data had to be processed and acted on at the edge. That meant giving some autonomy to these systems. At its simplest, it saw thermostats in factories, offices and homes turning the heating on if it got too cold or increasing ventilation if it got too hot.

    Easy so far. But we needed more, especially if we were going to have more automated factories, robot deliveries and self-driving cars. The job of the edge processor was becoming harder. Higher intelligence was needed and, thankfully, it has arrived, and getting better all the time.

    Developments in artificial intelligence (AI) have blossomed in recent years, bringing impressive smartness to edge devices.


    Marco Parmegiani from Eseye.
    “IoT starts and ends with the devices,” Marco Parmegiani, architect director at Eseye (www.eseye.com), told visitors to this week’s Hardware Pioneers Max (HPM) show in London. “This year, we are seeing the rise of the intelligent edge.”

    He said things had to happen at the edge. For example, there are devices that will monitor for a water leak. If all they do is send or sound an alarm, it could all be too late by the time you get home to fix it. But add a bit of intelligence and it will work out if the water needs turning off and do it itself. In fleet management, devices can intelligently know whether to send data by wifi, cellular or satellite depending on which is available and which is cheapest at the time.

    “The intelligence is being pushed down to the device,” said Marco. “IoT devices are becoming cleverer. You can now put a lot more processing power into the device and make the decision about whether and when to send data.”

    It did not take long before these advances caught the attention of people with more off-world problems. Space bodies such as Nasa and the European Space Agency (ESA) have long battled with latencies – space is big, as Douglas Adams pointed out in Hitchhikers Guide to the Galaxy – that go far beyond what is experienced on Earth. Remotely controlling a rover on Mars is just not practical in real time; by the time the engineer on Earth has pressed the stop button, the rover will have its face full of red rock. AI is becoming the answer.


    Alf Kuchenbuch from Brainchip.
    This was explained by Alf Kuchenbuch, a vice president at Australian technology company Brainchip (brainchip.com), who told HPM delegates how excited he was that his company’s chips were now doing real edge processing in space.

    “Nasa and the ESA are picking up on AI,” he said. “They want to see AI in space. They are nervous, but they are acting with urgency.”

    Earlier this month, he attended a workshop in the Netherlands organised by the ESA where he said the general view was that everything that happened on Earth would happen in space in five years’ time.

    “Some find that shocking, but it is an inevitable truth,” he said. “Nasa is picking up on this too.”

    But he said even satellites in low Earth orbit sometimes hit latency problems. There are also bandwidth difficulties. Satellites sending constant images of the Earth’s surface use a lot of bandwidth, but many of those images are useless because of cloud cover. Applying AI to the images on the satellite can pick those that show not just the top of clouds, and sometimes they can stitch images together, reducing drastically the amount of data they need to send. And if they are being used, say, to track ships, they don’t need to keep sending pictures of the ship, but just its coordinates.

    Taking a leaf from autonomous vehicles on Earth, similar technology can be used for performing docking manoeuvres in space and, as mentioned, controlling ground vehicles on the Moon or Mars. Another application is debris removal. There is a lot of junk circling the Earth and there are plans to remove it by slowing it down so it falls towards Earth and burns up.

    “These are why AI in space is so necessary,” said Alf.

    Brainchip is using neuromorphic AI on its chips, which Alf said had a big advantage in that it worked in a similar way to a brain, only processing information when an event happened, lowering the power requirements. The firm’s Akida chip is on SpaceX’s Transporter 10 mission, launched in March.

    “We are waiting for them to turn it on and for it to start doing its work,” he said. He wouldn’t say what that work was just that: “It is secret.”

    Brainchip is also working with Frontgrade Gaisler (www.gaisler.com), a provider of space-grade systems-on-chip, to explore integrating Akida into fault-tolerant, radiation-hardened microprocessors to make space-grade SoCs incorporating AI.

    “If this works out, our chip will be going on the Moon landing, or even to Mars,” he said. “Akida is not a dream. It is here today, and it is up there today.”
 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
20.0¢
Change
0.005(2.56%)
Mkt cap ! $366.5M
Open High Low Value Volume
19.5¢ 20.0¢ 19.5¢ $183.7K 933.9K

Buyers (Bids)

No. Vol. Price($)
82 1748557 19.5¢
 

Sellers (Offers)

Price($) Vol. No.
20.0¢ 692266 27
View Market Depth
Last trade - 11.15am 16/07/2024 (20 minute delay) ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.