BRN 2.44% 20.0¢ brainchip holdings ltd

2021 BRN Discussion, page-16363

  1. 550 Posts.
    lightbulb Created with Sketch. 2212
    Attempting to join a few dots. Apologies in advance for the huge post.

    TLDR: I suspect Ford are ordering [a minimum] 100,000 Akida chips, after which they will switch over to using IP.
    I also suspect Ford (and possibly GM) may be using the Xperi neuromorphic eye-tracking system, which uses Prophesee's neuromorphic sensor, into their vehicles.
    However, previously it was thought Akida was being used for drowsiness detection. Xperi already does this. This makes me think Xperi may either be one of our EAP customers, or more likely Ford is exclusively using Akida to process their neuromorphic sensor data. Note that given their strong ties to an ex-Ford engineer at Xperi, Ford are probably using Xperi sensor data for other custom Ford applications as well. This anticipated relationship with Xperi also indicates they may be a future customer, though if they would purchase IP themselves or through a third party like Renesas isn't clear.

    Long version:
    Ford are working with Brainchip to trial ADAS and Autonomous Vehicle solutions. It is a proof of concept where Ford are developing their own module, which is unusual for an automotive company. In March 2020 at the Market and Product Development update:
    https://www.finnewsnetwork.com.au/archives/finance_news_network275603.html
    Tier one automobile manufactures, we've got Detroit, we can't name names, but we've got Detroit where we are close to a proof of concept agreement. That is the first step. We'll get a little bit of money but more importantly it's validation that tier one automobile manufacturers are interested in developing a solution which includes an Akida [note transcript interpreted this wrong] device in a module or some part of your infrastructure in the automobile. Automotive module suppliers, probably the most target rich environment for us. Most automobile manufactures don't develop their own modules. It pushed that down to tier one module suppliers. We have a proof of concept, it's been frustrating.

    A little bit later in May 2020 Ford officially announced they are working with Brainchip. They will be doing driver behaviour assessments. Note the ongoing expenses received periodically also makes me think we would be overdue for additional Ford payments in the 4C:
    https://brainchipinc.com/brainchip-ai-player-on-roll/

    Agreement with Ford Motor Company for the Evaluation of AkidaTM Neural Processor

    On 24 May 2020, the Company signed a joint agreement with Detroit-based Company for evaluation of the Akida neural network System-on-Chip for Advanced Driver Assistance Systems & Autonomous Vehicle applications.
    The evaluation agreement was binding on execution which was signed into with Ford Motor Company and is not the subject of a fixed term. The deal is based upon a partnership to assess Akida as it relates to the automotive industry & payments under the agreement proposed to cover related expenses and received periodically during the evaluation process.

    Akida NSoC has an advanced and highly efficient neuromorphic nature, and the partners in the collaboration have also realised that these features provide a broad range of potential solutions to complex problems such as driver behaviour assessments and real-time object detection.

    Additional comments about Blue Cruise in another article below. Ford are deploying Blue Cruise this year, their driver monitoring solution for hands-free driving. This uses an infrared camera and optical sensors (note the use of optical sensors and not cameras - this may refer to an event based camera sensor). These cannot be retrofitted at a later date. Now why can they not be retro-fitted at a later date?Unless it's an order of 100,000 [or multiples or 100,000] modules or chips which are a one-time purchase. And after this purchase they intend to move to an IP agreement. This way the chips can no longer be fitted at a later date. This is where I think Akida fits in.

    https://arstechnica.com/cars/2021/04/ford-will-roll-out-bluecruise-hands-free-driving-software-in-q3-2021/

    And, like Super Cruise, it uses an eye-tracking infrared camera to make sure that the driver's eyes are on the road ahead while engaged. To prevent mode confusion, Ford says that the main instrument display will change unambiguously so drivers know whether BlueCruise is active.

    When the OTA update rolls out, it will be available to F-150s fitted with the $995 Co-Pilot 360 Active 2.0 pack (standard on Lariat, King Ranch, and Platinum trims) and Mustang Mach-Es fitted with the $2,600 Comfort and Technology Package (standard on the CA Route 1, Premium, and First Edition trims).

    However, the hardware—which includes the infrared driver-monitoring system, as well as some radar and optical sensorshas to be fitted by the factory. It cannot be retrofitted at a later date.Ford says this should number about 100,000 vehicles by the end of 2021. (From next year, the software will be installed by the factory.)


    To add further to this, Ford has been hit hard by the chip shortage. They've had tens of thousands of cars sitting in their car yards waiting for chips. But then consider how Akida chips were initially planned to be in Brainchip's hands around April this year. Because of the extra design tweaks this has been extended out to August. So Ford may also be using the chip shortage as a cover for why they have so many cars waiting for chips while in fact they are actually waiting for Akida chips to come out. The below article from May 2021.

    https://www.motorbiscuit.com/why-are-22000-new-ford-trucks-sitting-in-a-huge-lot-in-kentucky/
    Yes, the chip shortage is why there are over 20,000 Super Duty pickups and counting just sitting there. They’re waiting for microchips. That’s how bad the chip shortage has gotten.

    The Ford Kentucky Truck Plant in Louisville is home to Ford’s Super Duty assembly plant. If you know Ford’s inner working then you already know that, Or, if you receive Ford’s first-quarter earnings report you also have an idea according to the Detroit Free Press.

    That’s because John Lawler, Ford’s chief financial officer, told analysts about the hiccup. He also alluded to the fact that the 22,000 number would in all likelihood get larger. “The semiconductor shortage and the impact to production will get worse before it gets better,” Ford CEO Jim Farley said after earnings were posted



    The below article talks about Blue Cruise but also the difference between DMS and OMS.
    https://www.eetimes.com/dms-evolves-from-nanny-tech-to-blue-cruise-safety/
    DMS vs. OMSThe technical differences between DMS and OMS are huge.
    DMS is all about wanting “an epic amount of information about the driver… It probably demands 60 frames per second vision analytics,” explained Colin Barnden, lead analyst, Semicast Research. DMS monitors not just head pose, but also eye gaze. Driver monitoring measures levels of attention, distraction, cognition to understand the drivers’ attention, state and engagement level in the task of driving.
    Occupant monitoring is different, he added. “You are looking at, maybe five to 20 frames per second,” as OMS looks at the wider cabin and positions of the occupants, Barnden explained.

    Owl vs. lizardThe technology has evolved over time from an indirect DMS to watch the steering wheel and its rotation. Now the system primarily relies on head pose as the key to reading driver attention. Today, the latest DMS looks at driver attention based on “human factors’ studies, and actually studies of animals,” explained DiFiore.

    Comparing an owl with a lizard, he said, “The owl focuses on objects by turning its head. It never moves its eyes.” In contrast, the lizard keeps its head steady while its eyes dart back and forth, as it focuses on different objects. As humans, he concluded, “We have a combination of lizard and owl behavior. When we focus on objects, it’s normally a combination of our eyes moving and our head moving, depending on the task and the range.”

    Thus, the latest DMS seeks to know what’s going on in a driver’s head by collecting an awful lot of information including eye-gaze patterns and eyelid behaviors.

    Veoneer’s Chung said, “I agree with Nick [DiFiore] that the eye gaze is the future and the highest performance.” But given a growing number of carmakers looking to implement DMS, she noted, “maybe it is just a head post system that’s acceptable in the short term DMS for mass market.”

    Reactive vs. PredictiveAs Chung noted, one of the big benefits of DMS is its ability to predict and react by understanding when drivers get distracted or drowsy.

    What the above article indicates is that DMS requires processing a lot of data. Drivers use different methods for driving and focusing, which can be a combination of eye gaze and head turning. Therefore, what you need is a system which learns about the drivers driving style. If you have a system that does this effectively you could potentially have the system only focus on learned driver behaviours and only target these responses, making it a smarter system with higher accuracy.

    Regarding Ford's DMS - Xperi first announced their driver monitoring solution in early May last year. This was announced about 3 weeks before Ford's partnership with Brainchip was announced. I think the timing may be relevant. Ford would have known about the neuromorphic vision sensor and may have put two and together and decided to integrate Akida:
    https://www.xperi.com/news/xperi-wins-key-oem-vehicle-design-ins-for-in-cabin-monitoring-systems/

    MAY 6, 2020

    XPERI WINS KEY OEM VEHICLE DESIGN-INS FOR IN-CABIN MONITORING SYSTEMS

    Today we’re excited to announce that our in-cabin monitoring solution has been selected by four automotive brands as design-ins for their occupancy monitoring and driver monitoring systems in a range of new vehicles. This brings to 20 the number of new vehicle design-ins we have been awarded by automakers for the in-cabin systems.

    “We are proud to partner with industry-leading car and truck companies to advance occupancy and driver monitoring technologies that are redefining the in-cabin experience, which is of critical importance as vehicles become more autonomous,” said Jeff Jury, general manager automotive at Xperi. “Our detection technology enables OEMs to offer a safer, more personalized and convenient experience for drivers and occupants, today and into the future. Our systems have the capability to be effective even if drivers or occupants are wearing face masks as so many are today.”
    Our driver monitoring solution is currently in production and shipping in five light commercial vehicles from three different OEM brands, while 15 new vehicle design-ins, awarded by global OEMs for occupancy monitoring and driver monitoring technologies, are estimated to go into mass production starting in 2021, and expected to continue over a multi-year production life.
    Our in-cabin monitoring technologies support a safer driving experience and may ultimately help prevent traffic accidents, by providing driver and occupant state analytics through edge computing and sophisticated neural networks. In addition to detecting all human occupants of a vehicle, our solution detects pets and relevant objects, such as child seats. For each human occupant, the technology provides advanced analytics such as passenger authentication, age group, emotional state, and body pose.


    Then in April 2021, Xperi gave further details about their world-first in-cabin monitoring technology, which uses neuromorphic vision. In this article they say they are working with Prophesee, a producer of neuromorphic vision sensors. Their system also used "near infrared" spectrums in the training data, so it stands that it may also require an infrared camera to operate.

    https://www.businesswire.com/news/home/20210421005249/en/Xperi-Develops-World-first-In-cabin-Monitoring-Technologies-on-Neuromorphic-Vision-Systems

    CALABASAS, Calif.--(BUSINESS WIRE)--DTS®, a global leader in next-generation audio, imaging and sensing technology, and a wholly owned subsidiary of Xperi Holding Corporation (NASDAQ: XPER) (“Xperi”), today announced a world-first neuromorphic driver monitoring solution (DMS),powered by Prophesee Metavision® - Event-Based Vision sensor.
    Using the raw feed from the Metavision® - Event-Based Vision sensor, the DTS AutoSense* team developed driver-centric monitoring features such as gaze tracking, head pose, identification, and eyelid opening.

    With more than 20 years of world-leading experience in neural solutions and computer vision, and with billions of products powered by their solutions, the Xperi team maps and adapts in-cabin monitoring (ICM) technologies to existing and future sensor centric trends.

    “Intelligent sensing is required by the next-generation in-cabin experience. The critical path to this is a sensor fusion-focused advanced research strategy,” said Petronel Bigioi, chief technology officer, Xperi. “DTS AutoSense’s advanced research team is centered on state-of-the-art sensing solutions and neuromorphic sensing is one of the technologies that can shape the future of the industry.”

    The neuromorphic sensors capture information at an equivalent frame rate of 10,000 frames per second without requiring active illumination. This enables better low light performance for driver monitoring features as well as never-before-seen capabilities such as saccadic eye movement or micro-expressions, crucial next steps in the next-generation in-cabin experiences.

    Xperi relies on an end-to-end data generation and training system built to address the particular needs of the sensor. The training data set was generated based on Xperi’s extensive computer vision infrastructure, reusing ground-truth from the visible and near infrared spectrums, synthesizing a completely novel approach in neuromorphic based sensing.

    “DTS AutoSense’s innovative AI-enabled approach underscores the ability of Prophesee’s sensor solution to provide OEMs with an effective safety enhancement for vehicles of all types,” said Marc Rousseau, VP Products and Business Development at Prophesee. “We are delighted to collaborate on this first-of-its-kind technology with Xperi and its forward-thinking, high caliber team.”

    *DTS AutoSense comprises a Driver Monitoring Solution (DMS)and an Occupancy Monitoring Solution (OMS), the first to be designed into passenger vehicles projected to be on the road in 2021. Working together they provide actionable insights into activity inside the vehicle, including the driver, passengers, pets and objects, to create a better, safer experience.



    Guilian Gao is a distinguished Engineer at Xperi. He previously worked at Ford Motor Co and won their highest technical award many years back.
    https://www.linkedin.com/in/guiliangao
    https://hotcopper.com.au/data/attachments/3386/3386585-8ac9b9eac85cf474e233d2689f8e2f85.jpghttps://hotcopper.com.au/data/attachments/3386/3386635-b9f470993f1c924a9f3c62224582f541.jpg
    Xperi already sell products to Ford:
    https://s21.q4cdn.com/588597904/files/doc_financials/2020/ar/409ddc1a-d0c8-4f5a-84d9-5e5de77556ce.pdf
    Our HD Radio technology is incorporated into a number of our automotive partners’ products, including vehicles from Acura, Audi, BMW, Ford, GM, Honda, Hyundai, Tesla, and Toyota, among many others.

    However, Xperi already does drowsiness detection:
    https://www.radioworld.com/tech-and-gear/digital-radio/xperi-highlights-its-in-cabin-monitoring-efforts
    “Xperi’s driver monitoring solution is able to detect a distracted or drowsy driver and keep him safe behind the wheel using visual cues (face detection and tracking, head position, eye gaze, eyelid opening, etc),” wrote author Mara Anton. “Xperi’s in-cabin solutions can be tailored by car manufacturers to fit their design needs and creative use-cases. A car that turns into a moving cinema? One that acts like a portable office? Or a vehicle that’s a therapist on the go? It’s all up to the manufacturer’s path to fully-autonomous vehicles and your imagination, of course!”

    Given Ford's strong links to an engineer at Xperi, it is possible they are using the Xperi neuromorphic vision sensor data for their DMS system. Ford may be working with both Brainchip and Xperi to do more with their neuromorphic data and use it for more things like learning driver behaviour with Akida to provide a customised experience.

    Interestingly GM Super Cruise is also using an eye-tracking infrared camera. I suspect (but less strongly than Ford) they may also be one of the other automotive companies using Xperi's neuromorphic vision camera system.

    The below article talks about Blue Cruise but also the difference between DMS and OMS.
    https://www.eetimes.com/dms-evolves-from-nanny-tech-to-blue-cruise-safety/
    DMS vs. OMSThe technical differences between DMS and OMS are huge.
    DMS is all about wanting “an epic amount of information about the driver… It probably demands 60 frames per second vision analytics,” explained Colin Barnden, lead analyst, Semicast Research. DMS monitors not just head pose, but also eye gaze. Driver monitoring measures levels of attention, distraction, cognition to understand the drivers’ attention, state and engagement level in the task of driving.
    Occupant monitoring is different, he added. “You are looking at, maybe five to 20 frames per second,” as OMS looks at the wider cabin and positions of the occupants, Barnden explained.

    Owl vs. lizard

    The technology has evolved over time from an indirect DMS to watch the steering wheel and its rotation. Now the system primarily relies on head pose as the key to reading driver attention. Today, the latest DMS looks at driver attention based on “human factors’ studies, and actually studies of animals,” explained DiFiore.

    Comparing an owl with a lizard, he said, “The owl focuses on objects by turning its head. It never moves its eyes.” In contrast, the lizard keeps its head steady while its eyes dart back and forth, as it focuses on different objects. As humans, he concluded, “We have a combination of lizard and owl behavior. When we focus on objects, it’s normally a combination of our eyes moving and our head moving, depending on the task and the range.”

    Thus, the latest DMS seeks to know what’s going on in a driver’s head by collecting an awful lot of information including eye-gaze patterns and eyelid behaviors.

    Veoneer’s Chung said, “I agree with Nick [DiFiore] that the eye gaze is the future and the highest performance.” But given a growing number of carmakers looking to implement DMS, she noted, “maybe it is just a head post system that’s acceptable in the short term DMS for mass market.”

    Reactive vs. PredictiveAs Chung noted, one of the big benefits of DMS is its ability to predict and react by understanding when drivers get distracted or drowsy.

    What the above article indicates is that DMS requires processing a lot of data. It requires a high framerate of 60 fps so would need to be processed quite fast. Neuromorphic vision sensors would be best suited for this application given their ultra low power and high capture rate. And the most efficient way to process this data would be with a neuromorphic processor.
    Also, drivers use different methods for driving and focusing, which can be a combination of eye gaze and head turning. Therefore, what you want is a system which learns about the drivers driving style. If you have a system that does this effectively you could potentially have the system only focus on learned driver behaviours and target these responses, making it a smarter and more accurate system.

    Brainchip also list driver alertness (drowsiness detection on their website under Automotive sensors)
    https://brainchipinc.com/automotive/
    The Akida AKD1000 Edge AI processor can also monitor the passenger compartment, checking driver alertness with the aim to reduce human error. Privacy is assured without the need to upload images to the internet.

    Previously it was discussed on here that Ford may be using Akida for drowsiness detection. However, if Xperi is already capable of this why does Ford need Akida? Either Xperi is using Akida, or more likely Ford is exclusively processing Xperi neuromorphic data with Akida. Ford may also be using Xperi for additional custom solutions. This indicates Xperi may be a future customer, though if they would purchase IP themselves or through a third party like Renesas isn't clear.

    Pure speculation, DYOR
 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
20.0¢
Change
-0.005(2.44%)
Mkt cap ! $371.1M
Open High Low Value Volume
20.0¢ 20.5¢ 19.5¢ $579.8K 2.902M

Buyers (Bids)

No. Vol. Price($)
37 1243524 19.5¢
 

Sellers (Offers)

Price($) Vol. No.
20.0¢ 349262 10
View Market Depth
Last trade - 16.10pm 11/07/2024 (20 minute delay) ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.