BRN 2.33% 22.0¢ brainchip holdings ltd

Neuromorphic Vision Sensors Bring Autonomy Closer to Reality...

  1. 632 Posts.
    lightbulb Created with Sketch. 546
    Neuromorphic Vision Sensors Bring Autonomy Closer to Reality
    Article By : Anne-Françoise Pelé

    Why do we call an event-based vision sensor “neuromorphic”? Because each pixel is a neuron, and it makes sense to have the artificial intelligence next to the pixel...

    Neuromorphic vision sensors are bio-inspired cameras that capture the vitality of a scene, mitigating data redundancy and latency. Event-based, these sensors bring autonomy closer to reality and find utility in high-speed, vision-based applications in areas such as industrial automation, consumer electronics and autonomous vehicles.

    “Why do we say that an event-based vision sensor is neuromorphic? Because each pixel is a neuron, and it totally makes sense to have the artificial intelligence next to the pixel,” Pierre Cambou, principal analyst at Yole Développement (Lyon, France) told EE Times.

    Dormant for years, the neuromorphic vision sensor industry has been staging a comeback in recent months. Last November, Samsung filed a trademark application for its Dynamic Vision Sensor technology aimed at mobile and tablet applications. “This was a surprise,” Cambou said, “since Samsung had originally marketed DVS mainly for automotive advanced driver assistance systems.”

    In December, Sony quietly acquired Zurich-based Insightness, whose vision sensors allow motion detection within milliseconds even if the sensor itself is moving. And in February, shortly after raising an additional $28 million, Paris-based Prophesee reported during International Solid-State Circuits Conference a new, stacked event-based vision sensor jointly developed with Sony.

    Neuromorphic sensing originates from the development of a “silicon retina” by Misha Mahowald at the Institute of Neuroinformatics and ETH Zurich in 1991. Mimicking the human retina, Mahowald explained, “this silicon retina reduces the bandwidth by subtracting average intensity levels from the image and reporting only spatial and temporal changes.” This inspiration drives the concept behind the Dynamic Vision Sensor (DVS) and has led to the creation of a myriad of startups in recent years. The Swiss firm iniVation is among them.


    iniVation’s DAVIS346 DVS
    Founded by pioneers of event-based vision in 2015, the Zurich-based company has developed a dynamic vision platform that combines hardware and software for high-performance machine vision systems. Its neuromorphic DVS chip, dubbed DAVIS346, emulates the properties of the human retina. Only local pixel-level changes are transmitted as they occur, resulting in a stream of events at microsecond time resolution, equivalent to conventional vision sensors—but with far less data. Power (up to 90 percent less), data storage and computational requirements are significantly reduced, while sensor dynamic range (above 120 dB) is increased thanks to local processing, the company claimed.

    With a network of 300 customers, iniVation has collaborated on IBM’s TrueNorth brain-inspired chip with researchers at the University of Pennsylvania, University of Zurich and the U.S. Defense Advanced Research Projects Agency. That research focused on autonomous drone flights. A European Union initiative focused on a smart sustainable city project.

    Envisioning smart factories
    A silent revolution is occurring in factories. Autonomy and automation go hand in hand, and behind many advances in manufacturing automation is machine vision. Unlike simple sensors, machine vision sensors generate large amounts of data to identify defective systems, understand their deficiencies and enable rapid intervention. The results are cost savings and productivity gains.

    Suitable for industrial vision, iniVation claimed its Dynamic Vision platform enables high-speed 3D infrastructure scanning for predictive maintenance, high-speed production inspection, particle, microscopy for fluorescent imaging and human motion analysis. In other words, it performs mundane or complex repetitive tasks at high speed with high accuracy and consistency.

    “It has taken awhile for us to come with a good strategy,” iniVation’s CEO Kynan Eng said in an interview. While other companies perform high-speed counting, Eng said “it is no big deal counting objects at high speed” since conventional cameras can get “a thousand frames per second, even more.” If applications don’t need to respond immediately, then “there is no point using our sensors.”

    What is critical is latency rather than data throughput, and “our sensor deals with fast reaction time.” For instance, “If you have a robot moving along, doing something, it needs to adjust its path in real time. The faster it can adjust, the faster it can move and detect its own errors.”

    “I would [categorize] industrial vision as a relatively low risk, but low volume market,” said Eng. Hence, there has been little interest from venture funds. With an eye toward organic growth, iniVation is thinking in terms of economies of scale. Through its 2019 partnership with Samsung, iniVation shifted from manufacturing and silicon sales to selling cameras to the machine vision industry. “You can sell the $100 silicon, or you can package it in a camera and sell a $1,000 camera,” noted the Yole analyst Cambou.

    By moving towards the system, iniVation is moving up the value chain.

    “We recognized that it did not make sense for us to become a chip company,” Eng said. “We could raise a billion, and it would still not be enough to make the chip ourselves. People were asking us why our cameras were expensive and how we could make them cheap.” Partnering with Samsung, “makes that question go away.”

    The need for quality has boosted machine vision in the food, packaging, consumer electronics, aerospace and the automotive industries. Eng said iniVation’s goal is accessing higher volume markets.

    Picturing the future of mobile
    Event-based cameras only transmit changes of intensity. They do not suffer from motion blur and have a latency on the order of microseconds. Add to that very high dynamic range and a very low power consumption, making the cameras suitable for virtual and augmented reality applications. “This is a potentially large market, but it is not completely clear when it will become really huge,” said Eng. “Right now, it is a niche.”


    iniVation’s CEO Kynan Eng
    Event-based cameras are also showing up in mobile devices. “The trick is to convince the handset makers to put yet another sensor on the back of their phones,” said Eng. Originally, iniVation and other players produced cameras that only used DVS pixels. “These are good for dealing with high speed changes, but in many cases, people just want to take pictures of themselves and their food,” Eng added. “What we did some years ago was to develop a sensor with our pixels and normal pixels. You can take normal pictures, do normal processing, and, for particular use cases, you can use our pixels.”

    Hence, handset makers don’t have to make an either-or-decision, and the purchase price remains the same.

    “A window for mobile will open in 2021 or 2022,” said Cambou. “Today, we have five cameras on the back of a Huawei phone.” Moving forward, he continued, “I don’t see anything else than an always-on neuromorphic camera. Some people talk about multispectral, but I am more thinking about always-on awareness.” An event-based camera could enable touchless interactions such as locking and unlocking phones.

    As with Prophesee’s collaboration with Sony, partnering with Samsung gives iniVation a leg up in the smartphone market. As always, Cambou said, it is a question of implementation. “The good idea of always-on awareness has been here for years, but now the question is how you implement it. It depends on the applications you serve and what the improvements you offer in terms of customer experience.”

    Making cars see
    Event-based cameras are power-efficient because pixel activity is insignificant; almost no energy is needed for “silent” pixels. That’s a selling point as autonomous vehicles transition from internal combustion to electric engines. For car companies, “power consumption is much more important than what I thought initially,” said Eng. “In their current planning for electric cars, if a car uses a 4kW total power budget at constant speed, half of that is for moving the car and the other half is for the computing. Every watt you can save on the compute, you can add to the range of the car or have a smaller battery.”

    The company’s Division Sensor Platform enables vehicle odometry, high speed simultaneous localization and mapping (SLAM) in tough lighting conditions and automated driver assistance. “The promise of the autonomous car will increase as the processing technology will develop,” said Eng. “We will have a hybrid sensor, which will have both the frames and the events so the car companies can continue using what they have spent billions developing.”


    Yole analyst Pierre Cambou
    Sensors are key to unlocking autonomous vehicles. They also generate a ton of data, and systems “are heavily limited by the processing power,” said Cambou. The addition of more cameras—and, with them, more data—means “the computing power explodes.” One solution is improving data quality. “If you really want to solve autonomy, you will need more diversity quickly,” the analyst said. “You will use lidars, thermal cameras, and hyperspectral cameras. I think car companies should also consider event-based cameras.”

    The potential of neuromorphic engineering remains largely untapped. Neuromorphic semiconductors, sensing and computing will become a $7.1 billion market by 2029, according to Yole. If all technical questions are solved within the next four to five years, the neuromorphic computing market could grow from $69 million in 2024 to $5 billion in 2029 and $21.3 billion in 2034. And the neuromorphic sensing market could rise from $34 million in 2024 to $2 billion in 2029 and $4.7 billion in 2034.
 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
22.0¢
Change
0.005(2.33%)
Mkt cap ! $408.3M
Open High Low Value Volume
21.5¢ 23.5¢ 21.3¢ $7.637M 34.28M

Buyers (Bids)

No. Vol. Price($)
15 503753 22.0¢
 

Sellers (Offers)

Price($) Vol. No.
23.0¢ 196385 4
View Market Depth
Last trade - 16.10pm 21/06/2024 (20 minute delay) ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.