PAPER • THE FOLLOWING ARTICLE IS OPEN ACCESSEdge neuro-statistical learning for event-based visual motion detection and tracking in roadside safety systems
Cristian Axenie*, Ertan Halilov, Julian Main and David Weiss
Published 22 April 2025 • © 2025 The Author(s). Published by IOP Publishing Ltd
Neuromorphic Computing and Engineering, Volume 5,Number 2Focus on Event-based Neuromorphic Systems for Dynamic Signal ProcessingCitation Cristian Axenie et al 2025 Neuromorph. Comput. Eng. 5 024003DOI 10.1088/2634-4386/adcbcb…In practice, our proposed solution for pedestrian detection is a real-time edge processing system running entirely on an embedded neuromorphic computing platform that accelerates the detection and tracking of pedestrians and bicyclists. For more information about the project and our solution design, architecture, and deployment performance, you can access the TinyML Vision Zero San Jose Competition Final at www.youtube.com/watch?v=ZhBCtfalcOk&t=2872s&ab_channel= tinyML.The societal impact of the project is very high as it contributes to feasible, rapid, and affordable improvements in roadside safety in large urban areas. With the planned immediate deployment of the technology that leverages and improves existing infrastructure (i.e. gantry mounts), our solution also offers high economic benefits. The three main socially relevant aspects are (1) detection and identification of abnormal pedestrian and bicyclist events (i.e. change in motion direction and velocity) along public roads in real-time, regardless of weather conditions (e.g. clouds, Sun, rain) or time of day (e.g. dusk, dawn, or night); (2) already known and monitored 'hot spots' can now be augmented by the solution, which preserves the privacy of the detected and tracked individuals yet reliably detects their abnormal motions; (3) only salient features of the public road visual scene are extracted while preserving privacy (i.e. no identity reconstruction from DVS events) so that only relevant cues are used in deciding whether the scene is normal or abnormal. Using the BrainChip Akida neuromorphic platform to process event-based vision data on-site reduces system costs by a factor of 20 compared to conventional centralized GPU processing. For a quantitative comparison, in large quantities: the event-based camera costs $50, and the custom neuromorphic board developed by BrainChip with RaspberryPI costs $300; compared to GPU cost and a price of $1000 per COTS cameras, typically focusing on wider operating temperatures, high resolution, autofocus, vibration correction, improved optics, and dynamic ranges with or without additional light sources. A BrainChip Akida system with a DVS event camera consumes at most 7.6 W when processing camera data on-site. Conventional real-time vision processing systems (connected sometimes also with centralized data centers) require specialized GPUs operating at around 400 W per COTS5camera. Hence, the neuromorphic design enables at least a 100-fold power reduction; per 100 million expected devices, this continuously saves 40GW. This quantitative estimate assumes units × (400 W– 4 W) = 39.6 GW. In addition, on-site processing reduces transmitted data from about 30 MB s−1 in conventional systems to 300 B s−1, a 10 000-fold reduction. For this calculation, we assume 25 bytes per object (8 for timestamp, 16 for camera coordinates, 1 for class), and 10–12 objects are detected per second: 25 × 12 = 300 B. Event cameras detect changes around 33 times faster than conventional cameras and can be used at night and in poor weather conditions due to their wide dynamic range of up to 120 dB.
In this paper, we describe an award-winning solution for neuromorphic edge processing for roadside safety monitoring. Our solution is a practical deployment of neuromorphic technology in which event streams from a neuromorphic camera are processed simultaneously by an SNN, used for detection, and an event-based EM, used for the tracking of pedestrians and bicyclists. By exploiting the underlying spatiotemporal patterns of the event-based data in the perceived roadside scene, the system learns a representation of the sparse input data context based on the imposed physics of motion. Our analysis, experiments, and preliminary deployment demonstrate that our solution is a promising candidate for roadside real-time neuromorphic edge detection and tracking, delivering robust and accurate performance, a very good energy footprint, and supporting city-level scaling. We are currently working together with the city of Treuchtlingen in Germany 6 to mount the system on the gantry in dangerous intersections of the town.“
The lead is:Biography
Dr. Axenie is Professor of Artificial Intelligence and Research Group Leader of the SPICES Lab at the Technische Hochschule Nürnberg Georg Simon Ohm. After earning a Dr. Eng. Sc. in Neuroscience and Robotics from TUM in 2016, Dr. Axenie spent one more year with the TUM Center of Competence Neuroengineering before joining Huawei Research Center in Munich. Between 2017 and 2023 Dr. Axenie was Staff AI and ML Research Engineer with Huawei Research Center and at the same time head of the Audi Konfuzius-Institut Ingolstadt Laboratory at Technische Hochschule Ingolstadt. Dr. Axenie is a seasoned researcher with more than 15 years of academic research and more than 7 years of industrial research experience. He earned a M. Sc. in Robotics and Intelligent Control and a B. Sc. in Automation and Industrial Informatics. His research was disseminated in more than 50 peer-reviewed publications and more than 10 patents.
These researchers presented their paper at:
My opinion only DYOR
Fact Finder
- Forums
- ASX - By Stock
- BRN
- 2025 BrainChip Discussion
BRN
brainchip holdings ltd
Add to My Watchlist
2.63%
!
19.5¢

2025 BrainChip Discussion, page-3894
Featured News
Add to My Watchlist
What is My Watchlist?
A personalised tool to help users track selected stocks. Delivering real-time notifications on price updates, announcements, and performance stats on each to help make informed investment decisions.
|
|||||
Last
19.5¢ |
Change
0.005(2.63%) |
Mkt cap ! $394.9M |
Open | High | Low | Value | Volume |
19.5¢ | 19.5¢ | 19.0¢ | $802.7K | 4.129M |
Buyers (Bids)
No. | Vol. | Price($) |
---|---|---|
41 | 1102480 | 19.0¢ |
Sellers (Offers)
Price($) | Vol. | No. |
---|---|---|
20.0¢ | 710044 | 24 |
View Market Depth
No. | Vol. | Price($) |
---|---|---|
41 | 1102480 | 0.190 |
49 | 1895546 | 0.185 |
65 | 1352095 | 0.180 |
21 | 526033 | 0.175 |
42 | 745365 | 0.170 |
Price($) | Vol. | No. |
---|---|---|
0.200 | 710044 | 24 |
0.205 | 440394 | 14 |
0.210 | 703061 | 25 |
0.215 | 923030 | 21 |
0.220 | 1016179 | 36 |
Last trade - 16.10pm 24/06/2025 (20 minute delay) ? |
Featured News
BRN (ASX) Chart |