End-to-End Edge Neuromorphic Object Detection System
… on edge, based on a DVXplorer camera and the neuromorphic Brainchip’s Akida chip, to be suitable for edge … We would like also to thank Edge Impulse and Brainchip companies for …
End-to-End Edge Neuromorphic Object Detection SystemD. A. Silva1
, A. Shymyrbay1
, K. Smagulova1
, A. Elsheikh2
, M. E. Fouda3,†
and A. M. Eltawil1
1. Department of ECE, CEMSE Division, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
2.Department of Mathematics and Engineering Physics, Faculty of Engineering, Cairo University, Giza 12613, Egypt
3.Rain Neuromorphics, Inc., San Francisco, CA, 94110, USA
Abstract—Neuromorphic accelerators are emerging as a po-
tential solution to the growing power demands of Artificial
Intelligence (AI) applications. Spiking Neural Networks (SNNs),
which are bio-inspired architectures, are being considered as a
way to address this issue. Neuromorphic cameras, which operate
on a similar principle, have also been developed, offering low
power consumption, microsecond latency, and robustness in var-
ious lighting conditions. This work presents a full neuromorphic
system for Computer Vision, from the camera to the processing
hardware, with a focus on object detection. The system was
evaluated on a compiled real-world dataset and a new synthetic
dataset generated from existing videos, and it demonstrated good
performance in both cases. The system was able to make accurate
predictions while consuming 66mW, with a sparsity of 83%, and
a time response of 138ms.
Index Terms—SNN, DVS, Neuromorphic, Akida, YOLO
In this work, we implemented an end-to-end fully neuro-
morphic object detector using an event-based DVXplorer Lite
sensor by Inivation [14] and event-based processor Akida by
Brainchip [15]. The detector is based on an AkidaNet [16]
backbone, designed by Brainchip and optimized for Akida
chip, and a YOLOv2 [17] detection head, whose structure is
suitable to the Akida constraints. The main purpose of this
work is to show the potential of a full-spiking system by
creating an object detector with a small memory footprint, low
power consumption, and real-time response. The contributions
of this work are as follows (a) implementation of the first end-
to-end Akida-based full-spiking system for object detection
in the literature. To the best of our knowledge, there is
only one work implementing object detection on an actual
spiking platform, implemented over a SpiNNaker 3 [12].
(2)Generation of a new synthetic event-based dataset based
on recordings of traffic junctions; and (3) Evaluation of the
system’s performance, power consumption, and time-response,
showing that it can perform real-time detections with low
power consumption.
For Object Detection in SNNs, there is a lack of data reported for such
measures in actual spiking setups, where the only results
reported up to now are from Attention RPN-SNN [12]. In
that work, a SpiNNaker 3 neuromorphic chip was deployed
for detection tasks, reporting a total processing time of 35.3s
and power of 600mW.
VI. CONCLUSION AND FUTURE WORK
This work showed a low-power and real-time latency full
spiking neuromorphic system for object detection based on Ini-
Vation’s DVXplorer Lite event-based camera and Brainchip’s
Akida AKD1000 spiking platform. The system was evaluated
on three different datasets, comprising real-world and synthetic
samples. The final mapped model achieved mAPs of 28.58 for
the GEN1 dataset, equivalent to 54% of a more complex state-
of-the-art model and 89% of the performance detection from
the best-reported result for the single-class dataset PEDRo,
having 17x less parameters. A power consumption of 66mW
and a latency of 138.88ms were reported, being suitable for
real-time edge applications.
For future works, different models are expected to be
adapted to the Akida platform, from which more recent
releases of the YOLO family can be implemented. Moreover,
it is expected to evaluate those models in real-world scenarios
instead of recordings, as well as the acquisition of more data
to evaluate this setup under different challenging situations.
VII. ACKNOWLEDGEMENT
This work has been partially supported by King Abdullah
University of Science and Technology CRG program under
grant number: URF/1/4704-01-01.
We would like also to thank Edge Impulse and Brainchip
companies for providing us with the software tools and hard-
ware platform used during this work.
Mr. Fouda has been according to his LinkedIn been employed at Rain Ai for 2 years and 5 months and is presently holding the positionof Applied Research Lead.
I found all of the above very interesting particularly given that some might wonder why with Mr. Fouda's connections at Rain Ai they chose to use AKIDA technology.
My opinion only DYOR
Fact Finder
- Forums
- ASX - By Stock
- BRN
- 2024 BrainChip Discussion
2024 BrainChip Discussion, page-7366
-
-
- There are more pages in this discussion • 3,718 more messages in this thread...
You’re viewing a single post only. To view the entire thread just sign in or Join Now (FREE)
Featured News
Add BRN (ASX) to my watchlist
(20min delay)
|
|||||
Last
29.0¢ |
Change
0.035(13.7%) |
Mkt cap ! $572.0M |
Open | High | Low | Value | Volume |
26.0¢ | 29.0¢ | 26.0¢ | $10.11M | 36.62M |
Buyers (Bids)
No. | Vol. | Price($) |
---|---|---|
3 | 22403 | 28.5¢ |
Sellers (Offers)
Price($) | Vol. | No. |
---|---|---|
29.0¢ | 628072 | 21 |
View Market Depth
No. | Vol. | Price($) |
---|---|---|
2 | 4403 | 0.285 |
4 | 61857 | 0.280 |
15 | 320273 | 0.275 |
19 | 550055 | 0.270 |
8 | 231716 | 0.265 |
Price($) | Vol. | No. |
---|---|---|
0.290 | 355424 | 14 |
0.295 | 1286542 | 14 |
0.300 | 865044 | 36 |
0.305 | 255000 | 4 |
0.310 | 565025 | 21 |
Last trade - 16.10pm 08/11/2024 (20 minute delay) ? |
Featured News
BRN (ASX) Chart |
The Watchlist
EQN
EQUINOX RESOURCES LIMITED.
Zac Komur, MD & CEO
Zac Komur
MD & CEO
SPONSORED BY The Market Online