BringingTouch to the Edge: A Neuromorphic
ProcessingApproach For Event-Based Tactile
Systems
Harshil Patel
Brainchip Research Institute
Perth, Australia
Email: [email protected]
Anup Vanarse
Brainchip Research Institute
Perth, Australia
Email: [email protected]
Kristofor D. Carlson
Brainchip Inc.
Laguna Hills, USA
Email: [email protected]
Adam Osseiran
Brainchip Research Institute
University of Western Australia
Perth, Australia
Email: [email protected]
Abstract— The rise of neuromorphic applications has
highlighted the remarkable potential of biologically-inspired
systems. Despite significant advancements in audio and visual
technologies, research directed towards tactile sensing has not
been as extensive. We propose a neuromorphic tactile system
for sensing and processing that presents promising results for
edge devices and applications. In this study, a neuromorphic
tactile sensor, two data encoding techniques, and a two-layer
spiking neural network (SNN) deployed on the AKD1000
Akida Neuromorphic System on Chip (NSoC) were used to
demonstrate the system's capabilities. Results from
experiments on the ST-MNIST dataset showed high accuracy,
with the complement-coded variant achieving 93.1%,
outperforming previous state-of-the-art models for this dataset.
Additionally, an exploratory study showed that early
classification was possible, with most samples requiring only
38% of the available events to classify correctly, reducing the
amount of data that needs to be processed. The low power
consumption and high throughput of both SNN models, with
an average dynamic power consumption of 6.37 mW and 7.76
mW and an average throughput of 586 and 589 frames-per-
second respectively, make the proposed system suitable for
edge devices with limited power and processing resources.
Overall, the proposed tactile sensing system presents a
promising solution for edge applications that require high
accuracy, low power consumption, and high throughput….
This study suggests that the proposed early-classification
pipeline holds the potential to significantly reduce
classification latency for real-time systems, thus avoidingthe
trade-off between sample capture time and accuracy in time-
critical environments. For instance, in applications such as
autonomous vehicle control, faster inferences could be
critical, and making inferences during data capture could
provide intermediary results before a final classification is
made with the complete data sample. By reducing the latency
associated with the data collection window, this system could
enable real-world systems to make classifications with even
incomplete data before making a final high-confidence
classification.
V.CONCLUSION
The proposed neuromorphic system for tactile sensing
and processing presents promising results, offering a solution
for edge devices. In this study, two data encoding techniques
were used in combination with a two-layer SNN deployed on
the Akida NSoC. Results from experiments on the ST-
MNIST dataset showed high accuracy, with the complement
coded variant achieving 93.1%, outperforming previous
state-of-the-art models for this dataset. Additionally, it was
found that early classification was possible, as most samples
could be correctly classified with just 38% of the available
events, thus enabling real-time systems to reduce latency
induced by data collection. The low power consumption and
high throughput of both SNN models, with average dynamic
power consumption of 6.37 mW and 7.76 mW, and average
throughput of 586 and 589 frames-per-second respectively,
make this system suitable for edge devices with limited
power and processing resources. In conclusion, the proposed
tactile sensing system presents a promising solution for edge
applications, with high accuracy, low power consumption,
and high throughput.”
https://ieeexplore.ieee.org/abstract/document/10168592/
I have highlighted in red a most significant feature of AKIDA technology.
This characteristic is very human. We all when driving see things ahead
of us and without waiting to be completely sure that it is a person in an Italian
Suit complimented by a Gucci tie we make the decision that it is human
and could present a hazard and we react by reducing speed, hovering over
the brake pedal, changing lanes etc.
In other words we do not require 100% of the data to start activating our response
mechanism. For us as human drivers even a shadow of a human starting to appear
can be enough data.
AKIDA unlike Von Neuman compute which needs to have a whole image of the
object to process, it can just like a human commence to process the data coming
from the camera and reach correct classifications with only 38% of the incoming
data.
We all know that driving safely is about having sufficient time to react to whatever
occurs because no matter what a ton of metal does not stop instantly it always
takes some time no matter who or what is at the wheel and the more time who or
what has the better the outcomes.
The revolutionary nature of AKIDA technology continues to confound even before
the original purpose of this paper comes into play which was as it does to prove
out a method for AKIDA technology to bring touch to robotics in the same way that
human skin does.
My opinion only DYOR
Fact Finder
- Forums
- ASX - By Stock
- BRN
- 2024 BrainChip Discussion
2024 BrainChip Discussion, page-9758
-
-
- There are more pages in this discussion • 1,189 more messages in this thread...
You’re viewing a single post only. To view the entire thread just sign in or Join Now (FREE)
Featured News
Add BRN (ASX) to my watchlist
(20min delay)
|
|||||
Last
23.5¢ |
Change
0.010(4.44%) |
Mkt cap ! $463.5M |
Open | High | Low | Value | Volume |
22.0¢ | 24.5¢ | 22.0¢ | $2.565M | 10.96M |
Buyers (Bids)
No. | Vol. | Price($) |
---|---|---|
6 | 227996 | 23.5¢ |
Sellers (Offers)
Price($) | Vol. | No. |
---|---|---|
24.0¢ | 343440 | 7 |
View Market Depth
No. | Vol. | Price($) |
---|---|---|
6 | 227996 | 0.235 |
18 | 609222 | 0.230 |
9 | 736634 | 0.225 |
32 | 1785383 | 0.220 |
32 | 1158704 | 0.215 |
Price($) | Vol. | No. |
---|---|---|
0.240 | 333440 | 6 |
0.245 | 796658 | 20 |
0.250 | 635506 | 28 |
0.255 | 1257125 | 17 |
0.260 | 726031 | 20 |
Last trade - 16.10pm 31/10/2024 (20 minute delay) ? |
Featured News
BRN (ASX) Chart |
The Watchlist
BTH
BIGTINCAN HOLDINGS LIMITED
Harsh Shethia, Advisor
Harsh Shethia
Advisor
SPONSORED BY The Market Online