BRN 0.00% 26.5¢ brainchip holdings ltd

2020 BRN Discussion, page-19066

  1. 10,134 Posts.
    lightbulb Created with Sketch. 27430
    This is from the original research paper produced by IBM creating the IBM data set for hand gesture recognition using a DVS camera which we are now all familiar with so I have only extracted the title, authors, Abstract and Acknowledgments. The critical part is that DARPA sponsered this research under contract.

    Following this extract I have included an extract from an article from DARPA setting out its mission but importantly making clear in the final paragraph that just like NASA it embraces the SWaP program.

    I will leave the wild speculation to others but who is the leading technology company with regards to size, weight and power and is currently having its revolutionary patent protected first of its kind spiking convolutional neural network chip hardened by Vorago for according to Vorago the extreme conditions encountered in space and military applications.

    A Low Power, Fully Event-Based Gesture Recognition System


    Arnon Amir, Brian Taba, David Berg, Timothy Melano, Jeffrey McKinstry, Carmelo Di Nolfo, Tapan Nayak, Alexander Andreopoulos, Guillaume Garreau, Marcela Mendoza† , Jeff Kusnitz, Michael Debole, Steve Esser, Tobi Delbruck‡ , Myron Flickner, and Dharmendra Modha

    IBM Research {arnon, dmodha}@us.ibm.com †UC San Diego ‡UZH-ETH Zurich & iniLabs GmbH [email protected]

    Abstract
    We present the first gesture recognition system implemented end-to-end on event-based hardware, using a TrueNorth neurosynaptic processor to recognize hand gestures in real-time at low power from events streamed live by a Dynamic Vision Sensor (DVS). The biologically inspired DVS transmits data only when a pixel detects a change, unlike traditional frame-based cameras which sample every pixel at a fixed frame rate. This sparse, asynchronous data representation lets event-based cameras operate at much lower power than frame-based cameras. However, much of the energy efficiency is lost if, as in previous work, the event stream is interpreted by conventional synchronous processors. Here, for the first time, we process a live DVS event stream using TrueNorth, a natively event-based processor with 1 million spiking neurons. Configured here as a convolutional neural network (CNN), the TrueNorth chip identifies the onset of a gesture with a latency of 105 ms while consuming less than 200mW. The CNN achieves 96.5% out-of-sample accuracy on a newly collected DVS dataset (DvsGesture) comprising 11 hand gesture categories from 29 subjects under 3 illumination conditions……..

    7. Acknowledgments The authors thank everyone who participated in the DvsGesture dataset collection. This research was sponsored by DARPA under contract No. HR0011-09-C-0002. The views and conclusions contained herein are those of the authors and should not be interpreted as representing the official policies, either expressly or implied, of DARPA or the U.S. Government.
    ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

    https://www.darpa.mil/program/microbrain

    Defense Advanced Research Projects Agency Our Research µBRAIN

    Dr. Jiangying Zhou

    The past decade has seen explosive growth in development and training of artificial intelligence (AI) systems. However, as AI has taken on progressively more complex problems, the amount of computation required to train the largest AI systems has been increasing ten-fold annually. While AI advances are beginning to have a deep impact in digital computing processes, trade-offs between computational capability, resources and size, weight, and power consumption (SWaP) will become increasingly critical in the near future.Current neuromorphic/neural architectures rely on the digital computing architectures that attempt to mimic the way nature computes, but not the way it functions. Actual physical interactions and mechanisms that could enable improved engineered function as observed in bio-systems, such as miniature insects, remain to be fully described.μBRAIN will explore innovative basic research concepts aimed at understanding highly integrated sensory and nervous systems in miniature insects and developing prototype computational models that could be mapped onto suitable hardware to emulate their impressive function. Nature has forced on these small insects drastic miniaturization and energy efficiency, some having only a few hundred neurons in a compact form-factor, while maintaining basic functionality.

    This research could lead to capability of inference, prediction, generalization, and abstraction of problems in systematic or entirely news ways in order to find solutions to compelling problems.The primary goal is to understand the computational principles, architecture, and neuronal details of small bio-systems driven by extreme SWaP needs in nature. By doing so, DARPA aims to identify new computing paradigms that would enable improved AI with considerably reduced training times and power consumption.

    My opinion only DYOR.

 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.