BRN 5.56% 25.5¢ brainchip holdings ltd

I Want To Believe, page-224

  1. 1,728 Posts.
    lightbulb Created with Sketch. 3795

    Below are article extracts that mention Nicolas Oros, Senior Research Scientist at BrainChip...

    https://hotcopper.com.au/data/attachments/1417/1417193-5c084c72ae9089ce6e63929da771c135.jpg
    https://www.linkedin.com/in/nicolas-oros-3779353a/


    He's one of the inventors of two pending patents and the recently granted patent to BrainChip:
    https://patents.google.com/patent/US20170229117A1
    https://patents.google.com/patent/US20170236051A1
    https://patents.google.com/patent/US20170236027A1


    Neurorobotics - A Thriving Community and a Promising Pathway Toward Intelligent Cognitive Robots
    Jeffrey L. Krichmar

    "Also, important around this time was the reemergence of neuromorphic engineering (Krichmar et al., 2015). Similar to the goal of neurorobotics, neuromorphic engineering was using inspiration from the brain to build devices, in this case computer architectures and sensors. Because these computers were specifically designed for spiking neural networks, algorithms that controlled neurorobots were ideal for these platforms. Our group demonstrated that a large-scale spiking neural network model of the dorsal visual stream could lead to effective obstacle avoidance and tracking on a robot (Beyeler et al., 2015). Working with IBM’s low-power TrueNorth (TN) neuromorphic chip (Esser et al., 2016), we demonstrated that a convolutional neural network could be trained to self-drive a robot on a mountain trail (Hwu et al., 2017). The robot and TN chip were all powered by a single hobby level nickel metal hydride battery (Figure 7)6. The circuit diagram and pipeline shown in Figure 7 can generalize to other hardware and neurorobot applications.


    | A self-driving robot using deep convolutional neural networks on IBM's TrueNorth (TN) neuromorphic hardware. (A) Photograph was taken in Telluride, Colorado where the robot autonomously traversed mountain trails. From left to right are Rodrigo Alvarez-Icaza (IBM), Jacob Isbell (University of Maryland), Tiffany Hwu (University of California, Irvine), Will Browne (Victoria University of Wellington), Andrew Cassidy (IBM), and Jeff Krichmar (University of California, Irvine). Missing from the photograph is Nicolas Oros (BrainChip). (B) On the left, the connectivity on the IBM TN neuromorphic chip. On the right, an image of IBM TN NS1e board used in the experiments. (C) Data pipeline for running the self-driving robot. Training was done separately with the Eedn MatConvNet package using Titan X GPUs. During testing, a Wi-Fi connection between the Android Galaxy S5 and IBM NS1e transmitted spiking data back and forth, using the TN Runtime API. Figure adapted from Hwu et al. (2017) with permission.

    Figure 7. A self-driving robot using deep convolutional neural networks on IBM’s TrueNorth (TN) neuromorphic hardware. (A) Photograph was taken in Telluride, Colorado where the robot autonomously traversed mountain trails. From left to right are Rodrigo Alvarez-Icaza (IBM), Jacob Isbell (University of Maryland), Tiffany Hwu (University of California, Irvine), Will Browne (Victoria University of Wellington), Andrew Cassidy (IBM), and Jeff Krichmar (University of California, Irvine). Missing from the photograph is Nicolas Oros (BrainChip). (B) On the left, the connectivity on the IBM TN neuromorphic chip. On the right, an image of IBM TN NS1e board used in the experiments. (C) Data pipeline for running the self-driving robot. Training was done separately with the Eedn MatConvNet package using Titan X GPUs. During testing, a Wi-Fi connection between the Android Galaxy S5 and IBM NS1e transmitted spiking data back and forth, using the TN Runtime API. Figure adapted from Hwu et al. (2017) with permission.


    Because of their low-power, event-driven architectures, recent developments in neuromorphic hold great promise for neurorobot applications. In addition to our work on IBM’s chip, SpiNNaker has been used in a robot obstacle avoid and random exploration task (Stewart et al., 2016). New chips are being developed, such as Intel’s Loihi that will support embedded neuromorphic applications (Davies et al., 2018). In addition to running neural networks on specialized hardware, very low power neuromorphic vision and auditory sensors are being developed (Liu and Delbruck, 2010; Stewart et al., 2016). Similar to biology, these sensors only respond to change or salient events, and when they do respond, it is with a train of spikes. This allows seamless integration of these sensors with spiking neural networks, and their event-driven nature leads to power efficiency that’s ideal for embedded systems (i.e., robots!)."

    "Another reason to be optimistic about the future of this field is that now anyone can be a Neuroroboticist. Although we occasionally need to make custom robots for a particular task, most of today’s robots can be constructed from kits, off-the-shelf parts and 3D printing for a fraction of the cost when I first entered this field. For example, Nicolas Oros, who was a postdoctoral scholar in our lab, constructed a low cost, yet highly capable robot with hobby-grade platforms and Android smartphones as the computing and sensing engine (Oros and Krichmar, 2013). We have used this Android based robot idea for a wide range of research and student projects. Similar to the days of Radio Shack, there is now an online hobbyist community that makes it easy to obtain all the components necessary to build sophisticated robots. Also, open source software has made it easy to get started on programming neural networks, controlling physical robots (e.g., Robotic Operating System7), and creating environments for virtual robots8 These advances make it easy for any researcher, student, or hobbyist to get started on a neurorobotics project."

    Full article can be found here:
    https://www.researchgate.net/publication/326418438_Neurorobotics-A_Thriving_Community_and_a_Promising_Pathway_Toward_Intelligent_Cognitive_Robots
    And here:
    https://www.frontiersin.org/articles/10.3389/fnbot.2018.00042/full

 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
25.5¢
Change
-0.015(5.56%)
Mkt cap ! $502.9M
Open High Low Value Volume
28.0¢ 28.5¢ 25.0¢ $6.972M 26.15M

Buyers (Bids)

No. Vol. Price($)
41 1238103 25.0¢
 

Sellers (Offers)

Price($) Vol. No.
25.5¢ 241360 4
View Market Depth
Last trade - 16.10pm 07/11/2024 (20 minute delay) ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.