BRN 2.00% 24.5¢ brainchip holdings ltd

Sunrise and Landrush

  1. 6,267 Posts.
    lightbulb Created with Sketch. 9114
    So as a few of you know I've been developing with the Akida Development Environment for a while now. I have put a few of the basic projects I have built in the process of learning onto github. Building a few of the demos has given me a bit of an insight and I thought I might share this with the other long term holders who are generally interested in this stuff.


    One of the things I built was a crude imitation of the Tiger one-shot learning demo that Brainchip has presented to various audiences, eg. teach Akida a tiger, show it a picture of a tiger. After building a version of this demo and playing with it using a cheap webcam, pointing at various things and learning different objects, it made me realise just how revolutionary this technology is and how easy it will be to add personalisation to different products. This is a really fun demo to play with because it is so basic yet so powerful.


    You can download this repository here:


    https://github.com/stdp/akida-camera


    Learning how to use Akida with a Camera Feed


    This is an extremely basic way to utilise the Akida on-chip learning functionality. The demo will let you learn new classes of objects to recognise in the camera feed.


    Running and using the example

    1. run python3 akida_camera.py

    2. Press 1 to 0 on your keyboard to learn a new class

    3. Press s to save the newly learnt classes into your model for later use


    Understanding how useful instant personalisation will be, and the insanely large number of reasons why people want this in a product gave me an understanding as to why companies want strict NDAs. We are currently in the Sunrise phase of this new technology. Those with the capability to build with this technology will get their ideas to market first, ahead of the competition. Brainchip is bringing a product to market that other products are not capable of matching. Those who build products with this functionality will be capitalising on capabilities that have never existed before.


    The next phase will be when the Raspberry pi modules are made available. This will be the Landrush phase. Those with the knowledge will be able to have a chance to stake their claim before the knowledge becomes adopted mainstream. This will be the start of Akida saturating the market.


    all imo


    Sunrish and Landrush eg. https://help.iwantmyname.com/hc/en-gb/articles/360015456137-What-are-sunrise-and-landrush-periods-



    There are a few other projects to try out and build with:


    https://github.com/stdp/akida-yolo


    Learning how to use Akida with a Camera Feed running YOLO


    A basic example of using Akida with a camera feed running YOLO to identify people and cars within frame.


    This repo has a branch using the face detection YOLO: https://github.com/stdp/akida-yolo/tree/feature/yolo-face



    I built a basic Not Hotdog app from Silicon valley. Worlds first neuromorphic Not Hotdog?


    https://github.com/stdp/akida-not-hotdog


    Not Hotdog powered by Akida One-Shot Learning


    A classy example of using Akida's One-Shot Learning to determine if the object is a hotdog or not a hotdog. This is an extremely loose example that takes only two images. One hotdog and one not hotdog. There is a training dataset out there specifically for the not hotdog app but I just wanted to experiment with a single image of a hotdog.



    And a basic example that is easy to follow as an introduction to the Akida Development Environment:


    https://github.com/stdp/learning-akida


    Learning how to use the Akida Execution Engine


    This is an extremely basic introduction to utilising the Akida Execution engine. Step through this guide to learn the different stages of creating an Akida model and then running inference over a few images. The source images are being scaled and distorted because this was a quick demo.


    ---


    Let me know if you build anything using these projects. Hopefully these demos will help a few of you learn how to use Akida. I will be updating these to specifically run on a raspberry pi, using a pi camera, when the hardware becomes available. Long term holders if you need some advice on getting these demos running, feel free to ask.

 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
24.5¢
Change
-0.005(2.00%)
Mkt cap ! $454.7M
Open High Low Value Volume
24.5¢ 25.0¢ 24.0¢ $2.351M 9.583M

Buyers (Bids)

No. Vol. Price($)
11 212207 24.5¢
 

Sellers (Offers)

Price($) Vol. No.
25.0¢ 372207 12
View Market Depth
Last trade - 16.10pm 03/05/2024 (20 minute delay) ?
Last
24.5¢
  Change
-0.005 ( 2.00 %)
Open High Low Volume
24.5¢ 25.0¢ 24.0¢ 7464422
Last updated 15.58pm 03/05/2024 ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.