WBT 0.44% $2.27 weebit nano ltd

Wondering what a spiking neural network is?

  1. 764 Posts.
    lightbulb Created with Sketch. 326
    I've had a look around for a good description to explain the science/technology of spiking neural networks. But, turns out most people who understand it are no good at explaining it. The best video on it I've seen is Vsauce's video "The Stilwell Brain" - but it is only available to youtube premium subscribers. That video simulates a spiking neural network using humans; it's pretty cool. They basically use people to reproduce a simple version of the below:



    What you're seeing here is an input of pixels arranged in the shape of a numeric digit on a screen at the front (that's basically working like your retina). Behind that, various configurations of simulated neurons are "excited" by the information on that screen. Suppose each neuron was looking at a grid of 9 retinal "cells" (pixels). If only one of those 9 cells is white, the neuron will only be a little bit excited. But once a certain threshold is reached - perhaps 4 cells being white - it will excite the neuron enough to "spike" and fire a signal to other cells connected to it. Those cells are then similarly connected to multiple other neurons in the first layer of neurons behind the retina - and will only "spike" if enough of those first-layer neurons send them a signal. In the video you can see this as a progressive cascade of activation from the front screen to the back output. In contrast, the other types of network in this video are basically in constant activation front-to-back.

    In software, simulating this process requires bits in memory to track every "level" of excitation that can apply to each neuron. If a neuron will spike when it has received three or four inputs, it needs at least two bits of memory to track its state (probably more - I don't know if these simulated networks are model adaptation effects like real neurons, but if they do a lot more information needs to be tracked than just the stimulation needed for initial spiking). As a neural network simulated in software increases in complexity, the amount of memory required will increase exponentially. The CPU also needs to interact with the information in memory to keep track of what the neurons need to do - so both memory-intensive and CPU-intensive.

    Where weebit comes in is that the individual cells of weebit "bits" don't have to change state from 0 to 1 with a single input, but it can actually operate like a brain neuron where successive inputs to the cell bring it closer to the threshold that will cause the bit to flip from 0 to 1. Thus, the behaviour of the neuron can be modelled directly by the memory bits, instead of needing the CPU to keep track of it and the memory just storing the current state of each neuron.

    There are lots of details that I know nothing about - like how the connections of a weebit neural network are configured (i.e. do they have to be manufactured in a specific configuration or does software in some way simulate the connections between virtual neurons), or whether the activation thresholds can be tweaked for different purposes.

    Anyway, I look forward to finding out more when the demonstration is public.
    Last edited by Interloping: 25/07/19
 
watchlist Created with Sketch. Add WBT (ASX) to my watchlist
(20min delay)
Last
$2.27
Change
-0.010(0.44%)
Mkt cap ! $428.5M
Open High Low Value Volume
$2.29 $2.29 $2.25 $1.276M 561.8K

Buyers (Bids)

No. Vol. Price($)
4 17680 $2.27
 

Sellers (Offers)

Price($) Vol. No.
$2.28 1117 1
View Market Depth
Last trade - 16.10pm 21/06/2024 (20 minute delay) ?
WBT (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.