BRN 0.00% 17.0¢ brainchip holdings ltd

TATA SNN patent, page-13

  1. 6,614 Posts.
    lightbulb Created with Sketch. 2189

    As with other competitor/partner patnt descriptions/claims, there is no description of the basic NPU (network proceessing unit). I'm not sure if I've mentioned NPUs before, but Akida's NPU, the fundamental building block of Akida's CSNN is described in:

    WO2020092691A1AN IMPROVED SPIKING NEURAL NETWORK
    https://hotcopper.com.au/data/attachments/3833/3833972-f85b3be7d697e710d844e9aa32def73e.jpg


    The Tata patent relates to the use of a Doppler radar detector to gather data from the moving object.I haven't read the Tata patent in detail, but I've gleaned the following:

    TataCSSN RadarUS20210365778

    https://hotcopper.com.au/data/attachments/3834/3834000-02fae4d48d67109ef5c7820c3dec0a0c.jpg


    https://hotcopper.com.au/data/attachments/3834/3834003-1d6fe2662a1e8ed2b2b4bcfdd88ae250.jpg


    [0011] FIG. 3 is a functional block diagram of the SNN model for real-time radar-based action recognition in accordance with some embodiments of the present disclosure.


    I think this is the core of Tata's invention, detecting very small variations in the frequency of the reflected radar pulses (micro-Doppler) due to movements of the object. They utilize a database of micro-data corresponding to partiular movements.


    [0027] In an embodiment, the system 102 learns and identifies target's actions using the concept of neuromorphic computing applied over radar data (obtained from radar sensors monitoring the target). In an embodiment, the radar data facilitates in creation of micro-Doppler data which is then utilized for action recognition. The micro-Doppler data includes positive and negative doppler frequencies that are observed when different parts of the target move towards or away from the radar sensor. Together, all these frequencies constitute the micro-Doppler signatures for a particular action. Since different target parts move at different frequencies for multiple actions, their micro-Doppler signatures are different in time frequency domain. The system 102 employs a convolution-based spiking neural network (CSNN) that is capable of learning both spatial and temporal features of the actions.

    [0034] Referring to FIG. 3, a block diagram depictingarchitecture of the disclosed SNN model is illustrated in accordance with anexample embodiment. As illustrated in FIG. 3, the SNN model 300 is shown toinclude a data pre-processing layer 310, a plurality of CSNN layers (such asCSNN layers 322, 324, 326) and a classifier layer 350. The data preprocessinglayer 310 performs compression and encoding on the radar data in order to makethe computation faster. The plurality of CSNN layers such as CSNN layers 322,324, 326 contains multiple spiking layers to extract the spatial features fromthe input radar data (or spiking data). A special technique is detailed belowthat is used to capture the temporal signature of action while the radar datais being processed in the CSNN layer. The spatial feature extraction ishierarchical in nature, with initial layers (for example, the CSNN layer 322)capturing low level features like edges with complexity keep on increasing tillthe last/final layer (for example, the CSNN layer 326). In other words,convoluting over the plurality of CSNN layers increases complexity in the firstset of spatial features from an initial CSNN layer to a last CSNN layer of theplurality of CSNN layers. The convolutional features of a CSNN layer along withit's temporal spiking signature become an enriched feature set and is thenpassed to the classifier layer 350 for finally recognizing the actions. Thedetails of each of the aforementioned components is described further belowwith reference to steps 204-210 (FIG. 2).

    [0039] A narrow time window results in better resolution in time-axis, a poor one in frequency domain and vice versa. Thus, a unique trade-off point has to be achieved between time frequency resolution as both the information are important for the analysis of time-frequency plots. FIG. 4 shows a spectrogram for a bow action performed by a target (human). Zero Doppler frequency is observed when the person is still.

    [0040] Positive and negative Doppler frequencies are observed when different body parts move towards or away from the radar sensor. Together, all these frequencies constitute the micro-Doppler signatures for a particular action. Since different body parts move at different frequencies for multiple actions, their micro-Doppler signatures are different in time frequency domain. Hereinafter, the micro-Doppler signatures or frequency signals obtained upon being reflected from the target upon motion of the target with respect to the one or more radar sensors may be referred to as training data. The data obtained from radar sensors during the training stage may be referred to as `training data`. The data obtained from the radar sensors during the testing and/or validation phase may be referred to as `radar data`. The training data obtained from the radar sensors may be utilized for training the SNN model. The training of the SNN model is explained by taking an analogy of human brain.


    At [0043] Tata refers to Leaky Integrate and Fire, hwereas Akida uses Integrate and Fire (Akida doesn't leak).
    This describes the Akida NPU action:

    [0044] ... Excitatory pre-synaptic neurons increase the membrane potential, whereas, inhibitory pre-synaptic neurons tend to decrease it. As mentioned before, a spike is generated when the membrane potential breaches a threshold (V.sub.thresh). A spike in the presynaptic neuron increases the conductance of the synapse in magnitude. The dynamics of excitatory and inhibitory conductance are modelled as per equations (4) and (5) respectively:

    Claim 1:
    1. A processor implemented method comprising: employing, via one or more hardware processors, a spiking neural network (SNN) model for recognition of an action performed by a target, the SNN model comprising a data pre-processing layer, a plurality of Convolutional Spiking neural network (CSNN) layers and a classifier layer, wherein the SNN model for action recognition comprising: receiving, by the data preprocessing layer, a radar data acquired by one or more radar sensors, wherein the radar data indicative of one or more actions performed by the target, wherein the radar data comprises a plurality of Doppler frequencies reflected from the target upon motion of the target with respect to the one or more radar sensors; determining, by the data preprocessing layer, a first binarized matrix associated with the radar data; extracting, by the plurality of CSNN layers pre-trained on a training data, a set of features associated with the one or more actions of the target based on the first binarized matrix, the set of features comprising a first set of spatial features and a first set of temporal features; and identifying, by the classifier layer, a type of the action from amongst the one or more actions performed by the target based on the set of features.

    [0023] ... the radar data includes micro-Doppler data. The disclosed CSSN model is capable of learning spatial as well as temporal data from the radar data. Further, the use of neuromorphic and SNN concepts makes the disclosed model deployable over evolving neuromorphic edge devices thereby making the entire approach more efficient in terms of data, computation and energy consumption.

    The Akida patent has a date of 2018, whereas Tata's priority is 2019. Hence, in the case of "overlap" Akida should trump Tata.

    In as much as it relies on CSNN, if their CSNN is the same as that claimed in BrainChip's patent, they would need a licence from BrainChip, the granting of which would not, I expect, be unreasonably withheld.
 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
17.0¢
Change
0.000(0.00%)
Mkt cap ! $333.6M
Open High Low Value Volume
17.0¢ 17.0¢ 16.5¢ $326.8K 1.948M

Buyers (Bids)

No. Vol. Price($)
61 2035017 16.5¢
 

Sellers (Offers)

Price($) Vol. No.
17.0¢ 734189 13
View Market Depth
Last trade - 16.10pm 29/08/2024 (20 minute delay) ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.