BRN 2.50% 20.5¢ brainchip holdings ltd

2021 BRN Discussion, page-16665

  1. 6,614 Posts.
    lightbulb Created with Sketch. 2189
    While looking for an answer to your rhetorical question about "compute-in-memory" owners, I stumbled across this Qualcomm patent which gives a useful, if tuatological, pictorial illustration of CNNs. Love the smell of red herrings in the morning!

    In particular it illustrates the difference between a fully connected NN (Fig 2A), and a locally connected NN (Fig 2B), of which CNN (Fig 2C) is a special case, having a geometric association between locally associated blocks of pixels.

    Fig 2D illustrates supervised training in a CNN.

    https://worldwide.espacenet.com/patent/search/family/072744853/publication/US2021089865A1?q=US2021089865A1

    US2021089865A1 PARALLEL PROCESSING OF A CONVOLUTIONAL LAYER OF A NEURAL NETWORK WITH COMPUTE-IN-MEMORY ARRAY

    https://hotcopper.com.au/data/attachments/3397/3397703-f36d9a62148cfd4a8daf29b88fde3c29.jpg




    https://hotcopper.com.au/data/attachments/3397/3397717-1b740d6901ef2861dfad55a46b83c004.jpg


    [0036] The connections between layers of a neural network may be fully connected or locally connected. FIG. 2A illustrates an example of a fully connected neural network 202 . In a fully connected neural network 202 , a neuron in a first layer may communicate its output to every neuron in a second layer, so that each neuron in the second layer will receive input from every neuron in the first layer. FIG. 2B illustrates an example of a locally connected neural network 204 . In a locally connected neural network 204 , aneuron in a first layer may be connected to a limited number of neurons in thesecond layer. More generally, a locally connected layer of the locally connected neural network 204 may be configured so that each neuron in a layer will have the same or a similar connectivity pattern, but with connections strengths that may have different values (e.g., 210, 212, 214, and 216). The locally connected connectivity pattern may give rise to spatially distinct receptive fields in a higher layer, because the higher layer neurons in a given region may receive inputs that are tuned through training to the properties of a restricted portion of the total input to the network.

    [0037] One example of a locally connected neural networkis a convolutional neural network. FIG. 2C illustrates an example of aconvolutional neural network 206 . The convolutional neural network 206 may be configured such that the connection strengths associated with the inputs for each neuron in the second layer are shared (e.g., 208 ). Convolutionalneural networks may be well suited to problems in which the spatial location ofinputs is meaningful.

    [0038] One type of convolutional neural network is a deep convolutional network (DCN). FIG. 2D illustrates a detailed example of a DCN 200 designed to recognize visual features from an image 226 input from an image capturing device 230 , such as a car-mounted camera. The DCN 200 of the current example may be trained to identify traffic signs and a number provided on the traffic sign. Of course, the DCN 200 may be trained for other tasks, such as identifying lane markings or identifying traffic lights.

    [0039] The DCN 200 may be trained with supervised learning. During training, the DCN 200 may be presented with an image, such as the image 226 of a speed limit sign, and a forward pass may then be computed to produce an output 222 . The DCN 200 may include a feature extraction section and a classification section. Upon receiving the image 226 , a convolutional layer 232 may apply convolutional kernels (not shown) to the image 226 to generate a first set of feature maps 218 . As an example, the convolutional kernel for the convolutional layer 232 may be a 5×5 kernel that generates 28×28 feature maps. In the present example, because four different feature maps are generated in the first set of feature maps 218 , four different convolutional kernels were applied to the image 226 at the convolutional layer 232 . The convolutional kernels may also be referred to as filters or convolutional filters.

    ##################################################
    BTW, there are over 70 patents which answer to "compute-in-memory", and about 46 which answer to "compute-in-memory" and "neural", a lot of Intel, IBM, Qualcomm.

 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
20.5¢
Change
0.005(2.50%)
Mkt cap ! $380.4M
Open High Low Value Volume
20.0¢ 21.0¢ 20.0¢ $597.7K 2.928M

Buyers (Bids)

No. Vol. Price($)
25 1703112 20.0¢
 

Sellers (Offers)

Price($) Vol. No.
20.5¢ 337767 11
View Market Depth
Last trade - 16.10pm 29/07/2024 (20 minute delay) ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.