Reading various posts over the last several days has suggested that there are less well informed posters claiming that the AKIDA technology is a 'scam' of some sort. Well I take my hat off to the Brainchip team because they have managed to fool a whole range of highly credentialed semiconductor leaders including:ARM - It has publicly announced in presentations and on its website that Brainchip's AKIDA technology is fully compatible with their semiconductor range.MICROCHIP - It has publicly stated in a podcast that they are taking AKIDA technology to their customers.European Space Agency - It has funded EDGX-1 Brain which uses AKIDA technology.AIRBUS - It has announced that it has partnered with Brainchip and others.MERCEDES BENZ - It on the release of the EQXX the most efficient EV ever produced stated that AKIDA technology was 5 to 10 times more efficient than competitor technology.And the list goes on and on with Intel, Socionext, Global Foundries, Renesas, Megachips and others but perhaps the stake through the heart of ignorance was driven home in the peer reviewed paper by six senior researchers at Ericsson who produced a prototype Zero Energy AKIDA technology camera. I have extracted the relevant part of the paper which is wide ranging across the future requirements of 6G and Zero-Energy which I would commend to you and the link to the full paper is included at the end:“Towards 6G Zero-Energy Internet of Things: Standards, Trends/, and Recent ResultsTalha Khan, Sandeep Narayanan Kadan Veedu, András Rácz, Mehrnaz Afshang, Andreas Höglund, Johan Bergman -Ericsson{talha.khan, sandeep.narayanan.kadan.veedu, andras.racz, mehrnaz.afshang, andreas.hoglund, johan.bergman}@ericsson.com“…To demonstrate the feasibility of low-energy AI and low-energy communication in a ZE-IoT device, we have built an example use case, as illustrated in Figure 4.The ZE-IoT device consists of a low-power camera, a neuromorphic AI chip (the Akida neural chip from BrainChip), a low-power radio and a solar panel. The application assumes that the camera takes a picture (e.g., when triggered by a motion sensor), runs a neural network to create the neural embedding of the image (i.e., extracts the neural features from the image), and sends the neural embedding vector via a custom radio stack tailored for AI data that implements approximate and intermittent communication.The use case specific AI logic is hosted in the network which implements the final layers of image recognition. It can be customized for object, face or gesture recognition.Consequently, the AI logic in the sensor device can be use case agnostic, allowing considerable flexibility to introduce new use cases by adding new AI logic on the network side.Figure 4: AI-enabled ZE-IoT prototype use case.The radio link includes a custom data encoding, where instead of using binary encoding for the vector elements and sending the digital data with error correction encoding, we first create pseudo random linear projections of the embedding vector from the N- dimensional space to single-dimension and send these projected values on the radio link as digitally encodedand modulated data or with a quasi-analogous modulation. Formally,where is the raw vector containing the l projections, is the embedding vector to be transmitted and contains the first l linear projection codewords. Note that is not transmitted as it can be regenerated at the receiver from the same seed as used in the transmitter. The index l continuously increases with every transmission. The receiver attempts to obtain by solving Eq. (1) based on the received and the known .The proposed encoding enables approximate communication as each transmission from the ZE-IoT sensor includes information about the entire embedding vector which can be reconstructed at the receiver more accurately with the reception of every new transmission. Even if the ZE-IoT device has energy for only a single transmission, it can still contribute to the reception of the embedding vector.In Figure 5, we plot the energy profile of the ZE-IoT prototype device to illustrate the stored and spent energy versus time. When the harvested energy reaches the amount required for the next processing stage (e.g., camera capture, AI inference or radio transmission), the task execution depletes the stored energy, which is again collected by the solar harvester.Figure 5: Energy collected and spent.The average values for energy consumption and harvesting are as follows:(1) the AI inference on the Akida chip consumes ~8 mJ/image inference with 480k neural network parameters and 4-bit quantization of the weights,(2) one radio transmission of a 20 bytes frame consumes ~5 mJ and requires ~4 ms transmission time, and(3) the solar harvester produces ~1.5 mW or equivalently 4.5 mJ/3 s power under typical indoor lighting.V. CONCLUSIONSIn this article, we have discussed various aspects of the 6G cellular ZE-IoT technology. We have comprehensively reviewed the standardization efforts for ZE-IoT including use cases, device categories, design targets and future roadmap.We have also identified the role of emerging technology trends such as digital twins, AI and smart textiles in facilitating the mass adoption of ZE-IoT and vice versa. Moreover, we have provided novel research results to address some of the challenges facing ZE-IoT. For instance, to demonstrate the feasibility of AI-enabled ZE-IoT, we have developed a prototype of a solar-powered AI- enabled ZE-IoT camera device with neuromorphic computing. In addition, we have also emphasized the need of an OFDM-compatible physical layer design for ZE-IoT.To this end, we have devised techniques for OFDM-compatible backscatter communication for passive ZE-IoT devices and for OFDM-compatible OOK for active ZE-IoT devices.There are several opportunities for future research. One promising direction is to design OFDM-compatible communication techniques for ZE-IoT. Another possibility is to drive empirical research at the intersection of ZE-IoT connectivity and disruptive technology trends such as AI.The conditions are ripe for catering to the ZE-IoT use cases within a cellular ecosystem.”Posters can scream and shout about whatever they like but only a complete fool would doubt that AKIDA patent protected technology is real and completely proven.My opinion only DYORFact Finder
- Forums
- ASX - By Stock
- 2024 BrainChip Discussion
Reading various posts over the last several days has suggested...
-
-
- There are more pages in this discussion • 2,629 more messages in this thread...
You’re viewing a single post only. To view the entire thread just sign in or Join Now (FREE)
Featured News
Add BRN (ASX) to my watchlist
(20min delay)
|
|||||
Last
24.5¢ |
Change
0.000(0.00%) |
Mkt cap ! $483.2M |
Open | High | Low | Value | Volume |
24.5¢ | 25.5¢ | 23.0¢ | $2.156M | 8.797M |
Buyers (Bids)
No. | Vol. | Price($) |
---|---|---|
3 | 63333 | 24.0¢ |
Sellers (Offers)
Price($) | Vol. | No. |
---|---|---|
24.5¢ | 99875 | 4 |
View Market Depth
No. | Vol. | Price($) |
---|---|---|
3 | 63333 | 0.240 |
11 | 544590 | 0.235 |
35 | 1164149 | 0.230 |
25 | 1415176 | 0.225 |
33 | 1504869 | 0.220 |
Price($) | Vol. | No. |
---|---|---|
0.245 | 99875 | 4 |
0.250 | 460879 | 16 |
0.255 | 655887 | 15 |
0.260 | 1184533 | 25 |
0.265 | 715521 | 17 |
Last trade - 16.10pm 04/11/2024 (20 minute delay) ? |
Featured News
BRN (ASX) Chart |