BRN 20.5% 23.5¢ brainchip holdings ltd

2020 BRN Discussion, page-21007

  1. 10,090 Posts.
    lightbulb Created with Sketch. 27265
    Hi Branded
    The following article gives a good overview of what may happen however if you like you could just skip to the conclusion and read the words I have highlighted.

    One other thing to do is have a look at an earlier post from uiux where he put up the chart concerning the product development plan for AKIDA 1000, 2000, 3000 and 4000 is set out and take note of the fact that according to Peter van der Made while the AKID IP will become more powerful through each development stage the power consumption becomes less with each step up.

    AKIDA 1.0 IP rated at 10uW to 200mW and by the time we reach AKIDA 4.0 IP it has reduced to 4uW to 50mW. So if the players referred to in this article are looking to replace ARM IP Brainchip would seem to be ideally situated to take advantage particularly with the two top ARM sales people on board to get their foot in the door.

    My opinion only DYOR.

    SEPTEMBER21, 2020

    Nvidia has acquired Arm. What does this mean forthe future of AI, edge computing, and the people who write software for thesechips?

    In a move that has significant implications for thetech industry, U.S.-based graphics chip maker Nvidia announced last week thatit would purchase U.K.-based Arm Holdings from Japanese investment firmSoftbank for $40 billion. For anyone programming for AI, data processing, orembedded systems, this could mean your data intensive applications will soon berunning…

    Nahla Davies

    In a move that has significantimplications for the tech industry, U.S.-based graphics chip maker Nvidiaannounced last week that it would purchase U.K.-based Arm Holdings fromJapanese investment firm Softbank for $40 billion. For anyone programming for AI,data processing, or embedded systems, this could mean your data intensiveapplications will soon be running on ARM designed chips with nativeNvidia GPU support.

    Arm Holdings is the enterprise behindArm Processors, the smart sensor chips which are used to power over 90% of theworld’s smartphones and everything from autonomous vehicles to toasters to washing machines. While the company has no manufacturing capabilities, it describes itself as the “Switzerland” oftechnology: it licenses its chip designs to any company who wants them, and allows others to do the actual manufacturing.

    Under the terms of the new deal, Armwill continue to be headquartered in Cambridge as a UK-based company Nvidia hasannounced that they will be opening an AIresearch center in Cambridge as well near the Arm headquarters. The research center will serve as a central hub for collaboration between AI scientists and researchers from across the globe.

    So, what does this mean for ourreaders, the folks writing code every day? This acquisition could have majorimplications for developers working on embedded systems. It may be beneficialto start learning about platforms like CUDA (Compute Unified DeviceArchitecture) and its SDK. Your ability to process large amounts of data in the cloud may speed up, while at the same time, the ability to fit powerful machine learning algorithms onto devices may require less and less memory. Read on to learn the backstory behind this deal and what it will mean for the world of computing and programming.

    From graphics to AI

    While Nvidia is best known forgraphics cards that enable modern video games, the last few years have seen awealth of new applications for their technology: AI, data processing, andcryptocurrency mining have all turned to Nvidia GPUs. The demand from new areasexploded so fast that, in 2018, a run on Nvidia cards by bitcoin miners led toa global shortage. While the headiest days of the crypto currency boom may havepassed, the applications for large scale data processing in parallel continueto proliferate.

    Where CPUs are designed for complexlogic processes, GPUs are optimizedto process many floating point computations in parallel. 3D rendering requires a massive amount of arithmetic calculations as vertices rotate. For machine learning and other large data processing operations, the GPU’s focus on parallel arithmetic is a match made in heaven.

    Nvidia already offers products thatmarry their GPU with Arm chip designs to create data processingworkhorses. In late 2019, the company created an Arm-based,GPU accelerated server designed to process information very quickly. Purchasing Arm may be a way to double down on the bet that GPUs will make a popular data processing tool.

    As a result of the acquisition,Nivida will now be placed at the forefront of Arm’s IoT ecosystem andcloud-based AI edge computing. Edge computing refers to an approach where information isstored and processed locally rather than in a central data warehouse many miles away. Intel made a similar move when it acquired Movidius.

    If edge computing matures the waythese chip companies hope, companies can save moneygathering information locally and acting upon critical data immediately. For any applications that are latency-sensitive, such as autonomous vehicles, even a millisecond delay in data processing is unacceptable.

    With the addition of Arm, it’spossible Nvidia will become adominant force in everything from microprocessors to tablets, mobile phones to street lamps, washing machines to autonomous vehicles.

    What changes in the computing landscape made thispossible?

    Cloud computing reshaped softwarearchitecture in the past few years, where distributed software canautomatically scale its computing resources based on its immediate needs. Forexample, 86% of allenterprises are expected to become dependent upon SaaS (one of the three primary cloud computing categories, along with IaaS and PaaS) for most or all of their software needs by 2022.

    At the same time, Moore’s law hasheld steady and semiconductors have continued to shrink (possibly holdingtheir size in 2021), which has allowed more and more devices to include computing power. Combine that with easy access to scalable computing power, and you have a world where everything is a computer and needs specialized hardware.

    As more and more sophisticatedapplications were developed in these devices, the idea of edge computing rose.Instead of processing data and sending it to a device, now the data is beingprocessed on-device to get around network latency issues or the absence ofnetworks, both of which happen to be prevalent in those countries developingfastest.

    As mentioned earlier, those GPUs thatNvidia specializes in are very good at the parallel manipulation andprocessing of data. One of the most noteworthy advances in the computingindustry overall has been the proliferation of GPUs into numerous differentsolutions (such as 3D mapping, image processing, and deep machine learning), tothe point that more traditional CPU power has not been able tokeep up.

    What could change with the developerlandscape

    For developers, this may mean thatnew realms of data processing speed opens up. These data powerhouses could beadded to cloud offerings as an add-on, making this a seamless benefit tocomplex ETL pipelines. For integrated chipsets on smaller devices, it could bethat graphics and data processing limits grow, allowing mobile apps withimproved graphics and IoT devices with more sophisticated AI.

    Speaking of AI, neural nets maybecome simultaneously more complex and smaller. Specialized AI hardware couldbe developed to support consumer applications. Neural net software currently used in powerful data processing and forecasting applications could find new use cases, or new applications could arise that provide small scale AI benefits to a wide range of people.

    But the biggest effect may be thatmore developers need to know about the CUDA (Compute Unified DeviceArchitecture) framework SDK. Similar to SIMD intrinsics, this SDK allows programs to directly access the GPU’s parallel processing. And if Nvidia manages to unify the physicalmemory of the CPU and GPU, it could open up numerous new avenues for optimization and advancement.

    In the end, though, much of thedetails of this may be abstracted away from anyone writing code by librariesand high-level programming languages. The only coders sure to be affected arethose working directly with the embedded systems.

    Conclusion

    Nvidia’s acquisition of Arm is likelyto have a lasting effect on the tech industry as a whole. Not only hasNvidia become a muchbigger player in the IoT and cloud-based edge computing (to the point that it could be argued they will become the single most influential player), but major corporations that Arm licenses to, such as Apple,Intel, and Samsung, may look to shift to alternative sources for their chip andmicroprocessor designs.

    The biggest benefit that is likely tocome out of all of this is the fact that major corporations and smallerstartups alike have begun developing AI-based microprocessors that can handlecomplex neural networks. This means we are likely to see the continuedinnovation of microprocessors on a scale that we have never seen before.




 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.