BRN 3.70% 28.0¢ brainchip holdings ltd

Software-defined vehicles drive next-gen connected EV...

  1. 9,549 Posts.
    lightbulb Created with Sketch. 24780

    Software-defined vehicles drive next-gen connected EV development

    Software-defined vehicles, software-based simulation and neural processors in EVs and connected cars, with a look at developments from General Motors, Mercedes-Benz and Blackberry.



    The electric vehicle (EV) has clearly become a key topic of discussion, with EV range probably the thing most consumers are probably worried about. To address the range concern, two stories emerged this week – one was Mercedes-Benz’ achieving a 1,000 km range with its VISION EQXX prototype, albeit as a concept car, and General Motors announcing during a CES 2022 keynote its new Chevrolet Silverado EV with 400-mile (640 km) range.

    In briefings with companies, I often hear them talk about the software-defined car and the extensive use of software simulation (or we could also call it a digital twin). In the case of both the VISION EQXX and the Silverado EV, software plays a key part. I also spoke to BlackBerry about its IVY platform and how it is laying the groundwork for software-defined vehicles.

    Mercedes-Benz: from white paper to vehicle in 18 months

    Let’s start with Mercedes-Benz. The VISION EQXX is a research prototype concept electric vehicle (EV) which through digital simulations in real-life traffic conditions was shown to exceed 1,000 km on a single charge. Developed from white paper to completed vehicle in just 18 months through collaboration with startups and institutions from around the world, the software-defined electric car relied heavily on software in the loop (SiL) systems. This kept the commissioning phases with the real hardware extremely short and enabled Mercedes-Benz to drive large-scale tests early on in the project.

    Mercedes-Benz VISION EQXXMercedes-Benz VISION EQXXAn efficiency assistant in the VISION EQXX works together with the driver, curating information to support an efficient driving style. (Image: Mercedes-Benz)

    Using this approach, the team was able to install the drive unit, flash the software and get the wheels turning on the VISION EQXX within just two hours. Mercedes-Benz said this nimble, efficient and responsive teamwork was made possible by a combination of a motorsport mindset and intelligent use of the comprehensive testing options, and the digital development approach meant many of the innovations in the vehicle could be quickly adapted for production applications.


    Advanced software and digital processes have been key to the development process, and a masterclass in software management, according to Mercedes-Benz. The team made extensive use of open-source technology, augmented by elements created in-house. Agile working practices and monthly release planning ensured a continuous flow of end-to-end functions and early integration of solutions.

    The scale of the digital development work involved in designing and engineering the VISION EQXX was “truly ground-breaking” according to the company. Highly advanced digital tools such as augmented and virtual reality dispensed with the need for time-consuming physical mock-ups. It also facilitated simultaneous development work by remote teams working in different parts of the world – from Stuttgart (Germany) to Bangalore (India) and from Brixworth (UK) to Sunnyvale (California). This massive uplift in digital power slashed the time spent in the wind tunnel from more than 100 hours to just 46. It also meant more than 300,000 kilometers of test driving were covered virtually.The carmaker said its technology development program offers a completely realistic way forward for electric vehicle technology and automotive engineering, with many of the resulting features and developments already being integrated into production, including the next generation of the MMA – the Mercedes-Benz modular architecture for compact and medium-sized cars.

    The development of the VISION EQXX and enabling the 1,000km range is clearly a result of creating huge energy efficiency, from the electric drivetrain to the use of lightweight engineering and sustainable materials, as well as adding intelligent energy management. The claim is that 95% of the energy from the battery ends up at the wheels.

    Neuromorphic computing for infotainment

    This efficiency is not just being applied to enhancing range though. Mercedes-Benz also points out that its infotainment system uses neuromorphic computing to enable the car to take to “take its cue from the way nature thinks”.

    Mercedes-Benz VISION EQXXMercedes-Benz VISION EQXXNeuromorphic computing systems have the potential to radically reduce the energy needed to run the latest AI technologies in vehicles. (Image: Mercedes-Benz)

    The hardware runs spiking neural networks, in which data is coded in discrete spikes and energy only consumed when a spike occurs, reducing energy consumption by orders of magnitude. In order to deliver this, the carmaker worked with BrainChip, developing the systems based on its Akida processor. In the VISION EQXX, this technology enables the “Hey Mercedes” hot-word detection five to ten times more efficiently than conventional voice control. Mercedes-Benz said although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.

    The VISION EQXX user interface also demonstrates how the software-driven future of car UI/UX (user interface /user experience). The car features a completely seamless display spanning 47.5 inches from one A-pillar to the other. With an 8K (7680×660 pixels) resolution, the thin, lightweight mini-LED display acts as a portal connecting the driver and occupants with the car and the world outside. The Mercedes-Benz team worked with navigation experts NAVIS Automotive Systems to develop the first real-time 3D navigation system on a screen of this size. It performs seamless zoom and scroll functions from satellite view down to a height of 10 meters in the 3D city representation.

    The further development of the “Hey Mercedes” voice assistant is emotional and expressive as a result of a collaboration between Mercedes-Benz engineers and the voice synthesis experts from Sonantic. With the help of machine learning, the team have given “Hey Mercedes” its own distinctive character and personality. Mercedes-Benz stated: “As well as sounding impressively real, the emotional expression places the conversation between driver and car on a whole new level that is more natural and intuitive, underscoring the progressive feel of the modern luxury conveyed by the UI/UX in the VISION EQXX.”

    Mercedes-Benz VISION EQXXMercedes-Benz VISION EQXXThe user interface and user experience inside the VISION EQXX provides a vision of a highly responsive, intelligent and software-driven future. It is Mercedes-Benz’ first ever completely seamless display spanning 47.5 inches from one A-pillar to the other. (Image: Mercedes-Benz)

    The one-piece display is also highly energy efficient. Its mini-LED backlight consists of more than 3000 local dimming zones, meaning it consumes power only as and when needed in specific parts of the screen. The 3D navigation screen adapts to the type of content being shown. For instance, if you’re driving in an urban area, abstract visualization of the surrounding buildings helps provide orientation amid densely packed streets. However, if you are traveling on the motorway or open road, the level of detail diminishes to provide a clearer overview of the journey. This has the added efficiency benefit of reducing the energy consumption of the display.

    As well as providing seamless navigation, the intelligence in the VISION EQXX can mine for data based on the car’s route. There is also a system to help you drive more efficiently. From energy flow to terrain, battery status and even the direction and intensity of the wind and sun, the efficiency assistant curates all the available information and suggests the most efficient driving style.

    This is supposed to enhance the driver’s own senses by providing input on external conditions that the driver is unable to feel directly – in the way that, for instance, a cyclist can feel the force of the wind, or the extra effort involved to pedal uphill. This sensorial support is further augmented by the ability of the car to use the map data to “see into the future”, anticipating what lies ahead to help the driver take advantage of it in a way that maximizes efficiency.

    A series of screens can also provide more detailed information, with things like the influence of current acceleration, gradient, wind and rolling resistance on energy consumption shown in real time. The simplicity of the interface is a further development of the “Zero Layer” concept first used in the EQS, which eases driver-vehicle interaction by dispensing with submenus


 
watchlist Created with Sketch. Add BRN (ASX) to my watchlist
(20min delay)
Last
28.0¢
Change
0.010(3.70%)
Mkt cap ! $519.6M
Open High Low Value Volume
27.5¢ 29.0¢ 27.5¢ $3.829M 13.54M

Buyers (Bids)

No. Vol. Price($)
18 600720 27.5¢
 

Sellers (Offers)

Price($) Vol. No.
28.0¢ 211500 4
View Market Depth
Last trade - 16.10pm 07/05/2024 (20 minute delay) ?
Last
27.5¢
  Change
0.010 ( 0.22 %)
Open High Low Volume
28.0¢ 29.0¢ 27.5¢ 7165702
Last updated 15.59pm 07/05/2024 ?
BRN (ASX) Chart
arrow-down-2 Created with Sketch. arrow-down-2 Created with Sketch.