Software-defined vehicles drive next-gen connected EV development

Article By : Nitin Dahad

Software-defined vehicles, software-based simulation and neural processors in EVs and connected cars, with a look at developments from GM, Mercedes-Benz and Blackberry.

The electric vehicle (EV) has clearly become a key topic of discussion, with EV range probably the thing most consumers are probably worried about. To address the range concern, two stories emerged this week – one was Mercedes-Benz’ achieving a 1,000 km range with its VISION EQXX prototype, albeit as a concept car, and General Motors announcing during a CES 2022 keynote its new Chevrolet Silverado EV with 400-mile (640 km) range.

In briefings with companies, I often hear them talk about the software-defined car and the extensive use of software simulation (or we could also call it a digital twin). In the case of both the VISION EQXX and the Silverado EV, software plays a key part. I also spoke to Blackberry about its IVY platform and how it is laying the groundwork for software-defined vehicles.

Mercedes-Benz: from white paper to vehicle in 18 months

Let’s start with Mercedes-Benz. The VISION EQXX is a research prototype concept electric vehicle (EV) which through digital simulations in real-life traffic conditions was shown to exceed 1,000 km on a single charge. Developed from white paper to completed vehicle in just 18 months through collaboration with startups and institutions from around the world, the software-defined electric car relied heavily on software in the loop (SiL) systems. This kept the commissioning phases with the real hardware extremely short and enabled Mercedes-Benz to drive large-scale tests early on in the project.

Mercedes-Benz VISION EQXXMercedes-Benz VISION EQXX
An efficiency assistant in the VISION EQXX works together with the driver, curating information to support an efficient driving style. (Image: Mercedes-Benz)

Using this approach, the team was able to install the drive unit, flash the software and get the wheels turning on the VISION EQXX within just two hours. Mercedes-Benz said this nimble, efficient and responsive teamwork was made possible by a combination of a motorsport mindset and intelligent use of the comprehensive testing options, and the digital development approach meant many of the innovations in the vehicle could be quickly adapted for production applications.

Advanced software and digital processes have been key to the development process, and a masterclass in software management, according to Mercedes-Benz. The team made extensive use of open-source technology, augmented by elements created in-house. Agile working practices and monthly release planning ensured a continuous flow of end-to-end functions and early integration of solutions.

The scale of the digital development work involved in designing and engineering the VISION EQXX was “truly ground-breaking” according to the company. Highly advanced digital tools such as augmented and virtual reality dispensed with the need for time-consuming physical mock-ups. It also facilitated simultaneous development work by remote teams working in different parts of the world – from Stuttgart (Germany) to Bangalore (India) and from Brixworth (UK) to Sunnyvale (California). This massive uplift in digital power slashed the time spent in the wind tunnel from more than 100 hours to just 46. It also meant more than 300,000 kilometers of test driving were covered virtually.

The carmaker said its technology development program offers a completely realistic way forward for electric vehicle technology and automotive engineering, with many of the resulting features and developments already being integrated into production, including the next generation of the MMA – the Mercedes-Benz modular architecture for compact and medium-sized cars.

The development of the VISION EQXX and enabling the 1,000km range is clearly a result of creating huge energy efficiency, from the electric drivetrain to the use of lightweight engineering and sustainable materials, as well as adding intelligent energy management. The claim is that 95% of the energy from the battery ends up at the wheels.

Neuromorphic computing for infotainment

This efficiency is not just being applied to enhancing range though. Mercedes-Benz also points out that its infotainment system uses neuromorphic computing to enable the car to take to “take its cue from the way nature thinks”.

Mercedes-Benz VISION EQXXMercedes-Benz VISION EQXX
Neuromorphic computing systems have the potential to radically reduce the energy needed to run the latest AI technologies in vehicles. (Image: Mercedes-Benz)

The hardware runs spiking neural networks, in which data is coded in discrete spikes and energy only consumed when a spike occurs, reducing energy consumption by orders of magnitude. In order to deliver this, the carmaker worked with BrainChip, developing the systems based on its Akida processor. In the VISION EQXX, this technology enables the “Hey Mercedes” hot-word detection five to ten times more efficiently than conventional voice control. Mercedes-Benz said although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.

The VISION EQXX user interface also demonstrates how the software-driven future of car UI/UX (user interface /user experience). The car features a completely seamless display spanning 47.5 inches from one A-pillar to the other. With an 8K (7680×660 pixels) resolution, the thin, lightweight mini-LED display acts as a portal connecting the driver and occupants with the car and the world outside. The Mercedes-Benz team worked with navigation experts NAVIS Automotive Systems to develop the first real-time 3D navigation system on a screen of this size. It performs seamless zoom and scroll functions from satellite view down to a height of 10 meters in the 3D city representation.

The further development of the “Hey Mercedes” voice assistant is emotional and expressive as a result of a collaboration between Mercedes-Benz engineers and the voice synthesis experts from Sonantic. With the help of machine learning, the team have given “Hey Mercedes” its own distinctive character and personality. Mercedes-Benz stated: “As well as sounding impressively real, the emotional expression places the conversation between driver and car on a whole new level that is more natural and intuitive, underscoring the progressive feel of the modern luxury conveyed by the UI/UX in the VISION EQXX.”

Mercedes-Benz VISION EQXXMercedes-Benz VISION EQXX
The user interface and user experience inside the VISION EQXX provides a vision of a highly responsive, intelligent and software-driven future. It is Mercedes-Benz’ first ever completely seamless display spanning 47.5 inches from one A-pillar to the other. (Image: Mercedes-Benz)

The one-piece display is also highly energy efficient. Its mini-LED backlight consists of more than 3000 local dimming zones, meaning it consumes power only as and when needed in specific parts of the screen. The 3D navigation screen adapts to the type of content being shown. For instance, if you’re driving in an urban area, abstract visualization of the surrounding buildings helps provide orientation amid densely packed streets. However, if you are traveling on the motorway or open road, the level of detail diminishes to provide a clearer overview of the journey. This has the added efficiency benefit of reducing the energy consumption of the display.

As well as providing seamless navigation, the intelligence in the VISION EQXX can mine for data based on the car’s route. There is also a system to help you drive more efficiently. From energy flow to terrain, battery status and even the direction and intensity of the wind and sun, the efficiency assistant curates all the available information and suggests the most efficient driving style.

This is supposed to enhance the driver’s own senses by providing input on external conditions that the driver is unable to feel directly – in the way that, for instance, a cyclist can feel the force of the wind, or the extra effort involved to pedal uphill. This sensorial support is further augmented by the ability of the car to use the map data to “see into the future”, anticipating what lies ahead to help the driver take advantage of it in a way that maximizes efficiency.

A series of screens can also provide more detailed information, with things like the influence of current acceleration, gradient, wind and rolling resistance on energy consumption shown in real time. The simplicity of the interface is a further development of the “Zero Layer” concept first used in the EQS, which eases driver-vehicle interaction by dispensing with submenus.

GM Chevrolet Silverado: the Ultium effect

In her CES 2022 keynote, Mary Barra, General Motors chair and CEO, enthused about the potential for mass adoption of EVs, saying the industry was at a point of inflection. As part of that she unveiled the 2024 Chevrolet Silverado EV, which will offer an expected 400 miles on a full charge.

2024 Silverado EV RST
General Motors 2024 Silverado EV RST. (Image: General Motors)

This and many of its other new EVs are based on the company’s Ultium platform, and the first application of its Ultifi Linux-based software platform. Developed in-house at GM, the latter separates the vehicle’s software from the hardware to enable rapid and frequent software updates, as well as seamless delivery of software-defined features, apps and services to customers over the air. It offers the potential for more cloud-based services, faster software development and new opportunities to increase customer loyalty.

Ultifi’s functionality builds upon GM’s vehicle intelligence platform (VIP), the company’s advanced electrical architecture. VIP-enabled vehicles provide over-the-air capability, plenty of data bandwidth, robust cybersecurity and fast processing power. On top of this foundation, GM engineers will separate key software into a new centralized layer that acts as a powerful hub for vehicle systems. The Ultifi platform will then enable accelerated development and deployment of software and applications over the air to millions of customers, without affecting basic hardware controls.

Blackberry IVY: enabling a vehicle’s digital fingerprint

Meanwhile, Blackberry used CES 2022 to talk about Blackberry IVY, its intelligent vehicle data platform co-developed with Amazon Web Services (AWS). Making its debut at the show was the first demonstration of BlackBerry IVY on physical hardware. Its aim was to show how the ‘AI-based’ decisions being made by BlackBerry IVY would impact in-car experience and how it would look for drivers and passengers though a physical car dashboard.

Incorporating technology from BlackBerry IVY ecosystem partners including HERE TechnologiesCar IQ and Electra Vehicles, the demo was designed to show use cases that can be enabled via the platform in the form of enhanced predictions, intelligent recommendations and in-car payment capabilities that utilize in-vehicle data from multiple sensors. Using Electra’s EVE-Ai 360 adaptive controls for battery pack optimization, it provides an accurate battery state of charge and range prediction based on driver detection and personalization, while actively working to extend range and preserve battery lifetime.

blackberry-ivy-i1-digital-ecosystem
The Blackberry IVY digital ecosystem. (Image: Blackberry)

The system also ingests preloaded HERE Technologies’ data to provide tailored guidance, pricing and availability of vehicle charge stop locations, more efficient routes with carbon footprint estimates, and individual driver customizations. A third integration via Car IQ creates a “digital fingerprint” for the vehicle, allowing it to securely connect to a bank, card payment networks and in-vehicle marketplaces that allow the car to validate and autonomously pay for a wide range of frequently used services, including EV charging, tolls, parking, insurance, maintenance, and other payment and wallet capabilities.

The multi-OS, distributed architecture demonstration features BlackBerry IVY running on both Linux and QNX across gateway and digital instrument cluster domains, in collaboration with KPIT Technologies.

BlackBerry said IVY allows automakers to develop a better understanding of how vehicles are used and can be improved while allowing software developers to create enhanced services and experiences for drivers. The company said, “Being able to see in real-time how the platform enables all this is incredibly powerful and a great a catalyst to help those from across the transportation industry re-imagine all that is possible when you have the right tool.”

This article was originally published on Embedded.

Nitin Dahad is a correspondent for EE Times, EE Times Europe and also Editor-in-Chief of embedded.com. With 35 years in the electronics industry, he’s had many different roles: from engineer to journalist, and from entrepreneur to startup mentor and government advisor. He was part of the startup team that launched 32-bit microprocessor company ARC International in the US in the late 1990s and took it public, and co-founder of The Chilli, which influenced much of the tech startup scene in the early 2000s. He’s also worked with many of the big names—including National Semiconductor, GEC Plessey Semiconductors, Dialog Semiconductor and Marconi Instruments.

 

Leave a comment