RTI offers a way to leap to autonomous vehicles

Article By : Brian Santo

Autonomous vehicles will have to analyze the data from multiple sensor systems in split seconds, requiring expertise in real-time systems that is generally lacking. RTI has tools it says will make the job a whole lot easier and quicker.

Over the years, motor vehicles have accreted one electronic subsystem after another, most of them operating independently from the others. As auto manufacturers develop autonomous vehicles (AVs), they are going to need those electronics subsystems to work in concert. Unifying them is only part of the challenge, however. Whatever internal communications system is adopted to link them together must do nothing to impede the ability of the vehicle to use sensor data in real time. Communications must be secure. Ideally the tools to accomplish that should also be easy for designers to work with.

Real Time Innovations (RTI) just introduced what it believes is the only commercial solution that will accomplish all of that. The most recent version of RTI’s Connext is a set of tools that engineers can use to design onboard communications systems for vehicles. Connext 6 relies on the data distribution service (DDS), a real-time middleware standard that RTI helped develop at the turn of the century. The associated tools in Connext 6 work at the data level, allowing designers to determine how AV electronic subsystems behave without having to know their inner workings.

Many of the automakers most enthusiastic about getting AVs to market sooner rather than later have been developing their own on-board communications systems. RTI believes that doing all that work in-house is a tough row to hoe, however, because a good solution requires a background with real-time systems. Vehicle makers tend to lack that expertise, said Bob Leigh, RTI’s senior market development director of autonomous systems.

AVs will have cameras, lidar, radar, other sensors, and GPS, and the vehicle will have to fuse that data in a way that makes sense, for example determining whether the multiple sensor systems are detecting the same car, and if not, how many they’re seeing. RTI refers to combining that data as “sensor fusion,” and in its estimation, sensor fusion is where the company can make the strongest case for its approach.

“We have data coming from different sensors, it’s coming at different data rates, it has different volumes, and then we add in command-and-control, and things like status,” Leigh said. “We have high-resolution data taking a lot of bandwidth, data that needs very low latency, and on the other end we have commands — when you send one, it has to be recognized, it has to be received and it needs acknowledgement.”

Getting all of that data where it needs to go, in real-time, is complex, he explained. “Sometimes you have algorithms that can accept data on a best-effort basis. If you drop a little data, it doesn’t matter, but other systems might use exactly the same data –that might be a control algorithm and it needs every bit of the data. You have these different permutations and combinations of what kind of data you have and how it is used.”

“In traditional embedded or networking systems you’d use multiple protocols,” Leigh continued. “What that means is a whole lot more overhead in developing your software. It’s also much more brittle; there might be interference between protocols. And you’d perhaps be using a ton more bandwidth than you need in an optimized system.”

Working with DDS and Connext, AV designers have a databus created specifically to move data, and a single applications programming interface (API) to access it, regardless of which electronics subsystem the data is coming from. DDS doesn’t examine packets and figure out what to do with them; in a DDS system, the data bus is purely a conduit.

“If I’m a developer and I need that data, I can get that data. I don’t have to worry about streams or bandwidth, I just specify how I want that data to behave,” Leigh said.

The data communications and subsequent processing need to be done in as close to real time as possible, because real-time is literally critical in AV systems.

“AI and autonomous vehicles have to make decisions in microseconds so that safety is assured, so that lives are saved,” Leigh said.

There are five levels of vehicular autonomy (six, if you count “none”). Manufacturers are commercializing vehicles at Level 3. It is likely to take several more years before Level 4 vehicles are ready for the road. Source: The Society of Automotive Engineers

The extent of vehicle autonomy is defined in a five-level hierarchy. Levels 1, 2, and 3 describe increasing amounts of driver-assist technology. What Levels 1 to 3 have in common is that a human driver has ultimate control of the vehicle at all times. Levels 4 is self-driving with a human monitor who will need to take the wheel occasionally. Level 5 is self-driving with no human involvement. The steps from Level 1 to Level 3 were (relatively) short technological hops, but the transition between Level 3 and Level 4, while conceptually simple, represents an enormous technological leap. And while there have been vehicles demonstrated to have Level 4 and 5 capability, the technology is not anywhere near advanced enough to be commercialized.

The industry is investing a lot in Level 4, Leigh said. There are no Level 4 vehicles commercialized today, “but the feeling is that in five years, maybe sooner, we might see something like robo-taxis at level 4,” Leigh said.

“There is a step change in complexity going from Level 3 to Level 4. When you need the system to behave correctly more than 99 percent of the time – more than five nines – it has to handle every situation you can conceive of, and some you can’t conceive of. There’s a lot more software in there, a lot more engineering, a lot more testing. This is where our software comes into play.”

RTI also suggests that adopting its tools and its databus approach is that, being standards-based, it could become a common basis for AV certification.

The auto industry is pursuing programs that seem as if they might be solutions to the problem, but they address tangential concerns, RTI said. One such is AutoSAR, an auto industry consortium that devises standards for application development for microcontroller-based electronics systems used in vehicles. What AutoSAR doesn’t do is address how those electronics systems will work together in a system riding the road.

Some auto manufacturers are also interested in adapting the robot operating system (ROS) for use in AV manufacturing. Despite the name, ROS is not an operating system. Similar to AutoSAR, it is a standards-based approach to writing applications, but for robotics systems.

RTI says Connext 6 actually supports use of the DDS standard in both the AutoSAR Adaptive Platform and ROS (the ROS2 version). The Connext databus integrates AutoSAR Adaptive, ROS2, and native DDS components together for optimized end-to-end data sharing with little or no custom integration required, the company claims.

The company also says that Connext 6 supports all of the operating systems and processor families commonly used by OEMs and their suppliers.

The company claims it is involved in eight electric vehicle startups, eight passenger vehicle projects, and seven projects having to do with trucks, mining vehicles, and forklifts. Of course the company is constrained from identifying most of its customers, though on its web site it lists Audi, Volkswagon, and NextDroid. That last company is a startup that has yet to announce exactly what it is doing; its web site mentions driverless cars and shows a picture of an underwater vehicle that is almost certainly a drone given its shape and size.

EDN editor-in-chief Brian Santo has been writing about science and technology for over 30 years, covering cable networks, broadband, wireless, the Internet of things, T&M, semiconductors, consumer electronics, and more.

Leave a comment