Sensor fusion rising: Mobileye/ST out autonomous car platform

Article By : Junko Yoshida

Mobileye and STMicroelectronics are co-developing the next generation of Mobileye's SoC to act as the central computer performing sensor fusion for Fully Autonomous Driving vehicles starting in 2020.

Right after NXP unveiled Bluebox, an autonomous car engine, Mobileye and STMicroelectronics were quick to reveal a new generation of their vision SoC, EyeQ5. They are touting it as a "sensor fusion central computer for autonomous vehicles."

Unlike Bluebox, EyeQ5 is an SoC that will be ready in two years, according to ST/Mobileye. It will be manufactured by using a 10nm or below FinFET technology node.

Two diverging approaches

NXP and ST/Mobileye, two competing teams, are taking different approaches to seal deals with OEMs in the autonomous vehicles platform battle.
On one hand, NXP is promoting not only the Bluebox engine (consisting of a number cruncher and a safety/vision controller), but also a comprehensive autonomous vehicles platform with sensor fusion capabilities and decision-making functions.

NXP is leveraging its strong position in the ADAS processor market. Matt Johnson, NXP's VP and GM for automotive MCUs and processors, said his company has shipped more than 30 million ADAS processors worldwide, with eight of the world's top 10 largest carmakers using its processors.
On the other hand, the ST/Mobileye team is angling to enter the sensor fusion market, for the first time. Mobileye, for a long time, appeared convinced that vision is enough to enable autonomous driving.

Egil Juliussen, director of research, infotainment & ADAS at IHS Automotive, said, "I see that they are changing their tune a little [in the EyeQ5 announcement]. I suspect, under pressure from car OEMs, Mobileye is now adding other sensory data to do sensor fusion on the chip."

Clearly, the ST/Mobileye team hopes to take advantage of an EyeQ chips' dominant share for the automotive vision SoC market.

Earlier this year, Mobileye co-founder and CTO Amnon Shashua said one-third of the global car industry is already using EyeQ chips. He told the audience at a Mobileye press conference that Toyota and Daimler are the only two automakers not using Mobileye's vision chips.

Pre-emptive strike or Hail Mary?

The ST/Mobileye's EyeQ5 announcement is seen by many in the automotive industry as a pre-emptive strike against NXP's Bluebox.
IHS Juliussen said, "It looks like they're telegraphing to automotive OEMs and tier ones that they should wait until EyeQ5 comes out." Jim McGregor, principal analyst at TIRIAS Research, went a step further. "I'd call the ST/Mobileye move more like Hail Mary."
Asked to compare various autonomous vehicles platforms pitched on the market today, McGregor said, "I see that Nvidia's Drive PX2 is the high-end super computer capable of local deep learning. The BlueBox is a practical control and sensor fusion platform that is automotive compliant and fits the needs of current platforms, which NXP will evolve over time." Meanwhile, "The EyeQ5 is somewhere in between."

McGregor added, "Note that the cost probably falls within those lines as well. Each solution has its strengths. Nvidia has the high-end performance and deep learning expertise; MobileEye has the database for an all-in-one solution; and NXP has a full sensor fusion platform that is integrated with all the vehicle systems. All of these platforms will evolve over the next few years, and there is room for all three vendors."
In short, the autonomous vehicles platform battle has only begun. "It will be interesting to see who wins and where," said McGregor.
By all accounts, though, EyeQ5 comes with an impressive list of technical features and performance.

Setting aside claims about the chip being two years down the road, Juliussen noted, "This is a very powerful, impressive chip." He also added, "Mobileye has good credibility. I'm sure the company will deliver."

So what's inside the chip?
[sensor fusion]
Figure 1: While NXP is promoting the Bluebox engine, ST and Mobileye are aiming to enter the sensor fusion market, where, Mobileye is convinced that vision is enough to enable autonomous driving. (Source: Mobileye/STMicroelectronics)

The EyeQ5 will contain eight multithreaded CPU cores coupled with eighteen cores of Mobileye's next-generation, well-proven vision processors, explained Marco Monti, ST's EVP, Automotive and Discrete Group. In contrast, EyeQ4, a previous generation vision SoC, had 4 CPU cores and six Vector Microcode processors.

Why go FinFET?

Monti explained that this level of complexity has prompted ST to use the most advanced technology node available, 10nm or below FinFET. This is a huge shift from previous generations of EyeQ chips, which are manufactured by using ST's FDSOI process technology. Asked about a foundry for Eye5 chips, ST declined to disclose either the node or the foundry partner.

Undoubtedly, sensor fusing "is a big goal of EyeQ5," said Monti. It is able to drive 20 external sensors (camera, radar or lidar), he added. This is compared to the eight cameras of EyeQ4, he added.

All of the sensory data will be managed by the EyeQ5 system. "The architecture defined for EyeQ5 will make it possible to integrate signals coming from different sources independent of the origin of the sources (camera radar or lidar in any combination)," Monti explained.
The ST/Mobileye team is calling the performance of EyeQ5 "unprecedented."
EyeQ5 performs 12 Tera operations per second (80 times more than an available reference graphics accelerator) and it will keep power under 5W, according to Monti. In comparison, EyeQ4 is capable of 2.2 Tera operations per second.

Juliussen cited other notable features in the EyeQ5, including built-in security defences based on the integrated Hardware Security Module. This enables system integrators to support over-the-air software updates, secure in-vehicle communication, etc. "The root of trust is created based on a secure boot from an encrypted storage device," according to the ST/Mobileye announcement

The launch of EyeQ5 does not mean that other EyeQ chips will go away, according to Monti.

"The EyeQ5 will target high-end applications for autonomous driving and will co-exist with the EyeQ4," he said. The EyeQ4 will "continue to target mid-range applications for assisted driving (autonomous driving level 2) that will remain in the market."
In the ST-Mobileye partnership, Mobileye is responsible for the algorithms and the software to the car makers. ST does the IC design and engineering, automotive qualification and mass production. This is no different from the past, said Monti. "Due to the criticality of the target application (full level 4 autonomous driving), ST is responsible to provide a full ASIL-D IC to match the most stringent automotive quality requirements," he added.

NXP opts for open platform

Meanwhile, NXP's Johnson explained that the company's autonomous vehicles platform is designed to work with all the NXP silicon that processes sensory data. Rather than providing OEMs and tier ones with isolated sensor solutions, "We tied them all together. We are showing them how they all work together on this platform." He said, "It would be a little hypocritical of us, I think, if we just give them sensor chips and tell them you figure it out."

Asked to compare EyeQ5 with Bluebox, NXP's automotive team observed that "Bluebox is part of a larger platform that includes V2x, lidar/radar/vision and others. It's not a point solution." In short, what NXP is demonstrating this week at NXP FTF is "a full platform that is here and now." Bluebox does include safety ASIL and security. "Bluebox is more than a sensor fusion solution because it also includes the 'decision engine' piece," an NXP spokesman added.
Most important, NXP's platform, not Bluebox but its platform, "can support non-NXP sensors," he added. "Ours is an open platform."

Leave a comment