Nvidia rocks CES with auto-pilot, co-pilot scenarios

Article By : Junko Yoshida

Nvidia comes to CES to demonstrate how far the graphics chip company has come in advancing the development of AI-enabled autonomous cars.

It's hard to imagine a chip company who would have met with much enthusiasm from attendees at one of the world's biggest trade shows. Except for Nvidia, that is.

Last Wednesday night, Nvidia co-founder and CEO Jen-Hsun Huang, in his familiar black leather jacket, rocked a whooping and cheering crowd of 2,500 during his opening keynote speech at the Consumer Electronics Show. As Huang talked about revolutionising gaming, entertainment and the transportation of tomorrow with the power of the company’s GPU and AI technologies, the audience was ready and eager to believe his every word.

Huang’s star power aside, Nvidia came to CES to demonstrate how far the graphics chip company has come in advancing the development of AI-enabled autonomous cars.

Intel is trying to catch up by rolling out its own development platforms and the alliance with Mobileye and BMW. But the processor giant’s automotive future is still dependent on the execution and integration of the many automotive technologies Intel has amassed over the last 12 months.

During Huang's keynote, Nvidia showed a glimpse of Xavier (originally unveiled last fall), described by the company as an “AI car supercomputer.”

[Nvidia Xavier AI 01 (cr)]
__Figure 1:__ *Nvidia CEO Jen-Hsun Huang during his CES keynote speech.*

While showing off a small board a little larger than the palm of his hand, Huang said that Xavier can process 30 trillion operations a second at 30 watts. It’s integrated with an 8-core custom ARM64 CPU and 512 core Volta GPU. The Nvidia CEO also noted during his speech that Xavier chip itself is ASIL C but its module will be designed for ASIL D safety functionality.

Built on top of the hardware is a new OS called DriveWork, explained Huang. The OS helps fuse data coming from different sensors and location information from HD mapping.

Nvidia showed a video clip of Nvidia’s BB8 autonomous vehicle zipping around streets in California.

Auto-Pilot and Co-Pilot

The car, Nvidia explained, understands and interacts with the driver in natural spoken language. It negotiates stop lights, stop signs and intersections, while making its way to the freeway before passing control back to the driver when requested.

Layers on top of the DriveWorks OS include Auto-Pilot DNN and Co-Pilot DNN. While discussing the BB8 autonomous car’s perception capability, Huang said, “We are using two types of AI. One is AI driving you as auto-pilot, and another is AI looking out for you as a co-pilot.”

In addition to Xavier, Nvidia announced a number of newly formed partnerships. They include deals with Audi, Bosch and ZF. Huang also laid out the company’s HD mapping strategy involving China’s Baidu, Tom Tom, Japan’s ZeNRIN and HERE.

Nvidia and Audi have been working together for a decade. Huang reminded the audience that Audi was the first car company to come to CES seven years ago, when it announced that it was using Nvidia technology. To celebrate their log-standing partnerships, Huang brought Audi of America Head Scott Keough to the stage.

[Nvidia Audi automotive AI (cr)]
__Figure 2:__ *Head of Audi America, Scott Keough (left) and Nvidia’s CEO Huang.*

Under the new partnership, the two companies’ focus will be on putting “advanced AI cars on the road starting in 2020,” said Huang.

The first phase of the collaboration will focus on Nvidia’s Drive PX AI platform for self-driving cars. Future Audi car models will use deep learning to tackle the complexities of driving.

Audi’s Keough, on stage, talked up the autonomous car the company has brought to Las Vegas. “When you test drive our Q7 in the Gold Lot, it’s driving by itself," he said. Q7 learned how to drive in just three days through AI, he added.

Audi believes that this will be Level 4 automation in just three years.

This article first appeared on EE Times U.S.

Leave a comment