Smart sensors to improve ADAS designs

Article By : Robert Schweiger

By preprocessing sensor data, smart sensors alleviate the power and data rate drain that occurs when all sensor data...

Sensor systems are becoming more and more ubiquitous on cars right now, with level three autonomous cars already in production and level four autonomous cars right around the corner. In addition, starting in 2018, in the United States, all new cars—regardless of the level of autonomy—must have rear-view cameras installed. The demand for sensors is skyrocketing and will continue to do so.

With this demand comes the need for more sophisticated and advanced driver-assistance systems (ADAS) with sensors, processors, and central sensor fusion units to interpret the vast amount of sensor data now being collected.

Cadence, ADAS APP

Figure 1: ADAS applications must sense, analyze the data, and act upon that data. (Source: Cadence)

Figure 1 shows the different kinds of sensors required for level three to level five automated driving. This sensor data is then fused together in a central sensor fusion platform that also combines V2X (vehicle-to-everything) communication, GPS, and digital maps for localization. After the sensor data is compiled, the system must decide how to react to that sensor data: by braking, steering, or accelerating. And all of this processing must, of course, happen in real time, as close to as instantaneously as possible.

System Configuration

There are two schools of thought on how this system is configured: a raw data sensor fusion platform, or a hybrid sensor fusion platform.

The raw data sensor fusion platform requires the powerful computer to be in the center, where all the raw data from the sensors is processed. This requires high-speed data links to connect the data from the sensors to the central sensor fusion CPU or GPU. Depending on the car’s sensor configuration, the performance of the central sensor fusion CPU could easily go beyond 100 TFLOPs to enable L3+ automated driving. For example, the NVIDIA DRIVE PX Pegasus even provides 320 TOPS. The centralized computer interprets the data and decides how to react. A drawback of this approach is that the sensor fusion computer must consume a great deal of power for this data processing and interpretation. Systems that follow this concept use as much as 500W of power—and sometimes even more—thus requiring an active cooling system like water cooling.

The hybrid sensor fusion platform allows for some of the processing to occur in the sensors themselves, making them “smart” sensors. Pre-processing in those sensors for object detection or classification, for example, allows the central sensor fusion platform to be more lightweight. This also allows for redundant AI processes running either at the smart sensor or at the central computing unit to verify their conclusions. Another advantage in transmitting only object-level data to the central fusion unit is that pre-processed data requires a significantly lower data rate at the sensor interface. Smart sensor processing may require a compute performance of 1-4 TMAC depending on the software algorithms being used.

Sensor Specifics

As shown in Figure 1, some of the sensors used in automobile ADAS include cameras, radar, and lidar. Each sensor system has its advantages and disadvantages. Using multiple sensor technologies improves the safety level of the car and at the same time can relax the safety requirement for each individual sensor.

Cameras are the only sensors that actually “see”. They can recognize texture, detect traffic signs, can be used for object detection, and can inexpensively build a 3D map of the area surrounding the car. For a higher level of automated driving up to 130km/h HD cameras with a higher resolution and higher frame rates are required for recognition; for example, lane markers up to 500m during daylight.

A significant downside to using camera sensors is that poor visibility conditions—weather, low light, and glare—affect the camera’s efficacy. Also, to avoid image degradation from the original source, little or no video compression can be applied typically. Therefore, cameras in safety-critical ADAS applications require a high-speed interface to transmit raw data to the central sensor fusion unit—up to 24Gb/second. As an alternative, it can be very effective to pre-process the data, turning the camera into a “smart” camera that deploys on-device neural networks for object identification and classification. Instead of the raw data being transmitted, only object-level data is transmitted to the central sensor fusion unit and then processed. Pre-processing that data on the camera device with a DSP, such as the Cadence® Tensilica® Vision Q6 DSP for embedded vision and AI, can alleviate this problem.

To further improve the safety level of the camera system, redundant flows transmitting raw data as well as object-level data to sensor fusion unit can be realized. This way object detection can be done at the camera and the sensor fusion unit, using different algorithms, for example.

Radar is very robust in every weather condition, in difficult light conditions, and has a good range—but it does have some disadvantages in terms of the angular and range resolution. With an angular resolution of 1.2° and range accuracy of about 10cm, there is little room for error. 10cm can be the difference between a near miss and a catastrophe.

A lot of innovation has been recently going on to improve the resolution by developing 77-81GHz imaging radar systems. However, the higher the resolution of the radar sensor, the higher the data rate that needs to be transmitted. Moving from frequency-modulated MIMO radars (28Gb/second) to digital-modulated radars offers even higher resolution, but the raw data rate could go up to as much as 120Gb/second. This would require an automotive interface which goes way beyond the data rate of a MIPI CSI-2 interface, for example. Turning the radar sensor into a “smart” radar sensor with on-device signal processing for range and velocity FFTs, digital beamforming, and angular estimation, etc. would be the solution, reducing the data rate for the object list down to 100kb/second, which can be easily transmitted via the controller area network (CAN) interface. In addition, the higher radar resolution neural networks can be applied for on-device object detection and classification.

IMEC has just announced the world’s first CMOS 140GHz radar-on-chip system with integrated antennas in standard 28nm technology, which will further improve the performance and reduce the form factor of the radar system.

That said, applying all these techniques causes a huge increase in processing requirements, which need to be done within a power budget of less than 6W at the radar sensor. This can only be achieved by leveraging very efficient low-power DSPs (like Tensilica ConnX DSPs) instead using of power-hungry CPU/GPUs for raw sensor data processing in the central computing unit.

Lidar offers the best of both worlds: it can capture an effective 3D map of the areas surrounding a car with superior resolution. The angle of resolution for a lidar system is 0.1°, and the range accuracy is better than radar, about 5cm or less. There are two popular concepts for solid-state lidar technology, flash lidar sensors, and pulsed timed-flight lidar sensors using micro-electro-mechanical systems (MEMS). Current lidar systems require a relatively low transfer data rate (hundreds of Mb/sec), compared to high-resolution imaging radar systems. However, future solid-state lidar systems will produce a much higher raw data rate with more than 1Gb/sec, which might ask for on-device processing.

The drawback is that lidar is a very expensive technology, currently requiring a bulky mechanical moving sensor mounted on the top of the automobile, which has its challenges in range and bad weather conditions like fog. However, different types of solid-state lidar sensors are currently in development. This will bring down the cost significantly, so the four solid-state lidar sensors that would be required for each automobile to achieve 360° coverage will not be cost-prohibitive.

The raw sensor data rates in Table 1 depend on the architecture, methods, and resolution of the sensor and should just provide some guidance for current and upcoming higher-resolution sensors.

Cadence, ADAS P2

Table 1: Transfer data rate required for each sensor.

Ultrasound is a sensor that measures the distance to nearby objects and can warn the driver of surrounding obstacles. It’s mainly used for park assist, blind spot detection, maneuvering in narrow situations and other applications. Cars will be equipped with up to 12 ultrasonic sensors to fully capture the vehicle’s surrounding area. However, processing requirements and data rates are rather low.

Conclusion

Regardless of the architecture required, using hybrid sensor fusion or raw data sensor fusion to offload the number-crunching tasks from the CPU or GPU is critical in ADAS applications. Pre-processing sensor data with DSPs, thus turning the sensor—camera, radar, or lidar—into a smart sensor, enables the same functionality and alleviates the power and data rate drain that occurs when all the data processing must be handled by a single, centralized computer. Especially in electric vehicles, achieving low power consumption is of the utmost importance to enable automated driving.

— Robert Schweiger is director of automotive solutions at Cadence.

Leave a comment