Sensor electronics combined with better and refined software algorithms will ultimately enable a safe, fully autonomous vehicle within the next ten years.
Vision image sensors
Although 2D LIDAR is many times integrated with a vision sensor, we will look solely at the vision sensor by itself in this section.
Mobileye technology is quite extensive.
Their Mobileye Shield +TM is quite an advanced Collision Avoidance System, used by Tesla Motors. The system supports the targets of the Vision Zero Initiative, a multi-national road traffic safety project whose goal is to achieve a highway system with no fatalities or serious injuries in road traffic, started in Sweden in October 1997. The system only alerts drivers if a collision is imminent with vulnerable road users but not inanimate objects.
This collision avoidance system has the following lifesaving features:
Their Mobileye 5 system has Bluetooth and provides the driver with audio-visual warning in critical real-time through their smartphone application.
Georgia Institute of Technology
Georgia Tech has proposed an innovative 3D system package architecture which will take into account electrical, mechanical and thermal designs as well as new digital, RF, sensors, radar millimetre-wave and power technologies. They are proposing highly innovative large panel-based, ultra-thin glass packaging as well as innovations in electrical, thermal, mechanical designs, materials, processes, wiring lithography, fine-pitch and high-throughput assembly and highly-conductive through-Copper vias for signal, power and heat transfer.
The development of collision-avoidance sensors to enhance autonomous vehicle technology will require more advances for a new era in connectivity to infrastructure, as shown in Figure 5.
__Figure 5:__ *Wireless connectivity to surrounding infrastructure is one key element needed to bring autonomous vehicle technology to a safe level. (Source: Reference 5)*
The latest advancements in vehicle sensors and communication technologies have led to better driver visibility and awareness–around the vehicle–as well as other amazing features such as park assistance, adaptive cruise control, lane-keep assistance, traffic-sign recognition and pedestrian detection, as shown in Figure 6.
__Figure 6:__ *ADAS enhancements from new sensor technologies. (Source: Reference 5)*
Automotive electronic systems need to perform dozens of functions that include wireless communications; wireless sensing, stereo cameras, mm wave electronics, high bandwidth electronics or Photonics for data processing with security for autonomous driving; and high-power and high-temperature electronics for all electric vehicles. Integration of all these into one or more packages is more than Moore's Law, with on-chip transistor integration and, much more than Moore's Law (MTM) with stacked integration packages or SIP. This is System Moore's Law (SM) for complete system integration, leading to “A-SOP,” (Automotive System-on-Package) with a huge and growing market. See Figure 7.
__Figure 7:__ *Georgia Tech proposes a consortium for the automotive electronics industry. (Source: Reference 5)*
Viewing humans around the vehicle
__Figure 8:__ *The complex roles of humans need to be examined by developers of highly automated and self-driving vehicles. Autonomous vehicles must observe, understand, model, infer, and predict behaviour of occupants inside the vehicle cabin, pedestrians around the vehicle, and humans in the area of the vehicles. (Source: Reference 6)*
Approaching a scene
Imaging has been progressing greatly in studies that can help to control space approaches. These types of research studies and experiments will help prevent autonomous vehicle accidents and loss of life as the sensor systems evolve along with algorithm maturity.
There are so many things still to be learned about the ability of techniques to capture context in a holistic manner, as well as handle many unusual and un-predictable scenarios and objects, performing the extensive analysis of fine-grained short-term and long-term activity information regarding observed agents (both humans and vehicles), the art and science of predicting activity events and making decisions while being surrounded by very unpredictable humans and vehicles.
__Figure 9:__ *Here is a sample image-to-control policy pipeline (mediated-semantic perception) with deep neural networks (DNN), where initial prediction of semantic scene elements is followed by a control policy algorithm. (Source: Reference 6)*
Sound detection and analysis
Major advances are being accomplished in microphone development and sound quality. I am of the opinion that another sensor source in autonomous vehicles must be sound. This added aural input along with all the other visual sensors will add another layer of information input to the processor and algorithms that will provide critical decision-making capability that will ensure safety in autonomous vehicles on the road.
Humans react to car horns (do we include those in autonomous vehicles as an added layer of protection to alert other drivers, pedestrians and vehicles?), siren sounds from ambulances, fire trucks and police vehicles. I believe we need to add hearing to the autonomous vehicle to complete the emulation of a human driver.
There are so many advances in microphones regarding sound quality, robust MEMS designs that can survive the harsh automotive environment, noise reduction, low power operation and dynamic range capability.
Suffice it to say that we are not at the crossroads yet to decide upon road use of a fully autonomous vehicle with no human intervention. Much work is still to be done, but the science and engineering of sensors and mature algorithms is rapidly developing to predict random behaviour of humans and vehicles and react quickly to avoid damage to vehicles, property, but above all—precious human life.