Smart technology must be designed and employed in as safe a manner as possible.
In Arizona, where I live, a woman crossing Mill Avenue in Tempe, AZ was struck and killed by a self-driving Uber vehicle in March. I am pretty aware of what goes on around me, being a tech journalist, but I never think about autonomous vehicles roaming the streets of Arizona when I am crossing a street here, and neither did that woman. Maybe the following idea could have helped along with vehicle-to-everything (V2X) technology.
The use of pedestrian path prediction with a dynamic Bayesian network, which integrates awareness of pedestrian head orientation and situational awareness, situation criticality, and spatial layout cues, and V2X technology might have saved a life in Arizona. V2X can help identify pedestrian movement out of the line of sight of drivers such as in an area with many parked cars, SUVs, and trucks.
Being a technologist and electronics engineer by trade, I want to see smart technology succeed in all aspects of our life, where it makes sense for the betterment of society. The caveat here is that that technology must be employed in as safe a manner as we can make it.
Approximately 1.25 million people die each year as a result of road traffic crashes according to World Health Organization (WHO)2. It’s no doubt that people driving vehicles on today’s roads are far more distracted than ever (with smartphones, tablets, GPS-guided roadmap displays on our dashboards, etc.). This is dangerous and deadly. Autonomous vehicles can eliminate distracted driving by a human driver, but it is humans and only humans that will ultimately create and control the future autonomous vehicle through software, processing, and sensors. We’re engineers; we can do this safely.
Testing in a smart city environment is a far more controlled and safer way to perform autonomous vehicle testing. Let’s look at some possibilities with which we can expand upon and develop, as safety systems processing, software, and sensor technologies mature.
ISO 26262 is an international standard for the functional safety of electrical and/or electronic systems in production automobiles. The goal of this standard is to provide an automotive safety lifecycle including management, development, production, operation, service, decommissioning, and supports tailoring the necessary activities during lifecycle phases. ISO 26262 actually covers the functional safety aspects of the entire development process, which includes requirements for specification, design, implementation, integration, verification, validation, and configuration. It also provides requirements for validation and confirmation measures to ensure a sufficient and acceptable level of safety is being achieved.
The standard not only covers hardware, but firmware, software, and tools as well. It also provides an automotive-specific risk-based method to determine risk classes. There are automotive safety and integrity levels (ASILs) with “A” being the least stringent level and “D” being the most stringent level. National Instruments has an excellent white paper on this.
Inside and around the vehicle: Humans
We must first look at what humans are doing in and around the vehicle in the event of new driver assistance features in intelligent vehicles as well as in fully autonomous vehicles3. Designers must understand, model, and ultimately predict human behavior/movement in and around the vehicle as well as in surrounding vehicles.
Inside the vehicle
Regarding humans inside the highly-automated or self-driving vehicle, there may be instances that the driver may be distracted or fatigued; or if there are passengers, the system must be assessing whether they are ready to take over driving in an emergency. Multiple cameras inside the vehicle with different viewing angles are needed.
In-vehicle suffocation for humans and animals is a subset of this topic area (Reference 4). Temperature inside the car may be a factor and passive infrared and sound sensors may be deployed as well. Maybe a microwave sensor could be deployed to sense breathing.
Around the vehicle
Smart systems, various cameras, LIDAR, radar, and maybe even AI would be needed here along with V2X capability (more on that later).
Sensing here may use some different techniques than the previous section. Vision sensors will need color, thermal, and range capability as well.
Humans in surrounding vehicles
Vision-based algorithms can be applied to understand human behavior and intent, predict maneuvers, and recognize skill, style, and attention even in surrounding vehicles with the right cameras, sensors, and software. The Kanade–Lucas–Tomasi tracker algorithm can be used especially for face detection and tracking.
[Continue reading on EDN US: Vehicle sensor technology]
Steve Taranovich is a senior technical editor at EDN with 45 years of experience in the electronics industry.