Technologies Combine to Combat Distracted Driving

October 20, 2020

Story

Technologies Combine to Combat Distracted Driving

It’s important for drivers to change their behavior – and put down their phones – but as we observe National Distracted Driving Awareness Month, let’s take a look at how emerging technologies can help

We have all heard how texting and other distractions while driving can lead to accidents. Distracted driving claimed the lives of 2,841 people in the United States in 2018 alone, according to the National Highway Traffic Safety Administration. It’s important for drivers to change their behavior – and put down their phones – but as we observe National Distracted Driving Awareness Month, let’s take a look at how emerging technologies can help as well.

There are two primary ways a vehicle could reduce risks associated with distracted driving. The first is for the vehicle to maintain its own awareness of the environment around it and take emergency actions if necessary. The second is to monitor the state of the driver to ensure the driver is attentive.

Vehicles gather data about their environment through a variety of sensors, including radar and cameras, with ultrasonics playing a role in short distances at low speeds and lidar used in autonomous driving. Importantly, the data from these inputs have to be merged intelligently to get the clearest picture possible about what is around the vehicle, a process called sensor fusion.

Sensor fusion delivers the most accurate environmental model because it draws from the relative strengths of the different sensors. Radars are excellent for detecting how far away an object is and how fast it is moving, and they can operate in all weather conditions. Cameras are well suited for object classification – they can read street signs, and they can tell whether an object is another vehicle, a pedestrian, a bicycle or even a dog.

We can also apply machine learning to ensure the most complete picture. For example, machine learning can help improve radar’s edge detection. Developers can present many examples of objects in a particular category to a machine-learning system, and it can learn how radar signals are scattered by complex objects with many reflection points. It can take advantage of contextual information. And it can even learn from simultaneous data provided by cameras or high-definition (HD) maps to classify objects based on radar signals.

A robust environmental model allows the vehicle to make intelligent decisions to prevent accidents. Machine learning improves the system’s ability to identify vulnerable road users, such as bicyclists and motorcyclists, reducing misses by 70% compared to classical radar signal processing. It allows the system to identify pedestrians in a cluttered urban environment and can even spot them behind a parked car or other obstruction that may hide them from view. It can help provide accurate object detection and tracking, reducing position error and object-heading error by more than 50%, which means the vehicle is better able to tell when another vehicle is stopped in its lane.

With this information, the planning and policy systems involved in vehicle automation can know when to engage the brakes or steering to avoid a collision if the driver does not react. At the same time, the sensors that provide better situational awareness outside the vehicle can also help understand what’s going on inside the vehicle. 

The most common example of this is driver state sensing, which is a critical element in enabling advanced levels of automation and meeting future European New Car Assessment Program (NCAP) and EU General Safety Regulations designed to prevent distracted driving. A camera pointed inward at the cabin allows vehicle software to ensure that the driver is alert and engaged. Are the driver’s eyes on the road, or are they looking down at a phone, the radio or something else? Is the driver’s face showing signs of drowsiness, inattentiveness, or inebriation?

Going further, by fusing interior and exterior sensors, much more proactive safety solutions are now possible. Consider a scenario where you are stopped at an intersection waiting to make a right-hand turn. You’re looking to your left, watching for a gap in the traffic so you can quickly make the right turn. Meanwhile, a pedestrian starts to cross in front of you from the right.

The interior sensor sees where you are looking, but one of the exterior radars sees the pedestrian. The system knows you haven’t looked over to see the pedestrian, so it warns you that the person is there — before it becomes an emergency.

Addressing this kind of distracted driving is possible today. As the industry continues to innovate, expect more features that enable vehicles to anticipate and correct for inevitable distractions.