TRANSPORTATION SAFETYNot-So-Safe Automated Driving: Safety Risks During Drivers’ Takeover
In a recent study on automated driving traffic and engineering psychologists analyzed reactions to possible malfunctions of this future human-machine interface. The study shows that people are only partially able to take over the wheel quickly and safely in the event of technical malfunctions.
In a recent study on automated driving by TU Dresden and DEKRA, traffic and engineering psychologists analyzed reactions to possible malfunctions of this future human-machine interface. The study of various takeover scenarios shows that people are only partially able to take over the wheel quickly and safely in the event of technical malfunctions.
Checking your emails, reading the news or watching a movie while driving. With automated driving, drivers are allowed to perform secondary activities under certain circumstances. At the same time, however, they must remain alert so that they can quickly take back control of the vehicle in critical situations. In December 2021, the first automated vehicle system (Level 3) in Europe was officially approved by the German Federal Motor Transport Authority.
But what if, in a critical situation, the vehicle fails to prompt the driver to take over control? Scientists at the TU Dresden and DEKRA have addressed this question in a recent study. Thirty-six subjects took part in the field study at the DEKRA Lausitzring. Since it is well known that each system is subject to errors, it must certainly be assumed that not every takeover situation in automated driving will be correctly recognized and displayed.
With this in mind, four different takeover situations were examined during the test drives: In one condition, a takeover warning was given even though there was no critical situation (a so-called false alarm). In three other conditions, the takeover warning was not given despite a critical situation. These critical situations involved driving over a stop line with a stop sign, slowly drifting over to the opposite lane, and performing a sudden evasive maneuver to avoid an erroneously detected obstacle. All four takeover scenarios were triggered after the test subject had already driven around the circuit several times without encountering any unusual events.
Subjects in the control group were supposed to passively monitor the automated driving and intervene when they thought it necessary. Participants in the experimental group had to complete a visually demanding secondary task on a tablet which was installed in the vehicle. A takeover was categorized as successful if the driving person performed the correct takeover action before reaching the potential collision point.
Overall, the response after a false alarm proved to be of little concern: control and experimental groups successfully took over vehicle control. In contrast, difficulties were evident in vehicle takeover when the automated system failed to warn in a critical situation. Here, the proportion of successful takeovers reduced to about half between the experimental group as compared to the control group. Thus, engagement with the secondary activity reduced the likelihood of a successful takeover when no alarm was given by the vehicle. Of particular importance is the finding that also individuals who were not engaged in a secondary task experienced significant difficulties to takeover the vehicle control in some cases.
Depending on the critical situation, between 58 and 89 percent of takeovers in the experimental group were unsuccessful in the absence of a takeover warning. In the control group, the values were between 24 and 61 percent.
For Sebastian Pannasch, professor of engineering psychology at TU Dresden, the results of the study are worrying: “We will be exposed to considerable risks in future automated driving. Automated vehicles will not be able to recognize and report all critical situations. Our results illustrate that even if we monitor vehicles while driving, correct takeover is not guaranteed in a critical situation. Based on current in-vehicle behavior, we can assume that we will definitely engage in secondary tasks during automated driving. As the study results show, this significantly increases the risk that we will not be able to react appropriately in critical situations without warning.”
In the view of DEKRA experts and TU scientists, there has been a real gap in research to date, particularly in the aspect of missing takeover prompts: Less than ten percent of papers published to date deal with so-called “disengagement situations”, i.e. a system failure caused by an error.
Prof. Pannasch views the current development, which is primarily driven by technology, with great concern: “Not everything that is technically feasible should necessarily be implemented. Against the backdrop of the current findings, the promise of increased safety that is often made in connection with automated driving remains extremely questionable. The next study on automated driving is already being planned, and will examine the factor of trust in technology.
Video: Good Question About the Traffic of the Future
Five experts from the TU Dresden provide further information on the current technical challenges of autonomous driving in the video “When will autonomous driving arrive? A good question about the traffic of the future”: https://youtu.be/Qx-2xg-YzxM.