MIT study finds Tesla drivers lose their attention when the autopilot is activated – TechCrunch


By the end of this week, potentially thousands of Tesla owners will test the automaker’s latest version of its “Full Self-Driving” beta software, version 10.0.1, on public roads, including as regulators and federal officials investigate system security after some high profile crashes.

A new Massachusetts Institute of Technology study lends credence to the idea that the FSD system, which despite its name is not actually an autonomous system but rather an advanced driver assistance system (ADAS), may not actually be that safe. Researchers studying gaze data from 290 human-initiated autopilot disconnect epochs found that drivers can lose their attention when using partially automated driving systems.

“Visual behavior patterns change before and after [Autopilot] disconnection, ”the study reads. “Before the disengagement, drivers looked less at the road and concentrated more on non-driving-related areas compared to after the transition to manual driving. The higher proportion of off-road glances before disconnection to manual driving was not compensated for by longer glances ahead. “

Tesla CEO Elon Musk has said that not everyone who has paid for FSD software will be able to access the beta version, which promises more automated driving features. First, Tesla will use telemetry data to capture personal driving metrics over a seven-day period to ensure that drivers remain vigilant enough. The data can also be used to implement a new safety classification page that tracks the owner’s vehicle, which is linked to their insurance.

The MIT study provides evidence that drivers may not be using Tesla’s Autopilot (AP) as recommended. Because AP includes safety features like traffic-conscious cruise control and automatic steering, drivers become less attentive and take their hands more off the wheel. The researchers found that this type of behavior may be the result of a misunderstanding about what AP functions can do and what their limitations are, which is reinforced when it works well. Drivers whose tasks are automated for them can naturally become bored after trying to maintain visual and physical alertness, which researchers say only creates further inattention.

The report, titled “A Naturalistic Gazing Behavior Model Around Tesla Autopilot Disconnects,” has been following Tesla Model S and X owners through their daily routine for periods of a year or more throughout the area. Boston metropolitan area. The vehicles were equipped with the real-time intelligent driving environment recording data acquisition system1, which continuously collects data from the CAN bus, a GPS and three 720p video cameras. These sensors provide information such as the vehicle’s kinematics, the driver’s interaction with the vehicle’s controllers, the mileage, the driver’s location and posture, the face and the view in front of the vehicle. MIT collected nearly 500,000 miles of data.

The goal of this study is not to embarrass Tesla, but to advocate for driver attention management systems that can give feedback to drivers in real time or tailor automation functionality to suit the driver’s level of attention. Autopilot currently uses a hand-detection system on the steering wheel to monitor driver engagement, but does not monitor driver attention through eye or head tracking.

The researchers behind the study have developed a model for gaze behavior, “based on naturalistic data, that can help understand the characteristics of changes in driver attention under automation and support the development of solutions to ensure that drivers remain sufficiently engaged in driving tasks. “This would not only help driver monitoring systems to address ‘atypical’ looks, but can also be used as a benchmark to study the safety effects of automation on a driver’s behavior.

Companies like Seeing Machines and Smart Eye are already working with automakers like General Motors, Mercedes-Benz, and reportedly Ford to bring camera-based driver monitoring systems to cars with ADAS, but also to address the issues. problems caused by driving drunk or intoxicated. The technology exists. The question is, will Tesla use it?


feedproxy.google.com

Leave a Reply

Your email address will not be published. Required fields are marked *