Tesla Is Urging Drowsy Drivers to Use 'Full Self-Driving'. That Could Go Very Wrong

Since Tesla launched its Full Self-Driving (FSD) feature in beta in 2020, the company's owner's manual has been clear: Contrary to the name, cars using the feature can't drive themselves.
Tesla's driver assistance system is built to handle plenty of road situations—stopping at stop lights, changing lanes, steering, braking, turning. Still, “Full Self-Driving (Supervised) requires you to pay attention to the road and be ready to take over at all times,” the manual states. “Failure to follow these instructions could cause damage, serious injury or death.”
Now, however, new in-car messaging urges drivers who are drifting between lanes or feeling drowsy to turn on FSD—potentially confusing drivers, which experts claim could encourage them to use the feature in an unsafe way. “Lane drift detected. Let FSD assist so you can stay focused,” reads the first message, which was included in a software update and spotted earlier this month by a hacker who tracks Tesla development.
“Drowsiness detected. Stay focused with FSD,” read the other message. Online, drivers have since posted that they've seen a similar message on their in-car screens. Tesla did not respond to a request for comment about this message, and WIRED has not been able to find this message appearing on a Tesla in-car screen.
The problem, researchers say, is that moments of driver inattention are exactly when safety-minded driver assistance features should demand drivers get ultra-focused on the road—not suggest they depend on a developing system to compensate for their distraction or fatigue. At worst, such a prompt could lead to a crash.
“This messaging puts the drivers in a very difficult situation," says Alexandra Mueller, a senior research scientist at the Insurance Institute for Highway Safety who studies driver assistance technologies. She believes that “Tesla is basically giving a series of conflicting instructions.”
Plenty of research studies how humans interact with computer systems built to help them accomplish tasks. Generally it finds the same thing: People are really terrible passive supervisors of systems that are pretty good most of the time, but not perfect. Humans need something to keep them engaged.