U.S. Sen. Edward Markey is calling on Tesla to rebrand its Autopilot feature, warning that the name encourages drivers to rely too heavily on the technology.
Markey, a Massachusetts Democrat, urged the electric car maker Friday to change how it markets its driver assistance system, which helps with steering and cruise control, but does not make vehicles fully autonomous.
The system triggers a series of safety alerts if drivers take their hands off the wheel for more than 30 seconds. But as NBC10 Boston Investigator Ryan Kath reported, drivers have discovered ways to trick the car's sensors, allowing them to cruise with driver assistance engaged.
Videos circulating online show a range of methods, from laying a hand on the six o'clock position of the steering wheel to tying a weight around it, strapping in a water bottle — even wedging an orange in place.
Markey, a member of the Senate Committee on Commerce, Science and Transportation, recently met with company representatives in the wake of the NBC10 Boston report, which exposed how drivers are skirting one of Tesla’s key safety features.
After reviewing the issue, Markey said he believes the Autopilot name is “inherently misleading” and “promotes confusion” about the capabilities of Tesla’s driver assistance system.
“It gives the misimpression to drivers that they are safe,” Markey told NBC10 Sunday, “even though they are not controlling the vehicle -- that they can be complacent behind the wheel when in fact, they have to pay super close attention in order to make sure they don’t have an accident.”
Markey called on Tesla to change how it markets the feature, and to build additional tools to monitor drivers and ensure they remain engaged behind the wheel.
Tesla did not respond to a request for comment Monday from NBC10 Boston on Markey’s latest critique. In a December letter addressed to the lawmaker, the company said it is constantly upgrading its software in vehicles and recently added new features in December 2019 that help the car recognize stop signs and traffic lights.
The carmaker also said its data shows drivers using Autopilot get into fewer accidents than customers that don’t have the technology engaged.
“Tesla takes the risk of improper use or abuse of Autopilot very seriously,” wrote Alexandra N. Veitch, Tesla’s senior director of government relations and policy.
More From Our Tesla Autopilot Investigation
Autopilot has drawn scrutiny after some high-profile crashes involving Teslas, including one that killed a Florida man when his Tesla collided with a semi trailer while in Autopilot mode.
In September 2019, video of a Tesla owner seemingly asleep at the wheel while cruising down the Massachusetts Turnpike quickly went viral, inviting further questions about the technology.
Another Tesla owner in Newburyport told NBC10 he believes his car traveled for miles while he was asleep at the wheel with Autopilot engaged. The man, who asked not to be identified for fear of legal repercussions, said he was coming back from western Massachusetts on Route 2 last year when he accidentally dozed off.
When he woke up, he had traveled well past his exit for Interstate 495.
“I was ashamed of myself,” the driver told NBC10.
He also demonstrated the trick he used to bypass Tesla’s warning system, which normally alerts a driver to keep their hands on the wheel and stay engaged.
After seeing the story, Markey called it "outrageous" that drivers could easily disengage a key safety feature.
"Driver assistance is turning into driver replacement, and we aren't ready for that on the roads," he said.
In a November 2019 letter to the company, Markey posed a series of questions about its safety protocols, asking whether it has exhaustively tested potential methods for evading Autopilot's safety features, whether it tracks and responds to online videos that show how to disengage safety alerts, and what actions it will take to upgrade Autopilot's "now-known flaws."
Markey also raised the issue during a transportation safety meeting in Washington D.C.
In its Dec. 20 response, the company wrote that it uses driver data to regularly improve its safety features and "ensure appropriate driver engagement" while in Autopilot. Those measures include tailoring the steering wheel torque monitoring to require human interaction and deactivate when there is too little torque applied.
"In practical terms in most situations, this means that a limp hand on the wheel from a sleepy driver will not work, nor will the coarse hand pressure of a person with impaired motor controls, such as a drunk driver," the letter reads.
The company also pointed to its efforts to remove third-party products aimed at bypassing Autopilot's safety alerts from the market, and said every vehicle can be used in a manner that is dangerous and violates manufacturer's instructions.
“No driver monitoring technologies available on the market today, including driver-facing camera monitoring, are immune from misuse," the letter reads. "These driver misuses are not flaws attributable to the auto manufacturer, just as abuses of Autopilot technology is not a flaw attributable to Tesla."