Advertisement

Tesla's Autopilot can 'easily' be used to drive without anyone behind wheel, Consumer Reports warns

Tesla's Autopilot system can "easily" be used to drive the automaker's vehicles without anyone behind the wheel, Consumer Reports said in a new demonstration.

The magazine conducted the study on a test track after a widely publicized Tesla Model S crash in Texas on Saturday when two people were killed in a wreck that sparked an hours-long blaze. Local authorities said it appeared no one was in the driver's seat.

The National Transportation Safety Board and National Highway Traffic Safety Administration have opened investigations into the incident.

Tesla's Autopilot system enables automatic steering, accelerating and braking on roads with lanes, but it does not work in all situations. Tesla has said that drivers are supposed to keep their hands on the wheel at all times, ready to take over when the system is not able to perform.

"In our test, the system not only failed to make sure the driver was paying attention--it couldn't even tell if there was a driver there at all," Jake Fisher, senior director of auto testing at Consumer Reports, said in a statement.

Save better, spend better: Money tips and advice delivered right to your inbox. Sign up for free here

Tesla Autopilot critics: Self-driving cars aren't a reality yet

It's not clear whether Autopilot was engaged in the latest Tesla crash.

Tesla representatives have not responded to multiple requests seeking comment.

On Monday, after reports about the crash circulated, Tesla CEO Elon Musk said on Twitter that "data logs recovered so far show Autopilot was not enabled". That could not be independently verified. Tesla vehicles are not capable of fully driving themselves.

"Moreover, standard Autopilot would require lane lines to turn on, which this street did not have," he said.

But a study released in October by Duke University autonomous vehicle experts Benjamin Bauchwitz and M.L. Cummings found that in almost one-third of automated driving tests, Tesla "vehicles drove autonomously for nearly 30 seconds on extreme curves that lacked even a single lane marking."

"The potential for safety-critical events under these circumstances is enormous," the Duke researchers reported.

Tesla tested on track

In the Consumer Reports study – which occurred on a half-mile track with lanes – vehicle interface testing program manager Kelly Funkhouser sat in the rear seat and Fisher sat in the driver seat on top of a buckled seat belt because Autopilot will disengage if the belt is unbuckled while the car is moving.

After beginning the drive, Fisher engaged and set the speed dial to zero, bringing the vehicle to a stop. He "next placed a small, weighted chain on the steering wheel, to simulate the weight of a driver’s hand, and slid over into the front passenger seat without opening any of the vehicle’s doors, because that would disengage Autopilot," according to Consumer Reports.

"Using the same steering wheel dial, which controls multiple functions in addition to Autopilot’s speed, Fisher reached over and was able to accelerate the vehicle from a full stop. He stopped the vehicle by dialing the speed back down to zero."

Fisher said: "The car drove up and down the half-mile lane of our track, repeatedly, never noting that no one was in the driver’s seat, never noting that there was no one touching the steering wheel, never noting there was no weight on the seat. It was a bit frightening when we realized how easy it was to defeat the safeguards, which we proved were clearly insufficient.”

He urged Tesla owners not to try this.

“Let me be clear: Anyone who uses Autopilot on the road without someone in the driver seat is putting themselves and others in imminent danger,” he said.

Consumer Reports suggests fixes

Fisher urged Tesla to at least use weight-monitoring seat sensors to deactive Autopilot if no one is sitting in the driver's seat. The automaker could also adopt eye-tracking technology to ensure drivers' are keeping their eyes on the road, he said. General Motors' Cadillac partially autonomous Super Cruise uses such a system.

In the real world, some drivers have found ways around Autopilot's restrictions, including the use of "Autopilot Buddy," a now-illegal aftermarket device that tricked the vehicle into thinking the driver's hands were on the wheel. NHTSA issued a cease-and-desist order to that device's manufacturer in June 2018.

Despite no evidence yet that Autopilot was activated during the crash, the incident has sparked a fresh round of criticism from safety advocates who say that Tesla inspires too much confidence in its vehicles' autonomous capability.

Autopilot name questioned

"Autopilot is an intentionally deceptive name being used for a set of features that are essentially an advanced cruise control system," said Jason Levine, director of the Washington, D.C.-based nonprofit Center for Auto Safety, in an email interview. "There really is no longer a question that Tesla’s marketing is leading consumers to foreseeably misuse the technology in a dangerous way."

Musk has repeatedly defended Autopilot, saying that it's much safer than normal drivers.

On Saturday, before news of the crash broke, he cited a report claiming that cars with Autopilot engaged "are now approaching 10 times lower chance of accident than average vehicle."

Morgan Stanley auto analyst Adam Jonas estimated Wednesday that self-driving cars will need to be anywhere from 1,000 to 10,000 times safer than human drivers to gain acceptance from regulators and consumers.

You can follow USA TODAY reporter Nathan Bomey on Twitter @NathanBomey and subscribe to our free Daily Money newsletter here for personal finance tips and business news every Monday through Friday morning.

This article originally appeared on USA TODAY: Tesla Autopilot can 'easily' run without driver, Consumer Reports says