By John Antczak and Tom Krisher
Associated Press
LOS ANGELES — Federal safety regulators are sending a team to California to investigate a fatal freeway crash involving a Tesla, just after authorities near Oakland arrested a man in another Tesla rolling down a freeway with no one behind the steering wheel.
Experts say both cases raise pressure on the National Highway Traffic Safety Administration to take action on Tesla’s partially automated driving system called Autopilot, which has been involved in multiple crashes that have resulted in at least three U.S. deaths.
The probe of the May 5 crash in Fontana, California, east of Los Angeles, is the 29th case involving a Tesla that the agency has responded to.
“The details of whether the Tesla was in autonomous mode are still under investigation,” Officer Stephen Rawls, a spokesperson for the California Highway Patrol, said in an email Wednesday.
The Tesla driver, a 35-year-old man whose name has not been released, was killed and another man was seriously injured when the electric car struck an overturned semi on a freeway. The injured man, a 30-year-old passing motorist, was struck by the Tesla as he was helping the semi’s driver out of the wreck.
“We have launched a Special Crash Investigation for this crash. NHTSA remains vigilant in overseeing the safety of all motor vehicles and equipment, including automated technologies,” the agency said in a statement Wednesday.
The investigation comes just after the California Highway Patrol arrested another man who authorities say was in the back seat of a Tesla that was riding down Interstate 80 with no one behind the wheel.
Param Sharma, 25, is accused of reckless driving and disobeying a peace officer, the CHP said in a statement Tuesday.
The statement did not say if officials have determined whether the Tesla was operating on Autopilot, which can keep a car centered in its lane and a safe distance behind vehicles in front of it.
But it’s likely that either Autopilot or “Full Self-Driving” were in operation for the driver to be in the back seat. Tesla is allowing a limited number of owners to test its self-driving system.
Tesla, which has disbanded its public relations department, did not respond to messages seeking comment Wednesday.
The Fontana investigation, in addition to probes of two crashes in Michigan from earlier this year, show that NHTSA is taking a closer look at the Tesla systems. Experts say the agency needs to rein in such systems because people tend to trust them too much when they cannot drive themselves.
“I think they very likely are getting serious about this, and we may actually start to see some action in the not-too-distant future,” said Sam Abuelsamid, principal mobility analyst for Guidehouse Insights who follows automated systems.
“I definitely think that the increasing number of incidents is adding more fuel to the fire for NHTSA to do more,” said Missy Cummings, an electrical and computer engineering professor at Duke University who studies automated vehicles. “I do think they are going to be stronger about this.”
Tesla says on its website and in owners manuals that for both driver-assist systems, drivers must be ready to intervene at any time. But drivers have repeatedly zoned out with Autopilot in use, resulting in crashes in which neither the system nor the driver stopped for obstacles in the road.
The federal agency could declare Autopilot defective and require it to be recalled, or it could force Tesla to limit areas where Autopilot can be used to limited-access freeways. It could also make the company install a stronger system to ensure drivers are paying attention.
The auto industry, except for Tesla, already does a good job of limiting where such systems can operate, and is moving to self-regulate, Cummings said. Tesla seems to be heading that way. It’s now installing driver-facing cameras on recent models, she said.
Tesla has a system to monitor drivers to make sure they’re paying attention by detecting force from hands on the steering wheel.
The system will issue warnings and eventually shut the car down if it doesn’t detect hands. But critics have said Tesla’s system is easy to fool and can take as long as a minute to shut down. Consumer Reports said in April that it was able to trick a Tesla into driving in Autopilot mode with no one at the wheel.
In March, a Tesla official also told California regulators that “Full Self-Driving” was a driver-assist system that requires monitoring by humans. In notes released by the state’s Department of Motor Vehicles, the company couldn’t say whether Tesla’s technology would improve to fully self driving by the end of the year, contrary to statements made by company CEO Elon Musk.
In the back-seat driving case, authorities got multiple 911 calls Monday evening that a person was in the back of the Tesla Model 3 while the vehicle traveled on Interstate 80 across the San Francisco-Oakland Bay Bridge.
A motorcycle officer spotted the Tesla, confirmed the solo occupant was in the back seat, took action to stop the car and saw the occupant move to the driver’s seat before the car stopped, said the statement from the California Highway Patrol.
Authorities said they cited Sharma on April 27 for similar behavior.
In an interview with The Associated Press Wednesday, Sharma said he did nothing wrong, and he’ll keep riding in the back seat with no one behind the steering wheel.
Musk wants him to keep doing this, he said. “It was actually designed to be ridden in the back seat,” Sharma said. “I feel safer in the back seat than I do in the driver’s seat, and I feel safer with my car on Autopilot, I trust my car Autopilot more than I trust everyone on the road.”
He believes his Model 3 can drive itself, and doesn’t understand why he had to spend a night in jail.
“The way where we stand right now, I can launch a self-driving Tesla from Emeryville all the way to downtown San Francisco from the back seat,” he said, adding that he has gone about 40,000 miles in Tesla vehicles without being in the driver’s seat.
Sharma’s comments suggest he is among a number of Tesla drivers who rely too much on the company’s driving systems, Duke’s Cummings said.
“It’s showing people the thought process behind people who have way too much trust in a very unproven technology,” she said.