+ A
A -
NYT
Benjamin Maldonado and his teenage son were driving back from a soccer tournament on a California freeway in August 2019 when a truck in front of them slowed. Maldonado flicked his turn signal and moved right. Within seconds, his Ford Explorer pickup was hit by a Tesla Model 3 that was traveling about 60 mph on Autopilot.
A 6-second video captured by the Tesla and data it recorded show that neither Autopilot — Tesla’s much-vaunted system that can steer, brake and accelerate a car on its own — nor the driver slowed the vehicle until a fraction of a second before the crash. Jovani Maldonado, 15, who had been in the front passenger seat and was not wearing his seat belt, was thrown from the Ford and died, according to a police report.
The accident, which took place 4 miles from Tesla’s main car factory, is now the subject of a lawsuit against the company. It is one of a growing number of crashes involving Autopilot that have fueled concerns about the technology’s shortcomings, and could call into question the development of similar systems used by rival carmakers. And as cars take on more tasks previously done by humans, the development of these systems could have major ramifications — not just for the drivers of those cars but for other motorists, pedestrians and cyclists.
Tesla, founded in 2003, and its chief executive, Elon Musk, have been bold in challenging the auto industry, attracting devoted fans and customers and creating a new standard for electric vehicles that other established carmakers are reckoning with. The company is worth more than several large automakers combined.
But the accidents involving Autopilot could threaten Tesla’s standing and force regulators to take action against the company. The National Highway Traffic Safety Administration has about two dozen active investigations into crashes involving Autopilot.
At least three Tesla drivers have died since 2016 in crashes in which Autopilot was engaged and failed to detect obstacles in the road. In two instances, the system did not brake for tractor-trailers crossing highways. In the third, it failed to recognize a concrete barrier. In June, the federal traffic safety agency released a list showing that at least 10 people have been killed in eight accidents involving Autopilot since 2016. That list does not include the crash that killed Jovani Maldonado.
Tesla’s credibility has taken a hit, and some experts on autonomous driving say that it is hard not to question other claims made by Musk and the company. He has, for example, said several times that Tesla was close to perfecting Full Self Driving, a technology that would allow cars to drive autonomously in most circumstances — something other auto and technology companies have said is years away.
Musk and Tesla did not respond to several requests for comment.
Autopilot is not an autonomous driving system. Rather, it is a suite of software, cameras and sensors intended to assist drivers and prevent accidents by taking over many aspects of driving a car — even the changing of lanes. Tesla executives have claimed that handing off these functions to computers will make driving safer because human drivers are prone to mistakes and distractions, and cause most of the roughly 40,000 traffic fatalities that occur each year in the United States.
“Computers don’t check their Instagram” while driving, Tesla’s director of artificial intelligence, Andrej Karpathy, said last month in an online workshop on autonomous driving.
While Autopilot is in control, drivers can relax, but are not supposed to tune out. Instead, they’re supposed to keep their hands on the steering wheel and eyes on the road, ready to take over in case the system becomes confused or fails to recognize objects or dangerous traffic scenario.
But with little to do other than look straight ahead, some drivers seem unable to resist the temptation to let their attention wander while Autopilot is on. Videos have been posted on Twitter and elsewhere showing drivers reading or sleeping while at the wheel of Teslas. The company has often faulted drivers of its cars, blaming them in some cases for failing to keep their hands on the steering wheel and eyes on the road while using Autopilot.
Benjamin Maldonado and his teenage son were driving back from a soccer tournament on a California freeway in August 2019 when a truck in front of them slowed. Maldonado flicked his turn signal and moved right. Within seconds, his Ford Explorer pickup was hit by a Tesla Model 3 that was traveling about 60 mph on Autopilot.
A 6-second video captured by the Tesla and data it recorded show that neither Autopilot — Tesla’s much-vaunted system that can steer, brake and accelerate a car on its own — nor the driver slowed the vehicle until a fraction of a second before the crash. Jovani Maldonado, 15, who had been in the front passenger seat and was not wearing his seat belt, was thrown from the Ford and died, according to a police report.
The accident, which took place 4 miles from Tesla’s main car factory, is now the subject of a lawsuit against the company. It is one of a growing number of crashes involving Autopilot that have fueled concerns about the technology’s shortcomings, and could call into question the development of similar systems used by rival carmakers. And as cars take on more tasks previously done by humans, the development of these systems could have major ramifications — not just for the drivers of those cars but for other motorists, pedestrians and cyclists.
Tesla, founded in 2003, and its chief executive, Elon Musk, have been bold in challenging the auto industry, attracting devoted fans and customers and creating a new standard for electric vehicles that other established carmakers are reckoning with. The company is worth more than several large automakers combined.
But the accidents involving Autopilot could threaten Tesla’s standing and force regulators to take action against the company. The National Highway Traffic Safety Administration has about two dozen active investigations into crashes involving Autopilot.
At least three Tesla drivers have died since 2016 in crashes in which Autopilot was engaged and failed to detect obstacles in the road. In two instances, the system did not brake for tractor-trailers crossing highways. In the third, it failed to recognize a concrete barrier. In June, the federal traffic safety agency released a list showing that at least 10 people have been killed in eight accidents involving Autopilot since 2016. That list does not include the crash that killed Jovani Maldonado.
Tesla’s credibility has taken a hit, and some experts on autonomous driving say that it is hard not to question other claims made by Musk and the company. He has, for example, said several times that Tesla was close to perfecting Full Self Driving, a technology that would allow cars to drive autonomously in most circumstances — something other auto and technology companies have said is years away.
Musk and Tesla did not respond to several requests for comment.
Autopilot is not an autonomous driving system. Rather, it is a suite of software, cameras and sensors intended to assist drivers and prevent accidents by taking over many aspects of driving a car — even the changing of lanes. Tesla executives have claimed that handing off these functions to computers will make driving safer because human drivers are prone to mistakes and distractions, and cause most of the roughly 40,000 traffic fatalities that occur each year in the United States.
“Computers don’t check their Instagram” while driving, Tesla’s director of artificial intelligence, Andrej Karpathy, said last month in an online workshop on autonomous driving.
While Autopilot is in control, drivers can relax, but are not supposed to tune out. Instead, they’re supposed to keep their hands on the steering wheel and eyes on the road, ready to take over in case the system becomes confused or fails to recognize objects or dangerous traffic scenario.
But with little to do other than look straight ahead, some drivers seem unable to resist the temptation to let their attention wander while Autopilot is on. Videos have been posted on Twitter and elsewhere showing drivers reading or sleeping while at the wheel of Teslas. The company has often faulted drivers of its cars, blaming them in some cases for failing to keep their hands on the steering wheel and eyes on the road while using Autopilot.