A self-driving Tesla carrying a passenger for Uber rammed into an SUV at an intersection in suburban Las Vegas in April, an accident that sparked new concerns that a growing stable of self-styled “robotaxis” is exploiting a regulatory gray area in U.S. cities, putting lives at risk.
Tesla CEO Elon Musk aims to show off plans for a robotaxi, or self-driving car used for ride-hailing services, on Oct. 10, and he has long contemplated a Tesla-run taxi network of autonomous vehicles owned by individuals.
Do-it-yourself versions, however, are already proliferating, according to 11 ride-hail drivers who use Tesla’s Full Self-Driving (FSD) software. Many say the software, which costs $99 per month, has limitations, but that they use it because it helps reduce drivers’ stress and therefore allows them to work longer hours and earn more money.
Reuters is first to report about the Las Vegas accident and a related inquiry by federal safety officials, and of the broad use by ride-hail drivers of Tesla autonomous software.
While test versions of self-driving cabs with human backup drivers from robotaxi operators such as Alphabet’s Waymo and General Motors’ Cruise are heavily regulated, state and federal authorities say Tesla drivers alone are responsible for their vehicles, whether or not they use driver-assist software. Waymo and Cruise use test versions of software categorized as fully autonomous while Tesla FSD is categorized as a level requiring driver oversight.
The other driver in the April 10 Las Vegas accident, who was taken to the hospital, was faulted for failing to yield the right of way, according to the police report. The Las Vegas Tesla driver, Justin Yoon, said on YouTube the Tesla software failed to slow his vehicle even after the SUV emerged from a blind spot created by another vehicle.
Yoon, who posts YouTube videos under the banner “Project Robotaxi,” was in the driver’s seat of his Tesla, hands off the wheel, when it entered the intersection in a suburban part of Las Vegas, according to footage from inside the car. The Tesla on FSD navigated the vehicle at 46 mph (74 kph) and did not initially register a sport-utility vehicle crossing the road in front of Yoon. At the last moment, Yoon took control and turned the car into a deflected hit, the footage shows.
“It’s not perfect, it’ll make mistakes, it will probably continue to make mistakes,” Yoon said in a post-crash video. Yoon and his passenger suffered minor injuries and the car was totaled, he said.
Yoon discussed using FSD with Reuters before he publicly posted videos of the accident but did not respond to requests for comment afterward.
Tesla did not respond to requests for comment. Reuters was unable to reach the Uber passenger and other driver for comment.
Ride-hailing companies Uber and Lyft responded to questions about FSD by saying drivers are responsible for safety.
Uber, which said it was in touch with the driver and passenger in the Las Vegas accident, cited its community guidelines: “Drivers are expected to maintain an environment that makes riders feel safe; even if driving practices don’t violate the law.”
Uber also cited instructions by Tesla which alert drivers who use FSD to have their hands on the wheel and be ready to take over at any moment.
Lyft said: “Drivers agree that they will not engage in reckless behavior.”
Grand ambitions
Musk has grand plans for self-driving software based on the FSD product. The technology will serve as the foundation of the robotaxi product software, and Musk envisions creating a Tesla-run autonomous ride service using vehicles owned by his customers when they are not otherwise in use.
But the drivers who spoke to Reuters also described critical shortcomings with the technology, including sudden unexplained acceleration and braking. Some have quit using it in complex situations such as airport pickups, navigating parking lots and construction zones.
“I do use it, but I’m not completely comfortable with it,” said Sergio Avedian, a ride-hail driver in Los Angeles and a senior contributor on “The Rideshare Guy” YouTube channel, an online community of ride-hailing drivers with nearly 200,000 subscribers. Avedian avoids using FSD while carrying passengers. Based on his conversations with fellow drivers on the channel, however, he estimates that 30% to 40% of Tesla ride-hail drivers across the U.S. use FSD regularly. FSD is categorized by the federal government as a type of partial automation that requires the driver to be fully engaged and attentive while the system performs steering, acceleration and braking. It has come under increased regulatory and legal scrutiny with at least two fatal accidents involving the technology. But using it for ride-hail is not against the law.
“Ride-share services allow for the use of these partial automation systems in commercial settings, and that is something that should be facing significant scrutiny,” Guidehouse Insights analyst Jake Foose said.
The U.S. National Highway Traffic Safety Administration said it was aware of Yoon’s crash and had reached out to Tesla for additional information, but did not respond to specific questions on additional regulations or guidelines.
Authorities in California, Nevada and Arizona, which oversee operations of ride-hail companies and robotaxi companies, said they do not regulate the practice as FSD and other such systems fall out of the purview of robotaxi or AV regulation. They did not comment on the crash.
Uber recently enabled its software to send passenger destination details to Tesla’s dashboard navigation system – a move that helps FSD users, wrote Omar Qazi, an X user with 515,000 followers who posts using the handle @WholeMarsBlog and often gets public replies from Musk on the platform.
“This will make it even easier to do Uber rides on FSD,” Qazi said in an X post.
Tesla, Uber and Lyft do not have ways to tell that a driver is both working for a ride-hailing company and using FSD, industry experts said.
While almost all major automakers have a version of partial automation technology, most are limited in their capabilities and restricted for use on highways. On the other hand, Tesla says FSD helps the vehicle drive itself almost anywhere with active driver supervision but minimal intervention.
“I’m glad that Tesla is doing it and able to pull it off,” said David Kidd, a senior research scientist at the Insurance Institute for Highway Safety. “But from a safety standpoint, it raised a lot of hairs.”
Instead of new regulations, Kidd said NHTSA should consider providing basic, nonbinding guidelines to prevent misuse of such technologies.
Any federal oversight would require a formal investigation into how ride-hail drivers use all driver-assistance technology, not just FSD, said Missy Cummings, director of the George Mason University Autonomy and Robotics center and a former adviser to NHTSA.
“If Uber and Lyft were smart, they’d get ahead of it and they would ban that,” she said.
Meanwhile, ride-hail drivers want more from Tesla. Kaz Barnes, who has made more than 2,000 trips using FSD with passengers since 2022, told Reuters he was looking forward to the day when he could get out of the car and let Musk’s network send it to work.
“You would just kind of take off the training wheels,” he said. “I hope to be able to do that with this car one day.”