Driverless vehicles cannot do without humans.

The authorities need to do more research on shadow people.
"There's no one in the truck, but we're not magicians," Sterling Anderson, co-founder of the self-driving truck company Aurora, said in a podcast interview last year. Anderson was referring to the company's plans to launch a commercial delivery service using driverless trucks between Dallas and Houston. What he meant was this: Our technology isn't a trick. Unlike a magic show, there won't be a human hiding behind the curtain.
In May of this year, Aurora announced that its commercial self-driving truck service had officially launched. But a few weeks later, the company made another announcement: "Our partner, PACCAR, a truck manufacturer, has requested that a person be in the driver's seat due to the presence of certain prototype parts in their base vehicle. After much consideration, we have honored their request, and the driver who previously sat in the back seat on some of our trips has moved to the front seat."
Aurora insisted this wasn't necessary for the truck to operate safely and that the person in charge wouldn't be driving the vehicle. Still, it was clearly a setback to their ambition of "no one in the truck." Aside from the fear of appearing to be performing magic, the investment in driverless trucks doesn't seem so worthwhile if you have to pay someone to sit in them.
Aurora isn't the only autonomous vehicle company that hasn't been able to let go of humans. When Tesla launched its robotaxi service last month in Austin, Texas, the cars had human "safety monitors" in the passenger seats. Even the most established self-driving taxi services, where there's no one inside the car, still have humans behind the scenes. In China, Baidu 's robotaxis launched with "remote human operators" who could take control of the cars if necessary . Waymo, by contrast, doesn't have remote drivers, but does have "human control agents." Troubled Waymos retain control but can ask these humans for advice.
If humans are such bad drivers, as many self-driving car companies claim, why can't supposedly superior machines manage without them? Because machines and humans have distinct advantages. Machines don't get tired, bored, drunk, or distracted, but they struggle with real-world "edge cases" that require contextual knowledge and intuition, such as how best to navigate a traffic jam or what a construction worker waving his arms means. Furthermore, all safe systems must have backup in case of technical problems.
In that sense, autonomous vehicle companies should be applauded for maintaining the human presence. This means we should know much more about what functions people actually perform.
Systems that rely on a combination of machines and humans can suffer all sorts of problems. People asked to oversee safety may suffer from "automation complacency," the human tendency to lose focus while monitoring autonomous systems. Remote controllers may encounter technical issues, such as connection problems. Then there are liability issues: If a person at a service center somewhere gives bad advice to an autonomous vehicle that causes an accident, who is to blame? The tech company? The autonomous vehicle company? The person? And if that person isn't in the same state, or even the same country?
Bryant Walker Smith, an associate law professor at the University of South Carolina, told me that the onus should fall on self-driving companies: "They should explain exactly what they're doing and why they think it's safe. The authorities should be inquiring about all these details."
Until now, however, many of these human functions have remained obscure. When I sent a list of basic questions to Waymo, Tesla, and Aurora, only Waymo responded. The company declined to say how many people worked on its enforcement team, but did say they were employees of the information technology company Cognizant, required to have a driver's license, and were "located in Arizona, Michigan, and one location overseas." When I asked about its liability, the company responded that "if a Waymo vehicle were involved in a collision that caused property damage or injury, Waymo would assume the liability imposed by law."
The fact that self-driving cars still require human support behind the scenes shouldn't be considered a problem, but their functions shouldn't be hidden either. It's time for authorities to act.
© The Financial Times Limited [2025]. All rights reserved. FT and Financial Times are registered trademarks of Financial Times Limited. Redistribution, copying, or modification is prohibited. EXPANSIÓN is solely responsible for this translation, and Financial Times Limited is not responsible for its accuracy.
Expansion