Uber’s Next Autonomous Vehicle Bet
Uber’s long-term autonomous vehicle strategy is taking shape around a role it believes few companies can match: data collection at global scale. The company says it eventually wants to equip human drivers’ cars with sensors so those vehicles can gather real-world driving data for autonomous vehicle developers and potentially for other AI systems trained on physical-world scenarios.
That vision was laid out by Uber chief technology officer Praveen Neppalli Naga, who described the concept as the direction the company wants to move in after first learning more through a smaller internal effort. In the near term, Uber is using a dedicated fleet of sensor-equipped cars through a program announced in late January called AV Labs. But the strategic ambition is much larger. Uber’s driver network numbers in the millions globally, creating the possibility of a distributed sensor platform that could far exceed the reach of any single self-driving company’s in-house fleet.
Why Uber Thinks the Opportunity Is in Data
The company’s thesis is direct: the limiting factor for autonomous vehicle development is not only the underlying software and hardware stack, but access to broad, scenario-rich data. Naga argued that the bottleneck is data collection itself. In his view, AV developers need targeted examples from specific streets, times of day, and driving conditions, yet often lack the capital required to deploy enough vehicles to collect that information efficiently.
If Uber can solve that problem, it could become infrastructure for the AV sector rather than merely a distribution partner. That is a meaningful shift. Uber previously stepped back from building its own self-driving system, and the rise of robotaxi programs has long raised questions about whether platforms without proprietary AV stacks would eventually lose leverage. This plan suggests Uber sees another path: owning access to the trip network, the operational demand signal, and potentially the data layer that helps AV companies improve their models.
From Ride Platform to “AV Cloud”
Uber says it is building what Naga called an “AV cloud,” a library of labeled sensor data that partner companies can query and use for model training. The company already works with 25 AV partners, including London-based Wayve, according to the supplied source text. That existing partnership structure gives Uber a base of customers or collaborators for any broader data service it manages to create.
The plan goes beyond passive storage. Uber says partners could also use the system to run trained models in “shadow mode” against real Uber trips. In that setup, the AV system is not actually driving the vehicle, but it can be evaluated as if it were, allowing companies to compare model behavior against live trip conditions without deploying a self-driving car on the road.
That matters because it turns Uber’s operating network into a test environment as well as a distribution channel. For AV developers, the ability to gather data and then evaluate models against real trip patterns could reduce part of the gap between training and deployment.
The Regulatory and Technical Friction
Uber is not claiming this can happen immediately. Naga said the company first needs a better understanding of sensor kits and how they work in practice. He also flagged regulatory uncertainty, saying the company would need clarity across states on what the sensors mean and what data sharing would entail.
Those constraints are substantial. A large-scale network of sensor-equipped privately operated vehicles would raise questions about installation, maintenance, privacy, consent, and state-by-state compliance. Uber’s own description acknowledges that the project is contingent on resolving those issues. For now, AV Labs remains a smaller and more controlled version of the concept, using cars Uber operates itself rather than converting the broader driver base into mobile data collectors.
Why This Could Reshape Uber’s Position
Uber’s pitch is notable because it reframes the company’s relationship to autonomy. Instead of competing head-on as a car builder or autonomy developer, Uber is positioning itself as an intermediary with unique physical reach. Millions of drivers create route diversity, temporal coverage, and geographic spread that are difficult for a single AV company to replicate. If only a fraction of those vehicles were equipped with sensors, the scale could be enormous.
That does not guarantee success. The plan depends on regulation, economics, and partner adoption. Still, it reveals how Uber intends to stay strategically relevant as self-driving technology matures. Data, labeling, trip distribution, and simulated evaluation are all assets that become more valuable when many AV players need them at once.
A Broader AI Infrastructure Play
The source text also leaves room for an even broader interpretation. Uber says the data could potentially be useful not just for self-driving companies, but for other AI systems trained on physical-world scenarios. That suggests the company sees value in curated real-world sensor data beyond the robotaxi market alone.
For now, the immediate story is clear. Uber wants to evolve from a platform that connects riders and drivers into a platform that also supplies data infrastructure for autonomous mobility. The company is starting small through AV Labs, but the long-term ambition is unmistakable: transform pieces of its global driver network into a sensor grid and become a foundational layer for the AV ecosystem.
This article is based on reporting by TechCrunch. Read the original article.
Originally published on techcrunch.com








