AI that drives change: Wayve rewrites self-driving playbook with deep learning in Azure
LONDON – On a more than typically busy Thursday morning in Soho, the gray December sky spat rain. Traffic was stop-and-go, but mainly stop. Even the sidewalks were congested.
Finally, alongside the imposing British Museum, the flow of cars and trucks regained some momentum. Inside a four-door EV sedan that was driving itself, a safety operator sat passively but alertly behind the wheel, hands resting palm up on his thighs. The car glided forward without any assistance from him, en route to Trafalgar Square.
A few moments later, a harried man stepped into our path from behind a parked car. The AI-guided sedan braked firmly to a halt, giving the four passengers inside a gentle shake; the heedless pedestrian crossed the street without looking back. The safety operator had not touched the pedals; this car was acting independently.
Self-driving cars, powered by AI, roll through a number of big city streets these days, but the company behind our ride, Wayve, took a different route when it was founded in Cambridge, U.K., in 2017.
In essence, Wayve has built an AI-powered driver who could potentially be installed in any new car, no matter the make or model, and drive it, in any country or city, with just a couple weeks of fine-tuning. This approach relies on a form of AI model inspired by the human brain known as a “neural net.” Wayve’s AI Driver mainly uses cameras to safely navigate from point to point.
“We’re really approaching autonomous driving as an AI problem and building a data-driven stack with end-to-end deep learning.”
Alex Kendall, co-founder and CEO of Wayve, at its workshop in the King’s Cross neighborhood of London. Photo by Chris Welsch for Microsoft.
To achieve its goals, Wayve is harnessing the power of Microsoft Azure. In particular, it is using Azure Storage, Azure Databricks and Azure AI infrastructure with Azure Kubernetes Service to connect thousands of graphics processing units into a flexible supercomputer that can train and validate the AI model for autonomous driving.
A car equipped with Wayve technology has a powerful central computer in the trunk that is pre-loaded with the Wayve AI programming. Through the car’s cameras, the AI model can read road signs and traffic lights, and even in a busy city like London, perceive its environment and act accordingly. There are now Wayve-equipped vehicles operating in cities in the U.K., the United States, Germany and Japan.
“When we started, we were building a very contrarian approach,” said Wayve co-founder and Chief Executive Alex Kendall. “It’s still with us today. We’re really approaching autonomous driving as an AI problem and building a data-driven stack with end-to-end deep learning.”
A flexible and scalable strategy
This strategy contrasts with that of other competitors in the field, who started by hand-engineering a rules-based approach that tackled driving as different sets of problems and integrated a complex array of sensors and computers in the vehicle.
Wayve wanted a more generalized and flexible approach that could be scaled up quickly and deployed by different carmakers. It used deep learning to construct a neural network – a computer algorithm inspired by our understanding of the way the human brain works. It is made up of layers of interconnected nodes and learns patterns from data like video, other forms of sensor data and even simulated environments (in a way like a video game).
“We’re not aiming to build the full vertical stack,” Kendall said. “We’re not building our own cars. We’re not building our own cloud infrastructure. We’re not building our own mobility network.
“Our expertise is in the AI Driver, and we aim to partner with the biggest and best, whether it’s a car company or a mobility platform like Uber, or of course with Microsoft and the Azure infrastructure that underpins everything we do.”
“What I’m so grateful for is that Microsoft made a bet on Wayve,” he said, “backed us as a partnership quite, quite early on when we were going up against all of the other self-driving giants.”
Wayve has raised $1.3 billion since its inception.
Microsoft is among its believers. In October of 2025, Wayve and Microsoft agreed to a new deal on Wayve’s use of Azure services, a commitment that significantly expands Wayve’s use of Azure services. The two organizations have also signed a Strategic Framework Agreement, which means they will continue to work together in a variety of ways, including expanding the use of the technology being developed to other car and vehicle makers and collaborating on marketing and sales. Other companies are also making plans with Wayve.
In a collaboration with Uber, announced in June, the company plans to start operating a limited trial of passenger service in London with Wayve-equipped cars this year. Wayve has also announced a deal with Nissan, which will begin mass production of Wayve-equipped cars in fiscal year 2027.
“We were able to take a new vehicle from Nissan in Japan, a country where we had never driven,” Kendall said. “And in just four months, we were able to take this new vehicle and show that our system could drive autonomously all throughout Tokyo.”
Alex Persin is Wayve’s principal engineer. He leads the company’s “pre-training” team, developing the model that is the AI Driver.
“The analogy we like to use is that when a human learns to drive, they have 16 or 17 years of learning spatial awareness and hand-to-eye coordination and things like that,” he said. “And then they have maybe 40 hours of driving lessons where they learn the rules of the road and how to handle a car. Pre-training is that first 16 years.”
Working with Microsoft on something new
Using video and other data gathered from its fleet of test cars, as well as simulated data (think video games) and other kinds of data, Wayve’s engineers are teaching the AI model how to navigate safely through dynamic environments.
“The model is learning how objects move in space, how the views from the different cameras relate to each other, how they relate to the actions and how things like speed affect what the world will look like in the future,” Persin said.
He added that the data-hungry system for training Wayve’s AI model relies on Microsoft’s large-scale capacities. He cited Azure Blob Storage (blob is short for binary large object – and in this case means petabytes of video and other sorts of data being created at Wayve) and the Azure Kubernetes Service (AKS) system as tools that were essential in reaching Wayve’s goals to support training and the computational demand required to run the model.
Persin reflected on how Wayve and Microsoft have collaborated on the tools that helped Wayve create something truly new.
“One concrete example is AKS used to only support 1,000 nodes,” said Persin. A node is usually one server that may have up to several GPUs operating in it, and a cluster is a group of nodes. “We wanted single clusters bigger than that, and now the service supports 5,000 nodes, which has meant that we didn’t have to go and run our own kubernetes service ourselves … so that has accelerated our own development.”
Marta Wolinska, a machine learning engineer, works on Wayve’s driving performance team. Her work is adapting the model to different types of vehicles with different camera setups and other kinds of sensors, like radar and lidar, which refers to light detection and ranging.
She points out that new cars already incorporate many AI features, like lane detection and some degree of assisted driving, but that Wayve’s technology takes things to a different level.
She said what has impressed her and other Wayve engineers and computer scientists is how well the model reacts to real-world situations it might not have encountered in training.
“Like slowing down for geese crossing the road or squirrels, that kind of thing,” she said. “It’s really those long-tail scenarios that we generalize to really well.”
The benefits of self-driving cars
During our Wayve-equipped car’s trajectory between Wayve’s London headquarters, near King’s Cross, to Trafalgar Square, we got a good sampling of the complexity of the British capital’s traffic.
Takeoffs were smooth; the frequent stops too. No geese or squirrels were encountered, but the car did clearly see and stop for another careless pedestrian — this one crossing after the light had turned. It delivered its four passengers to Trafalgar Square and back without incident. The safety operator never needed to intervene during the trajectory he had planned.
Wayve CEO Kendall is enthusiastic about the impact Wayve and its competitors might have in London and elsewhere.
“I think Londoners are going to be delighted by self-driving car services because the benefits it brings are enormous,” he said.
He said driverless cars could also change the urban environment by reducing the need for parking spaces because self-driving cars could be shared or hired, making them more productive and spending less time in parking spots. Kendall said that the technology Wayve is developing is ultimately part of a larger trend — “embodied AI” — that has not gotten as much attention as large language models like Copilot.
“I think over the next decade we’re going to see the rise of embodied AI bringing AI into the physical world,” he said. “What this gives us the opportunity to do is, of course, the enormous part of our lives that involves physical interactions, whether this is self-driving cars, logistics, health care, robotics, manufacturing, domestic robotics. All of these applications in the physical world can benefit from AI as well.”