Skip to content

Waabi’s GenAI promises to do much more than power autonomous trucks

For the last two decades, Raquel Urtasun, founder and general director of autonomous truck launch Waabihas been developing artificial intelligence systems that can reason like a human would.

The AI ​​pioneer had previously served as chief scientist at Uber ATG before throwing waabi in 2021. Waabi launched with an “AI-based approach” to accelerate the commercial deployment of autonomous vehicles, starting with long-haul trucks.

“If you can build systems that can actually do that, suddenly you’ll need a lot less data,” Urtasun told TechCrunch. “It takes a lot less computing. “If you are able to reason efficiently, you don’t need to have fleets of vehicles deployed everywhere in the world.”

Building an AI-enabled AV stack that perceives the world as human power and reacts in real time is something Tesla has been trying to do with its vision first approach to autonomous driving. The difference, aside from Waabi’s convenience in using lidar sensors, is that Tesla’s full self-driving system uses “imitation learning” to learn how to drive. This requires Tesla to collect and analyze millions of videos of real-world driving situations that it uses to train its AI model.

The Waabi Driver, on the other hand, has done most of its training, testing and validation using a closed-loop simulator called Waabi World that automatically builds digital twins of the world from data; performs sensor simulation in real time; creates scenarios to stress test the Waabi Driver; and teaches the driver to learn from his mistakes without human intervention.

In just four years, that simulator has helped Waabi launch commercial pilots (with a human driver in the front seat) in Texas, many of whom are going through a partnership with Uber Freight. Waabi World is also enabling the startup to reach its planned fully autonomous commercial launch in 2025.

But Waabi’s long-term mission goes far beyond trucks.

“This technology is extremely, extremely powerful,” said Urtasun, who spoke to TechCrunch via video interview, with a whiteboard full of hieroglyph-like formulas behind her. “It has an amazing ability to generalize, it is very flexible and it develops very quickly. And it’s something we can expand to do much more than transport trucks in the future… This could be robotaxis. They could be humanoids or warehouse robotics. “This technology can solve any of those use cases.”

The promise of Waabi’s technology, which will be used for the first time to scale autonomous truck transportation, has allowed the startup to close a $200 million Series B round, led by existing investors Uber and Khosla Ventures. Strong strategic investors include Nvidia, Volvo Group Venture Capital, Porsche Automobil Holding SE, Scania Invest and Ingka Investments. The round brings Waabi’s total funding to $283.5 million.

The size of the round and the strength of its participants are particularly noteworthy given the blows the audiovisual industry has taken in recent years. Only in the road transport sector, Boarding Trucks offWaymo decided Pause your self-employed freight businessand TuSimple closed its operations in the US.. Meanwhile, in the robotaxi space, Argo AI faced your own closure, Cruise lost its permits operate in California following a major security incident, Motional cut nearly half of its workforce, and regulators are actively investigating waymo and zoox.

“The strongest companies are built by raising funds during times that are really difficult, and the audiovisual industry in particular has suffered a lot of setbacks,” Urtasun said.

That said, AI-focused players in this second wave of autonomous vehicle startups have notched impressive capital raises this year. UK-based Wayve is also developing a self-learning rather than rules-based autonomous driving system, and in May closed a $1.05 billion Series C led by the SoftBank Group. AND Intuition applied in March raised a $250 million round at a $6 billion valuation to bring AI to automotive, defense, construction and agriculture.

“In the context of AV 1.0, it is very clear today that it is capital intensive and moving forward is very slow,” Urtasun said, noting that the robotics and autonomous driving industry has been held back by complex and fragile AI systems. “And investors, I would say, are not very enthusiastic about that approach.”

However, what excites investors today is the promise of generative AI, a term that wasn’t exactly in vogue when Waabi was launched, but which nevertheless describes the system that Urtasun and his team created. Urtasun says Waabi is a next-generation genomic AI, which can be deployed in the physical world. And unlike today’s popular language-based genAI models, lLike OpenAI’s ChatGPTWaabi has figured out how to create such systems without relying on huge data sets, large language models, and all the computing power that comes with them.

The Waabi Driver, says Urtasun, has the remarkable ability to generalize. So instead of trying to train a system on every possible data point that has ever existed or could exist, the system can learn from a few examples and handle the unknown safely.

“That was in the design. We build these systems that can perceive the world, create abstractions of the world, and then take those abstractions and reason about, ‘What might happen if I do this?'” Urtasun said.

This more human and reasoning-based approach is much more scalable and more capital efficient, Urtasun says. It is also vital for validating safety-critical systems operating at the edge; You don’t want a system that takes a couple of seconds to react, otherwise you’ll crash the vehicle, he said. Waabi announced a partnership bring The Nvidia Thor hard drive to its autonomous trucks, which will give the startup access to automotive-grade computing power at scale.

On the road, it seems that driver Waabi understands that there is something solid ahead and that he should drive carefully. He may not know what that something is, but he will know how to avoid it. Urtasun also stated that the driver has been able to predict how other road users will behave without the need to receive training in different specific cases.

“It understands things without us telling the system the concept of objects, how they move in the world, that different things move differently, that there is occlusion, that there is uncertainty, how to behave when it rains a lot,” Urtasun said. “All these things are learned automatically. And because you’re exposed right now to driving scenarios, you learn all of those capabilities.”

He noted that Waabi’s unique and optimized architecture can be applied to other autonomy use cases.

“If you expose him to interactions in a warehouse, picking things up and putting things down, he can learn that, no problem,” he said. “It can expose you to multiple use cases and you can learn to develop all those skills together. “There is no limit in terms of what you can do.”