By now you’ve probably seen those top-down graphical images of a self-driving car as it navigates through a neon-hued world made up of yellow and purple boxes representing other cars and pedestrians. These images are used to translate the raw data produced by a self-driving vehicle’s hardware and software stack into something more visually appealing for operators, and helps them better understand how their cars “see” and interact with the world around them.
Now, two big players in the self-driving car space — Uber and GM’s Cruise — are putting their visualization software on the web and making them free for anyone to use. It’s an unprecedented step in the world of closely guarded self-driving secrets, but one that will hopefully encourage developers to build a variety of cool applications that can, in the end, lift up the entire industry.
In a Medium post last week, Cruise introduced its graphics library of two- and three-dimensional scenes called “Worldview.” “We hope Worldview will lower the barrier to entry into the powerful world of WebGL, giving web developers a simple foundation and empowering them to build more complex visualizations,” the company said.
It provides 2D and 3D cameras, mouse and keyboard movement controls, click interaction, and a suite of built-in drawing commands. Now our engineers can build custom visualizations easily, without having to learn complex graphics APIs or write wrappers to make them work with React.
Uber’s new tool seems more geared to AV operators specifically. The company’s Autonomous Visualization System (AVS for short) is a “customizable web-based platform that allows self-driving technology developers — big or small — to transform their vehicle data into an easily digestible visual representation of what the vehicle is seeing in the real world,” Uber says.
As autonomous vehicles log more and more miles on public roads, there is an even greater need to isolate certain edge cases to help operators understand why their cars made certain decisions. The visualization system allows engineers to break out and playback certain trip intervals for closer inspection.
Today, many AV operators rely on off-the-shelf visualization systems that weren’t designed with self-driving cars in mind. Often they are limited to bulky desktop computers that are difficult to navigate. Uber is now letting rival AV operators piggyback on its web-based visualization platform so they don’t have to “learn complex computer graphics and data visualization techniques in order to deliver effective tooling solutions,” the company says in a blog post.
“Being able to visually explore the sensor data, predicted paths, tracked objects, and state information like acceleration and velocity is invaluable to the triage process,” said Drew Gray, chief technology officer at self-driving startup Voyage, in a statement provided by Uber. “At Voyage we use this information to make data-driven decisions on engineering priorities.”
The move comes less than two months after Uber returned to public roads for the first time since one of the company’s vehicles struck and killed a pedestrian in Tempe, Arizona in March 2018. Uber’s autonomous vehicles are back on the road in Pittsburgh, albeit in a much more scaled back fashion.