AV Testing Costs Slashed by 1000 with Simulated Terrible Drivers

University of Michigan
From left, Sean Shen, a research engineer at UMTRI, Xintao Yan and Haowei Sun, both civil engineering PhD students, prepare to take out one of the autonomous vehicles for a test run at Mcity on North Campus of the University of Michigan in Ann Arbor on Wednesday, January 18, 2023. Image credit: Brenda Ahearn/University of Michigan, College of Engineering, Communications and Marketing
From left, Sean Shen, a research engineer at UMTRI, Xintao Yan and Haowei Sun, both civil engineering PhD students, prepare to take out one of the autonomous vehicles for a test run at Mcity on North Campus of the University of Michigan in Ann Arbor on Wednesday, January 18, 2023. Image credit: Brenda Ahearn/University of Michigan, College of Engineering, Communications and Marketing

Study: Dense reinforcement learning for safety validation of autonomous vehicles (DOI: 10.1038/s41586-023-05732-2)

The push toward truly autonomous vehicles has been hindered by the cost and time associated with safety testing, but a new system developed at the University of Michigan shows that artificial intelligence can reduce the testing miles required by 99.99%.

It could kick off a paradigm shift that enables manufacturers to more quickly verify whether their autonomous vehicle technology can save lives and reduce crashes. In a simulated environment, vehicles trained by artificial intelligence perform perilous maneuvers, forcing the AV to make decisions that confront drivers only rarely on the road but are needed to better train the vehicles.

To repeatedly encounter those kinds of situations for data collection, real world test vehicles need to drive for hundreds of millions to hundreds of billions of miles.

"The safety critical events-the accidents, or the near misses-are very rare in the real world, and often time AVs have difficulty handling them," said Henry Liu, U-M professor of civil engineering and director of both Mcity and the Center for Connected and Automated Transportation, a regional transportation research center funded by the U.S. Department of Transportation.

U-M researchers refer to the problem as the "curse of rarity," and they're tackling it by learning from real-world traffic data that contains rare safety-critical events. Testing conducted on test tracks mimicking urban as well as highway driving showed that the AI-trained virtual vehicles can accelerate the testing process by thousands of times. The study appears on the cover of Nature.

Detail photograph of screens during the virtual reality test run of an autonomous vehicle at Mcity on North Campus of the University of Michigan in Ann Arbor on Wednesday, January 18, 2023. Image credit: Brenda Ahearn/University of Michigan, College of Engineering, Communications and Marketing
Detail photograph of screens during the virtual reality test run of an autonomous vehicle at Mcity on North Campus of the University of Michigan in Ann Arbor on Wednesday, January 18, 2023. Image credit: Brenda Ahearn/University of Michigan, College of Engineering, Communications and Marketing

"The AV test vehicles we're using are real, but we've created a mixed reality testing environment. The background vehicles are virtual, which allows us to train them to create challenging scenarios that only happen rarely on the road," Liu said.

U-M's team used an approach to train the background vehicles that strips away nonsafety-critical information from the driving data used in the simulation. Basically, it gets rid of the long spans when other drivers and pedestrians behave in responsible, expected ways-but preserves dangerous moments that demand action, such as another driver running a red light.

By using only safety-critical data to train the neural networks that make maneuver decisions, test vehicles can encounter more of those rare events in a shorter amount of time, making testing much cheaper.

"Dense reinforcement learning will unlock the potential of AI for validating the intelligence of safety-critical autonomous systems such as AVs, medical robotics and aerospace systems," said Shuo Feng, assistant professor in the Department of Automation at Tsinghua University and former assistant research scientist at the U-M Transportation Research Institute.

Detail photograph of screens during the virtual reality test run of an autonomous vehicle at Mcity on North Campus of the University of Michigan in Ann Arbor on Wednesday, January 18, 2023. Image credit: Brenda Ahearn/University of Michigan, College of Engineering, Communications and Marketing
Detail photograph of screens during the virtual reality test run of an autonomous vehicle at Mcity on North Campus of the University of Michigan in Ann Arbor on Wednesday, January 18, 2023. Image credit: Brenda Ahearn/University of Michigan, College of Engineering, Communications and Marketing

"It also opens the door for accelerated training of safety-critical autonomous systems by leveraging AI-based testing agents, which may create a symbiotic relationship between testing and training, accelerating both fields."

And it's clear that training, along with the time and expense involved, is an impediment. An October Bloomberg article stated that although robotaxi leader Waymo's vehicles had driven 20 million miles over the previous decade, far more data was needed.

"That means," the author wrote, "its cars would have to drive an additional 25 times their total before we'd be able to say, with even a vague sense of certainty, that they cause fewer deaths than bus drivers."

Testing was conducted at Mcity's urban environment in Ann Arbor, as well as the highway test track at the American Center for Mobility in Ypsilanti.

Launched in 2015, Mcity, was the world's first purpose-built test environment for connected and autonomous vehicles. With new support from the National Science Foundation, outside researchers will soon be able to run remote, mixed reality tests using both the simulation and physical test track, similar to those reported in this study.

Real-world data sets that support Mcity simulations are collected from smart intersections in Ann Arbor and Detroit, with more intersections to be equipped. Each intersection is fitted with privacy-preserving sensors to capture and categorize each road user, identifying its speed and direction. The research was funded by the Center for Connected and Automated Transportation and the National Science Foundation.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.