News Story
Real-World Road Rules
Self-driving cars could make driving safer, with fatigue and distraction—not to mention irritation at that idiot who just cut you off—all taken out of the equation. (Plus, you might get to read or catch a few extra z’s.)
Autonomous vehicles will hit the market only when they can respond to driving challenges without fail, whether from ice, downpours and other environmental challenges, or from erratic drivers and headphone-wearing pedestrians with a death wish. A full determination of the capabilities of the vehicles’ systems could require hundreds of millions of miles of test drives.
But don’t cancel your plans for a nap while commuting just yet: University of Maryland computer scientist Dinesh Manocha, in collaboration with a team of colleagues from Baidu Research and the University of Hong Kong, has developed a photo-realistic simulation system for training and validating self-driving vehicles that can make the whole process faster and easier to evaluate in the lab.
In a paper published yesterday in the journal Science Robotics, the researchers described their new system, called Augmented Autonomous Driving Simulation (AADS), which aims to provide a richer, more authentic simulation than current systems that use game engines or high-fidelity computer graphics and mathematically rendered traffic patterns.
“This work represents a new simulation paradigm in which we can test the reliability and safety of automatic driving technology before we deploy it on real cars and test it on the highways or city roads,” said Manocha, one of the paper’s corresponding authors, and a professor with joint appointments in computer science, electrical and computer engineering, and the Institute for Advanced Computer Studies (UMIACS).
Current state-of-the-art simulation systems fall short in portraying photo-realistic environments and presenting real-world traffic flow patterns or driver behaviors.
AADS is a data-driven system that more accurately represents the inputs a self-driving car would receive on the road. Self-driving cars rely on a perception module, which receives and interprets information about the real world from cameras and other sensors, and a navigation module that makes decisions, such as where to steer or whether to brake or accelerate, based on the perception module.
In current simulator technology, the perception module receives input from computer-generated imagery and mathematically modeled movement patterns for pedestrians, bicycles and other cars. As a simulation of the real world, it is both crude and time-consuming to generate.
By contrast, the AADS system combines photos, videos and lidar point clouds—which are like 3D renderings—with real-world trajectory data for road users. Because it’s based on actual driving, it’s far more realistic.
(W)e’re capturing real behavior and patterns of movement,” Manocha said. “The way humans drive is not easy to capture by mathematical models and laws of physics.”
So the researchers extracted data about real trajectories from a wide range of driving videos, and modeled driving behaviors using social science methodologies.
“This data-driven approach has given us a much more realistic and beneficial traffic simulator.” Manocha said. “Because of the realism of the simulator, we can better evaluate navigation strategies of an autonomous driving system.”
Watch a video demonstration of the AADS system below:
Source: Maryland Today
Published March 29, 2019