Technical Articles

Creating 3D Virtual Driving Environments for Simulation-Aided Development of Autonomous Driving and Active Safety

By Arvind Jayaraman, MathWorks, and Ashley Micks and Ethan Gross, Ford Motor Company


Recreating traffic scenarios for testing autonomous driving in the real world requires significant time, resources, and expense, and can present a safety risk if hazardous scenarios are tested. Using a 3D virtual environment to enable testing of many of these traffic scenarios on the desktop or cluster significantly reduces the amount of required road tests. This paper details an approach to facilitate the development of perception and control algorithms for level 4 autonomy: a shared memory interface between MATLAB®, Simulink®, and Unreal Engine 4 can send information (such as vehicle control signals) back to the virtual environment. The shared memory interface conveys arbitrary numerical data, RGB image data, and point cloud data for the simulation of lidar sensors. The interface consists of a plugin for Unreal Engine®, which contains the necessary read/write functions, and a beta toolbox for MATLAB, capable of reading and writing from the same shared memory locations specified in Unreal Engine, MATLAB, and Simulink. The lidar sensor model was tested by generating point clouds with beam patterns that mimic Velodyne® HDL-32E (32 beam) sensors and is demonstrated to run at sufficient frame rates for real-time computations by leveraging the graphics processing unit (GPU).

Copyright © 2020 by The MathWorks, Inc. Published by SAE International, with permission.

This paper was presented at WCX17: SAE World Congress Experience.

Read full paper.

Published 2017

View Articles for Related Capabilities

View Articles for Related Industries