Automated Driving Toolbox


Automated Driving Toolbox

Design, simulate, and test ADAS and autonomous driving systems

Get Started:

Reference Applications

Use reference applications as a basis for developing automated driving functionality. Automated Driving Toolbox includes reference applications for forward collision warning (FCW), lane keeping assist (LKA), and automated parking valet.

Driving Scenario Simulation

Author driving scenarios, use sensor models, and generate synthetic data to test automated driving algorithms in simulated environments.

Cuboid Driving Simulation

Generate synthetic detections from radar and camera sensor models, and incorporate those detections into driving scenarios to test automated driving algorithms with a cuboid-based simulator. Define road networks, actors, and sensors using the Driving Scenario Designer app. Import prebuilt Euro NCAP tests and OpenDRIVE road networks.

Unreal Engine Driving Scenario Simulation

Develop, test, and visualize the performance of driving algorithms in a 3D simulated environment rendered using the Unreal Engine® from Epic Games®.

Using a 3D simulation environment to record synthetic sensor data, develop a lane marker detection system, and test that system under different scenarios. 

Ground Truth Labeling

Automate labeling of ground truth data and compare output from an algorithm under test with ground truth data.

Automating Ground Truth Labeling

Use the Ground Truth Labeler app for interactive and automated ground truth labeling to facilitate object detection, semantic segmentation, and scene classification.

Testing Perception Algorithms

Evaluate the performance of perception algorithms by comparing ground truth data against algorithm outputs.

Evaluating lane detection output against ground truth.

Perception with Computer Vision and Lidar

Develop and test vision and lidar processing algorithms for automated driving.

Vision System Design

Develop computer vision algorithms for vehicle and pedestrian detection, lane detection, and classification.

Monocular camera sensor simulation output.

Sensor Fusion and Tracking

Perform multisensor fusion using a multi-object tracking framework with Kalman filters.


Access and visualize high-definition map data from the HERE HD Live Map service. Display vehicle and object locations on streaming map viewers.

Accessing HERE HD Live Map Data

Read map data from the HERE HD Live Map web service, including tiled map layers that contain detailed road, lane, and localization information.

Using HERE HD Live Map to verify lane configurations.

Visualizing Map Data

Using streaming coordinates to map the positions of vehicles as they drive.

Displaying streaming map data.

Path Planning

Plan driving paths with vehicle costmaps and motion-planning algorithms.

Vehicle Controllers

Use lateral and longitudinal controllers to follow a planned trajectory.

Stanley lateral controller for computing steering angles.