Automated Driving Toolbox
Design, simulate, and test ADAS and autonomous driving systems
Automated Driving Toolbox™ provides algorithms and tools for designing, simulating, and testing ADAS and autonomous driving systems. You can design and test vision and lidar perception systems, as well as sensor fusion, path planning, and vehicle controllers. Visualization tools include a bird’s-eye-view plot and scope for sensor coverage, detections and tracks, and displays for video, lidar, and maps. The toolbox lets you import and work with HERE HD Live Map data and OpenDRIVE® road networks.
Using the Ground Truth Labeler app, you can automate the labeling of ground truth to train and evaluate perception algorithms. For hardware-in-the-loop (HIL) testing and desktop simulation of perception, sensor fusion, path planning, and control logic, you can generate and simulate driving scenarios. You can simulate camera, radar, and lidar sensor output in a photorealistic 3D environment and sensor detections of objects and lane boundaries in a 2.5-D simulation environment.
Automated Driving Toolbox provides reference application examples for common ADAS and automated driving features, including FCW, AEB, ACC, LKA, and parking valet. The toolbox supports C/C++ code generation for rapid prototyping and HIL testing, with support for sensor fusion, tracking, path planning, and vehicle controller algorithms.
Cuboid Driving Simulation
Generate synthetic detections from radar and camera sensor models, and incorporate those detections into driving scenarios to test automated driving algorithms with a cuboid-based simulator. Define road networks, actors, and sensors using the Driving Scenario Designer app. Import prebuilt Euro NCAP tests and OpenDRIVE road networks.
Unreal Engine Driving Scenario Simulation
Develop, test, and visualize the performance of driving algorithms in a 3D simulated environment rendered using the Unreal Engine® from Epic Games®.
Automating Ground Truth Labeling
Use the Ground Truth Labeler app for interactive and automated ground truth labeling to facilitate object detection, semantic segmentation, and scene classification.
Testing Perception Algorithms
Evaluate the performance of perception algorithms by comparing ground truth data against algorithm outputs.
Vision System Design
Develop computer vision algorithms for vehicle and pedestrian detection, lane detection, and classification.
Use lidar data to detect obstacles and segment ground planes.
Accessing HERE HD Live Map Data
Read map data from the HERE HD Live Map web service, including tiled map layers that contain detailed road, lane, and localization information.
Visualizing Map Data
Using streaming coordinates to map the positions of vehicles as they drive.
Multisignal Ground Truth Labeling
Label synchronized lidar and video signals simultaneously
Label lidar point clouds to train deep learning models
3D Scene Customization
Simulate driving scenarios in a 3D environment using scenes created in the Unreal Editor
Lidar Sensor Model
Generate synthetic point clouds from programmatic driving scenarios
Bird's-Eye Scope Enhancements
Visualize radar and lidar data from 3D simulation sensors, and visualize actors from custom blocks
HERE HD Live Map Roads in Scenarios
Create driving scenarios using imported road data from high-definition geographic maps