Automated Driving Toolbox™ provides algorithms and tools for designing, simulating, and testing ADAS and autonomous driving systems. You can design and test vision and lidar perception systems, as well as sensor fusion, path planning, and vehicle controllers. Visualization tools include a bird’s-eye-view plot and scope for sensor coverage, detections and tracks, and displays for video, lidar, and maps. The toolbox lets you import and work with HERE HD Live Map data and OpenDRIVE® road networks.
Using the Ground Truth Labeler app, you can automate the labeling of ground truth to train and evaluate perception algorithms. For hardware-in-the-loop (HIL) and desktop simulation of sensor fusion, path planning, and control logic, you can generate and simulate driving scenarios and radar and camera sensor outputs.
Automated Driving Toolbox provides reference application examples for common ADAS and automated driving features, including FCW, AEB, ACC, LKA, and parking valet. The toolbox supports C/C++ code generation for rapid prototyping and HIL testing, with support for sensor fusion, tracking, path planning, and vehicle controller algorithms.
Interactively label rectangular ROIs, polylines, or pixels in a video or image sequence by using the Ground Truth Labeler app.
Generate a scenario, simulate sensor detections, and use sensor fusion to track simulated vehicles.
Use the Driving Scenario Designer app to build a driving scenario and generate vision and radar sensor detections from it.
Understand coordinate systems for automated driving