Automated Driving System Toolbox™ provides algorithms and tools for designing and testing ADAS and autonomous driving systems. You can automate ground-truth labeling, generate synthetic sensor data for driving scenarios, perform multisensor fusion, and design and simulate vision systems.
For open-loop testing, the system toolbox provides a customizable workflow app and evaluation tools that let you automate labeling of ground truth and test your algorithms against ground truth. For HIL and desktop simulation of sensor fusion and control logic, you can generate driving scenarios and simulate object lists from radar and camera sensors.
Automated Driving System Toolbox supports multisensor fusion development with Kalman filters, assignment algorithms, motion models, and a multiobject tracking framework. Algorithms for vision system design include lane marker detection, vehicle detection with machine learning, including deep learning, and image-to-vehicle coordinate transforms.
Discover more about Automated Driving System Toolbox by exploring these resources.
Explore documentation for Automated Driving System Toolbox functions and features, including release notes and examples.
Browse the list of available Automated Driving System Toolbox functions.
Use Automated Driving System Toolbox to solve scientific and engineering challenges: