Sensor Fusion and Tracking Toolbox
Design, simulate, and test multisensor tracking and positioning systems
Track multiple objects using data from active and passive sensors such as radar, ADS-B, and EO/IR sensors. Customize trackers to handle maneuvering objects.
Track multiple spaceborne objects using data from radar sensors to generate space situational awareness. You can configure the trackers to use a Keplerian motion model or other orbital models.
Track extended objects with a centralized tracker that fuses data from multiple sensors and sensor modalities. Use a probability hypothesis density (PHD) tracker to estimate the kinematics of moving objects, along with the objects’ dimensions and orientation. For complex urban environments, implement a random finite set (RFS) grid-based tracker to track each grid cell’s occupancy as well as its kinematics.
Estimation Filters and Data Association
Estimate object states using a rich library of estimation filters, including linear and nonlinear Kalman filters, multimodel filters, and particle filters. Find the best or k-best solutions to the 2D assignment problem or the S-D assignment problem. Assign detections to detections, detections to tracks, or tracks to tracks.
Integrate estimation filters, assignment algorithms, and track management logic into multi-object trackers to fuse detections into tracks. Convert your sensor data into a detection format and use a global nearest neighbor (GNN) tracker for simple scenarios. Easily switch to a joint probabilistic data association tracker (JPDA), a multiple hypothesis tracker (MHT), or a PHD tracker for challenging scenarios such as tracking closely spaced targets where measurement ambiguities exist.
Extended Object and Grid-Based Trackers
Use a PHD tracker to track the kinematics, size, and orientation of extended objects. Using high resolution sensor data such as lidar and radar point clouds, track with grid-based, RFS trackers to estimate the dynamic characteristics of grid cells in complex urban environments.
Explore tracker architectures and evaluate design trade-offs between track-to-track fusion, central-level tracking, or hybrid tracking architectures. Use static (detection) fusion to combine detections from angle-only and range-only sensors such as IR, ESM, or bistatic radars.
Generate sensor reports to test tracking systems. Define multiplatform scenarios and generate motion profiles for each platform using waypoint-based and kinematics-based trajectories. Attach sensor models and signatures to each platform and simulate their reports statistically. Use a simulated ground truth in Monte Carlo simulations to verify and validate tracking systems.
Object Trajectory and Pose Generation
Define scenarios interactively with the Tracking Scenario Designer app and generate MATLAB scripts that define and convert the true position, velocity, and orientation of objects in different reference frames.
Active and Passive Sensor Models
Model active sensors (including radar, sonar, and lidar) to generate detections of objects. Simulate mechanical and electronic scans across azimuth, elevation, or both. Model radar warning receiver (RWR), electronic support measure (ESM), passive sonar, and infrared sensors to generate angle-only detections for use in tracking scenarios. Model multistatic radar and sonar systems with emitters and sensors.
Perform IMU, GPS, and altimeter sensor fusion to determine orientation and position over time and enable tracking with moving platforms. Estimate orientation and position for inertial navigation systems (INS) over time with algorithms that are optimized for different sensor configurations, output requirements, and motion constraints.
INS Sensor Models
Model inertial measurement unit (IMU), GPS, altimeter, and INS sensors. Tune environmental parameters, such as temperature, and noise properties of the models to emulate real-world environments.
Estimate pose with and without nonholonomic heading constraints using inertial sensors and GPS. Determine pose without GPS by fusing inertial sensors with altimeters or visual odometry.
Plot the orientation and velocity of objects, ground truth trajectories, sensor measurements, and tracks in 3D. Plot detection and track uncertainties. Visualize track IDs with history trails.
Sensor and Track Metrics
Generate track establishment, maintenance, and deletion metrics including track length, track breaks, and track ID swaps. Estimate track accuracy with position, velocity, acceleration, and yaw rate root-mean square error (RMSE) or average normalized estimation error squared (ANEES). Use integrated OSPA and GOSPA metrics to summarize performance in a single score. Analyze inertial sensor noise using Allan variance.
Tuning Filters and Trackers
Tune parameters of multi-object trackers such as the assignment threshold, filter initialization function, and confirmation and deletion thresholds to maximize performance. Compare results across trackers and tracker configurations. Automatically tune INS filters to optimize noise parameters.
Generate C/C++ and MEX code for simulation acceleration or desktop prototyping using MATLAB Coder™. Apply cost calculation thresholds to reduce time spent on calculating the assignment cost.