Sensor Fusion and Tracking Toolbox

Design and simulate multisensor tracking and positioning systems


Sensor Fusion and Tracking Toolbox includes algorithms and tools for the design, simulation, and analysis of systems that fuse data from multiple sensors to maintain position, orientation, and situational awareness. Reference examples provide a starting point for implementing components of airborne, ground-based, shipborne, and underwater surveillance, navigation, and autonomous systems.

The toolbox includes multi-object trackers, sensor fusion filters, motion and sensor models, and data association algorithms that let you evaluate fusion architectures using real and synthetic data. With Sensor Fusion and Tracking Toolbox you can import and define scenarios and trajectories, stream signals, and generate synthetic data for active and passive sensors, including RF, acoustic, EO/IR, and GPS/IMU sensors. You can also evaluate system accuracy and performance with standard benchmarks, metrics, and animated plots.

For simulation acceleration or desktop prototyping, the toolbox supports C code generation. 

Get Started:

Trajectory and Scenario Generation

Generate ground-truth waypoint-based and rate-based trajectories and scenarios. Model platforms and targets for tracking scenarios.

Generate Object Poses

Define and convert the true position, velocity, and orientation of objects in different reference frames.

Object pose.

Create Tracking Scenarios

Model platforms such as aircraft, ground vehicles, or ships. Platforms can carry sensors and provide sources of signals or reflect signals. Platforms can be stationary or in motion, carry sensors and emitters, and contain aspect-dependent signatures that reflect signals. 

Multiplatform radar detection generation.

Rotations, Orientation, and Quaternions

Represent orientation and rotation using quaternions, Euler angles, rotation matrices, and rotation vectors. Define sensor orientation with respect to body frame. 

Rotations, orientations, and quaternions. 

Sensor Models

Simulate measurements from IMU (accelerometer, gyroscope, magnetometer), GPS receivers, radar, sonar, and IR under different environmental conditions.

Inertial and GPS Sensors

Model IMU (inertial measurement units), GPS (global positioning systems), and INS (inertial navigation systems). Tune environmental parameters such as temperature, and noise properties of the models to mimic real-world environments.

IMU and GPS model.

Active Sensors

Model radar and sonar sensors and emitters to generate detections of targets. Simulate mechanical and electronic scans in azimuth and/or elevation.

Scanning radar mode configuration.

Passive Sensors

Model RWR (radar warning receiver), ESM (electronic support measure), passive sonar, and infrared sensors to generate angle-only detections for use in tracking scenarios. Define emitters and channel properties to model interferences.

Passive ranging using a single sensor.

Inertial Sensor Fusion

Estimate orientation and position over time with algorithms that are optimized for different sensor configurations, output requirements, and motion constraints.

Orientation Estimation

Fuse accelerometer and magnetometer readings to simulate an electronic compass (eCompass). Fuse accelerometer, gyroscope, and magnetometer readings with an attitude and heading reference system (AHRS) filter.

Orientation through intertial sensor fusion.

Pose Estimation

Estimate pose with and without nonholonomic heading constraints using inertial sensors and GPS. Determine pose without GPS by fusing inertial sensors with altimeters or visual odometry.

Visual-inertial odometry.

Estimation Filters

Use Kalman, particle, and multiple-model filters for different motion and measurement models.

Filters for Object Tracking

Estimate object states using linear, extended, and unscented Kalman filters for linear and non-linear motion and measurement models. Use Gaussian-sum and particle filters for non-linear, non-Gaussian state estimation including tracking with range-only or angle-only measurements. Improve tracking of maneuvering targets with interacting multiple model (IMM) filters.

Filters for object tracking.

Motion and Measurement Models

Configure tracking filters with constant velocity, constant acceleration, constant turn, and custom motion models in cartesian, along with spherical and modified spherical coordinate systems. Define position and velocity, range-angle, angle-only, or custom measurement models.

Motion models.

Multi-Object Tracking

Create multi-object trackers that fuse information from various sensors. Maintain single or multiple hypotheses about the objects it tracks.


Integrate estimation filters, assignment algorithms, and track management logic into multi-object trackers to fuse detections into tracks. Use a multiple hypothesis tracker (MHT) in challenging scenarios such as tracking closely spaced targets under ambiguity.

Multi-object trackers.

Track Assignment

Find the best or k-best solutions to the global nearest neighbor (GNN) assignment problem. Solve the S-D assignment problem. Assign detections to tracks, or tracks to tracks. Confirm and delete tracks based on recent track history or on track score.

Track management and data association.

Track Detection Fusion

Fuse state and state covariance. Statically fuse synchronous detections including triangulation of angle detections from passive sensors.

Tracking using distributed synchronous passive sensors.

Visualization and Analytics

Analyze and compare the performance of inertial filters and multi-object tracking systems.

Scenario Visualization

Plot objects orientation and velocity, ground truth trajectories, sensor measurements, and tracks in 3D. Plot detection and track uncertainties. Visualize track ID with history trails.

Theater plot.

Sensor and Track Metrics

Generate track establishment, maintenance, and deletion metrics including track length, track breaks, and track ID swaps. Estimate track accuracy with position, velocity, acceleration, and yaw rate root-mean square error (RMSE), along with average normalized estimation error squared (ANEES). Analyze inertial sensor noise using Allan variance.

Track metrics.

Latest Features

Scenario generation

Interactively design tracking scenarios in an App

Monte Carlo simulations

Design and run Monte Carlo simulations for tracking applications

Sensor coverage plots

Visualize beam and coverage area of sensors in a tracking scenario

Tracking performance metric

Evaluate tracking performance against ground truth based on the global optimal subpattern assignment (GOSPA) metric

Simulink blocks

Model TOMHT trackers, IMU sensors, and AHRS inertial fusion using Simulink blocks

Inertial filters

Access residuals and residual covariance of insfilters and ahrs10filter

See the release notes for details on any of these features and corresponding functions.

Panel Navigation

Sensor Fusion and Tracking for Autonomous Systems: An Overview

Learn how self- and situational awareness capabilities are used for sensing and perception in autonomous systems.

Additional Sensor Fusion and Tracking Toolbox Resources