Main Content

Choose SLAM Workflow Based on Sensor Data

You can use Computer Vision Toolbox™, Navigation Toolbox™, and Lidar Toolbox™ for Simultaneous Localization and Mapping (SLAM). SLAM is widely used in applications including automated driving, robotics, and unmanned aerial vehicles (UAV). To learn more about SLAM, see What is SLAM?.

Choose SLAM Workflow

To choose the right SLAM workflow for your application, consider what type of sensor data you are collecting. MATLAB® support SLAM workflows that use images from a monocular or stereo camera system, or point cloud data including 2-D and 3-D lidar data.

Visual SLAM

There are several approaches to visual SLAM, each suited to different sensors and applications. To compute 3-D points, you need to measure the depth from the camera to points on the 3-D object. Monocular, stereo, and RGB-D sensors provide this depth information using different techniques.

  • Monocular SLAM — Single camera to estimate motion over time. However, it cannot determine absolute scale, such as the true size or distance of objects. You can use the monovslam (Computer Vision Toolbox) object to estimate a camera's trajectory and reconstruct a sparse 3-D map from a sequence of monocular images.

  • Stereo visual SLAM — Stereo camera setup to derive depth from disparities between images, providing absolute scale. You can use the stereovslam (Computer Vision Toolbox) object to estimate camera motion and build a scaled 3-D map using synchronized stereo image pairs.

  • RGB-D visual SLAM — Depth-sensing camera captures both color and depth information, providing precise, per-pixel depth measurements. You can use the rgbdvslam (Computer Vision Toolbox) object to track camera pose and construct a dense 3-D map by combining RGB and depth data.

  • Visual-inertial SLAM — Fuses visual data with motion information from an inertial measurement unit (IMU), enhancing robustness and accuracy, particularly during rapid movements or in visually challenging environments. You can use vSLAM objects to fuse IMU measurements with mono, stereo and RGB-D data.

This table summarizes the key features available for visual SLAM.

Sensor DataFeaturesTopicsExamplesToolboxCode Generation

Monocular images

  • Feature detection, extraction, and matching

  • Triangulation and bundle adjustment

  • Data management for key frames and map points

  • Loop closure detection using bag of features

  • Similarity pose graph optimization

  • Fusion with IMU

Performant and deployable workflows:

Modular workflows:

  • Computer Vision Toolbox

  • Navigation Toolbox

Stereo images

  • Stereo image rectification

  • Feature detection, extraction, and matching

  • Reconstruction from disparity, triangulation, and bundle adjustment

  • Data management for key frames and map points

  • Loop closure detection using bag of features

  • Pose graph optimization

  • Fusion with IMU

Performant and deployable workflows:

Modular workflows:

  • Computer Vision Toolbox

  • Navigation Toolbox

RGB-D images

  • Feature detection, extraction, and matching

  • Reconstruction from depth images, triangulation, and bundle adjustment

  • Data management for key frames and map points

  • Loop closure detection using bag of features

  • Pose graph optimization

  • Fusion with IMU

Performant and deployable workflows:

Modular workflows:

  • Computer Vision Toolbox

  • Navigation Toolbox

Point Cloud, 2-D, and 3-D Lidar SLAM

This table summarizes the key features available for point cloud data including 2-D and 3-D lidar SLAM.

Sensor DataFeaturesTopicsExamplesToolboxCode Generation

2-D lidar scans

  • Occupancy map building

  • Vehicle pose estimation

  • Pose graph optimization

  • SLAM algorithm tuning

  • SLAM Map Builder app

  • Navigation Toolbox

  • Lidar Toolbox

Point cloud data

  • Point cloud processing

  • Registration

  • Data management for map building

  • Loop closure detection with global features

  • Pose graph optimization

  • Localization in a known map

  • Computer Vision Toolbox

3-D lidar scans

Feature-based:

  • Registration

  • Loop closure detection

  • Localization in a known map

  • Lidar Toolbox