Lidar (an acronym for Light Detection and Ranging) is a remote sensing technology that uses pulsed light to collect range measurements to the objects in the surroundings. Lidar sensors emit laser pulses that reflect off objects, allowing them to perceive the structure of their surroundings. The sensors record the reflected light energy, to determine the distances to an object to create a 2D or 3D representation of those surroundings.
Lidar sensors are among the primary sensors for autonomous driving and robotics applications. They enable 3D perception workflows like object detection, and semantic segmentation, as well as navigation workflows like mapping, simultaneous localization and mapping (SLAM), and path planning.
Autonomous systems use multiple sensors such as camera, IMU, and radar in their sensor suite for environmental perception. Lidars can overcome some of the drawbacks of other sensors by providing highly accurate, structural, and 3D information of the surroundings. This advantage contributed to the introduction of lidar sensors into the mainstream perception market.
The market adoption of lidars is driven by three key factors:
The introduction of low-cost lidars, with enhanced characteristics for range, size, and robustness, have increased the availability of the technology for comparatively low-revenue industrial applications.
Lidars gather high-density 3D information of the surroundings as point clouds with a higher accuracy than other range sensors like radars and sonars. This, in turn, improves the accuracy of the 3D reconstruction.
The recent developments in lidar processing workflows such as semantic segmentation, object detection and tracking, lidar camera data fusion, and lidar SLAM has enabled the engineering teams to add lidars into their development workflows. You can use tools such as MATLAB® to develop and apply lidar processing algorithms.
Aerial lidars are lidar sensors mounted on unmanned aerial vehicles (UAV) or aircrafts. Aerial lidars capture 3D point cloud data of a large terrain that can be used for lidar mapping, feature extraction, terrain classification, and other use cases.
Examples of aerial lidar applications include:
Ground lidars can be stationary terrestrial lidars and mobile lidars.
Lidars are widely used in indoor robotics applications by mounting them on mobile robots. Apart from 3D lidars, 2D lidars or laser scanners are also used in indoor robotics applications like lidar scanning and mapping. They collect depth information of the surroundings and are then further processed based on the use cases.
Common uses of indoor lidars include:
MATLAB and Lidar Toolbox™ simplify lidar processing tasks. With dedicated tools and functions, MATLAB helps you overcome common challenges in processing lidar data like 3D data types, sparsity of data, invalid points in the data, and high noises.
You can import live and recorded lidar data into MATLAB, implement lidar processing workflows, and create C/C++ and CUDA code to deploy into production.
Some of the important capabilities MATLAB provides in processing lidar point clouds include:
The first step in processing any sensor data in MATLAB is to get the data into the MATLAB workspace. You can:
You can preprocess lidar data to improve the quality of data and extract basic information from it. Lidar Toolbox provides functionality for downsampling, median filtering, aligning, transforming, and extracting features from point clouds.
MATLAB enables lidar camera calibration to estimate lidar-camera transforms for fusing camera and lidar data. You can further fuse color information in lidar point clouds and estimate 3D bounding boxes in lidar using 2D bounding boxes from a co-located camera.
With MATLAB, you can apply deep learning algorithms for object detection and semantic segmentation on lidar data.
MATLAB can unify multiple domains that feed into an end-to-end object tracking workflow. This enables you to read lidar data, preprocess it, apply deep learning to detect objects, track these objects using a predefined tracker, and deploy this on a target hardware.
MATLAB provides functions to register lidar point clouds and build 3D maps using SLAM algorithms. You can extract and match fast point feature histogram (FPFH) descriptors from lidar point clouds and then register point clouds based on the matched features.
You can also implement 3D SLAM algorithms by stitching together lidar point cloud sequences from ground and aerial lidar data.