Dan Seal, MathWorks
This video shows how to quickly get started acquiring live lidar data from Velodyne LiDAR® sensors into MATLAB®. It covers connecting to hardware, reading data, and performing analysis on lidar point clouds.
With your Velodyne hardware connected to your computer, you can test the connection using the third-party VeloView software. Then in MATLAB, you can connect directly to the hardware using the velodynelidar function.
Once you’ve connected to the hardware from MATLAB, you can read point clouds using the read command or stream point clouds to a buffer using the start and stop commands. You can also perform live analysis while streaming point cloud data into MATLAB.
MATLAB toolboxes provide many point cloud processing functions for different applications. With just a few lines of code, these functions and their corresponding examples can be applied to point clouds acquired live from Velodyne LiDAR sensors.
The functionality shown in this video requires Image Acquisition Toolbox™ as well as Image Acquisition Toolbox Support Package for Velodyne LiDAR Sensors, which can be downloaded from the Add-On Explorer in MATLAB.
You can now acquire live lidar data from select Velodyne LiDAR sensors directly into MATLAB. This video will show you how to get started.
First, I’ll review how to connect to your hardware. Then I’ll show the different options for reading lidar point clouds in MATLAB. Finally, you’ll see how to get started with some of the lidar processing functionality available in MATLAB.
Before you get started, you’ll want to make sure that you have installed all the required products and hardware support packages.
The following Velodyne LiDAR sensors are currently supported. For this example, I’ll be using a Velodyne Puck sensor.
Here is the sensor as it is mounted to my desk. I have it connected to power and to my computer’s ethernet port.
Before I connect to the sensor from MATLAB, I’ll check the connection using the VeloView software, which is a free third-party tool.
I can open the sensor steam, specify the sensor that I’m using and the lidar port. Here I’m using the default, which is 2368. And I can see the data coming from the sensor live, and if I move around my office, you can see this movement in the displayed point cloud.
So this is all working well. I can connect to the sensor from MATLAB. Before I do so, I should make sure to close the VeloView software.
In MATLAB, I can connect to the sensor using the velodynelidar function. Specify the sensor name. Here you can also specify the port and the calibration file if you have one. I’m using the defaults for those. The default calibration file is the one that’s provided by Velodyne LiDAR.
So you can see I’ve successfully connected to this device. I can preview the data coming from the sensor with the preview function. Now you can see the data is streaming into a MATLAB figure. And once again, if I move around the office, you can see the preview update to reflect this.
But the streaming preview does not save any data. If I want to actually acquire data in MATLAB, I can use the read function. And that will read data into a point cloud variable. So you can see I’ve read a single point cloud with 57,000 points into MATLAB. I can view this point cloud using the pcshow function. So that data has now been acquired and visualized in MATLAB.
In addition to on-demand point cloud readings, I can also stream data into a buffer in MATLAB. If I start acquiring on v and look at the object, we can now see that Streaming is true and the number of point clouds available continues to increase over time.
I can read point clouds from the beginning of the buffer, either one at a time or in groups, and I can analyze this data in MATLAB while data continues to stream into the end of the buffer.
I can also stop the buffered acquisition. When I do so, I still have access to the remaining data in the buffer. So there are a few different options for reading point clouds from the device.
Now let’s say I’d like to start doing some processing of these point clouds. When I want to learn how to do something in MATLAB, I like to search the documentation. If I search for lidar processing, I see that there are entire documentation sections dedicated to this. Let’s say I want to do some segmentation. There is a section here with a number of lidar preprocessing functions available. I see that one of them is pcsegdist, which can segment point clouds into clusters based on Euclidean distance.
Whenever I’m trying to use a new function for the first time, I like to look at the examples that are available. I see here that there’s an example where some artificial sphere data is created and then it’s segmented based on distance and the two clusters that are discovered are differentiated by color. That seems like an interesting application. I’ll open the live script in MATLAB. And I could just run this, but rather than using this artificial data, I’ll replace it with the data specific to my application, which is the live data coming from my sensor.
So now if I run this, it’ll read from my sensor and then perform the rest of these computations on that data.
Now I can see in this plot that most of the points within my office were found to be in the same cluster and there was a second cluster here for the chair in my office which is right behind me.
If I want to find more distinct clusters in my office, I can change this minimum distance so that rather than points needing to be half a meter away to be distinct clusters, let’s say they only need to be ten centimeters away. And I’ll also change the colormap so that adjacent clusters have more distinct colors.
Now let’s run this again.
And in the figure that pops up, I can see that there are a number of different colors for different segments that are found in my office. I was identified as one color here. My chair was a different color. This wall with my computer on it was another color. And there were several different clusters found on this desk at the far side of my office.
So these are some of the first steps that I would take exploring some new lidar processing function and then adapting it to my own needs.
So to review, I’ve gone through three steps that are needed to get started with hardware. First testing your connection and then connecting in MATLAB with the velodyne lidar function. You can read point clouds with the read function and stream point clouds with the start and stop functions. And then when you move on to processing, you can explore provided point cloud processing functionality and adapt these examples to suit your own needs.