Scenario Generation from Recorded Vehicle Data

This example shows how to generate a virtual driving scenario from recorded vehicle data. The scenario is generated from position information recorded from a GPS sensor and recorded object lists processed from a lidar sensor.

Overview

Virtual driving scenarios can be used to recreate a real scenario from recorded vehicle data. These virtual scenarios enable you to visualize and study the original scenario. Because you can programmatically modify virtual scenarios, you can also use them to synthesize scenario variations when designing and evaluating autonomous driving systems.

In this example, you create a virtual driving scenario by generating a drivingScenario object from data that was recorded from a test (ego) vehicle and an OpenDRIVE® file. The OpenDRIVE file describes the road network of the area where the data was recorded. The recorded vehicle data includes:

  • GPS data: Text file containing the latitude and longitude coordinates of the ego vehicle at each timestamp.

  • Lidar object list data: Text file containing the number of non-ego actors and the positions of their centers, relative to the ego vehicle, at each timestamp.

  • Video data: MP4 file recorded from a forward-facing monocular camera mounted on the ego vehicle.

To generate and simulate the driving scenario, you follow these steps:

  1. Explore recorded vehicle data.

  2. Import OpenDRIVE road network into driving scenario.

  3. Add ego vehicle data from GPS to driving scenario.

  4. Add non-ego actors from lidar object list to driving scenario.

  5. Simulate and visualize generated scenario.

The following diagram shows how you use the recorded data in this example. Notice that you create the driving scenario from the GPS, lidar object lists, and OpenDRIVE files. You use the camera data to visualize the original scenario and can compare this data with the scenario you generate. You also visualize the scenario route on a map using geoplayer.

Explore Recorded Vehicle Data

The positions of the ego vehicle were captured using a UBlox GPS NEO M8N sensor. The GPS sensor was placed on the center of the roof of the ego vehicle. This data is saved in the text file EgoUrban.txt.

The positions of the non-ego actors were captured using a Velodyne® VLP-16 lidar sensor with a range of 30 meters. The VLP-16 sensor was placed on the roof of the ego vehicle at a position and height that avoids having the sensor collide with the body of the ego vehicle. The point cloud from the lidar sensor was processed on the vehicle to detect objects and their positions relative to the ego vehicle. This data is saved in the text file NonEgoUrban.txt.

To help understand the original scenario, video from a monocular camera was recorded as a reference. This video can also be used to visually compare the original and generated scenarios. A preview of this recorded video is saved in the video file urbanpreview.mp4. You can download the full recorded video file from here.

Generate a preview of the urban traffic scenario used in this example.

vidObj = VideoReader('urbanpreview.mp4');
fig = figure;
set(fig,'Position',[0, 0, 800, 600]);
movegui(fig,'center');
pnl = uipanel(fig,'Position',[0 0 1 1],'Title','Urban Traffic Scenario');
plt = axes(pnl);
while hasFrame(vidObj)
    vidFrame = readFrame(vidObj);
    image(vidFrame,'Parent',plt);
    currAxes.Visible = 'off';
    pause(1/vidObj.FrameRate);
end

Though the sensor coverage area can be defined around the entire ego vehicle, this example shows only the forward-looking scenario.

Import OpenDRIVE Road Network into Driving Scenario

The road network data for generating the virtual scenario is obtained from OpenStreetMap®. The OpenStreetMap data files are converted to OpenDRIVE files and saved with extension .xodr. Use the roadNetwork function to import this road network data to a driving scenario.

Create a driving scenario object and import the desired OpenDRIVE road network into the generated scenario.

scenario = drivingScenario;
openDRIVEFile = 'OpenDRIVEUrban.xodr';
roadNetwork(scenario,'OpenDRIVE',openDRIVEFile);

Add Ego Vehicle Data from GPS to Generated Scenario

The ego vehicle data is collected from the GPS sensor and stored as a text file. The text file consists of three columns that store the latitude, longitude, and timestamp values for the ego vehicle. Use the helperGetEgoData function to import the ego vehicle data from the text file into a structure in the MATLAB® workspace. The structure contains three fields specifying the latitude, longitude and timestamps.

egoFile = 'EgoUrban.txt';
egoData = helperGetEgoData(egoFile);

Compute the trajectory waypoints of the ego vehicle from the recorded GPS coordinates. Use the geodetic2enu function from Mapping Toolbox to convert the raw GPS coordinates to the local east-north-up Cartesian system. The transformed coordinates define the trajectory waypoints of the ego vehicle.

% Specify latitude and longitude at origin of OpenDRIVE file
lat0 = 17.425853702697903;
lon0 = 78.44939480188313;
% Specify latitude and longitude of ego vehicle
lat = egoData.lat;
lon = egoData.lon;
% Compute waypoints of ego vehicle
[X,Y,~] = geodetic2enu(lat,lon,0,lat0,lon0,0,wgs84Ellipsoid);
egoWaypoints(:,1) =  X;
egoWaypoints(:,2) =  Y;

Visualize the GPS path of the ego vehicle using the geoplayer object.

zoomLevel = 17;
player = geoplayer(lat(1),lon(1),zoomLevel);
plotRoute(player,lat,lon);
for i = 1:length(lat)
    plotPosition(player,lat(i),lon(i));
end

Use helperComputeEgoData to compute the speed and the heading angle values of the ego vehicle at each sensor data timestamp.

[egoSpeed,egoAngle] = helperComputeEgoData(egoData,X,Y);

Add the ego vehicle to the driving scenario.

ego = vehicle(scenario,'Length',1,'Width',0.6,'Height',0.6);

Create a trajectory for the ego vehicle from the computed set of ego waypoints and the speed. The ego vehicle follows the trajectory at the specified speed.

trajectory(ego,egoWaypoints,egoSpeed);

Add Non-Ego Actors from Lidar Object Lists to Generated Scenario

The non-ego actor data is collected from the lidar sensor and stored as a text file. The text file consists of five columns that store the actor IDs, x-positions, y-positions, z-positions and timestamp values, respectively. Use the helperGetNonEgoData function to import the non-ego actor data from the text file into a structure in the MATLAB® workspace. The output is a structure with three fields:

  1. ActorID - Scenario-defined actor identifier, specified as a positive integer.

  2. Position - Position of actor, specified as an [x y z] real vector. Units are in meters.

  3. Time - Timestamp of the sensor recording.

nonEgoFile = 'NonEgoUrban.txt';
nonEgoData = helperGetNonEgoData(nonEgoFile);

Use helperComputeNonEgoData to compute the trajectory waypoints and the speed of each non-ego actor at each timestamp. The trajectory waypoints are computed relative to the ego vehicle.

actors = unique(nonEgoData.ActorID);
[nonEgoSpeed, nonEgoWaypoints] = helperComputeNonEgoData(egoData,egoWaypoints,nonEgoData,egoAngle);

Add the non-ego actors to the driving scenario. Create trajectories for the non-ego actors from the computed set of actor waypoints and the speed.

for i = 1:length(actors)
    actor = vehicle(scenario,'Length',1,'Width',0.6,'Height',0.6);
    trajectory(actor,nonEgoWaypoints{i},nonEgoSpeed{i});
end

Visualize the ego vehicle and non-ego actors that you imported into the generated scenario. Also visualize the corresponding trajectory waypoints of the ego vehicle and non-ego actors.

% Create a custom figure window and define an axes object
fig = figure;
set(fig,'Position',[0, 0, 800, 600]);
movegui(fig,'center');
hViewPnl = uipanel(fig,'Position',[0 0 1 1],'Title','Ego Vehicle and Actors');
hCarPlt = axes(hViewPnl);

% Plot the generated driving scenario.
plot(scenario,'Parent',hCarPlt);
axis([270 320 80 120]);
legend('Imported Road Network','Lanes','Ego Vehicle','Actor 1','Actor 2','Actor 3','Actor 4','Actor 5')
legend(hCarPlt,'boxoff');

figure,
plot(egoWaypoints(:,1),egoWaypoints(:,2),'Color',[0 0.447 0.741],'LineWidth',2)
hold on
cMValues = [0.85 0.325 0.098;0.929 0.694 0.125;0.494 0.184 0.556;0.466 0.674 0.188;0.301 0.745 0.933];
for i =1:length(actors)
    plot(nonEgoWaypoints{i}(:,1),nonEgoWaypoints{i}(:,2),'Color',cMValues(i,:),'LineWidth',2)
end
axis('tight')
xlabel('X (m)')
ylabel('Y (m)')
title('Computed Ego Vehicle and Actor Trajectories')
legend('Ego Vehicle', 'Actor 1', 'Actor 2', 'Actor 3','Actor 4','Actor 5','Location','Best')
hold off

Simulate and Visualize Generated Scenario

Plot the scenario and the corresponding chase plot. Run the simulation to visualize the generated driving scenario. The ego vehicle and the non-ego actors follow their respective trajectories.

% Create a custom figure window to show the scenario and chase plot
close all;
figScene = figure('Name','Driving Scenario','Tag','ScenarioGenerationDemoDisplay');
set(figScene,'Position',[0, 0, 1032, 1032]);
movegui(figScene,'center');

% Add the chase plot
hCarViewPanel = uipanel(figScene,'Position',[0.5 0 0.5 1],'Title','Chase Camera View');
hCarPlot = axes(hCarViewPanel);
chasePlot(ego,'Parent',hCarPlot);

% Add the top view of the generated scenario
hViewPanel = uipanel(figScene,'Position',[0 0 0.5 1],'Title','Top View');
hCarPlot = axes(hViewPanel);
plot(scenario,'Parent',hCarPlot);
% Set the axis limits to display only the active area
xMin = min(egoWaypoints(:,1));
xMax = max(egoWaypoints(:,1));
yMin = min(egoWaypoints(:,2));
yMax = max(egoWaypoints(:,2));
limits = [xMin xMax yMin yMax];
axis(limits);

% Run the simulation
while advance(scenario)
    pause(0.01)
end

Summary

This example shows how to automatically generate a virtual driving scenario from vehicle data recorded using the GPS and lidar sensors.

Helper Functions

helperGetEgoData

This function reads the ego vehicle data from a text file and converts into a structure.

function [egoData] = helperGetEgoData(egoFile)
%Read the ego vehicle data from text file
fileID = fopen(egoFile);
content = textscan(fileID,'%f %f %f');
fields = {'lat','lon','Time'};
egoData = cell2struct(content,fields,2);
fclose(fileID);
end

helperGetNonEgoData

This function reads the processed lidar data from a text file and converts into a structure. The processed lidar data contains information about the non-ego actors.

function [nonEgoData] = helperGetNonEgoData(nonEgoFile)
% Read the processed lidar data of non-ego actors from text file.
fileID = fopen(nonEgoFile);
content = textscan(fileID,'%d %f %f %f %f');
newcontent{1} = content{1};
newcontent{2} = [content{2} content{3} content{4}];
newcontent{3} = content{5};
fields = {'ActorID','Position','Time'};
nonEgoData = cell2struct(newcontent,fields,2);
fclose(fileID);
end

helperComputeEgoData

This function calculates the speed and heading angle of the ego vehicle based on the trajectory waypoints and the timestamps.

function [egoSpeed, egoAngle] = helperComputeEgoData(egoData, X, Y)
egoTime = egoData.Time;
timeDiff = diff(egoTime);
points = [X Y];
difference  = diff(points, 1);
distance  = sqrt(sum(difference .* difference, 2));
egoSpeed = distance./timeDiff;
egoAngle = atan(diff(Y)./diff(X));
egoAngle(end+1) = egoAngle(end);
egoSpeed(end+1) = egoSpeed(end);
end

helperComputeNonEgoData

This function calculates the speed and heading angle of each non-ego actor based on the trajectory waypoints and timestamps. The speed and heading angle are calculated relative to the ego vehicle.

function [nonEgoSpeed, nonEgoWaypoints] = helperComputeNonEgoData(egoData, egoWaypoints, nonEgoData, egoAngle)
actors = unique(nonEgoData.ActorID);
nonEgoWaypoints =  cell(length(actors), 1);
nonEgoSpeed = cell(length(actors), 1);
for i = 1:length(actors)
    id = actors(i);
    idx = find([nonEgoData.ActorID] == id);
    actorXData = nonEgoData.Position(idx,1);
    actorYData = nonEgoData.Position(idx,2);
    actorTime = nonEgoData.Time(idx);
    clear actorWaypoints actorSpeed;
    actorWaypoints(:,1) = 0;
    actorWaypoints(:,2) = 0;
    
    % Compute the trajectory waypoints of non-ego actor
    [sharedTimeStamps,nonEgoIdx,egoIdx] = intersect(actorTime,egoData.Time,'stable');
    tempX = actorXData(nonEgoIdx);
    tempY = actorYData(nonEgoIdx);
    relativeX = -tempX .* cos(egoAngle(egoIdx)) + tempY .* sin(egoAngle(egoIdx));
    relativeY = -tempX .* sin(egoAngle(egoIdx)) - tempY .* cos(egoAngle(egoIdx));
    actorWaypoints(nonEgoIdx,1) = egoWaypoints(egoIdx,1) + relativeX;
    actorWaypoints(nonEgoIdx,2) = egoWaypoints(egoIdx,2) + relativeY;
    
    % Compute the speed values of non-ego actor
    timeDiff = diff(sharedTimeStamps);
    difference   = diff(actorWaypoints, 1);
    distance   = sqrt(sum(difference .* difference, 2));
    actorSpeed = distance./timeDiff;
    actorSpeed(end+1) = actorSpeed(end);
    
    % Smooth the trajectory waypoints of non-ego actor
    actorWaypoints = smoothdata(actorWaypoints,'sgolay');
    
    % Store the values of trajectory waypoints and speed computed of each non-ego actor
    nonEgoWaypoints(i) = {actorWaypoints};
    nonEgoSpeed(i) = {actorSpeed'};
end
end

See Also

Apps

Functions

Objects

Related Topics

External Websites