Create driving scenario
drivingScenario object represents a 3-D
arena containing roads, vehicles, pedestrians, and other aspects of a driving scenario.
Use this object to model realistic traffic scenarios and to generate synthetic
detections for testing controllers or sensor fusion algorithms.
To add actors (cars, pedestrians, bicycles, and so on), use the
actor function. To add actors
with properties designed specifically for vehicles, use the
vehicle function. All actors,
including vehicles, are modeled as cuboids (box shapes).
To simulate a scenario, call the
advance function in a loop,
which advances the simulation one time step at a time.
You can also create driving scenarios interactively by using the Driving Scenario
Designer app. In addition, you can export
objects from the app to produce scenario variations for use in either the app or in
Simulink®. For more details, see Create Driving Scenario Variations Programmatically.
an empty driving scenario.
scenario = drivingScenario
SampleTime— Time interval between scenario simulation steps
0.01(default) | positive real scalar
Time interval between scenario simulation steps, specified as a positive real scalar. Units are in seconds.
StopTime— End time of simulation
Inf(default) | positive real scalar
End time of simulation, specified as a positive real scalar. Units are in seconds. The default
Inf causes the
simulation to end when the first actor reaches the end of its
SimulationTime— Current time of simulation
This property is read-only.
Current time of the simulation, specified as a positive real scalar. To reset the time to
zero, call the
restart function. Units
are in seconds.
IsRunning— Simulation state
This property is read-only.
Simulation state, specified as
If the simulation is running,
Actors— Actors and vehicles contained in scenario
GeoReference— Geographic coordinates of road network origin
This property is read-only.
Geographic coordinates of the road network origin, specified as a
three-element numeric row vector of the form [
lat is the latitude of the coordinate in
lon is the longitude of the coordinate in
alt is the altitude of the coordinate in
These values are with respect to the WGS84 reference ellipsoid, which is a standard ellipsoid used by GPS data.
The road network origin corresponds to the first set of geographic
coordinates specified as input to the
roadNetwork function. If you specify to import roads by
region, then this point corresponds to the center point of the specified
drivingScenario objects generated from the Driving
Scenario Designer app, if you imported the road network from a
map, then the road network origin corresponds to the first (or only) set of
geographic coordinates specified.
By specifying these coordinates as the origin in the
latlon2local function, you can the convert geographic
coordinates of a driving route into the local coordinates of a driving
scenario. Then, you can specify this converted route as a vehicle trajectory
in the scenario.
|Add actor to driving scenario|
|Positions, velocities, and orientations of actors in driving scenario|
|Physical and radar characteristics of actors in driving scenario|
|Add vehicle to driving scenario|
|Ego-centric projective perspective plot|
|Create actor or vehicle trajectory in driving scenario|
|Target positions and orientations relative to ego vehicle|
|Outlines of targets viewed by actor|
|Convert target poses from scenario to ego coordinates|
|Convert target poses from ego to scenario coordinates|
|Get current lane of actor|
|Create road lane specifications|
|Create road lane marking object|
|Lane marking vertices and faces in driving scenario|
|Get lane boundaries of actor lane|
|Clothoid-shaped lane boundary model|
|Compute lane boundary points from clothoid lane boundary model|
|Create road lane type object|
Create a driving scenario containing a curved road, two straight roads, and two actors: a car and a bicycle. Both actors move along the road for 60 seconds.
Create the driving scenario object.
scenario = drivingScenario('SampleTime',0.1','StopTime',60);
Create the curved road using road center points following the arc of a circle with an 800-meter radius. The arc starts at 0°, ends at 90°, and is sampled at 5° increments.
angs = [0:5:90]'; R = 800; roadcenters = R*[cosd(angs) sind(angs) zeros(size(angs))]; roadwidth = 10; road(scenario,roadcenters,roadwidth);
Add two straight roads with the default width, using road center points at each end.
roadcenters = [700 0 0; 100 0 0]; road(scenario,roadcenters) roadcenters = [400 400 0; 0 0 0]; road(scenario,roadcenters)
Get the road boundaries.
rbdry = roadBoundaries(scenario);
Add a car and a bicycle to the scenario. Position the car at the beginning of the first straight road.
car = vehicle(scenario,'Position',[700 0 0],'Length',3,'Width',2,'Height',1.6);
Position the bicycle farther down the road.
bicycle = actor(scenario,'Position',[706 376 0]','Length',2,'Width',0.45,'Height',1.5);
Plot the scenario.
Display the actor poses and profiles.
poses = actorPoses(scenario)
poses=2×1 struct array with fields: ActorID Position Velocity Roll Pitch Yaw AngularVelocity
profiles = actorProfiles(scenario)
profiles=2×1 struct array with fields: ActorID ClassID Length Width Height OriginOffset MeshVertices MeshFaces RCSPattern RCSAzimuthAngles RCSElevationAngles
Create a driving scenario and show how target outlines change as the simulation advances.
Create a driving scenario consisting of two intersecting straight roads. The first road segment is 45 meters long. The second straight road is 32 meters long and intersects the first road. A car traveling at 12.0 meters per second along the first road approaches a running pedestrian crossing the intersection at 2.0 meters per second.
scenario = drivingScenario('SampleTime',0.1,'StopTime',1); road(scenario,[-10 0 0; 45 -20 0]); road(scenario,[-10 -10 0; 35 10 0]); ped = actor(scenario,'Length',0.4,'Width',0.6,'Height',1.7); car = vehicle(scenario); pedspeed = 2.0; carspeed = 12.0; trajectory(ped,[15 -3 0; 15 3 0],pedspeed); trajectory(car,[-10 -10 0; 35 10 0],carspeed);
Create an ego-centric chase plot for the vehicle.
Create an empty bird's-eye plot and add an outline plotter and lane boundary plotter. Then, run the simulation. At each simulation step:
Update the chase plot to display the road boundaries and target outlines.
Update the bird's-eye plot to display the updated road boundaries and target outlines. The plot perspective is always with respect to the ego vehicle.
bepPlot = birdsEyePlot('XLim',[-50 50],'YLim',[-40 40]); outlineplotter = outlinePlotter(bepPlot); laneplotter = laneBoundaryPlotter(bepPlot); legend('off') while advance(scenario) rb = roadBoundaries(car); [position,yaw,length,width,originOffset,color] = targetOutlines(car); plotLaneBoundary(laneplotter,rb) plotOutline(outlineplotter,position,yaw,length,width, ... 'OriginOffset',originOffset,'Color',color) pause(0.01) end
Create a driving scenario containing an ego vehicle and a target vehicle traveling along a three-lane road. Detect the lane boundaries by using a vision detection generator.
scenario = drivingScenario;
Create a three-lane road by using lane specifications.
roadCenters = [0 0 0; 60 0 0; 120 30 0]; lspc = lanespec(3); road(scenario,roadCenters,'Lanes',lspc);
Specify that the ego vehicle follows the center lane at 30 m/s.
egovehicle = vehicle(scenario); egopath = [1.5 0 0; 60 0 0; 111 25 0]; egospeed = 30; trajectory(egovehicle,egopath,egospeed);
Specify that the target vehicle travels ahead of the ego vehicle at 40 m/s and changes lanes close to the ego vehicle.
targetcar = vehicle(scenario,'ClassID',2); targetpath = [8 2; 60 -3.2; 120 33]; targetspeed = 40; trajectory(targetcar,targetpath,targetspeed);
Display a chase plot for a 3-D view of the scenario from behind the ego vehicle.
Create a vision detection generator that detects lanes and objects. The pitch of the sensor points one degree downward.
visionSensor = visionDetectionGenerator('Pitch',1.0); visionSensor.DetectorOutput = 'Lanes and objects'; visionSensor.ActorProfiles = actorProfiles(scenario);
Run the simulation.
Create a bird's-eye plot and the associated plotters.
Display the sensor coverage area.
Display the lane markings.
Obtain ground truth poses of targets on the road.
Obtain ideal lane boundary points up to 60 m ahead.
Generate detections from the ideal target poses and lane boundaries.
Display the outline of the target.
Display object detections when the object detection is valid.
Display the lane boundary when the lane detection is valid.
bep = birdsEyePlot('XLim',[0 60],'YLim',[-35 35]); caPlotter = coverageAreaPlotter(bep,'DisplayName','Coverage area', ... 'FaceColor','blue'); detPlotter = detectionPlotter(bep,'DisplayName','Object detections'); lmPlotter = laneMarkingPlotter(bep,'DisplayName','Lane markings'); lbPlotter = laneBoundaryPlotter(bep,'DisplayName', ... 'Lane boundary detections','Color','red'); olPlotter = outlinePlotter(bep); plotCoverageArea(caPlotter,visionSensor.SensorLocation,... visionSensor.MaxRange,visionSensor.Yaw, ... visionSensor.FieldOfView(1)); while advance(scenario) [lmv,lmf] = laneMarkingVertices(egovehicle); plotLaneMarking(lmPlotter,lmv,lmf) tgtpose = targetPoses(egovehicle); lookaheadDistance = 0:0.5:60; lb = laneBoundaries(egovehicle,'XDistance',lookaheadDistance,'LocationType','inner'); [obdets,nobdets,obValid,lb_dets,nlb_dets,lbValid] = ... visionSensor(tgtpose,lb,scenario.SimulationTime); [objposition,objyaw,objlength,objwidth,objoriginOffset,color] = targetOutlines(egovehicle); plotOutline(olPlotter,objposition,objyaw,objlength,objwidth, ... 'OriginOffset',objoriginOffset,'Color',color) if obValid detPos = cellfun(@(d)d.Measurement(1:2),obdets,'UniformOutput',false); detPos = vertcat(zeros(0,2),cell2mat(detPos')'); plotDetection(detPlotter,detPos) end if lbValid plotLaneBoundary(lbPlotter,vertcat(lb_dets.LaneBoundaries)) end end
To specify the motion of actors in a driving scenario, you can either define trajectories for the actors or specify their motion manually.
trajectory function determines actor pose properties based on a
set of waypoints and the speeds at which the actor travels between those
waypoints. Actor pose properties are position, velocity, roll, pitch, yaw, and
angular velocity. With this approach, motion is defined by speed, not velocity,
because the trajectory determines the direction of motion.
The actor moves along the trajectory each time the
advance function is called.
You can manually update actor pose properties at any time during a simulation.
However, these properties are overwritten with updated values at the next call
When you specify actor motion manually, setting the velocity or angular
velocity properties does not automatically move the actor in successive calls to
advance function. Therefore,
you must use your own motion model to update the position, velocity, and other
pose parameters at each simulation time step.
 You need to enter into a separate agreement with HERE in order to gain access to the HDLM services and to get the required credentials (access_key_id and access_key_secret) for using the HERE Service.