After you have completed Setup and Configuration, you can use the Code Generation Simulink® Template for Image Processing to create a model that processes the image from the downward-facing camera of Parrot® minidrone. When you deploy the model on a Parrot minidrone, the output from the image-processing algorithm is an additional input to control the drone's flight. You can update the template to add your own logic.
At the MATLAB® command prompt, type simulink
.
On the Simulink Start Page, navigate to Simulink Support Package for Parrot Minidrones, and select Code Generation Template for Image Processing and Control.
A Simulink model opens. This
model is configured for code generation, and it uses image processing as an
additional input to control the drone's flight. The template contains two
subsystem blocks – Image Processing
System
and Flight Control System
– with
corresponding input ports and output ports.
For more information, see Image Processing System and
Flight Control System.
You can use the Image Processing System
to develop an image-processing
algorithm. This subsystem contains an input port, an output port, and an optional
block:
Image Data – This input port obtains data from the
downward-facing camera of a Parrot minidrone, in Y1UY2V format, as a
4-by-9600 array of type uint8
.
Vision-based Data – This output port transfers the processed data based on the image-processing algorithm modeled in the subsystem.
PARROT Image Conversion – This is an optional block that helps you to
convert the Image Data (in Y1UY2V format) to either YUV
or RGB format. There are three outputs for this block, which correspond to
the three color components of the converted image. Each color component is a
120-by-160 array of type uint8
. You can use these output
color components to create an image-processing algorithm, and then connect
the output of the algorithm to the Vision-based Data
output port of the subsystem.
You can model an image-processing algorithm in the Image Processing
System
by using the image obtained from Parrot minidrone's downward-facing camera. To create a typical
image-processing algorithm:
Extract features of interest from the input image, which is captured by the drone's camera every 200 ms. For example:
Use the color components – Convert the input image, which is already in Y1UY2V format (YUV422), to another color space format that has color components that are simpler to use (for example, RGB888 or YUV444).
Identify the shape of objects in the image – Use methods like corner detection, template matching, and so on.
Use the output from image extraction to perform further image analysis. For example:
Compute the statistics of the colored region in the image.
Set threshold values that help to identify certain patterns on the image.
Based on the output from the final image analysis, set the values
that can be used as input to control the position and orientation of
drone. For example, you can use a stateflow in the same subsystem to
design logic that uses the data from image analysis to further
generate the pitch and roll values that control the drone. The final
values are passed to the Flight Control System
using the Vision-based Data output port.
In the template, the Sample Time of the Image Processing
System
subsystem is 200 ms. Do not change this Sample
Time.
Flight Control System
is the main subsystem that integrates the input
data from the Image Processing System
and the data from the
different sensors on the drone. The control logic modeled in this subsystem starts
the motors on the drone, controls the drone's flight, and stops the drone.
The Flight Control System subsystem contains three input ports – AC cmd, Sensors, and Vision-based Data. For details about AC cmd and Sensors input ports, see Inports in Code Generation Template.
The Vision-based Data input port obtains data from the Image Processing System. For details, see Image Processing System.
In the template, Vision-based Data in the Flight Control System subsystem is terminated. However, if the Vision-based Data signal contains the appropriate values, you can use it for flight control by connecting that same as input to feed values to the Motors output port.
The output ports in the Code Generation template are:
Motors – This output port is used to send signals to start the four motors on the drone.
In the Code Generation Template for Image Processing and Control, the input speed values
for two of the four motors are set to zero. For the other two motors,
two Pulse Generator blocks are connected to send signals. The amplitude
values in these blocks are set to 400
, which spins
the motors at the corresponding speed. A phase shift is also
incorporated into the Pulse Generator so that the two motors spin
cyclically for 2 seconds each.
When you design your own controller logic, each of the signals to the motors can be derived from the output of controller logic or from the Vision-based Data. (Use 1-by-4 vector as the input to the Motors output port.)
Flag – This output port is used to set error conditions that can be
used to stop the flight of the drone (stop the motors). A value of
0
indicates that there are no errors, and any
nonzero value indicates an error. You can model multiple error
conditions in the controller logic with multiple nonzero values that can
be fed as input to this flag.
Do not use the values 1
,
69
, 88
, and
99
for setting error conditions using the
Flag output port (these values are reserved for other error
conditions).
In the template, the Sample Time of the Flight Control
System
subsystem is 5 ms. Do not change this Sample Time.
For details about setting up the hardware and deploying the model, see Setting Up the Hardware and Deploying the Model.
Flight Simulation Simulink Template for Parrot Minidrone | Setting Up the Hardware and Deploying the Model