Histogram-based object tracking
The histogram-based tracker incorporates the continuously adaptive mean shift (CAMShift) algorithm for object tracking. It uses the histogram of pixel values to identify the tracked object.
step method with input image,
the histogram-based tracker object
H, and any
optional properties to identify the bounding box of the tracked object.
BBOX = step(H,I) returns the bounding box,
of the tracked object. The bounding box output is in the format [x
y width height]. Before calling the
you must identify the object to track and set the initial search window.
initializeObject method to do this.
[BBOX,ORIENTATION] = step(H,I) additionally
returns the angle between the x-axis and the major axis of the ellipse
that has the same second-order moments as the object. The range of
the returned angle can be from –pi/2 to pi/2.
[BBOX,ORIENTATION, SCORE] = step(H,I) ) additionally
returns the confidence score indicating whether the returned bounding
BBOX, contains the tracked object.
1, with the greatest
confidence equal to
initializeObject method before calling
step method in order to set the object to track
and to set the initial search window.
initializeObject(H,I,R) sets the object to
track by extracting it from the [x y width height] region
in the 2-D input image,
I. The input image,
can be any 2-D feature map that distinguishes the object from the
background. For example, the image can be a hue channel of the HSV
color space. Typically,
I will be the first
frame in which the object appears. The region,
is also used for the initial search window, in the next call to the
For best results, the object must occupy the majority of the region,
initializeObject(H,I,R,N) additionally, lets
N, the number of histogram bins.
N is set to
N enhances the ability of the tracker
to discriminate the object. However, this approach also narrows the
range of changes to the object's visual characteristics that the tracker
can accommodate. Consequently, this narrow range increases the likelihood
of losing track.
You can improve the computational speed of the
by setting the class of the input image to
Starting in R2016b, instead of using the
to perform the operation defined by the System
object™, you can
call the object with arguments, as if it were a function. For example,
= step(obj,x) and
y = obj(x) perform
H = vision.HistogramBasedTracker returns
a System object,
H, that tracks an object by
using the CAMShift algorithm. It uses the histogram of pixel values
to identify the tracked object. To initialize the tracking process,
you must use the
initializeObject method to specify
an exemplar image of the object. After you specify the image of the
object, use the
step method to track the object in
consecutive video frames.
BBOX= vision.HistogramBasedTracker( returns
a tracker System object with one or more name-value pair arguments.
Unspecified properties have default values.
Normalized pixel value histogram
An N-element vector. This vector specifies
the normalized histogram of the object's pixel values. Histogram values
must be normalized to a value between
|initializeObject||Set object to track|
|initializeSearchWindow||Initialize search window|
|step||Histogram-based object tracking|
Track and display a face in each frame of an input video.
Create System objects for reading and displaying video and for drawing a bounding box of the object.
videoFileReader = vision.VideoFileReader('vipcolorsegmentation.avi'); videoPlayer = vision.VideoPlayer(); shapeInserter = vision.ShapeInserter('BorderColor','Custom', ... 'CustomBorderColor',[1 0 0]);
Read the first video frame, which contains the object. Convert the image to HSV color space. Then define and display the object region.
objectFrame = step(videoFileReader); objectHSV = rgb2hsv(objectFrame); objectRegion = [40, 45, 25, 25]; objectImage = step(shapeInserter, objectFrame, objectRegion); figure imshow(objectImage) title('Red box shows object region')
(Optionally, you can select the object region using your mouse. The object must occupy the majority of the region. Use the following command.)
% figure; imshow(objectFrame); objectRegion=round(getPosition(imrect))
Set the object, based on the hue channel of the first video frame.
tracker = vision.HistogramBasedTracker; initializeObject(tracker, objectHSV(:,:,1) , objectRegion);
Track and display the object in each video frame. The while loop reads each image frame, converts the image to HSV color space, then tracks the object in the hue channel where it is distinct from the background. Finally, the example draws a box around the object and displays the results.
while ~isDone(videoFileReader) frame = step(videoFileReader); hsv = rgb2hsv(frame); bbox = step(tracker, hsv(:,:,1)); out = step(shapeInserter, frame, bbox); step(videoPlayer, out); end
Release the video reader and player.
 Bradski, G. and A. Kaehler, Learning OpenCV :Computer Vision with the OpenCV Library, O'Reilly Media Inc.: Sebastopol, CA, 2008.
Usage notes and limitations:
See System Objects in MATLAB Code Generation (MATLAB Coder).