This is machine translation

Translated by Microsoft
Mouseover text to see original. Click the button below to return to the English verison of the page.

Note: This page has been translated by MathWorks. Please click here
To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

vision.HistogramBasedTracker System object

Package: vision

Histogram-based object tracking


The histogram-based tracker incorporates the continuously adaptive mean shift (CAMShift) algorithm for object tracking. It uses the histogram of pixel values to identify the tracked object.

Use the step method with input image, I, the histogram-based tracker object H, and any optional properties to identify the bounding box of the tracked object.

BBOX = step(H,I) returns the bounding box, BBOX, of the tracked object. The bounding box output is in the format [x y width height]. Before calling the step method, you must identify the object to track and set the initial search window. Use the initializeObject method to do this.

[BBOX,ORIENTATION] = step(H,I) additionally returns the angle between the x-axis and the major axis of the ellipse that has the same second-order moments as the object. The range of the returned angle can be from –pi/2 to pi/2.

[BBOX,ORIENTATION, SCORE] = step(H,I) ) additionally returns the confidence score indicating whether the returned bounding box, BBOX, contains the tracked object. SCORE is between 0 and 1, with the greatest confidence equal to 1.

Use the initializeObject method before calling the step method in order to set the object to track and to set the initial search window.

initializeObject(H,I,R) sets the object to track by extracting it from the [x y width height] region R located in the 2-D input image, I. The input image, I, can be any 2-D feature map that distinguishes the object from the background. For example, the image can be a hue channel of the HSV color space. Typically, I will be the first frame in which the object appears. The region, R, is also used for the initial search window, in the next call to the step method. For best results, the object must occupy the majority of the region, R.

initializeObject(H,I,R,N) additionally, lets you specify N, the number of histogram bins. By default, N is set to 16. Increasing N enhances the ability of the tracker to discriminate the object. However, this approach also narrows the range of changes to the object's visual characteristics that the tracker can accommodate. Consequently, this narrow range increases the likelihood of losing track.

    Tip:   You can improve the computational speed of the HistogramBasedTracker object by setting the class of the input image to uint8.

    Note:   Starting in R2016b, instead of using the step method to perform the operation defined by the System object™, you can call the object with arguments, as if it were a function. For example, y = step(obj,x) and y = obj(x) perform equivalent operations.


H = vision.HistogramBasedTracker returns a System object, H, that tracks an object by using the CAMShift algorithm. It uses the histogram of pixel values to identify the tracked object. To initialize the tracking process, you must use the initializeObject method to specify an exemplar image of the object. After you specify the image of the object, use the step method to track the object in consecutive video frames.

BBOX= vision.HistogramBasedTracker(Name,Value) returns a tracker System object with one or more name-value pair arguments. Unspecified properties have default values.



Normalized pixel value histogram

An N-element vector. This vector specifies the normalized histogram of the object's pixel values. Histogram values must be normalized to a value between 0 and 1. You can use the initializeObject method to set the property. This property is tunable.

Default: []


initializeObjectSet object to track
initializeSearchWindowInitialize search window
stepHistogram-based object tracking
Common to All System Objects

Create System object with same property values


Expected number of inputs to a System object


Expected number of outputs of a System object


Check locked states of a System object (logical)


Allow System object property value changes


expand all

Track and display a face in each frame of an input video.

Create System objects for reading and displaying video and for drawing a bounding box of the object.

videoFileReader = vision.VideoFileReader('vipcolorsegmentation.avi');
videoPlayer = vision.VideoPlayer();
shapeInserter = vision.ShapeInserter('BorderColor','Custom', ...
    'CustomBorderColor',[1 0 0]);

Read the first video frame, which contains the object. Convert the image to HSV color space. Then define and display the object region.

objectFrame = step(videoFileReader);
objectHSV = rgb2hsv(objectFrame);
objectRegion = [40, 45, 25, 25];
objectImage = step(shapeInserter, objectFrame, objectRegion);

title('Red box shows object region')

(Optionally, you can select the object region using your mouse. The object must occupy the majority of the region. Use the following command.)

% figure; imshow(objectFrame); objectRegion=round(getPosition(imrect))

Set the object, based on the hue channel of the first video frame.

tracker = vision.HistogramBasedTracker;
initializeObject(tracker, objectHSV(:,:,1) , objectRegion);

Track and display the object in each video frame. The while loop reads each image frame, converts the image to HSV color space, then tracks the object in the hue channel where it is distinct from the background. Finally, the example draws a box around the object and displays the results.

while ~isDone(videoFileReader)
  frame = step(videoFileReader);
  hsv = rgb2hsv(frame);
  bbox = step(tracker, hsv(:,:,1));

  out = step(shapeInserter, frame, bbox);
  step(videoPlayer, out);

Release the video reader and player.



[1] Bradski, G. and A. Kaehler, Learning OpenCV :Computer Vision with the OpenCV Library, O'Reilly Media Inc.: Sebastopol, CA, 2008.

Extended Capabilities

Introduced in R2012a

Was this topic helpful?