SURFPoints class

Object for storing SURF interest points


This object provides the ability to pass data between the detectSURFFeatures and extractFeatures functions. It can also be used to manipulate and plot the data returned by these functions. You can use the object to fill the points interactively. You can use this approach in situations where you might want to mix a non-SURF interest point detector with a SURF descriptor.


Although SURFPoints may hold many points, it is a scalar object. Therefore, NUMEL(surfPoints) always returns 1. This value may differ from LENGTH(surfPoints), which returns the true number of points held by the object.


points = SURFPoints(Location) constructs a SURFPoints object from an M-by-2 array of [x y] point coordinates, Location.

points = SURFPoints(Location,Name,Value) constructs a SURFPoints object with optional input properties specified by one or more Name,Value pair arguments. Each additional property can be specified as a scalar or a vector whose length matches the number of coordinates in Location.

Code Generation Support
Compile-time constant inputs: No restriction.
Supports MATLAB Function block: No
To index locations with this object, use the syntax: points.Location(idx,:), for points object. See visionRecovertformCodeGeneration_kernel.m, which is used in the Introduction to Code Generation with Feature Matching and Registration example.
Code Generation Support, Usage Notes, and Limitations

Input Arguments

expand all

Location — Point coordinatesM-by-2 array of [x y] point coordinates.

Point coordinates, specified as an M-by-2 array. The Location input contains M number of [x y] points.



Number of points held by the object.

Default: 0


Array of [x y] point coordinates


Specifies scale at which the interest points were detected. This value must be greater than or equal to 1.6.

Default: 1.6


Value describing strength of detected feature. The SURF algorithm uses a determinant of an approximated Hessian.

Default: 0.0


Sign of the Laplacian determined during the detection process. This value must be an integer, -1, 0, or 1. You can use this parameter to accelerate the feature matching process.

Blobs with identical metric values but different signs of Laplacian can differ by their intensity values. For example, a white blob on a blackground versus a black blob on a white background. You can use this parameter to quickly eliminate blobs that do not match.

For non-SURF detectors, this property is not relevant. For example, for corner features, you can simply use the default value of 0.

Default: 0


Orientation of the detected feature, specified as an angle, in radians. The angle is measured counter-clockwise from the X-axis with the origin specified by the Location property. Do not set this property manually. Rely instead, on the call to extractFeatures to fill in this value. The extractFeatures function modifies the default value of 0.0.The Orientation is mainly useful for visualization purposes.

Default: 0.0


isemptyReturns true for empty object
lengthNumber of stored points
plotPlot SURF points
selectStrongestReturn points with strongest metrics
sizeSize of the SURFPoints object


expand all

Detect SURF Features

Read in image.

    I = imread('cameraman.tif');

Detect SURF features.

    points = detectSURFFeatures(I);

Display location and scale for the 10 strongest points.

    strongest = points.selectStrongest(10);
    imshow(I); hold on;

Display [x y] coordinates for the 10 strongest points on command line.

ans =

  139.7482   95.9542
  107.4502  232.0347
  116.6112  138.2446
  105.5152  172.1816
  113.6975   48.7220
  104.4210   75.7348
  111.3914  154.4597
  106.2879  175.2709
  131.1298   98.3900
  124.2933   64.4942

Detect SURF Features and Display the Last 5 Points

Read in image.

    I = imread('cameraman.tif');

Detect SURF feature.

    points = detectSURFFeatures(I);

Display the last 5 points.

    imshow(I); hold on;

Related Examples


[1] Bay, H., A. Ess, T. Tuytelaars, and L. Van Gool. "SURF:Speeded Up Robust Features." Computer Vision and Image Understanding (CVIU).Vol. 110, No. 3, pp. 346–359, 2008.

Introduced in R2011b

Was this topic helpful?