Note: This page has been translated by MathWorks. Please click here

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

To view all translated materals including this page, select Japan from the country navigator on the bottom of this page.

Evaluate precision metric for object detection

`averagePrecision = evaluateDetectionPrecision(detectionResults,groundTruthTable)`

```
[averagePrecision,recall,precision]
= evaluateDetectionPrecision(___)
```

`[___] = evaluateDetectionPrecision(___,threshold)`

returns
the average precision, of the `averagePrecision`

= evaluateDetectionPrecision(`detectionResults`

,`groundTruthTable`

)`detectionResults`

compared
to `groundTruthTable`

, which is used to measure
the performance of the object detector. For multiclass detector, the
precision is a vector of scores for each object class in the order
specified by `groundTruthTable`

.

`[`

returns data
points for plotting the precision–recall curve, using input
arguments from the previous syntax.`averagePrecision`

,`recall`

,`precision`

]
= evaluateDetectionPrecision(___)

`[___] = evaluateDetectionPrecision(___,`

specifies
the overlap threshold for assigning a detection to a ground truth
box.`threshold`

)

Was this topic helpful?