Code covered by the BSD License  

Highlights from
Objective evaluation of binarization methods for document images

Be the first to rate this file! 27 Downloads (last 30 days) File Size: 1.57 MB File ID: #27652

Objective evaluation of binarization methods for document images

by

 

18 May 2010 (Updated )

Several measures are implemented to evaluate the output of the binarization methods.

| Watch this File

File Information
Description

% This function can be used to evaluate objectively the performance of binarization methods for document image.
%
% December 27th, 2012, By Reza FARRAHI MOGHADDAM and Hossein ZIAEI NAFCHI, Synchromedia Lab, ETS, Montreal, Canada
% May 17th, 2010, By Reza FARRAHI MOGHADDAM, Synchromedia Lab, ETS, Montreal, Canada
%
% The implemented measures are as follows [1]:
% Precision:
% Recall:
% Fmeasure: (used as one of the measures in "Document Image Binarization Contest" (DIBCO'09) in ICDAR'09)
% Sensitivity: (the same as Recall)
% Specificity:
% BCR: The balanced classification rate
% AUC: (The same as BCR)
% BER: The balanced error rate
% SFmeasure: F-measure based on sensitivity and specificity
% Accuracy:
% GAccuracy: Geometric mean of sens and spec (to be used as the measure in "Quantitative evaluation of binarization algorithms of images of historical documents with bleeding noise" contest in ICFHR'10)
% pFMeasure: pseudo F-Measure
% NRM: Negative rate metric
% PSNR: Peak signal-to-noise ratio
% DRD: Distance reciprocal distortion metric [2]
% MPM: Misclassification penalty metric [3]
%
% [1] M. Sokolova and G. Lapalme, A systematic analysis of performance
% measures for classification tasks, Information Processing & Management,
% 45, pp. 427-437, 2009. DOI: 10.1016/j.ipm.2009.03.002
%
% [2] H. Lu, A. C. Kot, Y. Q. Shi, Distance-Reciprocal Distortion Measure
% for Binary Document Images, IEEE Signal Processing Letters, vol. 11,
% no. 2, pp. 228-231, 2004.
%
% [3] D. P. Young, J. M. Ferryman, PETS Metrics: On-Line Performance
% Evaluation Service, ICCCN '05 Proceedings of the 14th International
% Conference on Computer Communications and Networks, pp. 317-324, 2005.%
%
% USAGE:
% temp_obj_eval = objective_evaluation_core(u, u0_GT, u0_skl_GT);
% where
% u is: the input binarized image to be evaluated.
% u0_GT: is the ground-truth binarized image.
% u0_skl_GT: is an optional input for the ground-truth skeleton of u0_GT. If not specified, the skeleton is automatically calculated using the thininng method.
% temp_obj_eval: is the output. The measures can be reached as the fields of temp_obj_eval. For example:
% fprintf('Precision = %9.5f\n', temp_obj_eval.Precision);
%

MATLAB release MATLAB 7.14 (R2012a)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Updates
28 Dec 2012

Adding five new measures:
1. pFMeasure: pseudo F-Measure
2. NRM: Negative rate metric
3. PSNR: Peak signal-to-noise ratio
4. DRD: Distance reciprocal distortion metric
5. MPM: Misclassification penalty metric

04 Jan 2013

Minor modification.

Adding five new measures:
1. pFMeasure: pseudo F-Measure
2. NRM: Negative rate metric
3. PSNR: Peak signal-to-noise ratio
4. DRD: Distance reciprocal distortion metric
5. MPM: Misclassification penalty metric

17 Sep 2013

Added:
1. A function to calculate the indicators and their statistics for a whole database.
2. An additional average F-measure and average p-F-measure according to the DIBCO series' definition.

Contact us