View License

Download apps, toolboxes, and other File Exchange content using Add-On Explorer in MATLAB.

» Watch video

Highlights from
Fleiss'es kappa

5.0 | 1 rating Rate this file 12 Downloads (last 30 days) File Size: 3.25 KB File ID: #15426 Version: 1.2

Fleiss'es kappa



25 Jun 2007 (Updated )

Compute the Fleiss'es kappa for multiple raters

| Watch this File

File Information

Fleiss'es kappa is a generalization of Scott's pi statistic, a statistical measure of inter-rater reliability. It is also related to Cohen's kappa statistic. Whereas Scott's pi and Cohen's kappa work for only two raters, Fleiss'es kappa works for any number of raters giving categorical ratings (see nominal data), to a fixed number of items. It can be interpreted as expressing the extent to which the observed amount of agreement among raters exceeds what would be expected if all raters made their ratings completely randomly. Agreement can be thought of as follows, if a fixed number of people assign numerical ratings to a number of items then the kappa will give a measure for how consistent the ratings are. The scoring range is between 0 and 1.

You can visit my homepage
My profile on XING
My profile on LinkedIN

MATLAB release MATLAB 7.3 (R2006b)
Tags for This File   Please login to tag files.
Please login to add a comment or rating.
Comments and Ratings (10)
23 Feb 2015 Giuseppe Cardillo

No No this file is an open code: you can modify it as you want providing my acknowledgment

Comment only
20 Feb 2015 Daniel Golden

Thanks for contributing this, Giuseppe. I moved the submission to my own git repository in order to make a few changes and facilitate making changes in the future:

Let me know if that's OK, or whether you'd prefer to use your own git repository that others can fork.

My modifications are in the current master and your FEX version from 23 Dec 2009 is

28 Jun 2012 Giuseppe Cardillo

no you are right

Comment only
27 Jun 2012 nicolas

Sorry, my mistake: pj are effectively different. But kj and zj are not.

With j=2, sum(x.*(m-x)) yields two identical values. As observers can choose only between category 1 or category 2, n votes for cat 1 induce m-n votes for cat 2.

Parameter b=pj.*(1-pj) yields also 2 identical values with j=2.

I am wrong ?

Comment only
22 Jun 2012 Giuseppe Cardillo

Of course, no. Pj is function of Kj and K1=K2 if and only if sum(x.*(m-x)) are equal (m is the numbers of raters).

Comment only
22 Jun 2012 nicolas

I tried your code with only 2 categories (so j=2). For all various set of data tested, I always got identical values for k1 and k2, p1 and p2...

Is it theoretically normal or not ? when j=2, k1=k2 ?

Thank you for your answer.

Comment only
25 Sep 2009 Adrian ADEWUNMI

With much pleasure, I am pleased to announce I have solved the problem .

Thanks to Giuseppe Cardillo for this matlab function.....good job.

Comment only
25 Sep 2009 Adrian ADEWUNMI

Whenever I imput any other matrix than a 5 x 10 matrix into matlab, using your function "fleiss(X)"it gives an error message as follows:

EDU>> fleiss(X)
??? Error using ==> fleiss at 107
The raters are not the same for each rows

Can you tell me how to fix this?

Comment only
28 Jun 2007 Giuseppe Cardillo

The Fleiss'es kappa is an overall valuation of agreement. It doesn't recognize differences among raters. I think that this can be done using Cohen's kappa.
An example of the use of Fleiss'es kappa may be the following: Consider 14 psychiatrists are asked to look at ten patients. Each psychiatrist gives one of possibly five diagnoses to each patient. The Fleiss'es kappa can be computed to show the degree of agreement among the psychiatrists above the level of agreement expected by chance.

Comment only
26 Jun 2007 Amy Graham

I think this m-file is to work with rates not raters.

Comment only
28 Jun 2007

Corrections in help lines

26 Sep 2007

new output edited

27 May 2008

there is some numerical inaccuracy so that r*(1/r)' isn't numerically equal to a square matrix of 1 if all element in r are equal. So I have changed the test to check that all raters are the same for each row.

12 Jun 2008

NORMCDF was replaced by ERFC so Statistics Toolbox is no more needed

24 Sep 2008

Improvement in input error handling

12 Nov 2008 1.1

Changes in help section

23 Dec 2009 1.2

Changes in description

Contact us