Krippendorff's Alpha is a measure of inter-rater agreement, measuring how much raters (labellers, coders) agree on labels assigned to items. It operates on different levels of measurement, implemented are nominal, ordinal and interval. In limited cases, it is identical to Fleiss’ Kappa, but Krippendorff’s Alpha is applicable to a wider range of problems and can deal with missing entries as it does not require the same number of raters for each item.
This implementation takes a matrix of observations and the desired level of measurement as input and computes Alpha. Results have been verified against an existing SPSS macro.
For more details on Krippendorff's Alpha see http://en.wikipedia.org/wiki/Krippendorff%27s_alpha