Cohen's kappa coefficient is a statistical measure of inter-rater reliability. It is generally thought to be a more robust measure than simple percent agreement calculation since Kappa takes into account the agreement occurring by chance.
Kappa provides a measure of the degree to which two judges, A and B, concur in their respective sortings of N items into k mutually exclusive categories. A 'judge' in this context can be an individual human being, a set of individuals who sort the N items collectively, or some non-human agency, such as a computer program or diagnostic test, that performs a sorting on the basis of specified criteria. The original and simplest version of kappa is the unweighted kappa coefficient introduced by J. Cohen in 1960. When the categories are merely nominal, Cohen's simple unweighted coefficient is the only form of kappa that can meaningfully be used. If the categories are ordinal and if it is the case that category 2 represents more of something than category 1, that category 3 represents more of that same something than category 2, and so on, then it is potentially meaningful to take this into account, weighting each cell of the matrix in accordance with how near it is to the cell in that row that includes the absolutely concordant items. This function can compute a linear weights or a quadratic weights

The output of this function is:
- Observed agreement percentage
- Random agreement percentage
- Agreement percentage due to true concordance
- Residual not random agreement percentage
- Cohen's kappa
- kappa error
- kappa confidence interval
- Maximum possible kappa
- k observed as proportion of maximum possible
- k benchmarks by Landis and Koch
- z test results

1 & 2) The confusion matrix is a square matrix so the function will compute the Kappa. The Cohen's kappa is used to test the agreement between judges. If they can classify "objects" into 16 categories you will have a 16x16 square matrix: on the main diagonal you will have "objects" that both judges will classify in the same category.
3) No and read the help section

I have a confusion matrix (dimension 16x16) resulted from a classification in 16 classes.

I use >> kappa(cf_mat);

1) If i give this matrix to your function will calculate kappa coefficient for this classification? You only specify X as square data matrix, not as a confusion matrix.

2) Your function works also on multi-class?

3) Do i need to provide weights if the classes are not balanced?

I have a confusion matrix (dimension 16x16) resulted from a classification in 16 classes.

1) If i give this function to your function will calculate kappa coefficient for this classification? You only specify X as square data matrix, not as a confusion matrix.