Assessing Agreement between Raters from the Point of Coefficients and Loglinear Models
Volume 15, Issue 1 (2017), pp. 1–24
Pub. online: 4 August 2022
Type: Research Article
Open Access
Published
4 August 2022
4 August 2022
Abstract
In square contingency tables, analysis of agreement between row and column classifications is of interest. For nominal categories, kappa co- efficient is used to summarize the degree of agreement between two raters. Numerous extensions and generalizations of kappa statistics have been pro- posed in the literature. In addition to the kappa coefficient, several authors use agreement in terms of log-linear models. This paper focuses on the approaches to study of interrater agreement for contingency tables with nominal or ordinal categories for multiraters. In this article, we present a detailed overview of agreement studies and illustrate use of the approaches in the evaluation agreement over three numerical examples.