Categories: Site glossary
Cohen's Kappa is a statistic sometimes calculated to measure inter-rater reliability: the extent to which two researchers give the same ratings when evaluating the same material.
Read about inter-rater reliability
Prof. Keith S. Taber's site
Cohen's Kappa is a statistic sometimes calculated to measure inter-rater reliability: the extent to which two researchers give the same ratings when evaluating the same material.
Read about inter-rater reliability