πŸ“• Node [[inter rater_agreement]]
πŸ“„ Inter-Rater_Agreement.md by @KGBicheno

inter-rater agreement

Go back to the [[AI Glossary]]

A measurement of how often human raters agree when doing a task. If raters disagree, the task instructions may need to be improved. Also sometimes called inter-annotator agreement or inter-rater reliability. See also Cohen’s kappa, which is one of the most popular inter-rater agreement measurements.

Loading pushes...

Rendering context...