Encyclopedia Evalica / Evaluation / Inter-annotator agreement (IAA)

Inter-annotator agreement (IAA) illustration

Inter-annotator agreement (IAA)

/ih'nter a.nuh'tay.ter uh'gree.muhnt eye.ay.ay/A measure of how consistently multiple humans label the same items. Low IAA is usually a sign the rubric or schema needs clearer definitions. (noun)

Certain model responses had low IAA so we made edits to our our rubric.

Related Evaluation terms

From the docs

Get started with Evals

Braintrust is the AI observability and eval platform for production AI. By connecting evals and observability in one workflow, teams at Notion, Stripe, Zapier, Vercel, and Ramp ship quality AI products at scale.

Start building