Encyclopedia Evalica / Annotation / Human review

Human review
/'hyoo.muhn ree'vyoo/A workflow in which team members manually examine and score traces or dataset records through a dedicated UI. Human review is often reserved for high-risk or high-impact cases. (noun)
“We scheduled human review for the most sensitive edge cases before the launch.”
Customer example
Navan's ops team reviews only the subset of calls that the eval system flagged for follow-up. Read more
Related Annotation terms
From the docs
Get started with Evals
Braintrust is the AI observability and eval platform for production AI. By connecting evals and observability in one workflow, teams at Notion, Stripe, Zapier, Vercel, and Ramp ship quality AI products at scale.
Start building