Skip to main content
Every production request flows through the same observability system you used during development. The Monitor page provides custom dashboards to track performance, costs, errors, and quality metrics across your deployed prompts and functions.

View production metrics

The Monitor page shows custom dashboards for tracking deployed prompts and functions. For details on creating custom charts, filtering data, selecting timeframes, and configuring dashboards, see Monitor with dashboards. Production-specific metrics include:
  • Request count: Volume of production traffic
  • Latency: Response time (total duration, time to first token)
  • Token count: Prompt tokens, completion tokens, and total usage
  • Cost: Estimated spend based on model pricing
  • Scores: Quality metrics from online scoring
  • Tools: Tool call frequency and success rates
Filter by production environments, models, or errors to focus on specific segments. Group by model, environment, user, or custom metadata to analyze patterns across your deployments. Monitor overview

Set up alerts

Configure alerts to notify you when metrics exceed thresholds:
  1. Navigate to Configuration > Automations
  2. Click + Alert
  3. Define your conditions using SQL queries
  4. Set notification channels (email, Slack, webhooks)
Example alerts:
  • Error rate exceeds 5% for 10 minutes
  • Average latency above 2 seconds
  • Daily cost exceeds budget threshold
  • Score drops below 0.8
See Alerts for detailed configuration.

Track costs

Cost charts estimate spending based on model pricing. Costs are calculated from:
  • Token counts (prompt and completion)
  • Model pricing rates
  • Provider-specific pricing tiers
Cost estimates are approximate. Actual billing from providers may vary based on rate limits, batch discounts, and other factors.

Monitor quality

Online scoring automatically evaluates production requests. View score distributions and trends in the Monitor page:
  • Group by score name to compare different quality metrics
  • Filter by low scores to find problematic requests
  • Track score changes over time to detect quality regressions
Configure online scoring in Configuration > Online scoring. See Score online for details.

Next steps