Skip to main content
Projects organize AI features in your application. Each project contains logs, experiments, datasets, prompts, and other functions. Configure project-specific settings to customize behavior for your use case.

Create a project

  1. Navigate to your organization’s project list
  2. Click + Project
  3. Enter a project name
  4. Optionally add a description
  5. Click Create

Add tags

Tags help organize and filter logs, datasets, and experiments:
  1. Navigate to Configuration in your project.
  2. Click Add tag.
  3. Enter tag details:
    • Name: Tag identifier.
    • Color: Visual indicator.
    • Description: Optional explanation.
  4. Click Save.
Use tags to track data by user type, feature, environment, or any custom category. Filter by tags in logs, experiments, and datasets. For more information about using tags, see View logs.

Configure human review

Define scores for manual review by users or your team:
  1. Navigate to Configuration > Human review.
  2. Click Add human review score.
  3. Configure the score:
    • Name: Score identifier.
    • Type: Continuous (0-1), categorical, or free-form text.
    • Options: For categorical scores, define possible values.
    • Write to expected: Allow setting expected values instead of scores.
    • Multiple choice: Allow selecting multiple categories.
  4. Click Save.
Review scores appear in all logs and experiments in the project. Use them for quality control, data labeling, or feedback collection. For more information, see Collect human feedback.

Create aggregate scores

Combine multiple scores into a single metric:
  1. Navigate to Configuration > Aggregate scores.
  2. Click Add aggregate score.
  3. Define the aggregation:
    • Name: Score identifier.
    • Type: Weighted average, minimum, or maximum.
    • Selected scores: Scores to aggregate.
    • Weights: For weighted averages, set score weights.
    • Description: Optional explanation.
  4. Click Save.
Aggregate scores appear in experiment summaries and comparisons. Use them to create composite quality metrics or overall performance indicators. For more information, see Interpret evaluation results.

Set up online scoring

Automatically evaluate production logs as they arrive:
  1. Navigate to Configuration > Online scoring.
  2. Click Add rule.
  3. Configure the rule:
    • Name: Rule identifier.
    • Scorers: Select which scorers to run.
    • Sampling rate: Percentage of logs to evaluate (1-100%).
    • Filter: Optional SQL query to select specific logs.
    • Span type: Apply to root spans or all spans.
  4. Click Save.
Online scoring runs asynchronously in the background. View results in the logs page alongside other scores. For more information, see Score logs automatically.

Configure span iframes

Customize how specific span fields render in the UI:
  1. Navigate to Configuration > Span iframes.
  2. Click Add iframe.
  3. Configure rendering:
    • Field path: Which field to render (e.g., output.html).
    • iframe URL: Template for the iframe src attribute.
  4. Click Save.
Use span iframes to render HTML, charts, or custom visualizations directly in trace views. For more information, see Extend traces.

Set comparison key

Customize how experiments match test cases:
  1. Navigate to Configuration > Comparison key.
  2. Enter a SQL expression (default: input).
  3. Click Save.
Examples:
  • input.question - Match by question field only.
  • input.user_id - Match by user.
  • [input.query, metadata.category] - Match by multiple fields.
The comparison key determines which test cases are considered the same across experiments. For more information, see Compare experiments.

Configure environments

Create environments to version prompts and functions:
  1. Navigate to Configuration > Environments.
  2. Click + Environment.
  3. Enter environment name (e.g., “production”, “staging”, “dev”).
  4. Optionally set as default.
  5. Click Create.
Assign prompt and function versions to environments to separate development from production. See Manage environments for details.

Edit project details

Update project name and description:
  1. Navigate to your project.
  2. Click Edit project in the top-right.
  3. Modify name and description.
  4. Click Save.

Delete a project

Deleting a project permanently removes all logs, experiments, datasets, and functions. This cannot be undone.
  1. Navigate to Configuration.
  2. Scroll to the bottom of the page.
  3. Click Delete project.
  4. Confirm by typing the project name.
  5. Click Delete.

Next steps