Projects organize AI features in your application. Each project contains logs, experiments, datasets, prompts, and other functions. Configure project-specific settings to customize behavior for your use case.
Create a project
- Navigate to your organization’s project list
- Click + Project
- Enter a project name
- Optionally add a description
- Click Create
If a project already exists, projects.create() returns a handle. There is no separate .get() method.
import * as braintrust from "braintrust";
// Get a handle to the project (creates if it doesn't exist)
const project = braintrust.projects.create({ name: "my-project" });
// Use the project to create functions
project.prompts.create({...});
project.tools.create({...});
Projects are automatically created when initializing experiments or loggers:import * as braintrust from "braintrust";
// Creates "my-project" if it doesn't exist
const experiment = braintrust.init("my-project", {
experiment: "my-experiment"
});
For more details, see the SDK reference for Python or TypeScript.
Tags help organize and filter logs, datasets, and experiments:
- Navigate to Configuration in your project.
- Click Add tag.
- Enter tag details:
- Name: Tag identifier.
- Color: Visual indicator.
- Description: Optional explanation.
- Click Save.
Use tags to track data by user type, feature, environment, or any custom category. Filter by tags in logs, experiments, and datasets. For more information about using tags, see View logs.
Define scores for manual review by users or your team:
- Navigate to Configuration > Human review.
- Click Add human review score.
- Configure the score:
- Name: Score identifier.
- Type: Continuous (0-1), categorical, or free-form text.
- Options: For categorical scores, define possible values.
- Write to expected: Allow setting expected values instead of scores.
- Multiple choice: Allow selecting multiple categories.
- Click Save.
Review scores appear in all logs and experiments in the project. Use them for quality control, data labeling, or feedback collection. For more information, see Collect human feedback.
Create aggregate scores
Combine multiple scores into a single metric:
- Navigate to Configuration > Aggregate scores.
- Click Add aggregate score.
- Define the aggregation:
- Name: Score identifier.
- Type: Weighted average, minimum, or maximum.
- Selected scores: Scores to aggregate.
- Weights: For weighted averages, set score weights.
- Description: Optional explanation.
- Click Save.
Aggregate scores appear in experiment summaries and comparisons. Use them to create composite quality metrics or overall performance indicators. For more information, see Interpret evaluation results.
Set up online scoring
Automatically evaluate production logs as they arrive:
- Navigate to Configuration > Online scoring.
- Click Add rule.
- Configure the rule:
- Name: Rule identifier.
- Scorers: Select which scorers to run.
- Sampling rate: Percentage of logs to evaluate (1-100%).
- Filter: Optional SQL query to select specific logs.
- Span type: Apply to root spans or all spans.
- Click Save.
Online scoring runs asynchronously in the background. View results in the logs page alongside other scores. For more information, see Score logs automatically.
Customize how specific span fields render in the UI:
- Navigate to Configuration > Span iframes.
- Click Add iframe.
- Configure rendering:
- Field path: Which field to render (e.g.,
output.html).
- iframe URL: Template for the iframe src attribute.
- Click Save.
Use span iframes to render HTML, charts, or custom visualizations directly in trace views. For more information, see Extend traces.
Set comparison key
Customize how experiments match test cases:
- Navigate to Configuration > Comparison key.
- Enter a SQL expression (default:
input).
- Click Save.
Examples:
input.question - Match by question field only.
input.user_id - Match by user.
[input.query, metadata.category] - Match by multiple fields.
The comparison key determines which test cases are considered the same across experiments. For more information, see Compare experiments.
Create environments to version prompts and functions:
- Navigate to Configuration > Environments.
- Click + Environment.
- Enter environment name (e.g., “production”, “staging”, “dev”).
- Optionally set as default.
- Click Create.
Assign prompt and function versions to environments to separate development from production. See Manage environments for details.
Edit project details
Update project name and description:
- Navigate to your project.
- Click Edit project in the top-right.
- Modify name and description.
- Click Save.
Delete a project
Deleting a project permanently removes all logs, experiments, datasets, and functions. This cannot be undone.
- Navigate to Configuration.
- Scroll to the bottom of the page.
- Click Delete project.
- Confirm by typing the project name.
- Click Delete.
Next steps