Create a tool
Currently, you must define tools via code and push them to Braintrust withbraintrust push. To define a tool,
use project.tool.create and pick a name and
unique slug. Then push the tool to Braintrust with braintrust push.
- TypeScript
- Python
calculator.ts
Dependencies
Braintrust will take care of bundling the dependencies your tool needs. In TypeScript, Braintrust usesesbuild to bundle your code and its dependencies together. This works for most dependencies, but it does not support native (compiled) libraries like SQLite. In Python, Braintrust uses uv to cross-bundle a specified list of dependencies to the target platform (Linux). This works for binary dependencies except for libraries that require on-demand compilation.
If you have trouble bundling your dependencies, file an issue in the braintrust-sdk repo.
Use tools in the UI
Once you define a tool in Braintrust, you can access it through the UI and API. However, the real advantage lies in calling a tool from an LLM. Most models support tool calling, which allows them to select a tool from a list of available options. Normally, it’s up to you to execute the tool, retrieve its results, and re-run the model with the updated context. Braintrust simplifies this process dramatically by:- Automatically passing the tool’s definition to the model
- Running the tool securely in a sandbox environment when called
- Re-running the model with the tool’s output
- Streaming the whole output along with intermediate progress to the client
View tools in the UI
Available tools are listed on the Tools page. You can run single datapoints right inside the tool to test its functionality.Add tools to a prompt
To add a tool to a prompt, select it in the Tools dropdown in your Prompt window. Braintrust will automatically:- Include it in the list of available tools to the model
- Invoke the tool if the model calls it, and append the result to the message history
- Call the model again with the tool’s result as context
- Continue for up to (default) 5 iterations or until the model produces a non-tool result
- TypeScript
- Python
github.ts

Embed tool calls into a prompt
In addition to selecting from the tool menu to add a tool to a prompt, you can also add a tool call directly from the Assistant or Tool messages within a prompt. To add a tool call to an Assistant prompt, select Assistant from the dropdown menu. Then select the Toggle tool calls icon to add the tool code directly into the prompt editor. You can also select Tool from the dropdown menu to enter a tool call ID, such as{{input.3.function_responses.0.id}}.
Structured outputs
Another use case for tool calling is to coerce a model into producing structured outputs that match a given JSON schema. You can do this without creating a tool function, and instead use the Raw tab in the Tools dropdown. Enter an array of tool definitions following the OpenAI tool format:
autoreturns the arguments of the first tool call as a JSON object. This is the default mode.parallelreturns an array of all tool calls including both function names and arguments.

response_format: { type: "json_object" } does not get parsed as a JSON object and will be returned as a string.Use tools in code
You can also attach a tool to a prompt defined in code. This example defines a tool and a prompt that uses it and pushes both to Braintrust.- TypeScript
- Python
github.ts