Skip to main content
OpenRouter gives you a unified way to call models from many providers. Braintrust traces OpenRouter either through the native openrouter SDK with wrap_openrouter() or through OpenRouter’s OpenAI-compatible endpoint with wrap_openai().
This guide covers manual instrumentation. For quicker setup, use auto-instrumentation.

Setup

Install Braintrust and the native OpenRouter SDK:
pip install braintrust openrouter
Set your API keys before you run your app:
.env
OPENROUTER_API_KEY=<your-openrouter-api-key>
BRAINTRUST_API_KEY=<your-braintrust-api-key>

Trace with OpenRouter

Use wrap_openrouter() when you want to trace the native OpenRouter SDK directly.
trace-openrouter.py
import os

from braintrust import init_logger, wrap_openrouter
from openrouter import OpenRouter

init_logger(project="My Project")

client = wrap_openrouter(OpenRouter(api_key=os.environ["OPENROUTER_API_KEY"]))

response = client.beta.responses.send(
    model="openai/gpt-5-mini",
    input="Summarize tracing in one sentence.",
)

print(response.output_text)

Use the OpenAI-compatible endpoint

If your app already uses the OpenAI Python SDK against OpenRouter’s OpenAI-compatible endpoint, keep that setup and use wrap_openai().
trace-openrouter-openai-compatible.py
import os

from braintrust import init_logger, wrap_openai
from openai import OpenAI

init_logger(project="My Project")

client = wrap_openai(
    OpenAI(
        base_url="https://openrouter.ai/api/v1",
        api_key=os.environ["OPENROUTER_API_KEY"],
    )
)

response = client.responses.create(
    model="openai/gpt-5-mini",
    input="Explain routing in one sentence.",
)

print(response.output_text)

What Braintrust traces

For the native openrouter SDK, Braintrust traces:
  • Chat completions, including chat.send() and send_async()
  • Streaming chat responses
  • Embeddings via embeddings.generate()
  • Responses API calls via beta.responses.send() and send_async()

Resources