Skip to main content
OpenRouter provides access to models from multiple providers through a single SDK. Braintrust integrates with OpenRouter in TypeScript through wrapOpenRouter and auto-instrumentation.
This guide covers manual instrumentation. For quicker setup, use auto-instrumentation.

Setup

Install the dependencies:
# pnpm
pnpm add braintrust @openrouter/sdk
# npm
npm install braintrust @openrouter/sdk
Set your API keys:
.env
OPENROUTER_API_KEY=<your-openrouter-api-key>
BRAINTRUST_API_KEY=<your-braintrust-api-key>

# If you are self-hosting Braintrust, set the URL of your hosted dataplane
# BRAINTRUST_API_URL=<your-braintrust-api-url>

Trace with OpenRouter

Trace your OpenRouter calls for observability and monitoring.

Trace automatically

Braintrust provides automatic tracing for OpenRouter calls. This is the recommended setup for most projects.
import { initLogger } from "braintrust";
import OpenRouter from "@openrouter/sdk";

initLogger({
  projectName: "My Project",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const client = new OpenRouter({ apiKey: process.env.OPENROUTER_API_KEY });

const response = await client.chat.send({
  chatGenerationParams: {
    model: "openai/gpt-5-mini",
    messages: [{ role: "user", content: "What is observability?" }],
  },
});
Run with the import hook:
node --import braintrust/hook.mjs app.js
If you’re using a bundler or Next.js Turbopack, see Trace LLM calls for plugin/loader setup.

Wrap an OpenRouter client

If you prefer explicit instrumentation, wrap the OpenRouter client with wrapOpenRouter.
import { initLogger, wrapOpenRouter } from "braintrust";
import OpenRouter from "@openrouter/sdk";

initLogger({
  projectName: "My Project",
  apiKey: process.env.BRAINTRUST_API_KEY,
});

const client = wrapOpenRouter(
  new OpenRouter({ apiKey: process.env.OPENROUTER_API_KEY }),
);

const response = await client.chat.send({
  chatGenerationParams: {
    model: "openai/gpt-5-mini",
    messages: [{ role: "user", content: "What is observability?" }],
  },
});