| Method | Best for | Setup |
|---|---|---|
| Vercel AI SDK | Fine-grained control over tracing, selective instrumentation | Install packages + add code |
| Vercel Marketplace | Quick setup, automatic tracing of all AI calls | Configure in Vercel dashboard |
Trace with Vercel AI SDK
The Braintrust SDK provides native support for the Vercel AI SDK, automatically tracing AI calls with full input/output logging, metrics, and tool execution.Setup
Install the Braintrust SDK alongside the Vercel AI SDK. The Braintrust SDK supports Vercel AI SDK v3, v4, v5, and v6.wrapAISDK to wrap the Vercel AI SDK functions (generateText, streamText, generateObject, streamObject).
trace-vercel-ai-sdk.ts
Trace tools calls
wrapAISDK automatically traces tool call suggestions from the LLM and the tool execution results.
trace-vercel-ai-sdk-tools.ts
Stream tool responses
You can also usestreamText for streaming responses with tool calls. Streaming creates doStream child spans (as explained above) for each LLM call:
trace-vercel-ai-sdk-streaming.ts
Add metadata
To attach custom metadata to yourwrapAISDK traces, wrap your AI calls in a parent span using traced. The wrapAISDK function automatically creates child spans for AI SDK calls, and you attach your metadata to the parent span:
trace-with-span-metadata.ts
- Add custom metadata like user IDs, session IDs, or feature flags.
- Compute metadata based on async operations (e.g., fetching user context).
- Add metadata conditionally based on the response.
- Group multiple AI calls under a single parent span with shared metadata.
If you’re using the Vercel AI SDK with OpenTelemetry tracing (not
wrapAISDK), you can use the native experimental_telemetry.metadata parameter instead. See the OpenTelemetry integration guide for details.Multi-round tool interactions
When using tools, the AI SDK often makes multiple LLM calls to complete the task. Braintrust automatically creates nested spans to give you visibility into each step:- Parent span:
generateText,streamText,generateObject, orstreamObject- represents the overall operation - Child spans:
doGenerateordoStream- one span for each individual LLM call during the operation - Tool spans: Tool executions appear as separate spans showing inputs and outputs
- How many LLM calls were needed to complete the task
- What each LLM call received and returned
- The complete flow of tool calls and responses
generateText call that uses tools might produce this span hierarchy:
generateText(parent)doGenerate(1st LLM call - decides to use tool)getWeather(tool execution)doGenerate(2nd LLM call - uses tool result to form response)
Trace agents
The AI SDK’s Agent classes (Agent, Experimental_Agent, ToolLoopAgent) are automatically wrapped and traced when using wrapAISDK.
trace-vercel-ai-sdk-agent.ts
Trace with Vercel Marketplace
The Vercel Marketplace integration provides automatic tracing for all AI calls in your Vercel applications with minimal setup. No package installation required.Setup
- Visit the Vercel Marketplace listing and select Install.
- Create or link your Braintrust account.
- Select a plan (Free or Pro) and create a project name.
- Select Add Drain to configure trace collection.
Configure log drain
In the Add Drain panel:- Select Traces and Next.
- Choose which Vercel projects to trace (All Projects or specific projects).
- Set the sampling rate for trace collection.
Enable OpenTelemetry
In your Next.js project, create aninstrumentation.ts file and call registerOtel. See the Vercel OpenTelemetry docs for details.