While tracing setup automatically logs LLM calls, you often need to trace additional application logic like data retrieval, preprocessing, business logic, or tool invocations. Custom tracing lets you capture these operations.
Braintrust SDKs provide tools to trace function execution and capture inputs, outputs, and errors:
Python SDK uses the @traced decorator to automatically wrap functions
TypeScript SDK uses wrapTraced() to create traced function wrappers
Go SDK uses OpenTelemetry’s manual span management with tracer.Start() and span.End()
All approaches achieve the same result—capturing function-level observability—but with different ergonomics suited to each language’s idioms.
import { initLogger, wrapTraced } from "braintrust";const logger = initLogger({ projectName: "My Project" });// Wrap a function to trace it automaticallyconst fetchUserData = wrapTraced(async function fetchUserData(userId: string) { // This function's input (userId) and output (return value) are logged const response = await fetch(`/api/users/${userId}`); return response.json();});// Use the function normallyconst userData = await fetchUserData("user-123");
The traced function automatically creates a span with:
Attach metadata and tags from within the function body. This is useful for data that’s only available during execution, like computed values or results from intermediate steps.
In TypeScript and Python, use span.log().
In Go, C#, Ruby, and Java, use the OTel setAttribute API. Custom attributes appear in the span’s metadata field. Use braintrust.tags for tags. For LLM-specific OTel attributes, see OpenTelemetry.
The TypeScript and Python SDKs support passing metadata and tags at span creation time, which avoids a separate span.log() call. This is useful at request entry points where you have request-scoped data — like a user ID or org ID — already available and don’t want to thread it through helper functions.
import OpenAI from "openai";import { initLogger } from "braintrust";const logger = initLogger({ projectName: "My Project" });const openai = new OpenAI();async function handleRequest(userId: string, orgId: string, prompt: string) { return logger.traced( async (span) => { const response = await openai.responses.create({ model: "gpt-5-mini", input: prompt, }); return response.output_text; }, { event: { metadata: { userId, orgId }, tags: ["handle-request"], }, }, );}await handleRequest("user-123", "org-456", "What is the capital of France?");
If you pass a non-string value (like an object or array) to the name field of a span, your logs will not appear in the UI - they will be hidden due to schema validation failure. Span names must always be strings.Before passing a value to the name parameter in tracing functions, ensure it is a string: