TraceLoop OpenLLMetry is an observability framework for LLM applications. Braintrust integrates with TraceLoop via OpenTelemetry to capture LLM calls, workflows, and application traces.
Setup
This integration uses Braintrust’s Python SDK OpenTelemetry configuration.
Install Traceloop alongside the Braintrust SDK with OpenTelemetry support and the OpenAI client:
pip install "braintrust[otel]" traceloop openai
Configure your environment variables:
TRACELOOP_BASE_URL=https://api.braintrust.dev/otel
TRACELOOP_HEADERS="Authorization=Bearer%20<Your API Key>, x-bt-parent=project_id:<Your Project ID>"
When setting the bearer token, encode the space between “Bearer” and your API key using %20.
Trace with TraceLoop
Initialize TraceLoop and your traces will automatically be sent to the Braintrust project specified in the x-bt-parent header:
from openai import OpenAI
from traceloop.sdk import Traceloop
from traceloop.sdk.decorators import workflow
Traceloop.init(disable_batch=True)
client = OpenAI()
@workflow(name="story")
def run_story_stream(client):
completion = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Tell me a short story about LLM evals."}],
)
return completion.choices[0].message.content
print(run_story_stream(client))
Resources