This integration uses Braintrust’s Python SDK configuration for OpenTelemetry.
To export OTel traces from Traceloop
OpenLLMetry to Braintrust, set the following
environment variables:
TRACELOOP_BASE_URL=https://api.braintrust.dev/otel
TRACELOOP_HEADERS="Authorization=Bearer%20<Your API Key>, x-bt-parent=project_id:<Your Project ID>"
When setting the bearer token, be sure to encode the space between “Bearer” and your API key using %20.
Traces will then appear under the Braintrust project or experiment provided in
the x-bt-parent header.
pip install braintrust[otel] traceloop openai
from openai import OpenAI
from traceloop.sdk import Traceloop
from traceloop.sdk.decorators import workflow
Traceloop.init(disable_batch=True)
client = OpenAI()
@workflow(name="story")
def run_story_stream(client):
completion = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Tell me a short story about LLM evals."}],
)
return completion.choices[0].message.content
print(run_story_stream(client))