Skip to main content
LlamaIndex is a data framework for connecting LLMs with external data sources. Braintrust traces LlamaIndex applications using OpenTelemetry to capture queries, retrievals, and LLM interactions.

Setup

This integration uses Braintrust’s Python SDK OpenTelemetry configuration. Install LlamaIndex with OpenTelemetry support:
pip install "braintrust[otel]" llama-index openai python-dotenv
Configure your environment variables:
.env
BRAINTRUST_API_KEY=your-api-key
BRAINTRUST_PARENT=project_name:llamaindex-demo
OPENAI_API_KEY=your-openai-key

Trace with LlamaIndex

Configure LlamaIndex’s global handler to send OpenTelemetry traces to Braintrust:
llamaindex_braintrust.py
import os

import llama_index.core
from dotenv import load_dotenv
from llama_index.core.llms import ChatMessage
from llama_index.llms.openai import OpenAI

load_dotenv()

# Configure LlamaIndex to send OTel traces to Braintrust
# Note: "arize_phoenix" is LlamaIndex's OTel handler name.
# We redirect it to Braintrust by overriding the endpoint.
braintrust_api_url = os.environ.get("BRAINTRUST_API_URL", "https://api.braintrust.dev")
llama_index.core.set_global_handler("arize_phoenix", endpoint=f"{braintrust_api_url}/otel/v1/traces")

# Your LlamaIndex application code
messages = [
    ChatMessage(role="system", content="Speak like a pirate. ARRR!"),
    ChatMessage(role="user", content="What do llamas sound like?"),
]
result = OpenAI().chat(messages)
print(result)
LlamaIndex uses "arize_phoenix" as the OpenTelemetry handler name. By overriding the endpoint, traces are sent to Braintrust instead.

Resources