Documentation Index
Fetch the complete documentation index at: https://braintrust.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
Summary
Goal: Configure JSON schema response formats in TypeScript SDK prompts.
Features: prompts.create(), params object, response_format, json_schema
Configuration Steps
Model parameters must be placed inside the params object, not at the top level.
project.prompts.create({
name: 'Generate Card Headlines',
slug: 'generate-card-headlines',
model: 'gemini-2.5-flash',
messages: [...],
params: {
response_format: {
type: 'json_schema',
json_schema: {
name: "headlines_response",
schema: {
type: "object",
properties: {
storyLabel: { type: "string" },
headlines: {
type: "array",
items: {
type: "object",
properties: {
cardIndex: { type: "number" },
headline: { type: "string" }
},
required: ["cardIndex", "headline"]
}
}
},
required: ["storyLabel", "headlines"]
}
}
}
}
});
Push prompts with bt functions push prompts.ts and confirm JSON output renders correctly in the Braintrust UI.
Common Mistakes
Placing response_format at the top level causes silent failure—the parameter is ignored and output returns as plain text.
// ❌ INCORRECT - response_format is ignored
project.prompts.create({
name: 'My Prompt',
model: 'gemini-2.5-flash',
response_format: { type: 'json_schema', ... } // Wrong location
});
// ✅ CORRECT - wrapped in params
project.prompts.create({
name: 'My Prompt',
model: 'gemini-2.5-flash',
params: {
response_format: { type: 'json_schema', ... }
}
});
Additional Notes
This structure applies to all model-specific parameters (temperature, max_tokens, etc.), not just response_format.