Skip to main content
Applies to:


Summary

Goal: Configure JSON schema response formats in TypeScript SDK prompts. Features: prompts.create(), params object, response_format, json_schema

Configuration Steps

Step 1: Wrap response_format in params object

Model parameters must be placed inside the params object, not at the top level.
project.prompts.create({
  name: 'Generate Card Headlines',
  slug: 'generate-card-headlines',
  model: 'gemini-2.5-flash',
  messages: [...],
  params: {
    response_format: {
      type: 'json_schema',
      json_schema: {
        name: "headlines_response",
        schema: {
          type: "object",
          properties: {
            storyLabel: { type: "string" },
            headlines: {
              type: "array",
              items: {
                type: "object",
                properties: {
                  cardIndex: { type: "number" },
                  headline: { type: "string" }
                },
                required: ["cardIndex", "headline"]
              }
            }
          },
          required: ["storyLabel", "headlines"]
        }
      }
    }
  }
});

Step 2: Verify output format in UI

Push prompts with npx braintrust push prompts.ts and confirm JSON output renders correctly in the Braintrust UI.

Common Mistakes

Placing response_format at the top level causes silent failure—the parameter is ignored and output returns as plain text.
// ❌ INCORRECT - response_format is ignored
project.prompts.create({
  name: 'My Prompt',
  model: 'gemini-2.5-flash',
  response_format: { type: 'json_schema', ... }  // Wrong location
});

// ✅ CORRECT - wrapped in params
project.prompts.create({
  name: 'My Prompt',
  model: 'gemini-2.5-flash',
  params: {
    response_format: { type: 'json_schema', ... }
  }
});

Additional Notes

This structure applies to all model-specific parameters (temperature, max_tokens, etc.), not just response_format.