Skip to main content
Applies to:


Summary Issue: The temperature parameter appears disabled or has no effect when using GPT-5 models in Braintrust prompts. Cause: OpenAI’s GPT-5 model family does not support the temperature parameter in their API. Resolution: Use a different model (GPT-4, GPT-3.5-turbo, or other compatible models) if temperature control is required.

Resolution Steps

If temperature control is required

Step 1: Switch to a compatible model

Change your prompt to use GPT-4, GPT-4-turbo, or GPT-3.5-turbo, which support the temperature parameter.

Step 2: Configure temperature setting

Set your desired temperature value (0.0 to 2.0) in the model parameters.

Otherwise, if using GPT-5 is required

Step 1: Accept default behavior

GPT-5 models use OpenAI’s default sampling behavior without temperature control.

Step 2: Adjust prompt design

Modify prompt instructions to guide output style instead of relying on temperature.

Additional Information

This is expected behavior from OpenAI’s API, not a Braintrust limitation. Refer to OpenAI’s model documentation for complete parameter support details for each model family.