Documentation Index
Fetch the complete documentation index at: https://braintrust.dev/docs/llms.txt
Use this file to discover all available pages before exploring further.
Applies to:
- Plan -
- Deployment -
Summary
Issue: Invoking Amazon Nova models from Braintrust fails with anAccessDeniedException even when the IAM policy allows bedrock:InvokeModel and bedrock:InvokeModelWithResponseStream on the foundation model:
us.amazon.nova-2-lite-v1:0 or global.amazon.nova-pro-v1:0), not directly against the foundation model ARN. The IAM policy attached to your Bedrock credentials must grant invocation permissions on the inference profile resource in addition to the foundation model resource.
Resolution: Update the IAM policy to include both resource types, then configure a Custom Bedrock provider in Braintrust with the exact inference-profile model identifier.
Resolution steps
Step 1: Update the IAM policy to cover both resources
Add both the foundation model and inference profile ARNs to theResource list, and include bedrock:UseInferenceProfile so the principal can route through the inference profile.
The policies above are examples. Always validate the AWS Bedrock IAM documentation and/or your AWS admin for guidance on scoping permissions to your environment.
XXXXXXXXXXXX with your AWS account ID, and add or remove inference profile ARNs to match the regions and Nova variants you plan to use.
Step 2: Match the inference profile prefix to the model identifier
Nova inference profiles are prefixed by routing scope:us.amazon.nova-*— routes within US regions.eu.amazon.nova-*— routes within EU regions.global.amazon.nova-*— routes globally.
us.amazon.nova-2-lite-v1:0 but call global.amazon.nova-2-lite-v1:0, the request fails with the same AccessDeniedException.