Skip to main content

AWS Bedrock Setup

Enterprise Feature

AWS Bedrock requires a paid AxonFlow tier. Community users can start with OpenAI, Anthropic, Google Gemini, Azure OpenAI, or Ollama, then move to enterprise when Bedrock, runtime operations, and stronger governance guarantees become deployment requirements.

Bedrock is the right fit when your organization needs AWS-native model access, tighter regional control, or enterprise governance patterns built around AWS infrastructure.

Runtime Defaults

The Bedrock provider expects:

  • BEDROCK_REGION
  • Optional BEDROCK_MODEL

If you do not set BEDROCK_MODEL, AxonFlow defaults to:

anthropic.claude-sonnet-4-20250514-v1:0

The runtime detects model family from the model ID and currently supports Anthropic, Amazon Titan, Meta Llama, and Mistral model families.

Environment Variables

export BEDROCK_REGION=us-east-1
export BEDROCK_MODEL=anthropic.claude-sonnet-4-20250514-v1:0

Bedrock credentials use the standard AWS credential chain on the host where AxonFlow is running.

IAM Configuration

The ECS task role (or EC2 instance profile) running AxonFlow needs Bedrock invoke permissions:

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel",
"bedrock:InvokeModelWithResponseStream"
],
"Resource": "arn:aws:bedrock:*::foundation-model/*"
}
]
}

For tighter controls, scope the resource ARN to specific models:

arn:aws:bedrock:us-east-1::foundation-model/anthropic.claude-sonnet-4-20250514-v1:0

Supported Model Families

FamilyExample Model IDsNotes
Anthropic Claudeanthropic.claude-sonnet-4-20250514-v1:0, anthropic.claude-haiku-4-5-20251001-v1:0Default family
Amazon Titanamazon.titan-text-express-v1AWS-native
Meta Llamameta.llama3-1-70b-instruct-v1:0Open-weight
Mistralmistral.mistral-large-2407-v1:0Open-weight

The runtime detects model family from the model ID prefix and uses the appropriate request/response format.

YAML Configuration

version: "1.0"

llm_providers:
bedrock:
enabled: true
config:
region: ${BEDROCK_REGION}
model: ${BEDROCK_MODEL:-anthropic.claude-sonnet-4-20250514-v1:0}

Good Fits

  • AWS-first regulated deployments
  • Organizations that want Bedrock model governance without changing their AxonFlow request pattern
  • Teams standardizing on AWS networking, IAM, and regional controls
  • Enterprises building governed AI platforms in AWS for healthcare, finance, legal, and other regulated workloads

Operational Notes

  • Bedrock is a configured provider for Proxy Mode and MAP.
  • In Gateway Mode, your application can still call Bedrock directly while AxonFlow handles pre-check and audit.
  • Network controls such as private connectivity and VPC endpoints are handled in your AWS environment, not through a special AxonFlow bedrock config flag.

Inference Profile IDs

Newer Anthropic models on Bedrock require inference profile IDs rather than direct model IDs for on-demand throughput. Inference profile IDs use a regional prefix:

us.anthropic.claude-sonnet-4-20250514-v1:0

If you receive ValidationException when using a direct model ID like anthropic.claude-sonnet-4-20250514-v1:0, switch to the inference profile ID with the us. (or your region's) prefix. The IAM resource ARN format for inference profiles is:

arn:aws:bedrock:us-east-1:*:inference-profile/us.anthropic.claude-sonnet-4-20250514-v1:0

AxonFlow passes the model ID to Bedrock as-is, so set BEDROCK_MODEL to the inference profile ID when required.

Troubleshooting

IssueCauseFix
AccessDeniedExceptionIAM role missing Bedrock permissionsAdd bedrock:InvokeModel to the task role
ValidationException on model IDModel not enabled in your accountEnable the model in the Bedrock console for your region
High latency (>2s)Cross-region callsSet BEDROCK_REGION to match your deployment region
ValidationException on newer Anthropic modelsDirect model ID no longer accepted for on-demand throughputUse inference profile ID with regional prefix (e.g., us.anthropic.claude-sonnet-4-20250514-v1:0)
Empty responseModel family not detectedVerify model ID starts with a supported prefix (anthropic., amazon., meta., mistral.)

Why Teams Usually Upgrade Here

Bedrock tends to matter when teams move beyond one or two experimental flows and need:

  • stronger cloud governance boundaries
  • AWS-native production deployment patterns
  • enterprise support and operational guarantees

Next Steps