Skip to main content

LLM Provider Configuration File Setup

This guide shows how to configure LLM providers using YAML configuration files instead of environment variables. This is the recommended approach for production Community deployments.

Overview

AxonFlow supports a three-tier configuration priority:

  1. Database (Enterprise - Customer Portal managed)
  2. Config File (Community - YAML/JSON file)
  3. Environment Variables (Fallback)

This page covers the Config File approach for Community users.

Quick Start

1. Create Configuration File

Create a file named axonflow.yaml:

version: "1.0"

llm_providers:
openai:
enabled: true
credentials:
api_key: ${OPENAI_API_KEY}

anthropic:
enabled: true
credentials:
api_key: ${ANTHROPIC_API_KEY}

2. Set Environment Variable

Tell AxonFlow where to find your config file:

export AXONFLOW_CONFIG_FILE=/path/to/axonflow.yaml

3. Start Orchestrator

The orchestrator will automatically load the configuration on startup:

./orchestrator
# [Config File] Config file loader initialized: /path/to/axonflow.yaml
# [LLM Config] Loaded 2 providers from config_file (tenant: default)

Environment Variables

VariableDescription
AXONFLOW_CONFIG_FILEPrimary: Path to unified config file
AXONFLOW_LLM_CONFIG_FILEAlternative: Path to LLM-specific config file

The primary variable takes precedence if both are set.

Configuration File Format

Full Example

version: "1.0"

llm_providers:
# OpenAI
openai:
enabled: true
credentials:
api_key: ${OPENAI_API_KEY}
config:
model: gpt-4o
max_tokens: 4096
priority: 10
weight: 0.4

# Anthropic
anthropic:
enabled: true
credentials:
api_key: ${ANTHROPIC_API_KEY}
config:
model: claude-3-5-sonnet-20241022
max_tokens: 8192
priority: 8
weight: 0.4

# AWS Bedrock (uses AWS credential chain)
bedrock:
enabled: true
config:
region: us-east-1
model: anthropic.claude-3-5-sonnet-20240620-v1:0
priority: 5
weight: 0.1

# Ollama (self-hosted)
ollama:
enabled: true
config:
endpoint: http://localhost:11434
model: llama3.1:70b
priority: 3
weight: 0.1

Provider Configuration

Each provider supports the following fields:

FieldTypeRequiredDescription
enabledbooleanYesWhether the provider is active
credentialsmapVariesProvider-specific credentials
configmapVariesProvider-specific configuration
priorityintegerNoHigher = preferred for failover
weightfloatNoTraffic distribution (0.0-1.0)

Environment Variable Expansion

Use ${VAR_NAME} syntax to reference environment variables:

credentials:
api_key: ${OPENAI_API_KEY}
config:
endpoint: ${OLLAMA_ENDPOINT:-http://localhost:11434}

The :- syntax provides default values if the variable is not set.

Provider-Specific Configuration

OpenAI

openai:
enabled: true
credentials:
api_key: ${OPENAI_API_KEY}
config:
model: gpt-4o # Optional, defaults to gpt-4o
max_tokens: 4096 # Optional

Anthropic

anthropic:
enabled: true
credentials:
api_key: ${ANTHROPIC_API_KEY}
config:
model: claude-3-5-sonnet-20241022 # Optional
max_tokens: 8192 # Optional

AWS Bedrock

Bedrock uses AWS credential chain (environment, IAM role, etc.):

bedrock:
enabled: true
config:
region: us-east-1 # Required
model: anthropic.claude-3-5-sonnet-20240620-v1:0 # Required
warning

Both region and model are required for Bedrock. The provider will be disabled if either is missing.

Ollama (Self-Hosted)

ollama:
enabled: true
config:
endpoint: http://localhost:11434 # Required
model: llama3.1:70b # Optional
warning

The endpoint is required for Ollama. The provider will be disabled without it.

Hot Reloading

Config changes are picked up automatically through cache invalidation. The default cache TTL is 30 seconds, meaning any changes to your config file will take effect within 30 seconds.

tip

No manual refresh is needed. Simply edit your config file and the orchestrator will automatically pick up changes on the next request after the cache expires.

Validation and Error Handling

The config loader validates:

  • File exists and is accessible
  • File is not a directory
  • YAML syntax is valid
  • Required provider fields are present

Validation errors are logged but don't prevent startup - the orchestrator falls back to environment variables.

Common Errors

Error MessageCauseSolution
Config file not foundFile path incorrectCheck AXONFLOW_CONFIG_FILE path
Permission deniedFile not readableCheck file permissions
Config path is a directoryPath points to folderUse path to YAML file
Failed to parse config fileInvalid YAMLValidate YAML syntax
Bedrock provider requires both region and modelMissing configAdd both region and model
Ollama provider requires endpointMissing endpointAdd endpoint to config

Docker Deployment

Mount your config file into the container:

# docker-compose.yaml
services:
orchestrator:
image: axonflow/orchestrator:latest
environment:
- AXONFLOW_CONFIG_FILE=/config/axonflow.yaml
- OPENAI_API_KEY=${OPENAI_API_KEY}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
volumes:
- ./axonflow.yaml:/config/axonflow.yaml:ro

Kubernetes Deployment

Use a ConfigMap for your configuration:

apiVersion: v1
kind: ConfigMap
metadata:
name: axonflow-config
data:
axonflow.yaml: |
version: "1.0"
llm_providers:
openai:
enabled: true
credentials:
api_key: ${OPENAI_API_KEY}
---
apiVersion: apps/v1
kind: Deployment
metadata:
name: orchestrator
spec:
template:
spec:
containers:
- name: orchestrator
env:
- name: AXONFLOW_CONFIG_FILE
value: /config/axonflow.yaml
- name: OPENAI_API_KEY
valueFrom:
secretKeyRef:
name: llm-credentials
key: openai-api-key
volumeMounts:
- name: config
mountPath: /config
volumes:
- name: config
configMap:
name: axonflow-config

Migrating from Environment Variables

If you're currently using environment variables, migration is straightforward:

Before (Environment Variables)

export OPENAI_API_KEY=sk-xxx
export ANTHROPIC_API_KEY=sk-ant-xxx
export BEDROCK_REGION=us-east-1
export BEDROCK_MODEL=anthropic.claude-3-5-sonnet-20240620-v1:0

After (Config File)

# axonflow.yaml
version: "1.0"

llm_providers:
openai:
enabled: true
credentials:
api_key: ${OPENAI_API_KEY} # Still uses env var for secret

anthropic:
enabled: true
credentials:
api_key: ${ANTHROPIC_API_KEY}

bedrock:
enabled: true
config:
region: us-east-1
model: anthropic.claude-3-5-sonnet-20240620-v1:0
export AXONFLOW_CONFIG_FILE=/path/to/axonflow.yaml
export OPENAI_API_KEY=sk-xxx # Secrets still in env vars
export ANTHROPIC_API_KEY=sk-ant-xxx
tip

Keep API keys in environment variables and reference them with ${VAR_NAME}. This is more secure than hardcoding credentials in config files.

See Also