Skip to main content

Tutorials

The tutorials section is designed for engineers who want a guided path through the product before they branch out into the larger example set.

These pages are especially useful for teams assessing whether AxonFlow can support real multi-agent or governed workflow development, not just isolated policy demos. The tutorial path is meant to get a developer from "the stack is running" to "I understand how governed requests, workflows, and upgrade paths fit together."

Available Tutorials

Your First Agent

Link: Your First Agent

What you will learn: How to start the AxonFlow stack locally, send your first governed LLM request through the agent gateway, and observe how policies evaluate that request. This tutorial covers the core developer loop: configure a provider, create a client, send a request, and inspect the audit result.

Prerequisites: Docker and Docker Compose installed. An API key for at least one LLM provider (OpenAI, Anthropic, or any supported provider).

Estimated time: 15-20 minutes.

Best for: Engineers assessing AxonFlow for the first time, or anyone who wants to understand the basic request path before reading deeper documentation.

Workflow Examples

Link: Workflow Examples

What you will learn: How to build multi-step governed workflows using both the Multi-step Autonomous Pipeline (MAP) and Workflow Control Protocol (WCP) patterns. This tutorial walks through creating workflows with step gates, tool gates, and human-in-the-loop approval checkpoints.

Prerequisites: Completing the First Agent tutorial. Familiarity with at least one AxonFlow SDK (Python, TypeScript, Go, or Java).

Estimated time: 30-45 minutes.

Best for: Teams building production AI applications that involve multiple steps, tool calls, or approval workflows where governance needs to apply at each stage.

What You Should Learn Here

The tutorial path is meant to help you answer four questions quickly:

  • how to get a request through AxonFlow successfully
  • how policies shape request and response behavior
  • how SDKs and integrations fit around the runtime
  • how to think about AxonFlow as a control plane for multi-agent or workflow-driven systems

For many teams, that is also the point where the commercial story becomes clearer:

  • Community is enough to learn and prototype
  • Evaluation becomes attractive when the workflow needs richer governance features
  • Enterprise becomes the natural fit once the product is moving toward company-wide or regulated usage

Suggested Order

  1. Complete Your First Agent.
  2. Read Workflow Examples once you understand the core request path.
  3. Move into Examples Overview for use-case-specific implementation patterns.

Prerequisites for All Tutorials

Before starting any tutorial, make sure you have:

  • Docker and Docker Compose installed and running. All tutorials use the local compose stack.
  • An LLM provider API key. At minimum, an OpenAI or Anthropic key. The tutorials will show you how to configure this in the compose environment.
  • curl or an HTTP client for making API calls. The tutorials show curl commands, but any HTTP client works.
  • A terminal with basic familiarity running shell commands.

If you plan to follow the Workflow Examples tutorial with SDK code instead of raw HTTP, you will also need the runtime for your chosen language (Python 3.10+, Node.js 18+, Go 1.21+, or Java 17+) and the corresponding AxonFlow SDK installed.

What Comes After Tutorials

Tutorials are intentionally narrower than the rest of the docs. Once you have the basics, the next step depends on your goal:

If you are assessing AxonFlow for a serious internal platform or customer-facing AI product, the tutorial section should be treated as the first guided pass, not the final destination. The next useful step is usually the SDK, integration, MCP, and governance sections that match your actual architecture.