Getting Started with Octopus
Octopus is the multi-model, multi-agent AI orchestration engine built into the BizFirstAi platform. This guide gets you from zero to a running agent pipeline in under five minutes.
Platform requirement: Octopus is part of BizFirstAi. You will need a BizFirstAi account to use the SDK and hosted runtime. Sign up free here.
Installation
Add the Octopus NuGet package to your .NET project:
dotnet add package BizFirstAi.Octopus
Basic Usage
The following example creates an Octopus client, registers the OpenAI provider, defines a simple agent, runs a prompt, and prints the result.
using BizFirstAi.Octopus;
using BizFirstAi.Octopus.Providers;
using BizFirstAi.Octopus.Agents;
// 1. Create the Octopus client
var client = new OctopusClient(new OctopusOptions
{
ApiKey = Environment.GetEnvironmentVariable("BIZFIRSTAI_API_KEY"),
Workspace = "my-workspace"
});
// 2. Register the OpenAI provider
client.Providers.Add(new OpenAiProvider
{
ApiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY"),
Model = "gpt-4o"
});
// 3. Define an agent
var agent = new OctopusAgent
{
Name = "research-agent",
Description = "Answers questions with concise, factual responses.",
Provider = "openai",
SystemPrompt = "You are a helpful research assistant. Be concise and factual."
};
// 4. Register the agent
await client.Agents.RegisterAsync(agent);
// 5. Run a prompt
var result = await client.Agents.RunAsync("research-agent", new AgentRunRequest
{
Prompt = "What is the Agentic Node Communication Protocol?"
});
// 6. Print the result
Console.WriteLine(result.Output);
Console.WriteLine($"Tokens used: {result.TokensUsed} Cost: ${result.EstimatedCostUsd:F4}");
Core Concepts
Providers
A Provider is a connection to an LLM service. Octopus supports multiple providers registered on the same client. Routing rules determine which provider handles each agent invocation.
| Provider | Class | Notes |
|---|---|---|
| OpenAI | OpenAiProvider |
Supports GPT-4o, GPT-4o mini, and all chat completion models |
| Anthropic | AnthropicProvider |
Supports Claude 3.5 Sonnet, Claude 3 Haiku and Opus variants |
| DeepSeek | DeepSeekProvider |
DeepSeek-V3 and DeepSeek-R1 series |
| Google Gemini | GeminiProvider |
Gemini 1.5 Pro and Flash variants |
| Ollama | OllamaProvider |
Self-hosted models via local Ollama instance |
| Custom | CustomHttpProvider |
Any OpenAI-compatible HTTP endpoint |
Agents
An Agent is a named, configurable unit of AI work. It wraps a provider connection with a system prompt, a set of tools, memory configuration, and lifecycle hooks. Agents are the building blocks of every Octopus pipeline.
- Each agent is bound to one primary provider, with optional fallbacks.
- Agents maintain their own context window, with automatic summarisation when limits are approached.
- Tool definitions are registered per-agent and injected into the model call automatically.
- Agents can share a pipeline-scoped memory store for cross-agent context passing.
Pipelines
A Pipeline is an ordered graph of agents and routing logic. Pipelines support sequential execution, parallel branches, conditional routing, and recursive spawning. You can define a pipeline using the visual designer or the Octopus SDK.
- Pipelines are versioned and can be promoted between environments.
- Each pipeline run produces a full execution trace logged in BizFirst Observe.
- Pipelines can be triggered via webhook, SDK call, scheduled cron, or from within a BizFirstAi Flow workflow.
Next Steps
Ready to go further? Explore these resources: