Octopus combines deep AI reasoning with workflow automation — every agent decision is planned, tool-assisted, and fully auditable from the first token to the final result.
Octopus is a workflow-driven AI Agent Engine. Agents reason, plan, call tools, execute sub-workflows, and explain every decision — all inside a single auditable runtime.
Each agent uses its assigned LLM to reason step-by-step, form plans, and select tools dynamically. The engine surfaces the chain-of-thought so every decision is explainable and reviewable.
AI agents are first-class citizens inside ProcessServer workflows. An AI Agent Node runs a full Octopus agent as one workflow step — combining intelligent reasoning with 26+ standard executors like IF/Switch, HTTP Request, Loop, and Human Approval.
Every agent action, tool call, LLM response, and workflow step is written to an immutable audit log. Debug, replay, and prove compliance at any point in time — nothing is a black box.
Connect any major LLM provider — or run local models — without changing your agent logic. Each agent carries its own model config independently from every other agent.
First-class support for every leading LLM provider today.
Every agent carries a fully independent LLM configuration.
Define ordered fallback sequences that keep pipelines running even when a provider is unavailable or over budget.
ProcessServer hosts two special AI executors alongside 26+ standard nodes — letting you blend deterministic workflow logic with generative AI in a single, unified pipeline.
Runs a complete Octopus agent as a single workflow step inside ProcessServer.
Describe logic in plain language — the LLM generates executable code at runtime.
AI nodes and standard executors work side by side in a single workflow graph.
Octopus implements the Model Context Protocol for tool calling — giving agents a standardised, discoverable way to act on the world beyond text generation.
A production-ready tool library ships with Octopus out of the box.
Extend the tool library with your own business-specific tools in minutes.
The WebDriver plugin gives agents full browser control via Selenium and Playwright.
Octopus retrieves relevant knowledge before every LLM call, so agents answer from your actual business data rather than model priors. Fully configurable per agent, per tenant.
Ingest enterprise content from wherever it lives today.
Vector similarity search returns the most relevant chunks before each LLM prompt.
Organise documents into named, reusable collections — assigned per agent.
ConversationComposite carries the full interaction state from the first message to the final outcome — across agents, sessions, and users — with nothing lost between turns.
Every turn is remembered and made available to the LLM within the configured context window.
Conversations can be paused, handed off to another agent, and resumed without losing a single byte of state.
Users watch every step of agent reasoning and execution as it happens — no polling, no page refresh.
Octopus is built on a composable plugin model. Five plugins are complete and production-ready today; the ecosystem grows with every release.
Composite persistence for all Octopus objects via SQL Server. Efficient selective hydration and caching keep query costs low even as conversation histories grow.
Microsoft Semantic Kernel integration for LLM orchestration. Enables local model execution and memory connectors — keeping data on-premises when required by policy.
Selenium and Playwright browser automation surfaced as a standard MCP tool. Agents can navigate, extract, fill forms, and screenshot any web page as part of their tool-calling chain.
Embeddable chatbot component that connects directly to the Octopus runtime via SignalR. Fully brandable — drop it into any web app and your users interact with agents instantly.
The full ProcessServer workflow engine as a plugin. Brings AI Agent Node, AI Function Node, and 26+ standard executors into the Octopus runtime as a single installable unit.
The plugin SDK is open. Implement a plugin interface, declare your capabilities, and the Octopus runtime discovers and loads your plugin automatically. The extension-app gives admins a UI for install, configure, and health monitoring.
TenantID is enforced at every service layer — AgentService, ConversationService, KnowledgeService, ProcessEngine. No cross-tenant data is ever accessible, even by accident.
Complete data separation baked into the architecture — not a filter layer bolted on top.
Access control is defined at the resource level — not just the service level.
Immutable records and enterprise-grade encryption protect every interaction end to end.
Every role gets a focused micro-frontend, all loading into a shared shell via Module Federation (React/Next.js). Admin, builder, end-user, and developer — each sees exactly what they need.
Admin and builder interface for creating and managing agents. Configure LLM providers, prompt templates, tool registrations, knowledge collection assignments, fallback chains, and related agent hand-offs — all without touching code.
End-user multi-turn conversation interface. Connected via SignalR — every tool call, LLM reasoning step, and workflow node transition is visible to the user as it happens. Structured step log available for post-run review.
Knowledge curator interface for document upload, collection management, and retrieval tuning. Upload PDFs, Word docs, and HTML pages; organise them into named collections; adjust top-K, similarity threshold, and chunk size per collection.
Developer and admin interface for the plugin ecosystem. Install, configure, and monitor plugin health in one place. Supports all five current plugins and any future custom plugins that implement the standard plugin interface.
Three rich composite objects carry the complete runtime state for every agent, conversation, and user. Selective hydration keeps memory lean; full rehydration restores every detail instantly.
The complete definition of an agent, loaded in one object.
Full multi-turn state persisted across every interaction.
Rich user model linking identity, access, and history.
The MVP is complete and production-ready. The roadmap extends Octopus with deeper memory, richer models, and self-improving agent capabilities across the next two phases.
Deeper intelligence and collaborative agents arriving in the next release cycle.
Foundation model capabilities and agentic self-improvement on the horizon.
Two additional plugins are on the confirmed roadmap to extend the storage and retrieval ecosystem.