Skip to content

Spindrel

Self-hosted AI agent server with persistent channels, composable expertise, workspace-driven memory, multi-step workflows, and a pluggable integration framework.

Built on FastAPI + PostgreSQL (pgvector). Bring your own API keys — use any LLM provider.

Early Access

Spindrel is under active development and in daily use by the maintainer. Core features are stable, but APIs, configuration formats, and database schemas may change between releases. Bug reports, feature requests, and contributions are welcome.


  • Any LLM Provider


    OpenAI, Anthropic, Gemini, Ollama, OpenRouter, vLLM — or any OpenAI-compatible endpoint. Mix providers across bots. Automatic retry with fallback models. Cost tracking via LiteLLM pricing data.

  • Composable Expertise (Carapaces)


    Snap-on skillsets that bundle tools, knowledge, and behavioral instructions. Give a bot carapaces: [qa, code-review] and it instantly knows how to test and review code. Carapaces compose via includes for layered expertise.

  • Workspace Memory + Conversation Continuity


    Bots maintain MEMORY.md, daily logs, and reference docs — all on disk, all indexed for RAG. Conversations are automatically archived into searchable sections that persist across fresh starts. Per-channel file stores with schema templates keep project context structured.

  • Workflows


    Reusable multi-step automations defined in YAML. Conditions, approval gates, parallel branches, cross-bot delegation, and scoped secrets. Trigger via API, bot tool, or heartbeat. Manage and monitor from the admin UI.

  • Heartbeats + Task Scheduling


    Periodic autonomous check-ins with quiet hours and repetition detection. Schedule one-off or recurring tasks with cron-like flexibility. Bots can self-schedule via schedule_task. Results dispatch to Slack, webhooks, or the UI.

  • Integration Activation + Templates


    Activate an integration on a channel and it instantly gets the right tools, skills, and behavioral instructions — no manual configuration. Pick a compatible workspace template and the bot knows exactly how to organize files. One click to go from blank channel to structured project.

  • :material-plug:{ .lg .middle } Integration Framework


    Pluggable integrations with auto-discovery. Shipped: Slack, GitHub, Discord, Gmail, Frigate, Mission Control, Arr, Claude Code, BlueBubbles, Ingestion. Each provides routers, dispatchers, tools, and lifecycle hooks. Extend with your own via INTEGRATION_DIRS.

  • Usage Tracking + Budgeting


    Per-bot token usage and cost tracking. Budget limits with configurable enforcement. Usage forecasting and breakdown by model. Powered by LiteLLM pricing data when available. Cost data is best-effort — always verify against your provider's billing dashboard.

  • Web Search


    Built-in web search via SearXNG (self-hosted) or DuckDuckGo (zero-config). Switch backends at runtime from the admin UI. No external API keys required.

  • Docker Sandboxes


    Long-lived containers for isolated code execution. Per-bot sandbox profiles with configurable images, mount points, and resource limits. Scope modes: session, client, agent, or shared.


Quick Start

git clone https://github.com/mtotho/spindrel.git
cd spindrel
bash setup.sh          # interactive wizard: deployment, LLM provider, auth
docker compose up -d

The setup wizard configures .env, starts services, and creates a default bot. The Orchestrator bot guides you through the rest conversationally.

Full setup guide

Guides