Skip to main content

Setting up

TLDR; Jump straight into building your first agent in minutes.

Start here

Follow our quickstart guide.

Introduction

Soma is an open-source, self-hostable AI agent & workflow runtime designed for startups through to enterprise. It’s self-hostable, a single binary and provides a security and governance plane across your agents. You can use Soma to build many different types of workflows & agents, using plain old code. You can build: See the quickstart guide for more details. Key features include:
  • Define your custom MCP functions and AI agents in regular code (Typescript supported, Python coming soon)
  • Bring whichever tools and supporting frameworks you would like into Soma (e.g. Llangchain, Vercel AI SDK, etc.)
  • Make your code fault-tolerant and resumable. Crash or suspend execution at any point and resume from where it left off
  • Automatically get an A2A (Agent2Agent) and (coming soon) OpenAI Streaming compatable API endpoints to message your agent
  • An MCP server pre-integrated with well known third-party SaaS providers that handles credential encryption and rotation for you
  • A UI to debug your Agent & MCP functions, configure MCP provider credentials and see historical agent executions and chats
  • Deployable to local desktop or cloud environments
Features that are landing soon:
  • Fine-grained API key access management to restrict access to your agent
  • Human in-the-loop approvals before actions are executed
  • Outbound AI gateway to intercept all agent requests to model providers for deeper observability
This is all packaged into a single binary and leverages:
  • Resstate for fault-tolerance
  • Turso (Sqlite) for data storage requirements
  • Local, AWS or (soon)GCP KMS encryption for secrets management (MCP credentials, API keys, Agent secrets)
See our deployment guide for self-hosting instructions.

Framework vs runtime

Framework vs runtime We view Soma as the bottom of the pyramind, providing the core building blocks for your agent and providing a DX to increase developer velocity. You can use any framework you want to build your agent, including:
  • Vercel AI SDK
  • Langchain
  • OpenAI Agents SDK
Whilst we don’t have lower level language bindings for all of these libraries just yet, you can still use them with Soma directly. Soma provides:
  • A secure MCP server for your agent to use
  • A ready to use OpenAI Streaming and Agent 2 Agent compatible API endpoints to message your agent
  • MCP and agent chat debug tools
  • An easy path from local development to production
  • API credential management (i.e. protect the chat endpoints via API key, Oauth, etc.)
  • Human in-the-loop approvals
  • Fault tolerance and resumability
all of which compliment your chosen framework and agent / workflow business logic.

SDK support matrix

LanguagePlatformStatus
TypescriptMac OSX X86🟢
Mac OSX AARCH / ARM🟢
Linux GNU X86🟢
Linux GNU AARCH / ARM🟢
Windows⚪
PythonMac OSX X86⚪
Mac OSX AARCH / ARM⚪
Linux GNU X86⚪
Linux GNU AARCH / ARM⚪
Windows⚪
Python support is planned and underway. Windows support is planned however it’s not natively supported due to our use of Unix domain sockets in Rust ( related issue )