soat
SOAT is an open-source framework designed to provide persistent memory capabilities for autonomous AI agents.
claude mcp add --transport stdio ttoss-soat node packages/server/dist/server.js \ --env MCP_HOST="0.0.0.0" \ --env MCP_PORT="default port for MCP server (e.g., 8080)" \ --env DATABASE_URL="PostgreSQL connection string for pgvector-backed storage" \ --env EMBEDDING_MODEL="Model used for generating embeddings (optional if using embedded services)" \ --env PGVECTOR_ENABLED="true"
How to use
SOAT provides an MCP-compliant REST API-backed memory store for autonomous agents. It ingests Documents and Files, vectorizes their content using pgvector-based embeddings, and stores them for fast semantic recall. The server exposes an MCP interface so agents can open a session, persist context, and retrieve relevant context via semantic search. In addition to MCP, SOAT offers REST endpoints for management tasks such as ingestion, indexing, and file handling, enabling system-to-system integration with standard HTTP calls. The combination of persistent memory and semantic search enables agents to recall relevant context across lifetimes, supporting more coherent agent behavior and better runtime recall.
How to install
Prerequisites
- Node.js (recommended LTS version)
- Git
- Optional: Docker (if you prefer running with Docker Compose)
Installation steps
-
Clone the repository git clone https://github.com/ttoss/soat.git cd soat
-
Install dependencies for the server package cd packages/server npm install
-
Build or prepare the server if required by the project setup npm run build
-
Configure environment (examples below) or create a .env file
- Create or edit a .env file with required variables (see additional notes for options)
-
Run the MCP server node dist/server.js
-
Verify the server is running by hitting the MCP REST endpoint or the MCP port you configured.
Additional notes
Notes and tips:
- The server relies on a PostgreSQL database with pgvector extensions for vector storage. Ensure PostgreSQL is available and the pgvector extension is enabled on the target database.
- Environment variables can control embedding model choices, host/port, and enabling embeddings at ingest time. Example: EMBEDDING_MODEL can point to a local Ollama instance or a hosted embedding service.
- If you use Docker, you can containerize the server and database. This README focuses on a Node-based run, but the project supports Docker Compose as an alternative path.
- For MCP interoperability, ensure your client agents use the MCP protocol version supported by SOAT and point to the MCP REST endpoints provided by the server.
- Common issues: database connection failures (check DATABASE_URL), missing pgvector extension, or port conflicts. Review logs for more details.
Related MCP Servers
gemini-cli
An open-source AI agent that brings the power of Gemini directly into your terminal.
drawio2go
A modern DrawIO editor application. AI-Powered, Human-AI Collaboration | AI 加持,人机共绘drawio
mcp -js
MCP server that exposes YepCode processes as callable tools for AI platforms. Securely connect AI assistants to your YepCode workflows, APIs, and automations.
architect
A powerful, self-extending MCP server for dynamic AI tool orchestration. Features sandboxed JS execution, capability-based security, automated rate limiting, marketplace integration, and a built-in monitoring dashboard. Built for the Model Context Protocol (MCP).
MCP s
A Model Context Protocol (MCP) server that provides AI assistants with access to Microsoft OneNote. This server enables AI models to read from and write to OneNote notebooks, sections, and pages.
remote s
Explore a community-maintained list of remote Model Context Protocol (MCP) servers. Find, learn about, and check the reachability of MCP-enabled services.