Get the FREE Ultimate OpenClaw Setup Guide →

post-cortex

Post-Cortex provides durable memory infrastructure with automatic knowledge graph construction, intelligent entity extraction, and semantic search powered by local transformer models.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio julymetodiev-post-cortex docker run -i julymetodiev/post-cortex \
  --env PC_HOST="Bind address (default 127.0.0.1)" \
  --env PC_PORT="Port for MCP HTTP API (default 3737)" \
  --env PC_DATA_DIR="Storage location (default ~/.post-cortex/data)" \
  --env PC_STORAGE_BACKEND="rocksdb or surrealdb (default rocksdb)"

How to use

Post-Cortex provides long-term memory for AI assistants by persisting conversations, decisions, and insights in a knowledge base. It supports semantic search with embeddings, a graph-enhanced retrieval mechanism (Graph-RAG), and automatic entity extraction to build a searchable knowledge graph. The server can run locally with embedded RocksDB storage or connect to a distributed SurrealDB backend, all while keeping processing on-device to preserve privacy. Tools exposed to the Claude workflow include session management, updating conversation context, semantic search, structured summaries, and workspace organization, making it straightforward to enrich AI agents with historical context and meaningful relationships.

To use Post-Cortex, first install the binary or container, then configure MCP as a transport target. Once the daemon or server is running, you can register the MCP under a host and port with the http transport, or use the stdio transport for a local, daemonless workflow. After registration, your AI agent can query past conversations, fetch related memories via semantic search, and retrieve structured summaries that highlight decisions and entities. The combination of persistent memory and a robust vector search backend enables rapid recall and more capable reasoning across long-running projects.

How to install

Prerequisites:

  • Docker (recommended for containerized usage) or a binary compatible with your OS
  • Optional: RocksDB or SurrealDB backend prerequisites if you choose a non-embedded setup

Install via Docker (recommended for quick start):

  1. Install Docker on your system (https://docs.docker.com/get-docker/)
  2. Pull and run the Post-Cortex image: docker run -it -p 3737:3737 julymetodiev/post-cortex

Install via binary (as provided in release):

  1. Download the binary for your platform from the releases page curl -L https://github.com/julymetodiev/post-cortex/releases/latest/download/pcx-<your-platform> -o /usr/local/bin/pcx
  2. Make it executable: chmod +x /usr/local/bin/pcx
  3. Run the binary to start the server locally (adjust as needed): pcx

Configure MCP (once globally):

  1. Register Post-Cortex with your MCP using the HTTP transport (recommended): claude mcp add --scope user --transport http post-cortex http://127.0.0.1:3737/mcp

If you prefer a local, daemon-free setup: 2) Register the stdio transport: claude mcp add --scope user --transport stdio post-cortex -- pcx

Note: If you use Docker, ensure ports are mapped correctly and that the data directory is persisted if you need long-term storage.

Additional notes

Environment variables: PC_HOST, PC_PORT, PC_DATA_DIR, and PC_STORAGE_BACKEND control binding, port, storage location, and backend type (rocksdb or surrealdb). The default setup uses RocksDB embedded storage for zero-configuration operation. For production workloads requiring distributed memory, configure SurrealDB as the backend. If you expect high query load, consider running the DSP in a container orchestrator and ensuring sufficient RAM for embeddings and the HNSW index. For privacy, all processing can remain local; avoid exposing the MCP HTTP endpoint to untrusted networks. If using the daemon mode, Daemon Mode docs provide guidance on multiple Claude instances sharing the same memory and how to manage the daemon lifecycle (start/status/stop).

Related MCP Servers

Sponsor this space

Reach thousands of developers