mcp-ai-memory
A production-ready Model Context Protocol (MCP) server for semantic memory management
claude mcp add --transport stdio scanadi-mcp-ai-memory npx -y mcp-ai-memory \ --env LOG_LEVEL="info" \ --env REDIS_URL="redis://localhost:6379" \ --env DATABASE_URL="postgresql://username:password@localhost:5432/mcp_ai_memory" \ --env EMBEDDING_MODEL="Xenova/all-MiniLM-L6-v2"
How to use
This MCP server provides a semantic memory system that lets AI agents store, retrieve, and manage contextual information across sessions. It uses PostgreSQL with pgvector for vector similarity search, Redis for optional caching, and a type-safe TypeScript implementation to ensure robust memory operations. Available tools include memory_search to perform natural-language memory lookups, memory_store to persist new memories while checking for duplicates, memory_update and memory_delete for metadata and lifecycle management, and advanced operations like memory_batch for bulk imports and memory_graph_search or memory_traverse for exploring relationships in memory graphs. You can also query memory_stats for database insights and memory_consolidate to merge or cluster similar memories. Use memory_get_relations to inspect relationships between memories and memory_relate/memory_unrelate to manage connections between items.
How to install
Prerequisites:
- Node.js 18+ (or Bun) installed on your system
- PostgreSQL installed with pgvector extension enabled
- Optional Redis for caching
Install from npm (recommended):
npm install -g mcp-ai-memory
Install from source:
- Install dependencies:
bun install
- Set up PostgreSQL with pgvector:
CREATE DATABASE mcp_ai_memory;
\c mcp_ai_memory
CREATE EXTENSION IF NOT EXISTS vector;
- Create environment file:
# Create .env with your database credentials
touch .env
- Run migrations (if using source version):
bun run migrate
Usage:
- Development:
bun run dev
- Production:
bun run build
bun run start
Additional notes
Environment tips:
- Ensure DATABASE_URL (or MEMORY_DB_URL) points to a PostgreSQL database with the vector extension enabled.
- If Redis is unavailable, the server will fall back to in-memory caching, which may affect performance under heavy load.
- EMBEDDING_MODEL controls the embedding dimensions; mismatches between model and database can cause embedding errors. Consider pinning a model and re-embedding if changing models.
- MEMORY_DB_URL and related env vars can be customized in your deployment config or Claude Desktop integration to fit your environment.
- For large-scale deployments, monitor memory_decay_status and memory_preserve to manage lifecycle and keep important memories intact.
Related MCP Servers
iterm
A Model Context Protocol server that executes commands in the current iTerm session - useful for REPL and CLI assistance
mcp
Octopus Deploy Official MCP Server
furi
CLI & API for MCP management
editor
MCP Server for Phaser Editor
DoorDash
MCP server from JordanDalton/DoorDash-MCP-Server
mcp
MCP сервер для автоматического создания и развертывания приложений в Timeweb Cloud