Get the FREE Ultimate OpenClaw Setup Guide →

Memgpt

A Model Context Protocol (MCP) server that provides persistent memory and multi-model LLM support.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio vic563-memgpt-mcp-server node build/index.js \
  --env OPENAI_API_KEY="your-openai-key" \
  --env ANTHROPIC_API_KEY="your-anthropic-key" \
  --env OPENROUTER_API_KEY="your-openrouter-key"

How to use

MemGPT is a TypeScript-based MCP server that provides a memory-enabled interface for chatting with multiple large language model providers. It exposes tools to send messages to the current provider, retrieve and manage memory, and switch both providers and models while persisting these selections. You can chat with OpenAI, Anthropic, OpenRouter, and Ollama-backed providers, with memory preserved across interactions. Use get_memory to fetch past conversations (optionally limiting how many memories are returned), clear_memory to wipe history, use_provider to switch providers, and use_model to select provider-specific models. The server is designed to run as an MCP endpoint that communicates via stdio, and it can be integrated with Claude Desktop via a provided configuration.

Key capabilities include:

  • chat: Send a message to the active LLM provider and receive a response. Works with multiple providers and models.
  • get_memory: Retrieve stored memories in chronological order; supports limit with a null value for unlimited history.
  • clear_memory: Remove all stored memories to start fresh.
  • use_provider: Change the current provider (OpenAI, Anthropic, OpenRouter, Ollama) with persistent selection.
  • use_model: Change the model for the current provider. Supports Claude 3/3.5 series for Anthropic, gpt-4o and related OpenAI models, and provider/model format for OpenRouter and locally hosted Ollama models. Selections persist across sessions.

How to install

Prerequisites:

  • Node.js (recommended LTS) and npm installed on your system
  • Git to clone the repository (optional if you already have the code)
  1. Clone the repository (or navigate to the MemGPT MCP server project directory):
  1. Install dependencies:
  • npm install
  1. Build the server (transpile TypeScript to JavaScript):
  • npm run build
  1. (Optional) Enable development auto-rebuild during work:
  • npm run watch
  1. Run the MCP server (example):
  • npm run start
  1. If integrating with Claude Desktop, add the server config to Claude’s config file as shown in the README example (path may vary by OS).

Prerequisites recap: ensure API keys for OpenAI, Anthropic, and OpenRouter are available and set in environment variables or in a .env file that the server loads at runtime.

Additional notes

Environment variables:

  • OPENAI_API_KEY: Your OpenAI API key
  • ANTHROPIC_API_KEY: Your Anthropic API key
  • OPENROUTER_API_KEY: Your OpenRouter API key

Tips:

  • Memory retrieval: Use get_memory with limit: null to fetch all memories or limit: n to fetch the most recent n memories.
  • Provider and model persistence: use_provider and use_model selections persist across sessions, simplifying subsequent runs.
  • Debugging: Since MCP servers operate over stdio, use the MCP Inspector tool to inspect traffic and debugging output when developing locally.
  • OpenRouter and Ollama support depends on the provider’s model naming conventions (provider/model format for OpenRouter, and locally available models for Ollama).
  • When deploying, ensure network access and API keys are secured (e.g., use environment variables or secret management in your hosting environment).

Related MCP Servers

Sponsor this space

Reach thousands of developers