chatgpt2md
Convert ChatGPT export to Markdown with full-text search and MCP server for Claude
claude mcp add --transport stdio nextstat-chatgpt2md /path/to/chatgpt2md serve --index /path/to/chatgpt_chats/.index --chats /path/to/chatgpt_chats
How to use
chatgpt2md exposes an MCP server that lets Claude access your converted ChatGPT history. The server runs as a small, stand-alone process and provides three MCP tools: search_conversations for full-text search across all conversations, get_conversation to read the full content of a specific conversation, and list_conversations to browse conversations by year and month. After you run the server (via the serve command), Claude can communicate with it using the standard MCP transport and the tools, enabling natural-language queries like finding conversations about Rust or showing all chats from a given month. The built-in index (Tantivy) speeds up searches, and the converter step builds Markdown files from your ChatGPT export so Claude can read and index them efficiently.
How to install
Prerequisites:
- A supported platform (macOS, Windows, or Linux)
- Rust toolchain for building from source (optional if you use pre-built binaries)
- A ChatGPT export ZIP or conversations.json file (Step 3 in the README)
Installation options:
Option A — Pre-built binaries (recommended):
- Go to the Releases page of the repository and download the binary for your platform (macOS, Windows, or Linux).
- Extract (if needed) and place the binary somewhere on your PATH, e.g. /usr/local/bin/chatgpt2md.
Example:
# macOS/Linux: extract and move to PATH
tar xzf chatgpt2md-*.tar.gz
sudo mv chatgpt2md /usr/local/bin/
Option B — Install from source (requires Rust):
- Ensure Rust is installed (rustup recommended):
rustc --version && cargo --version
- Install from Git:
cargo install --git https://github.com/NextStat/chatgpt2md
Option C — Build locally (from cloned repo):
- Clone the repo:
git clone https://github.com/NextStat/chatgpt2md
- Build the binary in release mode:
cd chatgpt2md
cargo build --release
- The binary will be at ./target/release/chatgpt2md.
Additional notes
Notes and tips:
- The MCP server is configured to serve the index and chats from your local paths. Replace /path/to/chatgpt_chats with your actual output directory as produced by chatgpt2md export.
- The index directory (.index) is generated by the convert step and is used by Claude to perform fast searches. Ensure the path used in --index points to that directory.
- If Claude cannot reach the MCP server, verify that the server is running, the paths are correct, and that Claude has permission to access the specified directories.
- You can customize the server name in the MCP config (the key under mcpServers) to something meaningful for your setup, like chatgpt-history.
- For troubleshooting, check that the binary is executable and that the CLI arguments match the paths you intend to use (index and chats).
Related MCP Servers
gemini -tool
MCP server that enables AI assistants to interact with Google Gemini CLI, leveraging Gemini's massive token window for large file analysis and codebase understanding
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
grafana -analyzer
让AI助手直接分析你的Grafana监控数据 - A Model Context Protocol server for Grafana data analysis
the-academy
A Socratic dialogue engine for AI agents.
domain-search
Zero-config domain availability MCP for Claude & ChatGPT. AI suggestions, premium/auction detection via GoDaddy, RDAP/WHOIS fallback. Stdio + HTTP.
obsidian
MCP server for Obsidian vault management - enables Claude and other AI assistants to read, write, search, and organize your notes