MCP server enabling LLMs to interact with WhatsApp - send messages with fuzzy name matching, full-text search across conversations, manage chats and contacts, and download media
claude mcp add --transport stdio eddmann-whatsapp-mcp docker run -i --rm -v /ABSOLUTE/PATH/TO/whatsapp-store:/app/store ghcr.io/eddmann/whatsapp-mcp:latest \ --env DB_DIR="Directory for SQLite databases and downloaded media - default: store" \ --env LOG_LEVEL="Logging level - DEBUG, INFO, WARN, ERROR - default: INFO" \ --env FFMPEG_PATH="Path to ffmpeg binary for audio conversion - default: ffmpeg"
How to use
This MCP server provides seven tools to manage and interact with your WhatsApp data through Claude and other LLMs: list_chats, list_messages, search_messages, send_message, download_media, get_connection_status, and catch_up. The server runs as a Docker container and persists its data to a mounted host directory to keep chat history and media across restarts. To use it, start the container with a volume mounted to /app/store and then query the available tools via Claude or any MCP-compatible interface. The tools let you list conversations with filtering and pagination, fetch and search message history with full-text search, send messages or media with fuzzy name matching and threading, download media to local storage, check connection and database statistics, and generate intelligent activity summaries of your WhatsApp activity. Ensure you configure environment variables such as DB_DIR and LOG_LEVEL if you need custom paths or logging detail.
How to install
Prerequisites:
- Docker installed on your machine
- Sufficient permissions to run containers
Installation steps:
- Pull the latest image:
docker pull ghcr.io/eddmann/whatsapp-mcp:latest
- Create a directory for persistent storage on your host:
mkdir -p whatsapp-store
- Run the container with a mounted volume to persist data:
docker run -it --rm \
-v "$(pwd)/whatsapp-store:/app/store" \
ghcr.io/eddmann/whatsapp-mcp:latest
- On first run, a QR code will appear in the terminal for pairing. Open WhatsApp on your phone, go to Settings → Linked Devices → Link a Device, and scan the QR code. Wait for the history sync to complete (look for logs indicating history sync persistence counts).
Optional with Claude Desktop integration:
- Use the provided Claude configuration snippet to add the WhatsApp MCP server, and adjust the volume path and environment variables as needed.
Notes:
- Session data is saved under store/whatsapp.db and messages under store/messages.db
- Media downloads go to store/<chatJID>/ directories
- If you need persistent storage, mount an absolute path to /app/store as shown above.
Additional notes
Tips and common issues:
- Always mount an absolute host path to /app/store to ensure persistence across container restarts.
- If you encounter login issues after a restart, ensure the store directory contains the expected WhatsApp session data; the server should automatically reconnect using stored credentials.
- Configure LOG_LEVEL for debugging (e.g., DEBUG) if you are troubleshooting.
- The available environment variables (DB_DIR, LOG_LEVEL, FFMPEG_PATH) can be customized in your orchestration tool or Claude config to suit your environment.
- For production, consider securing access to the container and restricting Claude's access to the MCP endpoints.
Related MCP Servers
fast-filesystem
A high-performance Model Context Protocol (MCP) server that provides secure filesystem access for Claude and other AI assistants.
Amazing-Marvin
Model Context Provider for Amazing Marvin productivity app - Access your tasks, projects, and categories in AI assistants
work-memory
Never lose context again - persistent memory management system for AI-powered workflows across multiple tools
local -gateway
Aggregate multiple MCP servers into a single endpoint with web UI, OAuth 2.1, and profile-based tool management
mail -bridge
Connect macOS Mail to AI through Model Context Protocol
openapi-to
Transform OpenAPI specifications into production-ready MCP servers with AI-powered evaluation and enhancement. Leverages LLMs to analyze, improve, and generate Model Context Protocol implementations from your existing API documentation.