Get the FREE Ultimate OpenClaw Setup Guide →

wa_llm

A WhatsApp bot that can participate in group conversations, powered by AI. The bot monitors group messages and responds when mentioned.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ilanbenb-wa_llm python -m uv sync --all-extras --dev \
  --env DB_URI="PostgreSQL URI" \
  --env LOG_LEVEL="Log level" \
  --env LOGFIRE_TOKEN="Logfire monitoring key" \
  --env WHATSAPP_HOST="WhatsApp Web API URL" \
  --env VOYAGE_API_KEY="Voyage AI key" \
  --env ANTHROPIC_API_KEY="Anthropic API key" \
  --env DM_AUTOREPLY_ENABLED="Enable auto-reply for direct messages" \
  --env DM_AUTOREPLY_MESSAGE="Message to send as auto-reply" \
  --env WHATSAPP_BASIC_AUTH_USER="WhatsApp API user" \
  --env WHATSAPP_BASIC_AUTH_PASSWORD="WhatsApp API password"

How to use

wa_llm is an AI-powered WhatsApp bot that joins groups, tracks conversations, and generates intelligent summaries using LLM-based processing. It supports knowledge base integration for context-aware answers, persistent message history stored in PostgreSQL with pgvector, and a REST API with Swagger docs exposed at http://localhost:8000/docs. You can manage groups, adjust settings, and opt out users from being tagged in summaries via direct messages to the bot. To operate locally, run the provided uv task which launches the application along with its dependencies; once running, you can connect a WhatsApp device to the web client and invite the bot to target groups to begin summarization and context-aware chat interactions. The API endpoints allow you to load new knowledge base topics and trigger generation and distribution of group summaries.

How to install

Prerequisites:

  • Docker and Docker Compose (for easy local setup and production-like environments)
  • Python 3.13+ (for development via uv runner)
  • PostgreSQL with pgvector extension (for persistent history and vector search)
  • Voyage AI API key (for enhanced capabilities)
  • WhatsApp account for the bot

Step-by-step installation:

  1. Clone the repository:
git clone https://github.com/ilanbenb/wa_llm.git
cd wa_llm
  1. Install dependencies (development):
# Using uv (development commands)
uv sync --all-extras --dev
  1. Create and configure environment:
cp .env.example .env

Fill in required values for WHATSAPP_HOST, WHATSAPP_BASIC_AUTH_USER, WHATSAPP_BASIC_AUTH_PASSWORD, VOYAGE_API_KEY, DB_URI, LOG_LEVEL, ANTHROPIC_API_KEY, LOGFIRE_TOKEN, DM_AUTOREPLY_ENABLED, and DM_AUTOREPLY_MESSAGE.


4) Run services (development):

```bash
docker compose up -d

For production deployment, use the pre-built images and the production docker-compose file:

docker compose -f docker-compose.prod.yml up -d
  1. Connect the WhatsApp device:
  1. Access the API docs:
http://localhost:8000/docs

Additional notes

  • The system relies on PostgreSQL with the pgvector extension to store and search vector representations of conversations and KB topics.
  • Environment variables control bot behavior such as auto-reply, opt-out status, and logging level; ensure you provide real keys for Voyage AI, Anthropic, and Logfire if used.
  • The repository provides multiple docker-compose files for development, local execution, and production; choose the appropriate one for your environment.
  • If you modify group management (via the database), you may need to restart the wa_llm-web-server service as described in the README to apply changes.
  • The WhatsApp integration requires a valid WhatsApp Web session; ensure the device is connected and authorized to access the target groups.

Related MCP Servers

Sponsor this space

Reach thousands of developers