slack -client
A Slack bot and MCP client acts as a bridge between Slack and Model Context Protocol (MCP) servers. Using Slack as the interface, it enables large language models (LLMs) to connect and interact with various MCP servers through standardized MCP tools.
claude mcp add --transport stdio tuannvm-slack-mcp-client docker run -i ghcr.io/tuannvm/slack-mcp-client:latest \ --env LOG_LEVEL="info" \ --env SLACK_APP_TOKEN="your-slack-app-level-token" \ --env SLACK_BOT_TOKEN="your-slack-bot-token" \ --env MCP_API_ENDPOINT="https://your-mcp-server.example.com"
How to use
This Slack MCP Client acts as a production-ready bridge that lets AI models interact with real tools and systems via Slack conversations using the Model Context Protocol (MCP). It supports multiple LLM providers, including OpenAI and Anthropic, and enables interactions with local or remote tools such as filesystems, Git repositories, and Kubernetes clusters. The client exposes MCP-compatible transport methods (HTTP, SSE, and stdio) and maintains per-thread context for accurate, thread-aware interactions. To use it, deploy the Docker image and configure the environment with your Slack credentials and MCP server endpoint. Once running, you can initiate Slack conversations that trigger LLM processing, tool discovery, and tool execution across MCP-enabled backends. The setup also supports agent mode for multi-step workflows and a knowledge layer (RAG) to enhance responses with retrieved documents or semantic searches.
How to install
Prerequisites:\n- Docker installed and running on the host\n- Access tokens for Slack (Bot Token and App Token) with the necessary permissions\n- An MCP-compatible server endpoint (or use the included MCP bridge deployment)\n\nInstallation steps:\n1) Pull the MCP Slack client Docker image (or build locally if you prefer):\n docker pull ghcr.io/tuannvm/slack-mcp-client:latest\n\n2) Create a configuration file or set environment variables to define Slack and MCP endpoints. Example environment variables (adjust to your environment):\n - SLACK_BOT_TOKEN=your-slack-bot-token\n - SLACK_APP_TOKEN=your-slack-app-level-token\n - MCP_API_ENDPOINT=https://your-mcp-server.example.com\n - LOG_LEVEL=info\n\n3) Run the container with the required environment variables (example using docker run):\n docker run -i \n -e SLACK_BOT_TOKEN=your-slack-bot-token \n -e SLACK_APP_TOKEN=your-slack-app-level-token \n -e MCP_API_ENDPOINT=https://your-mcp-server.example.com \n -e LOG_LEVEL=info \n --name slack-mcp-client \n ghcr.io/tuannvm/slack-mcp-client:latest\n\n4) Alternatively, deploy via docker-compose by mounting environment variables in a .env file and the compose file to define service parameters.\n\n5) Verify logs and ensure the Slack app is properly authorized and connected to the MCP backends.
Additional notes
Tips and caveats:\n- Ensure your Slack tokens and MCP endpoint are securely stored and not checked into version control. Prefer using environment-specific secret management in production.\n- If you use Agent Mode, ensure your LLM providers have proper API keys configured and that your tool discovery endpoints are accessible by the MCP client.\n- When debugging, enable verbose logging (LOG_LEVEL=debug) to see transport handshake details and MCP message routing.\n- For production, consider deploying behind a reverse proxy with TLS termination and enabling OpenTelemetry tracing for observability.\n- If you encounter authentication failures with SSE MCP servers, verify the Authorization header setup on the server side and ensure the client provides valid tokens.\n- The client supports per-thread context; use separate Slack threads for related conversations to maximize contextual accuracy.
Related MCP Servers
nunu
A CLI tool for building Go applications.
slack
The most powerful MCP Slack Server with no permission requirements, Apps support, GovSlack, DMs, Group DMs and smart history fetch logic.
mcp-proxy
An MCP proxy server that aggregates and serves multiple MCP resource servers through a single HTTP server.
mcp-client-go
mcp client for Go (Golang). Integrate multiple Model Context Protocol (MCP) servers
scaled
ScaledMCP is a horizontally scalabled MCP and A2A Server. You know, for AI.
muster
MCP tool management and workflow proxy