confluence
MCP server for Confluence — search and fetch pages from any MCP-compatible AI assistant (Claude, Cursor, Windsurf)
claude mcp add --transport stdio pankaj28843-confluence-mcp-server uv run python -m confluence_search.fastmcp_app \ --env CONFLUENCE_URL="https://confluence.example.com" \ --env CONFLUENCE_PERSONAL_ACCESS_TOKEN="your_token_here"
How to use
This MCP server provides two tools for interacting with a Confluence instance: search_confluence and fetch_confluence_page. The search tool translates natural language queries into Confluence CQL searches, supports optional filters like spaces, labels, and date ranges, and returns a ranked list with titles, URLs, spaces, excerpts, and labels. The fetch tool hydrates a specific page by its content_id, returning a Markdown document with metadata (URL, space, version, last-modified date, labels, and ancestors) followed by the full page body. You can connect via stdio-based clients (e.g., Claude Desktop, Cursor) or HTTP-based transports (e.g., Windsurf) by running the server in the appropriate mode.
To use the stdio workflow with Claude/Desktop clients, start the MCP server using uv and then configure the client to send commands over stdio. For HTTP-based clients, run the server in a server mode (e.g., via Docker) and point the client to http://<host>:<port>/mcp. The provided configuration shows how to wire the uv-based stdio runtime for the confluence server and supply necessary environment variables such as CONFLUENCE_URL and CONFLUENCE_PERSONAL_ACCESS_TOKEN.
How to install
Prerequisites
- Python 3.11+ and UV runtime (uv) installed on your system, or Docker for container deployment
- A Confluence instance accessible over HTTP(S)
- A Confluence Personal Access Token with appropriate permissions
Step 1: Prepare environment
- Install Python 3.11+ (from python.org or your OS package manager)
- Install the UV runtime following the project’s guidance (per the repository’s prerequisites)
Step 2: Obtain and configure credentials
- Create a .env file (or equivalent) with: CONFLUENCE_URL=https://confluence.example.com CONFLUENCE_PERSONAL_ACCESS_TOKEN=your_token_here Optional TLS/verification envs may be added as needed, e.g., CONFLUENCE_VERIFY_TLS=true
Step 3: Install dependencies (if applicable)
- If the project uses a virtual environment: python -m venv venv source venv/bin/activate # On Windows use venv\Scripts\activate
- Install project dependencies (adjust to the project’s setup, e.g., poetry, pip requirements): pip install -r requirements.txt # if a requirements file exists
Step 4: Run the MCP server
-
Quickstart (stdio mode for Claude Desktop / Cursor): uv sync uv run python -m confluence_search.fastmcp_app
-
Docker (HTTP transport): python deploy_confluence_mcp.py # builds image + starts container on :43043
or manually:
docker build -t confluence-mcp-server . docker run -d --name confluence-mcp-server
--env-file .env
-p 43043:43043
confluence-mcp-server -
If using Docker, connect clients to http://127.0.0.1:43043/mcp
Additional notes
Environment variables drive runtime behavior. Ensure CONFLUENCE_URL and CONFLUENCE_PERSONAL_ACCESS_TOKEN are kept secret. The server supports a read-only mode for safety. The cache directory (CONFLUENCE_CACHE_DIR) defaults to .cache/confluence_mcp and can be customized. When using the HTTP transport, the port defaults to 43043 unless changed by environment variables. If you encounter TLS issues, verify your TLS settings (CONFLUENCE_VERIFY_TLS) and token permissions. The npm_package field is null for this Python-based MCP server.
Related MCP Servers
mcp-atlassian
MCP server for Atlassian tools (Confluence, Jira)
lc2mcp
Convert LangChain tools to FastMCP tools
MCPHammer
MCP security testing framework for evaluating Model Context Protocol server vulnerabilities
skill-to
Convert AI Skills (Claude Skills format) to MCP server resources - Part of BioContextAI
md2confluence
MCP server to upload Markdown to Confluence. Auto-converts Mermaid diagrams, code blocks, images, and tables.
fastmcp-builder
A comprehensive Claude Code skill for building production-ready MCP servers using FastMCP. Includes reference guides, runnable examples, and a complete implementation with OAuth, testing, and best practices.