bitbucket
MCP server for Bitbucket Cloud REST API — works with any MCP client
claude mcp add --transport stdio or2ooo-bitbucket-mcp npx -y @or2ooo/bitbucket-mcp@latest \ --env BITBUCKET_READONLY="false" \ --env ATLASSIAN_API_TOKEN="your-api-token" \ --env ATLASSIAN_USER_EMAIL="your-email@example.com"
How to use
This Bitbucket Cloud MCP Server provides a collection of 32 tools organized into 5 toolsets to interact with the Bitbucket Cloud REST API v2.0. It exposes context tools such as bb_whoami and bb_list_workspaces, repository tools to inspect and modify repositories (read and write where allowed), pull request tools to review and manage PRs, issue tools, and pipelines tools to monitor and trigger pipelines. The server is designed to work with MCP clients like Claude Code, GitHub Copilot, OpenAI Codex, and others, delivering compact, LLM-friendly outputs with safety controls. To use the server, configure your MCP client (or Claude Coprocessor, Copilot, etc.) to launch the MCP server via the provided command (typically via npx). Pass your Atlassian user email and API token as environment variables, and optionally enable read-only mode or restrict access with workspace and repository allowlists. The server will expose a suite of commands corresponding to each tool, enabling you to query Bitbucket resources, perform safe writes, and manage PRs, issues, and pipelines through natural language prompts.
How to install
Prerequisites: - Node.js v24+ - npm - Access to a Bitbucket API token with appropriate scopes
Installation steps:
-
Ensure Node.js and npm are installed
- node -v
- npm -v
-
Install or run the MCP server via npx (recommended for quick setup)
- npx -y @or2ooo/bitbucket-mcp@latest
-
Alternatively, install as a local CLI (optional)
- npm i -g @or2ooo/bitbucket-mcp@latest
- Then run the server with a command equivalent to the npx call
-
Set up environment variables (examples shown below)
- ATLASSIAN_USER_EMAIL=your-email@example.com
- ATLASSIAN_API_TOKEN=your-api-token
- BITBUCKET_READONLY=false
-
Verify installation by listing configured MCP servers in your client (e.g., Claude Code) and ensure the bitbucket server appears in the MCP configuration.
Notes:
- The server relies on your Bitbucket API token; keep tokens secure and rotate them periodically.
- In development, you can build from source and run node dist/index.js, but using the npm package via npx is recommended for production usage.
Additional notes
Tips and common issues:
- If you run into authentication errors, double-check ATLASSIAN_USER_EMAIL and ATLASSIAN_API_TOKEN and ensure the token has the required scopes as documented.
- Enable BITBUCKET_READONLY to prevent write operations during testing or in shared environments.
- You can restrict access by setting BITBUCKET_DEFAULT_WORKSPACE or BITBUCKET_ALLOWED_WORKSPACES and BITBUCKET_ALLOWED_REPOS to limit what the MCP server can access.
- If using a local development setup, you may run node or dist/index.js directly; in production, using npx with the latest package is recommended to keep tools up-to-date.
- For platform-specific tooling like Claude Code, Codex, or Copilot, ensure their MCP configuration points to the same command and environment variables so prompts surface consistent results.
Related MCP Servers
augments
Comprehensive MCP server providing real-time framework documentation access for Claude Code with intelligent caching, multi-source integration, and context-aware assistance.
google-ai-mode
MCP server for free Google AI Mode search with citations. Query optimization, CAPTCHA handling, multi-agent support. Works with Claude Code, Cursor, Cline, Windsurf.
docmole
Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants
devtap
Bridge build/dev process output to AI coding sessions via MCP — supports Claude Code, Codex, OpenCode, Gemini CLI, and aider
RLM-Memory
A Model Context Protocol (MCP) server that provides AI agents with persistent memory and semantic file discovery.
mcp-local-llm
MCP server for delegating mechanical tasks to local LLMs via Ollama. Claude does the thinking, your local model does the grunt work.