codescene
The CodeScene MCP Server exposes CodeScene’s Code Health analysis as local AI-friendly tools.
claude mcp add --transport stdio codescene-oss-codescene-mcp-server docker run -i codescene/codescene-mcp \ --env CS_ONPREM_URL="<your_on_prem_instance_url_if_any>" \ --env CS_ACCESS_TOKEN="<your_code_scene_personal_access_token>" \ --env CS_ACE_ACCESS_TOKEN="<ACE_access_token_if_using_ACE>"
How to use
The CodeScene MCP Server exposes CodeScene’s Code Health insights as local, AI-friendly tools. Once running, AI agents and assistants can query the local repository for maintainability signals such as complexity hotspots, deep nesting, and low cohesion, enabling safer and more informed suggestions. If you have CodeScene ACE enabled, you can leverage automated refactoring workflows where AI-driven prompts are guided by precise modularity improvements. The server runs locally, communicating with your CodeScene instance via your personal access token, and performs analysis on your repository data without sending code to external services. To use it, connect your AI tool to the MCP endpoint provided by the running Docker container (typically localhost with the port mapped by your setup) and supply your CS_ACCESS_TOKEN. If ACE is desired, also provide CS_ACE_ACCESS_TOKEN so the ACE augmentation can be applied during analysis and refactoring prompts.
How to install
Prerequisites:
- Docker installed and running on your machine
- A CodeScene account to obtain a CS_ACCESS_TOKEN
- (Optional) ACE add-on and CS_ACE_ACCESS_TOKEN if you plan to use ACE features
Installation steps:
- Retrieve your access token from CodeScene (CS_ACCESS_TOKEN).
- Start the MCP server via Docker: docker run -i codescene/codescene-mcp Note: You can customize environment variables as needed (see below).
- If you need to pass authentication or other environment variables, run with explicit env vars:
docker run -i
-e CS_ACCESS_TOKEN="<your_token>"
-e CS_ONPREM_URL="<your_on_prem_url>"
-e CS_ACE_ACCESS_TOKEN="<ace_token>"
codescene/codescene-mcp - Verify the MCP server is up by testing a simple query from your AI tool or your browser if the endpoint supports it.
Notes:
- If you are behind a corporate proxy or firewall, ensure outbound access is allowed to CodeScene APIs as required by CS_ACCESS_TOKEN.
- For ACE usage, ensure the ACE token is valid and passed to the container via CS_ACE_ACCESS_TOKEN.
Additional notes
Tips and considerations:
- The MCP server runs locally and keeps analysis on your machine; no code is uploaded to external services.
- If using Docker, you may want to map a mount path for persistent configuration and per-project setups, especially when you have multiple repos.
- When mounting volumes, consider per-project mcp.json configurations or a root-mounted workspace with project-specific references.
- If you encounter token-related errors, double-check that CS_ACCESS_TOKEN is valid and not expired.
- ACE is an optional add-on. If you’re not using ACE, you can omit CS_ACE_ACCESS_TOKEN; the MCP will still provide Code Health insights.
- For best AI results, use frontier models where supported by your AI tool, as Code Health prompts and ACE guidance are designed to complement them.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP