aider
An experimental MCP server to use aider as a coding agent.
claude mcp add --transport stdio danielscholl-aider-mcp-server python -m aider_mcp_server \ --env TRANSPORT="sse"
How to use
The Aider MCP Server exposes AI coding capabilities via a standardized MCP interface. It uses Aider to perform coding tasks, model selection, and general question answering, accessible through tools such as ai_code for coding tasks, get_models to list available models, and ask_question for direct prompts to an LLM. The server supports multiple transports (SSE or stdio) to fit into your integration workflow, and lets you configure Aider sessions with a range of environment variables, including API keys for OpenAI, Anthropic, and Google Gemini. You can run the server in SSE mode to expose a streaming API endpoint, or use stdio mode where the MCP client starts the server automatically. The included MCP tools provide concrete JSON inputs and examples to help you orchestrate AI-assisted coding tasks or queries within your applications.
How to install
Prerequisites:
- Python 3.10 or higher
- A UDP/Internet-connected environment to install dependencies
- Access to an MCP client library or environment capable of HTTP/WS streaming or stdio integration
Install steps:
-
Clone the repository: git clone https://github.com/your-username/aider-mcp.git cd aider-mcp
-
Create and activate a Python virtual environment (recommended): python -m venv venv source venv/bin/activate # On Windows use: venv\Scripts\activate
-
Install the package in editable mode: uv pip install -e .
-
Run tests to verify setup: uv run pytest
-
Start the server (example for SSE mode): uv run python -m aider_mcp_server
Note: Ensure API keys for any models you intend to use (OpenAI, Anthropic, Gemini) are provided via environment variables (OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY) as needed.
Additional notes
Environment variables and configuration:
- TRANSPORT: sse (default) or stdio when using the MCP client in stdio mode.
- HOST, PORT: Bind address and port for SSE transport as needed.
- API keys: Provide OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY only if you intend to use the corresponding models.
Common issues:
- If the server fails to start, verify your Python version and that the aider_mcp_server module is importable in your environment.
- When using stdio mode, the MCP client must be configured to spawn the server with the correct command and environment (e.g., TRANSPORT=stdio).
- Ensure network accessibility if SAP clients connect over SSE; firewall rules may block the SSE endpoint.
Tips:
- Use get_models to discover available models before running ai_code tasks.
- For code generation tasks, supply a clear ai_coding_prompt and restrict editable files to the project scope.
- When using Docker for deployment, mount your project path and pass API keys as environment variables.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP