code-review-assistant
The Code Review Assistant is a simple multi-agent system built using the Model Context Protocol (MCP) and LangChain. Its purpose is to provide automated, preliminary feedback on code snippets, including syntax checking, code explanation, and improvement suggestions. It can be integrated with MCP-compatible clients like Cursor IDE or Claude for Desk
claude mcp add --transport stdio dasunmihiranga-code-review-assistant-mcp uvx run code_review_server.py \ --env GROQ_API_KEY="paste-your-groq-api-key-here"
How to use
The Code Review Assistant exposes a multi-agent code review toolset via MCP. It leverages a simple, modular design to provide automated, preliminary feedback on code snippets, including syntax checks, high-level explanations of what the code does, and actionable improvement suggestions. Users can connect with MCP clients to access the review capabilities through a single tool named review_code. When you call review_code with a code snippet, the server processes the input using the configured language model backend (local Ollama models or the Groq API) and returns a consolidated review that covers potential syntax issues, a plain-language explanation of the code, and suggested improvements. This makes it suitable for quick pass-through reviews during development or as a companion assistant integrated into IDEs or chat-driven MCP clients.
How to install
Prerequisites:
- Python installed on your system (recommended Python 3.8+)
- uv tool installed (e.g., pipx install uv)
- Access to Groq API or local Ollama models if you plan to use those backends
- Project clone containing the code_review_server.py and related modules
Step-by-step setup:
- Clone the repository to your machine and navigate into the project directory.
- Install uv (Python tool) if you haven't already:
- pipx install uv
- Create and activate a virtual environment (optional but recommended):
- uv venv
- On Windows: .venv\Scripts\activate
- On macOS/Linux: source .venv/bin/activate
- Install Python dependencies listed in requirements.txt (if not handled by uv):
- uv sync
- Create an .env file at the project root with your configuration. Copy from a sample file if provided and insert any necessary keys (e.g., Groq API key):
GROQ_API_KEY=your-groq-api-key
Add any other environment variables required by your setup
- Ensure any optional backends (like Ollama) are installed and running per their instructions.
- Start the MCP server using uv:
- uv run code_review_server.py The server will listen for connections from MCP clients and expose the review_code tool.
Notes:
- If you use Ollama or Groq, ensure the respective services are running and accessible before starting the server.
- You can customize prompts and agent behavior by editing the files under prompts/ and agents/ as needed.
Additional notes
Tips and common issues:
- Environment variables: Keep sensitive keys (GROQ_API_KEY, Ollama endpoints) out of version control; use a local .env file and load it at startup.
- Backends: The server supports both local Ollama models and the Groq API. If Groq API keys are not configured, the server may fall back to alternative backends if available.
- Connecting clients: Ensure the MCP client is configured to target the code_review_assistant server name and port, typically discovered via the MCP discovery mechanism.
- Prompts: The quality of the reviews depends on the prompt templates in prompts/. If you need more detailed explanations or stricter style guidelines, adjust the prompt templates accordingly.
- Troubleshooting: If the server fails to start, check for port conflicts, Python dependencies, and that uv is correctly installed in your environment.
Related MCP Servers
mcp-ical
A Model Context Protocol Server that allows you to interact with your MacOS Calendar through natural language.
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
oxylabs
Official Oxylabs MCP integration
lc2mcp
Convert LangChain tools to FastMCP tools
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.