lad_mcp_server
Lad MCP Server: Autonomous code & system design review for AI coding agents (Claude Code, Cursor, Codex, etc.). Features multi-model consensus via OpenRouter and context-aware reviews via Serena.
claude mcp add --transport stdio shelpuk-ai-technology-consulting-lad_mcp_server node lad_server.js \ --env SERENA_API_KEY="YOUR_SERENA_API_KEY (optional if using Serena integration)" \ --env OPENROUTER_API_KEY="YOUR_OPENROUTER_API_KEY (optional for dual-reviewer model metadata retrieval)"
How to use
Lad is a project-aware AI design and code review MCP server. It enhances AI-generated code reviews by providing a second, independent evaluation (dual-reviewer) and, when available, integrates with Serena to reference project history, requirements, and design decisions. This enables the reviewer LLMs to spot inconsistencies with broader project constraints and ensures code changes align with long-term goals rather than isolated diffs. Use Lad alongside other MCP servers in the suite to get richer feedback on AI-generated code, with the option to enable Serena-backed context for deeper analysis.
To use Lad effectively, run the Lad MCP server in your environment and query its code review and system design tools as part of your agent workflow. Lad exposes two primary review modes: code_review for diffs and system_design_review for architectural planning. When Serena is available, Lad connects to Serena's repository index and memories to ground reviews in your project history. If Serena is not used, Lad still provides robust dual-reviewer analysis based on the submitted code and context provided in the request.
How to install
Prerequisites:
- Node.js (v14 or newer) and npm installed on your system
- Access to OpenRouter for dual-reviewer model metadata (optional but recommended)
- Optionally Serena set up if you want project-aware memory integration
Installation steps:
-
Clone the Lad MCP Server repository: git clone https://github.com/Shelpuk-AI-Technology-Consulting/lad_mcp_server.git cd lad_mcp_server
-
Install dependencies: npm install
-
Configure environment (example):
- Create a .env file or export variables in your environment
- Example content: OPENROUTER_API_KEY=your-openrouter-api-key SERENA_API_KEY=your-serena-api-key
-
Start the server: npm run start or node lad_server.js
-
Verify the server is running by hitting the configured port (default: 3000) or by checking the logs. Integrate the mcp_config snippet into your MCP orchestration to enable Lad in your workflow.
Additional notes
Tips and notes:
- If you enable Serena integration, Lad can reference persistent project memories, design decisions, and requirements to improve review quality. This requires proper Serena setup and API access.
- For dual-reviewer mode, ensure OPENROUTER_SECONDARY_REVIEWER_MODEL is configured if you want to switch between single and dual reviewer modes.
- Keep API keys secret. Do not commit them to version control; use environment variables or a secrets manager.
- If you encounter token or model loading issues, verify network access to OpenRouter and Serena endpoints, and ensure the correct API keys are loaded.
- You can customize environment variables to control model selection, review verbosity, and memory behavior depending on your security and performance needs.
Related MCP Servers
awesome-claude-skills
A curated list of awesome Claude Skills, resources, and tools for customizing Claude AI workflows
OpenContext
A personal context store for AI agents and assistants—reuse your existing coding agent CLI (Codex/Claude/OpenCode) with built‑in Skills/tools and a desktop GUI to capture, search, and reuse project knowledge across agents and repos.
Context-Engine
Context-Engine MCP - Agentic Context Compression Suite
kindly-web-search
Kindly Web Search MCP Server: Web search + robust content retrieval for AI coding tools (Claude Code, Codex, Cursor, GitHub Copilot, Gemini, etc.) and AI agents (Claude Desktop, OpenClaw, etc.). Supports Serper, Tavily, and SearXNG.
mcp-image
MCP server for AI image generation and editing with automatic prompt optimization and quality presets (fast/balanced/quality). Powered by Gemini (Nano Banana 2 & Pro).
codingbuddy
Codingbuddy orchestrates 29 specialized AI agents to deliver code quality comparable to a team of human experts through a PLAN → ACT → EVAL workflow.