mandoline
MCP server that enables LLMs to evaluate themselves
claude mcp add --transport stdio mandoline-ai-mandoline-mcp-server node server.js \ --env PORT="8080"
How to use
Mandoline provides an MCP server that exposes a suite of evaluation tools and resources to help AI assistants reflect on, critique, and improve their own performance using Mandoline's evaluation framework.
To use it, integrate the Mandoline MCP server into your assistant as described in the client setup sections. The hosted Mandoline MCP server runs locally at http://localhost:8080 by default when you start it with npm start, and you can configure clients to point at that locally-hosted endpoint during development. The available tools are organized under Health, Metrics, Evaluations, and Resources:
- Health: get_server_health helps verify the MCP server is reachable and healthy.
- Metrics: create_metric, batch_create_metrics, get_metric, get_metrics, update_metric allow you to define and manage evaluation criteria for tasks you care about.
- Evaluations: create_evaluation, batch_create_evaluations, get_evaluation, get_evaluations, update_evaluation enable you to score prompts and responses against your metrics and inspect historical results.
- Resources: llms.txt and mcp provide quick access to Mandoline docs and MCP setup guidance for assistants.
Once configured, your AI assistant can invoke these tools during conversations to evaluate outputs, compare options, and generate structured feedback, improving alignment with your evaluation criteria over time.
How to install
Prerequisites
- Node.js 18+ and npm
- A working Mandoline account and API keys if you plan to use Mandoline tools that require authentication
Installation steps
- Clone the repository and install dependencies
git clone https://github.com/mandoline-ai/mandoline-mcp-server.git
cd mandoline-mcp-server
npm install
- Build (if the project uses a build step)
npm run build
- Optional: configure environment for local run
cp .env.example .env.local
# Edit .env.local to customize PORT, LOG_LEVEL, etc.
- Start the server
npm start
The server will run on http://localhost:8080 by default (unless you override the PORT in your environment).
Using the hosted server If you prefer the hosted Mandoline MCP server, configure clients to point to https://mandoline.ai/mcp as shown in the client setup instructions. Replace any local endpoints accordingly in your assistant integration.
Additional notes
Notes and tips:
- If you run locally, ensure PORT 8080 is open in your environment and not used by another process.
- When updating configuration (e.g., API keys), restart your client tools (Claude Code, Codex, Claude Desktop, Cursor) to pick up changes.
- For local development, ensure you have a valid Mandoline API key if you plan to enable any authenticated features.
- The MCP server supports a variety of tools (health, metrics, evaluations, resources); you can expose or customize endpoints as needed in your deployment.
- Check the official documentation linked in the README for up-to-date usage guidance and compatibility notes.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
mcp -langfuse
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.
rohlik
MCP server that lets you shop groceries across the Rohlik Group platforms (Rohlik.cz, Knuspr.de, Gurkerl.at, Kifli.hu, Sezamo.ro)
mcp -chart-minio
mcp-server-chart私有化部署方案
kanban
MCP Kanban is a specialized middleware designed to facilitate interaction between Large Language Models (LLMs) and Planka, a Kanban board application. It serves as an intermediary layer that provides LLMs with a simplified and enhanced API to interact with Planka's task management system.
codemesh
The Self-Improving MCP Server - Agents write code to orchestrate multiple MCP servers with intelligent TypeScript execution and auto-augmentation