Get the FREE Ultimate OpenClaw Setup Guide →

frappe

AI power for Frappe Framework

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio vyogotech-frappe-mcp-server docker run -i vyogotech/frappe-mcp-server:latest \
  --env DEBUG="true|false" \
  --env MODEL_API_KEY="your-model-api-key" \
  --env FRAPPE_API_KEY="your-frappe-api-key" \
  --env MODEL_BASE_URL="URL to LLM provider or local model" \
  --env FRAPPE_BASE_URL="URL to ERPNext/Frappe instance"

How to use

This MCP server acts as a bridge between ERPNext/Frappe applications and AI assistants via the MCP protocol. It exposes capabilities to run natural language queries against ERPNext data, perform CRUD operations on any doctype, search documents, and analyze documents or projects. Clients like Cursor IDE, Claude Desktop, or HTTP API calls can interact with the server to issue commands such as listing documents, creating or updating records, and requesting deep analyses that relate documents across doctype boundaries. The server supports a generic LLM configuration workflow, allowing you to point at a local Ollama model or a cloud provider (OpenAI, Together.ai, Groq, etc.). Typical usage involves sending a chat-like payload to the REST API (e.g., /api/v1/chat) or using the MCP protocol over your preferred client.

How to install

Prerequisites:

  • Docker installed and running (for the recommended docker deployment)
  • ERPNext/Frappe instance accessible from the deployment environment
  • Optional: Ollama or other OpenAI-compatible providers for local or remote AI models

Option A: Automated Install (Recommended)

  1. Run the install script provided by the project to install the MCP server binary for MCP clients (Cursor, Claude Desktop):
curl -fsSL https://raw.githubusercontent.com/vyogotech/frappe-mcp-server/main/install.sh | bash

This installs the MCP server binary to ~/.local/bin/frappe-mcp-server-stdio

Option B: Manual Install (Go binary)

  1. Clone the repository or download the pre-built binary from releases:
# Clone
git clone https://github.com/vyogotech/frappe-mcp-server
cd frappe-mcp-server
  1. Build the STDIO server (for MCP clients like Cursor/Claude Desktop):
make build-stdio
  1. Configure the server:
cp config.yaml.example config.yaml
# Edit config.yaml with your ERPNext/Frappe credentials and model/provider details
  1. Run the server locally (example using the built binary):
./frappe-mcp-server-stdio --config /path/to/config.yaml

Notes:

  • If using Docker, ensure network access to your ERPNext/Frappe instance and to any AI model providers.
  • Ensure that API keys and base URLs are securely managed (prefer environment variables or secret management in production).

Additional notes

Tips and common considerations:

  • Environment variables: FRAPPE_BASE_URL, FRAPPE_API_KEY, MODEL_BASE_URL, MODEL_API_KEY, DEBUG. Use secure values and avoid hardcoding in your config.
  • The server exposes tools like CRUD operations (get_document, list_documents, create_document, update_document, delete_document), search_documents, analyze_document, and project tools (get_project_status, portfolio_dashboard, analyze_project_timeline). Leverage these in your LLM prompts to interact with ERPNext data.
  • If running locally with Ollama, ensure Ollama is up and the model is loaded/accessible by the server.
  • For Claude Desktop or Cursor, configure the client to point at the server's REST API or the stdio binary as shown in the quick start examples.
  • Review the API Reference and Quick Start guides in the documentation for detailed request formats and payload schemas.

Related MCP Servers

Sponsor this space

Reach thousands of developers