Get the FREE Ultimate OpenClaw Setup Guide →

mcp-smart-notes

Prototype Model Context Protocol (MCP) note-taking system with intelligent auto-tagging powered by local LLMs. Full MCP specification compliance, JSON-RPC 2.0 transport, and robust error handling with graceful fallbacks.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio spewp-mcp-smart-notes python -m note_server \
  --env MODEL_NAME="qwen2.5:7b (or other local LLM model via Ollama)" \
  --env OLLAMA_HOST="localhost"

How to use

The mcp-smart-notes server provides a local, MCP-compliant note-taking system with automatic tagging powered by a local LLM via Ollama. It exposes a set of MCP tools to create, search, and manage notes with intelligent categorization. You can rely on the JSON-RPC 2.0 transport for communication with clients and leverage the built-in robust error handling to gracefully recover from common issues like model or Ollama unavailability. Typical workflows include creating a note with a title and content, after which the system automatically assigns tags from predefined categories such as Coding, Education, Greeting, and Finance.

Available tools include create_note (to add new notes with automatic tagging), search_notes (full-text search across titles, content, and tags), search_by_tag (find notes by a specific tag), list_notes (overview of notes with IDs and tagging status), update_note (modify title, content, or tags), and delete_note (permanently remove a note by ID). These tools enable a compact CLI or integration layer to drive a local knowledge base with privacy-preserving, on-device inference.

How to install

Prerequisites:

  • Python 3.8 or newer
  • Ollama installed and running locally
  • A compatible local LLM model available to Ollama (recommended: qwen2.5:7b)

Installation steps:

  1. Clone the repository git clone https://github.com/yourusername/mcp-smart-notes.git cd mcp-smart-notes

  2. Install Python dependencies python -m pip install --upgrade pip pip install -r requirements.txt

  3. Start Ollama in a separate terminal ollama serve

  4. Pull a compatible model (in another terminal) ollama pull qwen2.5:7b

  5. Run the MCP server python -m note_server

If you prefer a different invocation, adjust the command to point to your server module or entry point as needed.

Additional notes

Environment and configuration tips:

  • Ensure Ollama is running and the chosen model is available locally to prevent failures in auto-tagging.
  • The server uses JSON-RPC 2.0; clients should send properly structured requests and handle standard MCP error codes.
  • If the model or Ollama becomes temporarily unavailable, the system will attempt fallbacks and report actionable errors rather than crashing.
  • You can customize tag categories by adjusting the tagging bridge or rules within the codebase, provided you maintain MCP compatibility.
  • For production deployments, consider running Ollama with sufficient resources and enable log levels that aid debugging without leaking sensitive note content.

Related MCP Servers

Sponsor this space

Reach thousands of developers