mcp-smart-notes
Prototype Model Context Protocol (MCP) note-taking system with intelligent auto-tagging powered by local LLMs. Full MCP specification compliance, JSON-RPC 2.0 transport, and robust error handling with graceful fallbacks.
claude mcp add --transport stdio spewp-mcp-smart-notes python -m note_server \ --env MODEL_NAME="qwen2.5:7b (or other local LLM model via Ollama)" \ --env OLLAMA_HOST="localhost"
How to use
The mcp-smart-notes server provides a local, MCP-compliant note-taking system with automatic tagging powered by a local LLM via Ollama. It exposes a set of MCP tools to create, search, and manage notes with intelligent categorization. You can rely on the JSON-RPC 2.0 transport for communication with clients and leverage the built-in robust error handling to gracefully recover from common issues like model or Ollama unavailability. Typical workflows include creating a note with a title and content, after which the system automatically assigns tags from predefined categories such as Coding, Education, Greeting, and Finance.
Available tools include create_note (to add new notes with automatic tagging), search_notes (full-text search across titles, content, and tags), search_by_tag (find notes by a specific tag), list_notes (overview of notes with IDs and tagging status), update_note (modify title, content, or tags), and delete_note (permanently remove a note by ID). These tools enable a compact CLI or integration layer to drive a local knowledge base with privacy-preserving, on-device inference.
How to install
Prerequisites:
- Python 3.8 or newer
- Ollama installed and running locally
- A compatible local LLM model available to Ollama (recommended: qwen2.5:7b)
Installation steps:
-
Clone the repository git clone https://github.com/yourusername/mcp-smart-notes.git cd mcp-smart-notes
-
Install Python dependencies python -m pip install --upgrade pip pip install -r requirements.txt
-
Start Ollama in a separate terminal ollama serve
-
Pull a compatible model (in another terminal) ollama pull qwen2.5:7b
-
Run the MCP server python -m note_server
If you prefer a different invocation, adjust the command to point to your server module or entry point as needed.
Additional notes
Environment and configuration tips:
- Ensure Ollama is running and the chosen model is available locally to prevent failures in auto-tagging.
- The server uses JSON-RPC 2.0; clients should send properly structured requests and handle standard MCP error codes.
- If the model or Ollama becomes temporarily unavailable, the system will attempt fallbacks and report actionable errors rather than crashing.
- You can customize tag categories by adjusting the tagging bridge or rules within the codebase, provided you maintain MCP compatibility.
- For production deployments, consider running Ollama with sufficient resources and enable log levels that aid debugging without leaking sensitive note content.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
mcp-gemini-search
Model Context Protocol (MCP) with Gemini 2.5 Pro. Convert conversational queries into flight searches using Gemini's function calling capabilities and MCP's flight search tools
goai
AI SDK for building AI-powered applications in Go
MCP-Plugin-dotnet
.NET MCP bridge: expose app methods/data as MCP tools, prompts, and resources via an in-app plugin + lightweight server (SignalR; stdio/http).