Get the FREE Ultimate OpenClaw Setup Guide →

opennote

A notebook app built with Rust + Flutter and AI tech stacks. Support semantic search, mcp, importing webpages and more.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add opennote-org-opennote

How to use

OpenNote is a Rust-based backend service that powers a personal notebook with semantic search capabilities and multi-user workspaces. It exposes an MCP (Model Context Protocol) endpoint so you can route your queries through your preferred LLM client or agent while benefiting from OpenNote's embedding and vector search flow. With MCP support, you can issue structured requests to manage notes, perform semantic searches, and coordinate multi-step actions via the HTTP streamable interface. To begin, run the MCP-enabled server and point your MCP client at the provided base URL. You can then register the opennote MCP server in your client configuration and start issuing requests to store, retrieve, and search notes using your chosen model provider.

How to install

Prerequisites:

  • Docker and Docker Compose installed on your machine
  • optionally Rust/tools if you want to build from source or use local tooling

Installation steps:

  1. Clone the OpenNote repository to your machine.
  2. Use Docker Compose (recommended) to deploy the full stack (Frontend, Backend, Qdrant, and vLLMEmbedder) as described in the documentation. Example: docker compose up --build
  3. If you prefer manual setup, ensure Docker is running, start a local Qdrant instance, and run the backend Rust service. A typical local backend config is provided in backend/config.prod.json; you may copy it to config.prod.json and tailor the paths and endpoints as needed.
  4. After the services are up, the OpenNote application will be accessible at http://localhost:3000 (or the port configured in your setup). To configure the MCP server entry, use the provided MCP JSON snippet and register it in your MCP client.

Notes:

  • The project uses a multi-stage Docker build to compile both frontend (Flutter) and backend (Rust).
  • You can customize the embedder backend by editing backend/config.prod.json or backend/config.docker.json if using Docker Compose.

Additional notes

Tips and common issues:

  • Ensure Qdrant is running and accessible before starting the backend, as it serves as the vector database for semantic search.
  • If you’re using GPU acceleration with vLLM, make sure the NVIDIA Container Toolkit is installed and that you’ve configured compose.yaml accordingly. For CPU-only setups, switch the embedder image to a CPU-compatible one.
  • The MCP server block uses baseUrl pointing to the backend MCP endpoint (e.g., http://localhost:8086/mcp). Replace with your deployed host if needed and provide appropriate authorization headers in your MCP client configuration.
  • If you modify service names or hosts in docker-compose, ensure the base_url in your MCP config matches the actual deployed URL.
  • The project supports multiple embedding providers (OpenAI, Cohere, NomiC, vLLM, etc.). Configure backend/config.prod.json accordingly to select your embedding provider and model.

Related MCP Servers

Sponsor this space

Reach thousands of developers