mcp -lancedb
MCP server from kyryl-opens-ml/mcp-server-lancedb
claude mcp add --transport stdio kyryl-opens-ml-mcp-server-lancedb uvx mcp-lance-db
How to use
This MCP server provides a lightweight semantic memory layer backed by LanceDB. It exposes two MCP tools: add-memory and search-memories. Use add-memory to store a piece of text content along with its vector embedding in a LanceDB collection named memories. Use search-memories to retrieve semantically similar memories by supplying a query string; you can optionally set a limit (default 5) to control how many results are returned. The server maintains state and will notify connected clients of resource changes as memories are added or queried. The embedding provider used is sentence-transformers with the BAAI/bge-small-en-v1.5 model running on CPU by default, making it suitable for local or lightweight deployments. This setup acts as a persistent, queryable semantic memory store that can be leveraged by LLM workflows, chat assistants, or AI tools that need to recall prior context.
How to install
Prerequisites:
- Python 3.9+ (recommended latest stable)
- Basic familiarity with running shell commands
Installation steps:
- Ensure Python is installed and accessible: python3 --version
- Install the MCP runtime tooling (uv/uvx ecosystem) used to install and run MCP servers. Example with pipx:
- python3 -m pip install --user pipx
- python3 -m pipx ensurepath
- pipx install uvx
- Install the LanceDB-based MCP server package (mcp-lance-db) using the MCP tooling:
- uvx install mcp-lance-db
- Install Python dependencies for LanceDB and embedding models (as needed by the server code):
- python3 -m pip install lancedb sentence-transformers
- (Optional) python3 -m pip install transformers
- Create or verify the configuration is accessible to the MCP runtime (see mcp_config below). You can customize the LanceDB path and collection as needed in your environment.
- Run the server (or follow your environment’s packaging):
- uvx run mcp-lance-db
Notes:
- If you already have uvx installed via another method, you can adapt the commands accordingly (e.g., uvx install mcp-lance-db and uvx run mcp-lance-db).
- Ensure the LanceDB database path ./lancedb exists or is writable by the process. The default collection name is memories.
Additional notes
Tips and reminders:
- The server uses LanceDB as a vector store; ensure the database directory (./lancedb) is accessible and has sufficient disk space for embeddings and memories.
- The embedding model used is heavy on CPU; for large datasets consider using a more capable CPU or a GPU-enabled environment if available.
- The Quickstart config demonstrates how Claude Desktop reads the LanceDB config; you can adapt the same structure to other clients by pointing them to the mcp-lance-db server.
- Debugging MCP servers over stdio can be challenging; for a better experience, use the MCP Inspector tool: npx @modelcontextprotocol/inspector uv --directory $(PWD) run mcp-lance-db
- If you change the embedding provider or model, ensure compatibility with your retrieval logic and any similarity threshold you set (default 0.7 for distance-based filtering).
- When deploying, consider adding environment variables to configure the database path, collection name, model, and threshold without changing code.
- If you encounter issues with installing or running, verify permissions, Python environment isolation (virtualenv/venv), and that the required Python packages are installed in the active environment.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP