Get the FREE Ultimate OpenClaw Setup Guide →

txtai-assistant

Model Context Protocol (MCP) server implementation for semantic vector search and memory management using TxtAI. This server provides a robust API for storing, retrieving, and managing text-based memories with semantic vector database search capabilities. You can use Claude and Cline AI as well.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio rmtech1-txtai-assistant-mcp python server/main.py \
  --env HOST="0.0.0.0" \
  --env PORT="8000" \
  --env LOG_LEVEL="DEBUG" \
  --env CORS_ORIGINS="*" \
  --env MAX_MEMORIES="0"

How to use

The TxtAI Assistant MCP server exposes semantic memory storage and retrieval capabilities built on top of txtai. It provides a simple API for storing memories, performing semantic searches, filtering by tags, and inspecting the health and statistics of the memory store. Once the server is running, you can leverage MCP tools such as store_memory, retrieve_memory, search_by_tag, delete_memory, get_stats, and check_health to manage and query memories in a Semanti-c memory space. This makes it suitable for AI assistants that need persistent context, fast semantic lookup, and organized memory tagging across conversations.

To use it with Claude or Cline, configure their MCP settings to point at this server. After configuration, the available tools will be exposed to your assistant, enabling you to store new memories, query for relevant past content, filter by tags, delete memories by content hash, and monitor the health and statistics of the underlying data store. The server exposes endpoints for storing and querying memories via HTTP, and it maintains memory data in a file-based backend with optional persistent storage and logging for debugging and auditing.

How to install

Prerequisites:

  • Python 3.8 or higher
  • pip (Python package installer)
  • virtualenv (recommended)

Install and run the server locally:

  1. Clone the repository
git clone https://github.com/yourusername/txtai-assistant-mcp.git
cd txtai-assistant-mcp
  1. Create and activate a virtual environment
python3 -m venv venv
# macOS/Linux
source venv/bin/activate
# Windows
venv\Scripts\activate
  1. Install dependencies
pip install -r server/requirements.txt
  1. Configure environment variables (example using the provided template)
cp .env.template .env
# Edit .env to customize HOST, PORT, CORS_ORIGINS, LOG_LEVEL, MAX_MEMORIES
  1. Start the server using the provided startup script
bash scripts/start.sh

The script will set up the environment, install dependencies if needed, create required directories, and start the MCP server with the configured settings.

Additional notes

Environment variables can customize behavior without code changes. Key variables include HOST, PORT, CORS_ORIGINS, LOG_LEVEL, and MAX_MEMORIES. If you encounter port conflicts, change PORT in the .env file or in your MCP configuration. Ensure the data and logs directories are writable by the process. If using a production environment, consider securing the server behind a reverse proxy and enabling appropriate CORS restrictions. The server stores memories in JSON files under data/ by default; you can rotate logs or adjust the memory size constraints via MAX_MEMORIES.

Related MCP Servers

Sponsor this space

Reach thousands of developers