txtai-assistant
Model Context Protocol (MCP) server implementation for semantic vector search and memory management using TxtAI. This server provides a robust API for storing, retrieving, and managing text-based memories with semantic vector database search capabilities. You can use Claude and Cline AI as well.
claude mcp add --transport stdio rmtech1-txtai-assistant-mcp python server/main.py \ --env HOST="0.0.0.0" \ --env PORT="8000" \ --env LOG_LEVEL="DEBUG" \ --env CORS_ORIGINS="*" \ --env MAX_MEMORIES="0"
How to use
The TxtAI Assistant MCP server exposes semantic memory storage and retrieval capabilities built on top of txtai. It provides a simple API for storing memories, performing semantic searches, filtering by tags, and inspecting the health and statistics of the memory store. Once the server is running, you can leverage MCP tools such as store_memory, retrieve_memory, search_by_tag, delete_memory, get_stats, and check_health to manage and query memories in a Semanti-c memory space. This makes it suitable for AI assistants that need persistent context, fast semantic lookup, and organized memory tagging across conversations.
To use it with Claude or Cline, configure their MCP settings to point at this server. After configuration, the available tools will be exposed to your assistant, enabling you to store new memories, query for relevant past content, filter by tags, delete memories by content hash, and monitor the health and statistics of the underlying data store. The server exposes endpoints for storing and querying memories via HTTP, and it maintains memory data in a file-based backend with optional persistent storage and logging for debugging and auditing.
How to install
Prerequisites:
- Python 3.8 or higher
- pip (Python package installer)
- virtualenv (recommended)
Install and run the server locally:
- Clone the repository
git clone https://github.com/yourusername/txtai-assistant-mcp.git
cd txtai-assistant-mcp
- Create and activate a virtual environment
python3 -m venv venv
# macOS/Linux
source venv/bin/activate
# Windows
venv\Scripts\activate
- Install dependencies
pip install -r server/requirements.txt
- Configure environment variables (example using the provided template)
cp .env.template .env
# Edit .env to customize HOST, PORT, CORS_ORIGINS, LOG_LEVEL, MAX_MEMORIES
- Start the server using the provided startup script
bash scripts/start.sh
The script will set up the environment, install dependencies if needed, create required directories, and start the MCP server with the configured settings.
Additional notes
Environment variables can customize behavior without code changes. Key variables include HOST, PORT, CORS_ORIGINS, LOG_LEVEL, and MAX_MEMORIES. If you encounter port conflicts, change PORT in the .env file or in your MCP configuration. Ensure the data and logs directories are writable by the process. If using a production environment, consider securing the server behind a reverse proxy and enabling appropriate CORS restrictions. The server stores memories in JSON files under data/ by default; you can rotate logs or adjust the memory size constraints via MAX_MEMORIES.
Related MCP Servers
PPTAgent
An Agentic Framework for Reflective PowerPoint Generation
mcp-aktools
📈 提供股票、加密货币的数据查询和分析功能MCP服务器
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
Gitingest
mcp server for gitingest
the -company
TheMCPCompany: Creating General-purpose Agents with Task-specific Tools
davinci -professional
An enterprise-grade MCP server that exposes the full functionality of DaVinci Resolve and DaVinci Resolve Studio (through version 20) to either Claude Desktop or Cursor MCP clients. Fully configured and tested as a Claude Desktop Extension making installation as easy as clicking a button. Supports both Windows and Macintosh.