fegis
Define AI tools in YAML with natural language schemas. All tool usage is automatically stored in Qdrant vector database, enabling semantic search, filtering, and memory retrieval across sessions.
claude mcp add --transport stdio p-funk-fegis uv --directory /absolute/path/to/fegis run fegis \ --env AGENT_ID="claude_desktop" \ --env QDRANT_URL="http://localhost:6333" \ --env ARCHETYPE_PATH="/absolute/path/to/fegis-wip/archetypes/default.yaml" \ --env QDRANT_API_KEY="" \ --env COLLECTION_NAME="fegis_memory" \ --env EMBEDDING_MODEL="BAAI/bge-small-en"
How to use
Fegis is a productivity-focused MCP server that lets you author prompts as YAML-defined tools, store all tool executions with full context in a vector database, and search across past tool usages. Tools are described in archetypes and invoked via structured YAML or through the available UI/CLI integrations. When you run Fegis, it automatically saves tool inputs, outputs, and session context as embeddings in Qdrant, enabling semantic search and retrieval of prior analyses, ideas, or privacy considerations. The included SearchMemory capability lets you query what you previously analyzed or generated, making it easy to revisit and refine your work.
To use Fegis, ensure your Qdrant instance is running and configure the environment as shown in the configuration example. You’ll have access to tools defined by the YAML archetypes (e.g., BiasDetector and Introspection tools) as well as memory-related capabilities. You can craft prompts in YAML to drive complex cognitive tasks, and rely on Fegis to persist results for future retrieval via semantic search or direct memory lookups.
How to install
Prerequisites:
- Node.js or uv package manager (as needed for your setup)
- Python 3.13+ (if using Python-based tooling)
- Docker (for Qdrant)
- Git
Installation steps:
- Install uv (as shown in the Quick Start, depending on your OS)
- Windows: winget install --id=astral-sh.uv -e
- macOS/Linux: curl -LsSf https://astral.sh/uv/install.sh | sh
- Clone the repository
- git clone https://github.com/p-funk/fegis.git
- Start Qdrant (vector database) in a container
- docker run -d --name qdrant -p 6333:6333 -p 6334:6334 qdrant/qdrant:latest
- Install Python dependencies (if applicable) and ensure Python 3.13+ is available
- Set up configuration files
- Create or update claude_desktop_config.json (as shown in the README) or adjust your environment to point to the FEgis archetypes
- Run the MCP server using uv with the provided command (see mcp_config example) or start your preferred runtime environment
Notes:
- Ensure the ARCHETYPE_PATH points to your YAML archetype definitions and that the QDRANT URL is reachable from the host running the MCP server.
Additional notes
Tips and common considerations:
- Ensure Qdrant is up before starting the MCP server; the Embedding and Collection names should align with the config (default collection is fegis_memory).
- The ARCHETYPE_PATH should point to valid archetype YAML files; invalid paths will cause tool registration to fail.
- If using claude_desktop, ensure the environment variables (QDRANT_URL, QDRANT_API_KEY, COLLECTION_NAME, EMBEDDING_MODEL, ARCHETYPE_PATH, AGENT_ID) are correctly set in claude_desktop_config.json.
- The embedding model (default BAAI/bge-small-en) can be swapped for different accuracy/speed trade-offs.
- The system stores a complete history of tool invocations; be mindful of potential sensitive data in tool inputs/outputs and adjust retention policies as needed.
Related MCP Servers
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
Gitingest
mcp server for gitingest
pfsense
pfSense MCP Server enables security administrators to manage their pfSense firewalls using natural language through AI assistants like Claude Desktop. Simply ask "Show me blocked IPs" or "Run a PCI compliance check" instead of navigating complex interfaces. Supports REST/XML-RPC/SSH connections, and includes built-in complian
mcp -memos-py
A Python package enabling LLM models to interact with the Memos server via the MCP interface for searching, creating, retrieving, and managing memos.
davinci -professional
An enterprise-grade MCP server that exposes the full functionality of DaVinci Resolve and DaVinci Resolve Studio (through version 20) to either Claude Desktop or Cursor MCP clients. Fully configured and tested as a Claude Desktop Extension making installation as easy as clicking a button. Supports both Windows and Macintosh.
openapi -proxy
An MCP server that provides tools for exploring large OpenAPI schemas