Get the FREE Ultimate OpenClaw Setup Guide →

fegis

Define AI tools in YAML with natural language schemas. All tool usage is automatically stored in Qdrant vector database, enabling semantic search, filtering, and memory retrieval across sessions.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio p-funk-fegis uv --directory /absolute/path/to/fegis run fegis \
  --env AGENT_ID="claude_desktop" \
  --env QDRANT_URL="http://localhost:6333" \
  --env ARCHETYPE_PATH="/absolute/path/to/fegis-wip/archetypes/default.yaml" \
  --env QDRANT_API_KEY="" \
  --env COLLECTION_NAME="fegis_memory" \
  --env EMBEDDING_MODEL="BAAI/bge-small-en"

How to use

Fegis is a productivity-focused MCP server that lets you author prompts as YAML-defined tools, store all tool executions with full context in a vector database, and search across past tool usages. Tools are described in archetypes and invoked via structured YAML or through the available UI/CLI integrations. When you run Fegis, it automatically saves tool inputs, outputs, and session context as embeddings in Qdrant, enabling semantic search and retrieval of prior analyses, ideas, or privacy considerations. The included SearchMemory capability lets you query what you previously analyzed or generated, making it easy to revisit and refine your work.

To use Fegis, ensure your Qdrant instance is running and configure the environment as shown in the configuration example. You’ll have access to tools defined by the YAML archetypes (e.g., BiasDetector and Introspection tools) as well as memory-related capabilities. You can craft prompts in YAML to drive complex cognitive tasks, and rely on Fegis to persist results for future retrieval via semantic search or direct memory lookups.

How to install

Prerequisites:

  • Node.js or uv package manager (as needed for your setup)
  • Python 3.13+ (if using Python-based tooling)
  • Docker (for Qdrant)
  • Git

Installation steps:

  1. Install uv (as shown in the Quick Start, depending on your OS)
  2. Clone the repository
  3. Start Qdrant (vector database) in a container
    • docker run -d --name qdrant -p 6333:6333 -p 6334:6334 qdrant/qdrant:latest
  4. Install Python dependencies (if applicable) and ensure Python 3.13+ is available
  5. Set up configuration files
    • Create or update claude_desktop_config.json (as shown in the README) or adjust your environment to point to the FEgis archetypes
  6. Run the MCP server using uv with the provided command (see mcp_config example) or start your preferred runtime environment

Notes:

  • Ensure the ARCHETYPE_PATH points to your YAML archetype definitions and that the QDRANT URL is reachable from the host running the MCP server.

Additional notes

Tips and common considerations:

  • Ensure Qdrant is up before starting the MCP server; the Embedding and Collection names should align with the config (default collection is fegis_memory).
  • The ARCHETYPE_PATH should point to valid archetype YAML files; invalid paths will cause tool registration to fail.
  • If using claude_desktop, ensure the environment variables (QDRANT_URL, QDRANT_API_KEY, COLLECTION_NAME, EMBEDDING_MODEL, ARCHETYPE_PATH, AGENT_ID) are correctly set in claude_desktop_config.json.
  • The embedding model (default BAAI/bge-small-en) can be swapped for different accuracy/speed trade-offs.
  • The system stores a complete history of tool invocations; be mindful of potential sensitive data in tool inputs/outputs and adjust retention policies as needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers