Get the FREE Ultimate OpenClaw Setup Guide →

MakerAi

The AI Operating System for Delphi. 100% native framework with RAG 2.0 for knowledge retrieval, autonomous agents with semantic memory, visual workflow orchestration, and universal LLM connector. Supports OpenAI, Claude, Gemini, Ollama, and more. Enterprise-grade AI for Delphi 10.3+

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio gustavoeenriquez-makerai python -m makerai.server \
  --env MAKERAI_API_KEY="your-api-key-for MakerAI providers" \
  --env MAKERAI_LOG_LEVEL="info"

How to use

MakerAI MCP server exposes or consumes tools within the MakerAI Delphi-based AI ecosystem. It is designed to interoperate with the MCP framework by wrapping and/or connecting to model and tool capabilities provided by MakerAI, such as RAG pipelines, autonomous agents, and various ChatTools. Once the MCP server is running, you can deploy and register tools (functions, commands, or services) that clients can invoke via Model Context Protocol messages. The server serves as an integration point between client requests and the underlying capabilities exposed by MakerAI, enabling orchestrated tool usage, graph-based reasoning, and human-in-the-loop workflows within Delphi-based deployments. You can query provider-specific components, universal connectors, and the RAG/Document management pipelines through the MCP interface, and compose multi-step interactions across tools and models.

How to install

Prerequisites:\n- A supported operating system (Windows, macOS, or Linux) with a compatible runtime for the chosen execution method (Python in this configuration).\n- Python 3.8+ installed and available on your PATH.\n- Basic networking setup to expose the MCP server endpoints if you intend to reach it from clients.\n\nStep-by-step:\n1) Clone the MakerAI repository containing the MCP server components.\n2) Create a Python virtual environment and install required dependencies. Example:\n python -m venv venv\n source venv/bin/activate # Unix/macOS\n venv\Scripts\activate # Windows\n pip install -r requirements.txt\n3) Configure environment variables as needed (see mcp_config env section).\n4) Run the MCP server using the command configured in mcp_config. Example: python -m makerai.server --config config/mcp.yaml\n5) Verify the server is listening on the expected endpoint and can be reached by MCP clients.\n6) If you plan to run with Docker, adapt the run command to docker run ... with appropriate port mappings and volumes.

Additional notes

Tips and considerations:\n- Ensure your MakerAI API keys and any provider credentials are securely managed (prefer environment variables over hard-coding).\n- If you enable verbose logging, monitor for connection retries and rate limits from external providers.\n- When using TAiCapabilities and unified thinking levels, validate your ModelCaps and SessionCaps to avoid bridging gaps unexpectedly.\n- If you modify or extend tools exposed by the MCP server, keep the JSON-based tool catalog in sync with the runtime registration to avoid runtime discovery issues.\n- For production, consider deploying behind a reverse proxy with TLS and setting appropriate authentication for MCP endpoints.

Related MCP Servers

Sponsor this space

Reach thousands of developers