MakerAi
The AI Operating System for Delphi. 100% native framework with RAG 2.0 for knowledge retrieval, autonomous agents with semantic memory, visual workflow orchestration, and universal LLM connector. Supports OpenAI, Claude, Gemini, Ollama, and more. Enterprise-grade AI for Delphi 10.3+
claude mcp add --transport stdio gustavoeenriquez-makerai python -m makerai.server \ --env MAKERAI_API_KEY="your-api-key-for MakerAI providers" \ --env MAKERAI_LOG_LEVEL="info"
How to use
MakerAI MCP server exposes or consumes tools within the MakerAI Delphi-based AI ecosystem. It is designed to interoperate with the MCP framework by wrapping and/or connecting to model and tool capabilities provided by MakerAI, such as RAG pipelines, autonomous agents, and various ChatTools. Once the MCP server is running, you can deploy and register tools (functions, commands, or services) that clients can invoke via Model Context Protocol messages. The server serves as an integration point between client requests and the underlying capabilities exposed by MakerAI, enabling orchestrated tool usage, graph-based reasoning, and human-in-the-loop workflows within Delphi-based deployments. You can query provider-specific components, universal connectors, and the RAG/Document management pipelines through the MCP interface, and compose multi-step interactions across tools and models.
How to install
Prerequisites:\n- A supported operating system (Windows, macOS, or Linux) with a compatible runtime for the chosen execution method (Python in this configuration).\n- Python 3.8+ installed and available on your PATH.\n- Basic networking setup to expose the MCP server endpoints if you intend to reach it from clients.\n\nStep-by-step:\n1) Clone the MakerAI repository containing the MCP server components.\n2) Create a Python virtual environment and install required dependencies. Example:\n python -m venv venv\n source venv/bin/activate # Unix/macOS\n venv\Scripts\activate # Windows\n pip install -r requirements.txt\n3) Configure environment variables as needed (see mcp_config env section).\n4) Run the MCP server using the command configured in mcp_config. Example: python -m makerai.server --config config/mcp.yaml\n5) Verify the server is listening on the expected endpoint and can be reached by MCP clients.\n6) If you plan to run with Docker, adapt the run command to docker run ... with appropriate port mappings and volumes.
Additional notes
Tips and considerations:\n- Ensure your MakerAI API keys and any provider credentials are securely managed (prefer environment variables over hard-coding).\n- If you enable verbose logging, monitor for connection retries and rate limits from external providers.\n- When using TAiCapabilities and unified thinking levels, validate your ModelCaps and SessionCaps to avoid bridging gaps unexpectedly.\n- If you modify or extend tools exposed by the MCP server, keep the JSON-based tool catalog in sync with the runtime registration to avoid runtime discovery issues.\n- For production, consider deploying behind a reverse proxy with TLS and setting appropriate authentication for MCP endpoints.
Related MCP Servers
UnrealGenAISupport
An Unreal Engine plugin for LLM/GenAI models & MCP UE5 server. Includes OpenAI's GPT 5.1, Deepseek V3.1, Claude Sonnet 4.5 APIs, Gemini 3, Alibaba Qwen, Kimi and Grok 4.1, with plans to add Gemini, audio tts, elevenlabs, OpenRouter, Groq, Dashscope & realtime APIs soon. UnrealMCP is also here!! Automatic scene generation from AI!!
c4-genai-suite
c4 GenAI Suite
forge-orchestrator
Forge Orchestrator: Multi-AI task coordination. File locking, knowledge capture, drift detection. Rust.
ncp
Natural Context Provider (NCP). Your MCPs, supercharged. Find any tool instantly, load on demand, run on schedule, ready for any client. Smart loading saves tokens and energy.
observee
Observee (YC S25) lets you build AI agents with 1000+ integrations using MCPs with managed OAuth, Security and Observability
adal-cli
The self-evolving coding agent that learns from your entire team and codebase. Less syncing. Less waiting. Deliver at the speed of thought.