automagik-tools
From API to AI in 30 Seconds - Transform any API into an intelligent MCP agent that learns, adapts, and speaks human
claude mcp add --transport stdio namastexlabs-automagik-tools uvx automagik-tools
How to use
Automagik Tools is an MCP toolkit that turns any API into a self-learning AI agent using the MCP (Model Context Protocol). With it, you can generate an MCP agent from an OpenAPI spec in seconds and have it understand and execute natural language requests against the API. The README demonstrates using the UVX CLI to connect to an API’s OpenAPI spec and start a self-learning agent, for example: uvx automagik-tools openapi https://api.stripe.com/openapi.json. Tools like Genie let you orchestrate multiple MCP servers with persistent memory, while the Hub provides a multi-tenant HTTP server to manage tools per user. The project supports different transports (stdio, SSE, HTTP) and aims for zero-code integration, enabling natural-language interactions with complex APIs without writing integration logic.
How to install
Prerequisites:
- Python and uvx installed (the project uses the uvx workflow shown in the README).
- Internet access to fetch the automagik-tools package via uvx.
Installation steps:
- Install uvx if you haven't already (per your environment): python -m pip install uvx
- Install or run Automagik Tools via uvx: uvx automagik-tools
- Start using the tool by pointing it at an OpenAPI spec: uvx automagik-tools openapi https://api.stripe.com/openapi.json
- (Optional) Run the Hub or Genie components as described in the README to enable multi-tenant tools and orchestration: uvx automagik-tools hub --port 8000 uvx automagik-tools tool genie -t sse --port 8000
Prerequisites recap:
- Python environment
- uvx command available
- Access to the target OpenAPI document you want to integrate
Additional notes
Tips and considerations:
- The Automagik Tools suite emphasizes zero-code integration by turning OpenAPI specs into conversational agents. Use the openapi flow to quickly onboard new APIs.
- When connecting multiple MCP servers with Genie or the Hub, you can enable persistent memory across agents and coordinate workflows via SSE or HTTP transports.
- Environment variables may be needed for certain integrations (for example, API tokens or credentials in the env section of a tool in Genie’s config). See the README examples for how to set env values per tool, e.g., env: {"GITHUB_TOKEN": "your-token"}.
- If you encounter deployment issues, ensure your network can reach the OpenAPI URL and that the uvx tooling is up-to-date with the repository version you’re using.
Related MCP Servers
awesome-agent-skills
A curated list of skills, tools, tutorials, and capabilities for AI coding agents (Claude, Codex, Antigravity, Copilot, VS Code)
bytechef
Open-source, AI-native, low-code platform for API orchestration, workflow automation, and AI agent integration across internal systems and SaaS products.
claude-debugs-for-you
Enable any LLM (e.g. Claude) to interactively debug any language for you via MCP and a VS Code Extension
claude-emporium
🏛 [UNDER CONSTRUCTION] A (roman) claude plugin marketplace
lc2mcp
Convert LangChain tools to FastMCP tools
context-engineering
🧠 Stop building AI that forgets. Master MCP (Model Context Protocol) with production-ready semantic memory, hybrid RAG, and the WARNERCO Schematica teaching app. FastMCP + LangGraph + Vector/Graph stores. Your AI assistant's long-term memory starts here.