End-to-End-Agentic-Ai-Automation-Lab
This repository contains hands-on projects, code examples, and deployment workflows. Explore multi-agent systems, LangChain, LangGraph, AutoGen, CrewAI, RAG, MCP, automation with n8n, and scalable agent deployment using Docker, AWS, and BentoML.
claude mcp add --transport stdio mdalamin5-end-to-end-agentic-ai-automation-lab docker run -i mdalamin5/end-to-end-agentic-ai-automation-lab \ --env LOG_LEVEL="INFO" \ --env MCP_CONFIG_PATH="path/to/mcp-config.yaml" \ --env GRAPHQL_ENDPOINT="https://your-graphQL-endpoint"
How to use
This MCP server embodies the End-to-End Agentic AI Automation Lab as a standardized MCP integration point for orchestrating and connecting diverse agentic AI components. It exposes tooling and workflows to manage multi-agent collaboration, RAG pipelines, and automated AI workflows using MCP-compatible tools. Once running, you can explore the MCP-enabled endpoints to query tool capabilities, register new tools, and observe standardized data formats for tool invocation and data exchange. Depending on your deployment, you may interact with it via the containerized MCP server, or through any MCP-aware client that supports the protocol.
Key capabilities typically include: registering and discovering tools, executing agent-driven tool calls with standardized input/output, and streaming telemetry or debugging information for monitoring agent behavior. The server is designed to help you integrate agentic frameworks (such as LangChain, LangFlow, and related components) with RAG systems and workflow automation pipelines (e.g., n8n) under a unified MCP specification.
How to install
Prerequisites:
- Docker or a compatible container runtime
- Git installed
- Access to the repository (clone URLs)
-
Clone the repository: git clone https://github.com/MDalamin5/End-to-End-Agentic-Ai-Automation-Lab.git cd End-to-End-Agentic-Ai-Automation-Lab
-
Review MCP configuration:
- Inspect configs or docker-compose files if provided in configs/ or deployment/ folders.
- Ensure environment variables (e.g., MCP_CONFIG_PATH, GRAPHQL_ENDPOINT) are set for your environment.
-
Run via Docker:
- Build or pull the Docker image used for the MCP server (as described in the repository or Docker Hub): docker pull mdalamin5/end-to-end-agentic-ai-automation-lab
- Start the container (example): docker run -i --env MCP_CONFIG_PATH=/path/to/config.yaml --env GRAPHQL_ENDPOINT=https://your-graphq endpoint --env LOG_LEVEL=INFO mdalamin5/end-to-end-agentic-ai-automation-lab
-
Verify the server:
- Check container logs for startup messages indicating MCP server is ready.
- Use an MCP client to enumerate available tools or invoke a tool as per the MCP specification.
Note: If you prefer local development, install Python and run any server entrypoint described in the repo (e.g., a module under a tools/ or src/ directory) following the repository's instructions.
Additional notes
Tips and common issues:
- Ensure your environment variables (MCP_CONFIG_PATH, GRAPHQL_ENDPOINT) are correctly set; missing values can prevent tool registration or proper orchestration.
- If using Docker, ensure the image tag matches the latest release or your built image tag.
- When integrating with n8n or LangFlow, confirm network access between the MCP server container and the workflow orchestrator.
- For debugging, enable verbose logging (LOG_LEVEL=DEBUG) and consult logs for tool invocation traces, errors in data schemas, or authentication issues.
- The MCP integration is designed to be compatible with multiple tool ecosystems; verify your tool definitions follow the MCP input/output contracts to avoid schema mismatches.
Related MCP Servers
generative-ai
Comprehensive resources on Generative AI, including a detailed roadmap, projects, use cases, interview preparation, and coding preparation.
mcp-memory-service
Open-source persistent memory for AI agent pipelines (LangGraph, CrewAI, AutoGen) and Claude. REST API + knowledge graph + autonomous consolidation.
agentscope-runtime
A production-ready runtime framework for agent apps with secure tool sandboxing, Agent-as-a-Service APIs, scalable deployment, full-stack observability, and broad framework compatibility.
CoexistAI
CoexistAI is a modular, developer-friendly research assistant framework . It enables you to build, search, summarize, and automate research workflows using LLMs, web search, Reddit, YouTube, and mapping tools—all with simple MCP tool calls or API calls or Python functions.
wavefront
🔥🔥🔥 Enterprise AI middleware, alternative to unifyapps, n8n, lyzr
langgraph-ai
LangGraph AI Repository