Awesome-AI-Engineering
The Full-Stack LLM Engineering Playbook. Architectural patterns for Agents (MCP) & RAG, coupled with advanced Post-Training recipes (SFT, DPO, QLoRA) for domain adaptation. Covers Data Pipelines, Evaluation Frameworks, and System Design.
claude mcp add --transport stdio eric-llms-awesome-ai-engineering docker run -i eric-llms/awesome-mcp-servers:eric-llms-awesome-ai-engineering \ --env MCP_PORT="Port on which the MCP server will listen inside the container (default often 8000)" \ --env MCP_LOG_LEVEL="Logging level (e.g., info, debug, warn)" \ --env MCP_CONFIG_PATH="Path to MCP configuration inside the container if overridden"
How to use
Awesome AI Engineering provides an MCP (Model Context Protocol) server that exposes a curated set of MCP projects and tooling focused on practical, production-ready AI agent workflows. It acts as a centralized host for MCP-based agents, memory modules, and context management utilities, enabling you to explore, run, and test agent workflows from a single interface. The server aggregates examples and hands-on labs related to production-grade LLMs, agent orchestration, memory management, and MCP best practices, making it suitable for teams looking to standardize their MCP implementations and experiment with reference architectures.
To use, start the server (via Docker in this repository’s recommended deployment) and access the MCP catalog and project endpoints exposed by the container. You can browse the included MCP projects, run sample agents, and inspect model contexts, tool invocations, and memory modules. Depending on the build, you may find tools for task-oriented agents, memory persistence, data analysis pipelines, and demonstration runs that illustrate how MCP can orchestrate multi-step reasoning and action in production-like scenarios.
How to install
Prerequisites:
- Docker installed on your workstation or deployment environment
- Git (optional, for cloning source repositories)
Installation steps:
- Ensure Docker is running on your machine.
- Pull the MCP server image (or build it if you have a custom image): docker pull eric-llms/awesome-mcp-servers:eric-llms-awesome-ai-engineering
- Run the MCP server container: docker run -i -p 8000:8000 eric-llms/awesome-mcp-servers:eric-llms-awesome-ai-engineering Note: If you override ports with MCP_PORT, set -p accordingly or rely on container defaults.
- Verify the server is up by hitting the exposed endpoint (e.g., http://localhost:8000 or the port you mapped).
- Configure clients or tooling to connect to the MCP server using the provided endpoints and the MCP configuration object.
If you prefer to customize, clone the repository and adjust the Dockerfile or container entrypoint to point at your local MCP configuration or additional tooling.
Additional notes
Environment variables can customize behavior (logging, ports, and config paths). If you encounter port binding issues, confirm that the host port is free and that the container is exposing the correct port. Some MCP projects may require additional storage mounts or credentials for external tools (e.g., data lakes or email/file processing utilities). Check the container logs for hints about missing dependencies or misconfigurations. For reproducible results, pin the image to a specific tag and document the MCP versions used for each project in your deployment notes.
Related MCP Servers
mindsdb
Query Engine for AI Analytics: Build self-reasoning agents across all your live data
ai-engineering-hub
In-depth tutorials on LLMs, RAGs and real-world AI agent applications.
gateway
A blazing fast AI Gateway with integrated guardrails. Route to 200+ LLMs, 50+ AI Guardrails with 1 fast & friendly API.
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
Kiln
Build, Evaluate, and Optimize AI Systems. Includes evals, RAG, agents, fine-tuning, synthetic data generation, dataset management, MCP, and more.
learn-ai-engineering
Learn AI and LLMs from scratch using free resources