agi -agent
A modular AGI agent framework based on MCP (Multi-Context Processing), inspired by Manus, with ChatGPT-style LLM integration and task control.
claude mcp add --transport stdio ot2net-agi-mcp-agent python -m uvicorn agi_mcp_agent.api.server:app --host 0.0.0.0 --port 8000 --reload \ --env REDIS_URL="redis://localhost:6379" \ --env SECRET_KEY="change-me" \ --env ENVIRONMENT="development" \ --env DATABASE_URL="sqlite:///./db.sqlite3" \ --env OPENAI_API_KEY="your-openai-api-key"
How to use
AGI-MCP-Agent provides a Master Control Program (MCP) framework designed to orchestrate intelligent agents, their cognitive workflows, and multi-agent collaboration. The server exposes an API built with FastAPI and managed through an MCP interface, enabling you to deploy, monitor, and coordinate autonomous agents that can plan, reason, interact with tools, and persist memories. The included environment interface and coordination primitives make it possible to connect external APIs, data sources, and tools, while maintaining security through sandboxed execution and robust orchestration. Use the MCP to spawn agent lifecycles, schedule tasks, and observe performance metrics across agent fleets.
Once the server is running, you can interact with the API to create agents, assign tasks, and configure tools. The agent framework supports cognitive processing (planning, reasoning, decision-making), memory management (short- and long-term), tool integrations, perception modules, action generation, and self-monitoring. The environment interface standardizes how external systems are queried and how results are formatted, while multi-agent coordination handles communication, roles, and conflict resolution. This setup enables you to experiment with autonomous problem solving, tool chaining, and collaborative tasks across multiple agents within a single MCP instance.
How to install
Prerequisites:
- Python 3.9 or newer
- Poetry (optional but recommended) for dependency management
- PostgreSQL 12+ or SQLite for development
- Docker and Docker Compose (optional, for containerized deployment)
- OpenAI API key (for LLM-based agents)
Install and run (local development):
-
Clone the repository: git clone https://github.com/ot2net/agi-mcp-agent.git cd agi-mcp-agent
-
Install dependencies (with Poetry): make install-dev
or manually: poetry install
-
Set up environment variables: cp example.env .env
Edit .env with your configuration, including API keys and database URL
-
Initialize the database (if using a DB): make db-init
-
Run the server (development mode): make run-dev
or directly: uvicorn agi_mcp_agent.api.server:app --host 0.0.0.0 --port 8000 --reload
Containerized option (Docker Compose):
-
Ensure Docker and Docker Compose are installed
-
Copy environment file and configure it: cp example.env .env
-
Start services with Docker Compose: docker-compose up -d
-
Access the API:
- API: http://localhost:8000
- Documentation: http://localhost:8000/docs
Additional notes
Tips and common considerations:
- The MCP server is Python-based and relies on uvicorn to serve the FastAPI application. Ensure the port (default 8000) is available on your host.
- Store sensitive keys (OpenAI, database passwords) in a .env file and do not commit it to version control.
- If you’re using Docker, you can customize memory/CPU limits in your docker-compose.yml for stability in resource-constrained environments.
- Use the Makefile commands to simplify development tasks (lint, test, format, migrations, etc.).
- If you encounter connection issues to the database, verify DATABASE_URL and migrations status (db-migrate, db-upgrade).
- When integrating new tools or APIs, ensure proper sandboxing and rate-limiting to prevent cascading failures across agents.
- Review the API docs at /docs to understand available endpoints for agent creation, task assignment, and environment configuration.
Related MCP Servers
python -client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
gtm
An MCP server for Google Tag Manager. Connect it to your LLM, authenticate once, and start managing GTM through natural language.
AI-SOC-Agent
Blackhat 2025 presentation and codebase: AI SOC agent & MCP server for automated security investigation, alert triage, and incident response. Integrates with ELK, IRIS, and other platforms.
alris
Alris is an AI automation tool that transforms natural language commands into task execution.
simple
Simple MCP Server — Educational Example This repository contains a minimal MCP (Model Context Protocol) server built for educational purposes. Its goal is to help developers understand how the MCP protocol works and how to manage decision-making processes based on model–client communication.
llm-bridge
A model-agnostic Message Control Protocol (MCP) server that enables seamless integration with various Large Language Models (LLMs) like GPT, DeepSeek, Claude, and more.