agentxsuite
AgentxSuite is an open-source platform to connect, manage, and monitor AI Agents and Tools across multiple MCP servers — in one unified interface.
claude mcp add --transport stdio alparn-agentxsuite uvx alparn-agentxsuite \ --env POSTGRES_DB="your_db_name" \ --env POSTGRES_USER="your_db_user" \ --env DJANGO_SECRET_KEY="your-django-secret-key" \ --env POSTGRES_PASSWORD="your_db_password" \ --env NEXT_PUBLIC_API_URL="http://localhost:8000" \ --env CORS_ALLOWED_ORIGINS="http://localhost:3000" \ --env SECRETSTORE_FERNET_KEY="your-fernet-key" \ --env NEXT_PUBLIC_MCP_FABRIC_URL="http://localhost:8090"
How to use
AgentxSuite provides a unified platform to manage AI Agents, Tools, and Policies across multiple MCP servers. It exposes a Django-based API alongside a FastAPI MCP Fabric service, enabling secure multi-tenant orchestration and MCP-compatible interactions. Use the MCP Fabric endpoints to discover registered tools, list organizations and environments, and execute tools via the unified run API. The system supports internal system tools, internal MCP Fabric interactions, and external MCP servers, allowing flexible tool execution workflows across environments. Typical usage involves registering organizations and environments, linking connections to MCP servers, registering agents and tools, and then running tools through the unified run endpoint or the MCP Fabric tool endpoints.
How to install
Prerequisites:
- Python 3.11+ and virtual environment support
- Docker and/or local development setup as described
- Access to a PostgreSQL database or SQLite for development
Option A: Docker Compose (recommended)
- Clone the repository and navigate to the project root.
- Create environment file (optional):
cp .env.example .env
Edit .env with your settings (SECRET_KEY, database passwords, etc.)
- Start all services: docker-compose up -d
- Run Django migrations: docker-compose exec backend python manage.py migrate
- (Optional) Create a superuser: docker-compose exec backend python manage.py createsuperuser
- Start MCP Fabric service (optional): docker-compose --profile mcp-fabric up -d mcp-fabric
- View logs as needed: docker-compose logs -f backend docker-compose logs -f frontend docker-compose logs -f mcp-fabric
Option B: Local Development (Python)
- Create a virtual environment: python3 -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
- Install dependencies: cd backend pip install -r requirements/base.txt pip install -r requirements/dev.txt pip install -r requirements/test.txt
- Apply migrations: python manage.py migrate
- (Optional) Create a superuser: python manage.py createsuperuser
- Start the Django API server: cd backend python manage.py runserver
- Start the MCP Fabric service (FastAPI): cd backend uvicorn mcp_fabric.main:app --reload --port 8090
- If you want to run both concurrently, use two terminals and ensure the virtual environment is active in the MCP Fabric terminal.
Additional notes
Notes and tips:
- The MCP Fabric service exposes tools via multi-tenant and root endpoints; tokens for the root endpoints must include org_id and env_id claims.
- Environment variables for production should be managed securely (e.g., secret stores, not checked into VCS).
- Use the unified runs/execute endpoint to run tools in a consistent MCP-compatible format across environments.
- When using Docker, ensure the required ports (8000 for Django, 8090 for MCP Fabric, 5432 for Postgres, 6379 for Redis) are accessible and not blocked by firewalls.
- If you encounter authentication issues, verify the Authorization header and token scope for the requested endpoint.
- For development, SQLite is available; switch to PostgreSQL in production for better performance and concurrency.
Related MCP Servers
code-mode
🔌 Plug-and-play library to enable agents to call MCP and UTCP tools via code execution.
crawl4ai
🕷️ A lightweight Model Context Protocol (MCP) server that exposes Crawl4AI web scraping and crawling capabilities as tools for AI agents. Similar to Firecrawl's API but self-hosted and free. Perfect for integrating web scraping into your AI workflows with OpenAI Agents SDK, Cursor, Claude Code, and other MCP-compatible tools.
crawlbase
Crawlbase MCP Server connects AI agents and LLMs with real-time web data. It powers Claude, Cursor, and Windsurf integrations with battle-tested web scraping, JavaScript rendering, and anti-bot protection enabling structured, live data inside your AI workflows.
mcp-json-yaml-toml
A structured data reader and writer like 'jq' and 'yq' for AI Agents
AGENT
Local-first multi-agent platform built on DeepAgents. Gateway, Agent Worker, and Web UI for orchestrating autonomous AI agents.
agent-comm
MCP server for AI agent task communication and delegation with diagnostic lifecycle visibility