Get the FREE Ultimate OpenClaw Setup Guide →

agentxsuite

AgentxSuite is an open-source platform to connect, manage, and monitor AI Agents and Tools across multiple MCP servers — in one unified interface.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio alparn-agentxsuite uvx alparn-agentxsuite \
  --env POSTGRES_DB="your_db_name" \
  --env POSTGRES_USER="your_db_user" \
  --env DJANGO_SECRET_KEY="your-django-secret-key" \
  --env POSTGRES_PASSWORD="your_db_password" \
  --env NEXT_PUBLIC_API_URL="http://localhost:8000" \
  --env CORS_ALLOWED_ORIGINS="http://localhost:3000" \
  --env SECRETSTORE_FERNET_KEY="your-fernet-key" \
  --env NEXT_PUBLIC_MCP_FABRIC_URL="http://localhost:8090"

How to use

AgentxSuite provides a unified platform to manage AI Agents, Tools, and Policies across multiple MCP servers. It exposes a Django-based API alongside a FastAPI MCP Fabric service, enabling secure multi-tenant orchestration and MCP-compatible interactions. Use the MCP Fabric endpoints to discover registered tools, list organizations and environments, and execute tools via the unified run API. The system supports internal system tools, internal MCP Fabric interactions, and external MCP servers, allowing flexible tool execution workflows across environments. Typical usage involves registering organizations and environments, linking connections to MCP servers, registering agents and tools, and then running tools through the unified run endpoint or the MCP Fabric tool endpoints.

How to install

Prerequisites:

  • Python 3.11+ and virtual environment support
  • Docker and/or local development setup as described
  • Access to a PostgreSQL database or SQLite for development

Option A: Docker Compose (recommended)

  1. Clone the repository and navigate to the project root.
  2. Create environment file (optional): cp .env.example .env

    Edit .env with your settings (SECRET_KEY, database passwords, etc.)

  3. Start all services: docker-compose up -d
  4. Run Django migrations: docker-compose exec backend python manage.py migrate
  5. (Optional) Create a superuser: docker-compose exec backend python manage.py createsuperuser
  6. Start MCP Fabric service (optional): docker-compose --profile mcp-fabric up -d mcp-fabric
  7. View logs as needed: docker-compose logs -f backend docker-compose logs -f frontend docker-compose logs -f mcp-fabric

Option B: Local Development (Python)

  1. Create a virtual environment: python3 -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
  2. Install dependencies: cd backend pip install -r requirements/base.txt pip install -r requirements/dev.txt pip install -r requirements/test.txt
  3. Apply migrations: python manage.py migrate
  4. (Optional) Create a superuser: python manage.py createsuperuser
  5. Start the Django API server: cd backend python manage.py runserver
  6. Start the MCP Fabric service (FastAPI): cd backend uvicorn mcp_fabric.main:app --reload --port 8090
  7. If you want to run both concurrently, use two terminals and ensure the virtual environment is active in the MCP Fabric terminal.

Additional notes

Notes and tips:

  • The MCP Fabric service exposes tools via multi-tenant and root endpoints; tokens for the root endpoints must include org_id and env_id claims.
  • Environment variables for production should be managed securely (e.g., secret stores, not checked into VCS).
  • Use the unified runs/execute endpoint to run tools in a consistent MCP-compatible format across environments.
  • When using Docker, ensure the required ports (8000 for Django, 8090 for MCP Fabric, 5432 for Postgres, 6379 for Redis) are accessible and not blocked by firewalls.
  • If you encounter authentication issues, verify the Authorization header and token scope for the requested endpoint.
  • For development, SQLite is available; switch to PostgreSQL in production for better performance and concurrency.

Related MCP Servers

Sponsor this space

Reach thousands of developers