Get the FREE Ultimate OpenClaw Setup Guide →

evo-ai

Evo AI is an open-source platform for creating and managing AI agents, enabling integration with different AI models and services.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio evolutionapi-evo-ai python -m evo_ai.main \
  --env REDIS_URL="redis://localhost:6379/0" \
  --env EMAIL_HOST="smtp.yourprovider.com" \
  --env EMAIL_PORT="587" \
  --env DATABASE_URL="postgresql://USER:PASSWORD@HOST:5432/DB_NAME" \
  --env JWT_SECRET_KEY="your-jwt-secret" \
  --env EMAIL_HOST_USER="your-email@example.com" \
  --env EMAIL_HOST_PASSWORD="email-password" \
  --env LANGFUSE_PUBLIC_KEY="pk-lf-..." \
  --env LANGFUSE_SECRET_KEY="sk-lf-..."

How to use

Evo AI is an open-source platform for creating and managing autonomous AI agents. The EVO AI MCP server exposes APIs to manage agents, tools, and workflows, with support for multiple agent types (LLM-based, A2A, sequential, parallel, loop, workflow graphs via LangGraph, and task-oriented agents). It also integrates with Google’s A2A protocol for interoperability between agents and provides secure JWT-based authentication and encrypted API key management. To get started, run the backend service and connect clients or the frontend to the API at the configured URL. Use the MCP server to configure agents, attach tools, define sub-agents and workflows, and manage clients and organizations. The platform supports advanced workflows and cross-agent communication, enabling complex multi-agent orchestration through a unified API surface.

How to install

Prerequisites: Python 3.10+, PostgreSQL 13+, Redis 6+, Git, Make.

  1. Clone the repository

git clone https://github.com/EvolutionAPI/evo-ai.git cd evo-ai

  1. Set up a Python virtual environment and install dependencies

make venv source venv/bin/activate # Linux/macOS

Windows: venv\Scripts\activate

make install-dev

  1. Configure environment

cp .env.example .env

Edit the .env file with your database, Redis, and other settings

  1. Initialize the database and seed data

make alembic-upgrade make seed-all

  1. Run the application (development)

make run

Backend will be available at http://localhost:8000

  1. Optional: run frontend in a separate session

From frontend directory

cd frontend pnpm install pnpm dev

Frontend will be available at http://localhost:3000

Additional notes

Environment variables to consider: DATABASE_URL, REDIS_URL, JWT_SECRET_KEY, EMAIL_HOST/PORT/USER/PASSWORD for notifications, and Langfuse OTEL keys for tracing. If running with Docker, you can map ports and volumes accordingly and ensure your database and Redis services are reachable. Common issues include misconfigured database URLs, network restrictions between services, and missing migrations. Use make alembic-upgrade to apply migrations after changing models. The platform supports A2A protocol integration and LangGraph-based workflows, so ensure external dependencies and network rules allow inter-service communication when deploying in production.

Related MCP Servers

Sponsor this space

Reach thousands of developers