Get the FREE Ultimate OpenClaw Setup Guide →

mcp -ai

MCP SERVER de AI - conexion mediante HTTP/REST API - gRPC Server - WebSocket - Server-Sent Events (SSE) con proveedores AI de AWS y AZURE

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio proyectoskevinsvega-mcp-server-ai docker run -i proyectoskevinsvega/mcp-server-ai:latest

How to use

MCP Server AI is a unified, high-performance microservice that abstracts access to multiple IA providers (such as AWS Bedrock and Azure OpenAI) behinda single API surface. It exposes HTTP/REST, gRPC, WebSocket, and server-sent events, enabling multi-protocol clients to interact with a single backend while benefiting from session management, worker pools, caching, and observability. You can reach the API through the HTTP port for REST endpoints, connect via gRPC for high-throughput calls, or use WebSocket for real-time bi-directional communication. The server also supports batch processing and provides monitoring through Prometheus/Grafana, with structured logging and distributed tracing for troubleshooting and performance analysis.

How to install

Prerequisites:

  • Docker and Docker Compose installed
  • Optional: Kubernetes (kubectl) for deployments
  • Environment with access to AWS/Azure IA credentials
  1. Clone the repository or pull the Docker image
  • If using the provided Docker image, you can pull it from Docker Hub (example shown below)
  1. Quickstart with Docker Compose (recommended)
  • Create or edit a docker-compose.yml to include Redis, PostgreSQL, and the MCP Server AI service. Then start the stack:
# Example docker-compose up the stack (adjust services as needed)
docker-compose up -d
  1. Run directly with Docker (alternative)
  • Start the MCP server container (example usage):
docker run -d --name mcp-server-ai \
  -p 8090:8090 -p 8091:8091 -p 50051:50051 \
  -e SERVER_ENV=production \
  proyectoskevinsvega/mcp-server-ai:latest
  1. Optional: Kubernetes deployment
  • Use the provided Helm charts or Kubernetes manifests in deploy/k8s to deploy with HPA/VPA and monitoring. Ensure secrets are created for IA credentials before deploying.
  1. Environment configuration
  • You can configure via environment variables (see README variables section) or a .env file mounted into the container. Typical variables include provider credentials, Redis/PostgreSQL connection strings, and service ports.

Additional notes

Tips and common issues:

  • Ensure your IA provider credentials (AWS/Azure) are valid and have the required permissions.
  • When using Redis, enable TLS and clustering in production for improved resilience.
  • If you see high latency, check worker pool scaling settings and adjust MAX_TOKENS, TEMPERATURE, and the number of workers accordingly.
  • For production deployments, enable RBAC, Secrets management (Vault/Sealed Secrets), and implement network policies for Kubernetes.
  • Review the environment variables for CORS and API ports to avoid conflicts in multi-tenant environments.
  • Use Prometheus/Grafana dashboards to monitor latency, throughput, and resource utilization; enable Jaeger tracing for distributed traces across services.

Related MCP Servers

Sponsor this space

Reach thousands of developers