Get the FREE Ultimate OpenClaw Setup Guide →

gateway

Universal MCP-Server for your Databases optimized for LLMs and AI-Agents.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add centralmind-gateway

How to use

CentralMind Gateway exposes your structured database behind an auto-generated API and an MCP (Model Context Protocol) server. It can generate an MCP/REST API layer from your database schema and samples, enabling AI agents to discover, understand, and interact with your data using MCP or OpenAPI endpoints. The MCP server is designed to be combined with your AI tooling to enable function calling, memory, and context-aware queries while ensuring PII protection, auditing, and secure access. Typical usage involves starting the gateway with a database connection string, which then serves an MCP SSE endpoint for streaming context and a REST API with Swagger UI for exploration and integration. The gateway also supports multiple deployment modes (standalone binary, Docker, Kubernetes), making it flexible for development, testing, and production.

How to install

Prerequisites:

  • Docker installed on your host (for the recommended deployment).
  • Internet access to pull the gateway image.

Installation steps (Docker):

  1. Ensure Docker is running on your machine.

  2. Pull and run the gateway image with a connection string to your database:

    docker run -i -p 9090:9090 ghcr.io/centralmind/gateway:v0.2.18 start --connection-string "postgres://db-user:db-password@db-host/db-name?sslmode=require"

  3. Verify the gateway is up. You should see logs indicating the MCP SSE server and REST API endpoints, for example:

    INFO Gateway server started successfully! INFO MCP SSE server for AI agents is running at: http://localhost:9090/sse INFO REST API with Swagger UI is available at: http://localhost:9090/

  4. Open the REST API (Swagger UI) in your browser at http://localhost:9090/ to inspect generated endpoints, or use the MCP SSE endpoint for streaming model context interactions.

Optional: If you prefer building from source, follow the project’s GitHub repo build instructions (Go mod download, then go build) and run the binary with a connection string.

Additional notes

Tips and considerations:

  • The gateway can expose APIs via REST or MCP; use MCP for context-rich, AI-friendly interactions and OpenAPI for standard client integrations.
  • Use the MCP SSE endpoint for streaming updates and model interactions; the REST API with Swagger provides interactive exploration and testing.
  • Ensure your database connection string has appropriate permissions and SSL configuration as required by your environment.
  • For production, consider deploying via Kubernetes and enabling telemetry (OTel) and plugin-based security (API keys, OAuth).
  • If you encounter port conflicts, map the internal port 9090 to a different host port (e.g., -p 8080:9090) and update clients accordingly.

Related MCP Servers

Sponsor this space

Reach thousands of developers