Get the FREE Ultimate OpenClaw Setup Guide →

GrEBI

HPC aggregation pipeline and API/MCP server for LLM-mediated biomedical data integration

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ebispot-grebi docker run -i ghcr.io/ebispot/grebi_mcp:latest \
  --env NEO4J_AUTH="none" \
  --env KG_ENDPOINT="https://wwwdev.ebi.ac.uk/kg/api/v1/mcp"

How to use

GrEBI is an MCP server that orchestrates queries across multiple biomedical knowledge graphs by streaming materialised graph queries and enabling cross-resource querying. It exposes an MCP endpoint at the configured host and port (for example, https://wwwdev.ebi.ac.uk/kg/api/v1/mcp) which clients can connect to and issue MCP-compatible requests. The server integrates data from EMBL-EBI resources, MONARCH, ROBOKOP, Ubergraph, and other sources to support integrative queries that span multiple data sources, rather than being constrained to a single REST API.

You can use the MCP endpoint to submit streamable queries, and the system will route, merge, and return results as a stream. The output typically includes materialised query results and cross-resource mappings to facilitate unified analysis. The README notes that materialised queries and database exports are produced, and the service emphasizes capabilities for cross-resource integration, allowing clients to perform complex queries across ontologies, mappings, and datasets. If you want to run locally or in a container, use the provided command to start the MCP server image, connect your client to the MCP endpoint, and begin issuing MCP requests (e.g., for cross-dataset phenotype or ontology-based queries).

How to install

Prerequisites:

  • Docker installed on your system (Docker Desktop or equivalent).
  • Access to the internet to pull the MCP server image.
  • Optional: a local Neo4j instance if you wish to run local graph data in parallel.

installation steps:

  1. Pull the MCP server image (example; adjust tag as needed):
docker pull ghcr.io/ebispot/grebi_mcp:latest
  1. Run the MCP server container:
docker run -d --name grebi_mcp -p 8080:8080 ghcr.io/ebispot/grebi_mcp:latest
  1. Verify the server is reachable and the MCP endpoint is exposed (adjust host/port as configured):
curl -sSf https://localhost:8080/api/v1/mcp
  1. If you need to connect to a Neo4j instance for local exports or testing, ensure appropriate environment variables are set (e.g., NEO4J_AUTH) and that your container has access to the Neo4j data directories.

  2. Optional: configure environment variables or mount points for persistent data, data exports, or mapping resources as required by your deployment environment.

Additional notes

Notes and tips:

  • The GrEBI MCP server streams results; prefer clients that can consume streaming responses.
  • The Images and tags for the MCP container may update; verify the tag you pull corresponds to a compatible release.
  • If you run Neo4j locally, ensure sufficient RAM and disk space for the graph exports mentioned in the README (e.g., multi-hundred GB exports).
  • The official endpoint is https://wwwdev.ebi.ac.uk/kg/api/v1/mcp; adapt your client configuration if you deploy to a different host or enable TLS in production.
  • Environments may require network allowances for FTP and external mappings; ensure outbound access is available if you rely on external data sources.

Related MCP Servers

Sponsor this space

Reach thousands of developers