apollo
Apollo MCP Server
claude mcp add --transport stdio apollographql-apollo-mcp-server cargo run
How to use
Apollo MCP Server is a Model Context Protocol server that exposes GraphQL operations as MCP tools. It acts as a gateway between your GraphQL API (built with Apollo) and MCP clients such as LLMs or MCP inspectors, allowing those clients to discover, query, and orchestrate GraphQL operations through a standardized MCP interface. To use it effectively, you’ll need to define the GraphQL operations you want to expose as MCP tools and provide a configuration file that guides how the MCP server should run and connect to your GraphQL backend. The server will then surface those operations as MCP tools that AI models can invoke via the MCP protocol.
To get started, configure a graph for the MCP server to sit in front of, define the GraphQL operations you wish to expose, and establish a connection to an MCP client (e.g., an LLM or MCP inspector). The server documentation covers config file options and examples. Once running, you can access the exposed tools, issue queries or mutations through the MCP interface, and leverage the Apollo GraphQL backend through a standardized, model-friendly protocol.
How to install
Prerequisites:
- Rust toolchain (Rust and Cargo) installed on your machine
- Git to clone the repository
Install and run from source:
- Install Rust (if not already installed):
- On macOS/Linux: curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
- On Windows: follow rustup-init.exe installer from https://rustup.rs/
- Clone the repository:
- git clone https://github.com/apollographql/apollo-mcp-server.git
- cd apollo-mcp-server
- Build and run the server (development mode):
- cargo build
- cargo run This will compile the MCP server and start it from the target/debug directory by default.
Alternative: follow the project’s official installation guide for any binary releases or containerized options if available.
Additional notes
Notes and tips:
- The Apollo MCP Server documentation covers config-file options, including how to specify the GraphQL graph, the operations to expose, and MCP client connections.
- By default, building from source places the binary in target/debug (or target/release with --release). Use cargo run for a quick start or cargo build --release for a production-oriented binary.
- Ensure you have a reachable GraphQL endpoint for the server to proxy requests to, and verify network access between the MCP server and your MCP clients (LLMs or MCP inspectors).
- If you encounter port or binding issues, check your environment for conflicting services and adjust the server configuration accordingly.
- When using a cargo-based run, you can pass environment variables or command-line flags as needed as described in the project docs to tailor logging, config file paths, and backend URLs.
Related MCP Servers
goose
an open source, extensible AI agent that goes beyond code suggestions - install, execute, edit, and test with any LLM
cunzhi
告别AI提前终止烦恼,助力AI更加持久
probe
AI-friendly semantic code search engine for large codebases. Combines ripgrep speed with tree-sitter AST parsing. Powers AI coding assistants with precise, context-aware code understanding.
mcp-center
A centralized platform for managing and connecting MCP servers. MCP Center provides a high-performance proxy service that enables seamless communication between MCP clients and multiple MCP servers.
backlog -rust
MCP server for Backlog, project management service.
perplexity-web-api
🔍 Perplexity AI MCP without API key