context7-http
Context7 MCP Server (with HTTP SSE and Streamable)
claude mcp add --transport stdio lrstanley-context7-http docker run -i ghcr.io/lrstanley/context7-http:latest
How to use
context7-http is an MCP server that exposes Context7 over HTTP(S) streaming endpoints, allowing you to access and interact with Context7 libraries and tooling without running the full local service. The server provides HTTP streamable endpoints and SSE support, along with utilities like resolve-library-uri and search-library-docs to locate libraries and their documentation. It includes resources such as context7://libraries for high-level library information and context7://libraries/top/<n> for the top libraries by trust score or popularity when the score is unavailable.
To use it, run the container image and connect your MCP client or IDE to the HTTP streamable endpoint. The server is designed to be used with MCP-enabled tooling (e.g., editors or plugins) that expect an MCP server exposing the context7 HTTP interface. The README examples mention configuring clients to point to the server URL (for example, /mcp for HTTP streamable and /sse for SSE), and reference integration points with tools like Windsurf, Zed, Claude Code, Claude Desktop, and BoltAI, which can consume MCP servers configured with the context7 HTTP endpoint.
How to install
Prerequisites:
- Docker installed and running on your machine or server.
- Internet access to pull the container image from GitHub Container Registry (GHCR).
Installation steps (Docker-based):
- Ensure Docker is installed and running:
docker --version
- Run the MCP server using the provided image (latest tag):
docker run -i --rm -p 8080:8080 ghcr.io/lrstanley/context7-http:latest
This launches the server and exposes the MCP HTTP endpoints on port 8080 by default. If you need a specific tag, replace latest with a tag like 0.4.0 or master as appropriate.
- Verify the server is up by curling the mcp endpoint (adjust port if you mapped differently):
curl -s http://localhost:8080/mcp
- (Optional) Run with a specific port or additional docker options as needed:
docker run -i --rm -p 8080:8080 ghcr.io/lrstanley/context7-http:latest
Alternative install methods (if you prefer building locally):
- Build from source (Go): follow the repository's Go build instructions to compile the binary, then run the binary directly and bind to a port. See the repository's README for build commands and any required environment variables.
- Use a different container tag or a pre-release image if your environment requires a specific version.
Additional notes
Notes and tips:
- The MCP server supports HTTP streaming (streamable) and SSE endpoints. If you plan to use SSE, ensure the client supports SSE per MCP spec.
- The server exposes tools like resolve-library-uri and search-library-docs to locate libraries and browse their documentation. These utilities are particularly useful when composing MCP requests or debugging library availability.
- When deploying behind a reverse proxy or in a cloud environment, consider configuring heartbeat intervals and proxy trust settings as described in MCP client tooling to maintain stable connections.
- The README references container images (ghcr) for multiple versions. Use specific version tags (e.g., 0.4.0) for reproducible deployments.
- No npm package is required for this server since it is provided as a Docker image; thus npm_package is null.
- If you run into port conflicts, map a different host port to the container’s internal port (for example, -p 9090:8080).
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
just
Share the same project justfile tasks with your AI Coding Agent.
mcp-lsp-bridge
MCP LSP bridge server bringing LSP to your LLM
relace
Unofficial Relace MCP client with AI features. Personal project; not affiliated with or endorsed by Relace
cco
Real-time audit and approval system for Claude Code tool calls.
vibe-workspace
Manage a vibe workspace with many repos