mcphub
MCPHub is an embeddable Model Context Protocol (MCP) solution for AI services. Seamlessly integrate MCP servers with OpenAI Agents, LangChain, and Autogen frameworks through a unified interface. Simplifies configuration, setup, and management of MCP tools across different AI applications.
claude mcp add --transport stdio cognitive-stack-mcphub npx -y @smithery/cli@latest run @smithery-ai/server-sequential-thinking
How to use
MCPHub provides a centralized way to embed and manage MCP servers within AI workflows. It supports both TypeScript-based MCP servers (via npm/npx) and Python-based MCP servers (via uv or other package managers) and handles repository cloning, setup, and running of the server components. With MCPHub, you can load a configured MCP server, enumerate its available tools, and then use those tools from within OpenAI Agents, LangChain, Autogen, or other frameworks. The example server shown in the documentation demonstrates a sequential-thinking MCP server, which exposes a set of capabilities (tools) that you can query and invoke through your agent or orchestrator. The library also supports automatic configuration from GitHub repos by analyzing READMEs to extract server settings, provided you have an OpenAI API key for the analysis step.
How to install
Prerequisites:
- Node.js and npm (for npx-based MCP servers)
- Python 3.x (for uv-based MCP servers)
- Git
- OpenAI API key if you plan to use automatic repo-based configuration
Installation steps:
-
Install uv (Python package manager) (optional, if you plan to run Python MCP servers): curl -LsSf https://astral.sh/uv/install.sh | sh
-
Install git (if not already installed):
- Ubuntu/Debian: sudo apt-get install git
- macOS: brew install git
-
Install npx (comes with Node.js): npm install -g npx
-
Install MCPHub (Python package): pip install mcphub
-
Optional: Install framework-specific dependencies for MCPHub: pip install mcphub[openai] # OpenAI integration pip install mcphub[langchain] # LangChain integration pip install mcphub[autogen] # Autogen integration pip install mcphub[all] # All optional dependencies
-
Create and customize your .mcphub.json configuration in your project root as needed (see README examples).
-
Run your MCPHub-enabled workflow or application that loads the MCP servers from the configuration.
Additional notes
Tips and considerations:
- Environment variables can be specified per server in the configuration (e.g., connection strings, API keys). Use the env field to map to ${ENV_VAR} placeholders.
- The configuration supports multiple sources: npm/npx, Python uv, GitHub repos, and local development servers. Combine sources as needed.
- If you use automatic configuration from a GitHub repo, an OpenAI API key must be available via OPENAI_API_KEY.
- Ensure that the server you reference (e.g., smithery-ai/server-sequential-thinking) is compatible with MCPHub's expected tooling interface and exposes tools that OpenAI Agents or LangChain can invoke.
- For debugging, list available tools from a running MCP server to verify proper connection and tool availability before integrating into agents.
- MCPHub handles path management and environment setup, but you may need to adjust setup_scripts in the configuration for non-default installation flows.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP