dynamic
MCP proxy server that reduces LLM context overhead with on-demand tool loading from multiple upstream servers.
claude mcp add --transport stdio asyrjasalo-dynamic-mcp uvx dmcp /path/to/your/dynamic-mcp.json \ --env DYNAMIC_MCP_CONFIG="Path to dynamic-mcp.json (optional if config path is provided in args)"
How to use
Dynamic-mcp is a proxy that loads tool schemas from upstream MCP servers on demand, reducing the amount of context that needs to be loaded upfront. It starts with only a couple of MCP tools exposed and can import or reference additional servers, resources, and prompts from upstream transports (stdio, HTTP, SSE). This makes it easier to manage large toolsets without overloading the client with tokens. You can connect via the configured transports, and dynamic-mcp will fetch tool schemas as needed, handle OAuth when required, and automatically retry failed connections to keep commands available.
To use it, run the dynamic-mcp process through the provided interface (for Python via uvx in the Quick Start example). You can configure the proxy to expose the initial two tools and then gradually enable more upsteam servers. When you invoke a tool, dynamic-mcp will fetch the corresponding tool schemas for that server on-demand, perform necessary environment variable interpolation, and route requests through the chosen MCP transport (stdio, HTTP, or SSE). The on-demand loading helps keep your MCP ecosystem responsive and token-efficient while supporting features like OAuth authentication and automatic reconnection.
You can also import tool configurations from popular AI coding tools to bootstrap your dynamic-mcp configuration. The import flow normalizes environment variables to the ${VAR} format and guides you through feature selection (tools, resources, prompts) per server, generating a dynamic-mcp.json that you can point your mcpServers entry at.
How to install
Prerequisites:
- Either Python with uvx installed or a Rust toolchain to build the native binary
- Access to the internet to download required packages or binaries
Option A: Python package (uvx)
- Ensure Python is installed and uvx is available (pipx or pip install).
- Install the dynamic-mcp application from PyPI: pipx install dmcp (or follow your preferred method).
- Create or update your dynamic-mcp.json configuration file describing the upstream MCP servers you want to expose on demand.
- Run uvx with the dynamic-mcp configuration, e.g.: uvx dmcp /path/to/your/dynamic-mcp.json
- Set the DYNAMIC_MCP_CONFIG environment variable if you want to omit the config path when running.
Option B: Native binary
- Download the release for your OS from the dynamic-mcp releases page.
- Ensure the binary (dmcp) is in your PATH.
- Prepare your dynamic-mcp.json configuration file.
- Run the binary: dmcp
- Alternatively, set DYNAMIC_MCP_CONFIG to point to your config file and omit args.
Option C: Compile from source (Rust)
- Install Rust and Cargo.
- Build from source: cargo install dynamic-mcp
- The binary will be available at ~/.cargo/bin/dmcp (or $CARGO_HOME/bin/dmcp).
- Create dynamic-mcp.json and start the binary with dmcp, or set DYNAMIC_MCP_CONFIG to point to it.
Additional notes
Tips and considerations:
- The dynamic-mcp format supports stdio, HTTP, and SSE transports. If you omit the type and provide a URL, HTTP transport with SSE detection is used by default.
- The server can be configured to import tool configurations from many AI coding tools to bootstrap its setup. Environment variable normalization is performed during import to convert variables to the ${VAR} format expected by dynamic-mcp.
- OAuth is supported for HTTP/SSE MCP servers. On first connection, an authorization flow may occur in a browser, and tokens are stored under ~/.dynamic-mcp/oauth-servers/<server-name>.json with automatic rotation.
- For environment interpolation inside command arguments, you can use the ${VAR} syntax (and nested formats as illustrated in the docs).
- If you encounter token or connection issues, enable retries and verify that the upstream MCP servers are reachable and that any required OAuth scopes are properly configured.
- The mcp_config example uses a two-step startup (exposing two tools initially). You can customize which tools/resources/prompts are exposed per server during the import or by editing dynamic-mcp.json after import.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
offeryn
Build tools for LLMs in Rust using Model Context Protocol
codemesh
The Self-Improving MCP Server - Agents write code to orchestrate multiple MCP servers with intelligent TypeScript execution and auto-augmentation
mcp-protocol-sdk
[DEPRECATED] Moved to prism-mcp-rs - Enterprise-grade Rust MCP SDK