metorial
Connect any AI model to 600+ integrations; powered by MCP 📡 🚀
claude mcp add --transport stdio metorial-metorial node server.js \ --env NODE_OPTIONS="" \ --env METORIAL_API_KEY="your-metorial-api-key"
How to use
Metorial provides an MCP-based server that connects AI models to a broad ecosystem of APIs, data sources, and tooling via MCP. This server acts as a gateway and orchestrator, enabling your models to deploy and interact with a wide range of server deployments (APIs, data sources, and tooling) through a unified interface. With Metorial’s MCP server, you can leverage its run flow to manage sessions, route tool calls, and orchestrate multi-step conversations across different external services. The included OAuth support helps you handle services that require user authentication, ensuring secure access to tools like calendars, messaging, and storage providers. In practice, you’ll deploy server deployments (for example, Google Calendar, Slack, Hacker News, or custom tools), then pass their identifiers to the Metorial .run() workflow to execute complex agent-driven tasks across those tools. This server setup is designed for developers building agentic AI applications who want reliable monitoring, customization, and straightforward self-hosting.
To use it, ensure your MCP client (your AI model) communicates with the Metorial MCP server as part of a deployment plan. You’ll typically authenticate with a Metorial API key, configure any OAuth sessions for services requiring user authorization, and then issue a run request specifying a message, the target server deployments, and a model. The server handles session management and coordinates tool usage across deployments, returning a composed result from the agent’s interactions with the connected tools.
How to install
Prerequisites:
- Git
- Node.js (LTS) and npm/yarn
- A Metorial API key (for MCP features and onboarding)
- Install Node.js and npm
- Download and install from https://nodejs.org/
- Verify: node -v npm -v
- Clone the repository
- git clone https://github.com/metorial/metorial.git
- cd metorial
- Install dependencies
- npm install
- Configure environment
- Create a .env file or export environment variables:
METORIAL_API_KEY=your-metorial-api-key
Optional: other config values your deployment requires
- Run the MCP server
- npm run start (If your package.json uses a different script, use the appropriate start command, e.g., node server.js)
- Verify startup
- The server should listen on the configured port (e.g., http://localhost:3000) and expose MCP endpoints as defined by the repository.
Notes:
- If you’re using Docker, adapt the commands to docker run -i <image> with the appropriate environment variables.
- If you’re using a hosted Metorial Platform API key, ensure network access and API quotas are set accordingly.
Additional notes
Tips and common issues:
- Ensure METORIAL_API_KEY is set; without it, MCP route calls may fail or be unauthenticated.
- When using OAuth-enabled tools (e.g., Google Calendar, Slack), you’ll typically create OAuth sessions via the MCP flows and pass the resulting session IDs in serverDeployments.
- For local development, prefer .env files to manage secrets securely and avoid committing keys to version control.
- If your server doesn’t start, check that the entry point (server.js) exists and that dependencies are installed (npm install).
- Use the provided examples in the repository to validate your setup and test end-to-end MCP runs with a simple tool deployment before scaling to more complex workflows.
Related MCP Servers
agent-scan
Security scanner for AI agents, MCP servers and agent skills.
mcpcan
MCPCAN is a centralized management platform for MCP services. It deploys each MCP service using a container deployment method. The platform supports container monitoring and MCP service token verification, solving security risks and enabling rapid deployment of MCP services. It uses SSE, STDIO, and STREAMABLEHTTP access protocols to deploy MCP。
HyperChat
HyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.
evo-ai
Evo AI is an open-source platform for creating and managing AI agents, enabling integration with different AI models and services.
DeepWideResearch
Agentic RAG for any scenario. Customize sources, depth, and width
skunit
skUnit is a testing tool for AI units, such as IChatClient, MCP Servers and agents.