itential
🔌 Itential Platform MCP Server
claude mcp add --transport stdio itential-itential-mcp python -m itential_mcp run \ --env ITENTIAL_MCP_SERVER_HOST="0.0.0.0" \ --env ITENTIAL_MCP_SERVER_PORT="8000" \ --env ITENTIAL_MCP_PLATFORM_HOST="your-platform.example.com" \ --env ITENTIAL_MCP_PLATFORM_USER="your-username" \ --env ITENTIAL_MCP_SERVER_TRANSPORT="stdio|sse|http" \ --env ITENTIAL_MCP_PLATFORM_PASSWORD="your-password"
How to use
This MCP server from Itential provides a comprehensive interface for connecting large language models to the Itential Platform. It exposes hundreds of automation capabilities through tools grouped into categories (network device management, workflow orchestration, platform health, resource management, and more), with flexible authentication and transport options. You can start the server locally, connect via stdio, SSE, or HTTP transports, and discover tools dynamically. Use the included CLI to start the server and then use an MCP client to invoke tools, start workflows, or query platform state. The server supports filtering tool visibility by tags and configuring access controls to match roles in your organization.
How to install
Prerequisites:
- Python 3.10+ installed
- pip available
- Optional: uv for development and make if you plan to run the repo from source
Install from PyPI:
pip install itential-mcp
Run directly from source (development workflow):
git clone https://github.com/itential/itential-mcp
cd itential-mcp
make build # if you want to build containers or run local dev setup
# Run the server (from repository) using uv or the installed package
uv run itential-mcp run # if you have uv installed
Basic usage after installation:
# Start with default stdio transport
itential-mcp run
# Start with SSE transport for web clients
itential-mcp run --transport sse --host 0.0.0.0 --port 8000
Additional notes
Environment and configuration tips:
- Set ITENTIAL_MCP_PLATFORM_HOST, ITENTIAL_MCP_PLATFORM_USER, and ITENTIAL_MCP_PLATFORM_PASSWORD to connect to your Itential Platform instance.
- Transport and host/port can be adjusted via ITENTIAL_MCP_SERVER_TRANSPORT, ITENTIAL_MCP_SERVER_HOST, and ITENTIAL_MCP_SERVER_PORT. Supported transports include stdio, SSE, and HTTP depending on your deployment.
- You can control tool visibility and behavior with include/exclude tag options when launching the server or via configuration files.
- If you run in container mode, supply environment variables as shown in the container usage examples (e.g., ITENTIAL_MCP_SERVER_TRANSPORT, ITENTIAL_MCP_SERVER_HOST, ITENTIAL_MCP_PLATFORM_HOST, etc.).
- When running from PyPI, you can still enable dynamic tool discovery and role-based access without modifying code.
Common issues:
- Ensure Python 3.10+ is active in your environment if you see syntax or dependency errors.
- If the server fails to start on a port, verify that the port is not already in use and that the host binding is correct.
- For authentication failures, double-check platform credentials and that the correct host URL is reachable from the MCP server environment.
Related MCP Servers
gpt-researcher
An autonomous agent that conducts deep research on any data using any LLM providers.
jupyter
🪐 🔧 Model Context Protocol (MCP) Server for Jupyter.
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
falcon
Connect AI agents to CrowdStrike Falcon for automated security analysis and threat hunting
beemcp
BeeMCP: an unofficial Model Context Protocol (MCP) server that connects your Bee wearable lifelogger to AI via the Model Context Protocol
MCP-MultiServer-Interoperable-Agent2Agent-LangGraph-AI-System
This project demonstrates a decoupled real-time agent architecture that connects LangGraph agents to remote tools served by custom MCP (Modular Command Protocol) servers. The architecture enables a flexible and scalable multi-agent system where each tool can be hosted independently (via SSE or STDIO), offering modularity and cloud-deployable execut