Get the FREE Ultimate OpenClaw Setup Guide →

mcp -infranodus

The official InfraNodus MCP server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio infranodus-mcp-server-infranodus npx -y infranodus-mcp-server-infranodus \
  --env INFRAKIND_API_KEY="your InfraNodus API key (if required)" \
  --env INFRAKNODUS_GRAPH_NAME="default graph name to use, or leave as is"

How to use

This InfraNodus MCP Server integrates InfraNodus knowledge graphs and network analysis capabilities into your LLM workflows and AI assistants. It exposes a set of tools that let you generate knowledge graphs from text, analyze existing graphs, detect content gaps, and derive actionable insights such as topics, clusters, and research questions. Use these tools to augment responses from your LLMs, improve prompt engineering with topical context, and create richer, graph-informed outputs for tasks like content planning, SEO, and knowledge management.

To use the tools, start the MCP server via the command above (typically using npx to run the InfraNodus MCP package). Once the server is running, you can invoke tools such as generate_knowledge_graph to convert text into a graph, analyze_text to extract topics and clusters from a URL or transcript, and generate_research_questions or generate_research_ideas to surface questions and ideas based on identified gaps. The memory-related tools (memory_add_relations, memory_get_relations, retrieve_from_knowledge_base) help you persist and retrieve graph-based context across sessions, enabling more coherent long-running conversations with your AI assistant.

How to install

Prerequisites:

  • Node.js installed (recommended LTS version)
  • Access to npm or npx
  • Optional: InfraNodus API key if required by your deployment

Step 1: Install and run the MCP server via npx

npx -y infranodus-mcp-server-infranodus

Step 2: Set environment variables (if needed)

  • Create a .env file or export variables in your shell, e.g.
export INFRAKIND_API_KEY="your_api_key"
export INFRAKNODUS_GRAPH_NAME="default_graph"

Step 3: Verify the server is listening

  • Check console output for a listening port e.g. http://localhost:PORT or similar
  • If you have a reverse proxy or firewall, ensure the port is accessible

Step 4: Integrate with your client or workflow

  • Use the MCP adapter or API calls defined by the package to call the available tools
  • Pass in text, URLs, or transcripts as inputs to analyze and generate graph-based insights

Additional notes

Tips and common issues:

  • Ensure your InfraNodus API key (if required by your deployment) is kept secure; avoid committing it to code repositories.
  • If you see timeouts, increase the request timeout in your API client or adjust server resource limits.
  • When using memory-related tools, consistently provide a graph name to store and retrieve memory context.
  • If you need to customize the default graph, set INFRAKNODUS_GRAPH_NAME to your preferred value.
  • Review the list of available tools in the server documentation to determine which tool best suits your prompt engineering needs (e.g., generate_contextual_hint for prompts, or analyze_google_search_results for SEO contexts).

Related MCP Servers

Sponsor this space

Reach thousands of developers