cognigraph
use markmap-cli,mermaid-cli to create cognigraph
claude mcp add --transport stdio xsp52hz-cognigraph-mcp-server node /path/to/cognigraph-mcp-server/build/index.js \ --env OPENAI_API_KEY="sk-..." \ --env OPENAI_BASE_URL="http://localhost:11434/v1" \ --env OPENAI_DEFAULT_MODEL="llama3" \ --env MINDMAP_DEFAULT_SAVE_DIR="C:\Users\YourUser\Desktop"
How to use
CogniGraph MCP server provides tools to generate mind maps, relationship graphs, and knowledge graphs from Markdown input and Mermaid syntax, using external CLI tools and an OpenAI-compatible API for AI-driven analysis. The server exposes these capabilities via the MCP interface and is designed to work with MCP clients like Claude Desktop, Cherry Studio, DeepChat, and HyperChat. You can access the tools through the use_mcp_tool command in your MCP client, selecting the appropriate tool and supplying the required inputs and optional parameters. The server also supports saving outputs to disk when requested, using the MINDMAP_DEFAULT_SAVE_DIR environment variable as the default location.
Available tools:
- generate_mindmap: create a mind map from Markdown text and return HTML or SVG content directly. Inputs: markdown (string, required), outputFormat (html | svg, optional, default html).
- generate_and_save_mindmap: create a mind map from Markdown, then save the result (HTML or SVG) to disk. Inputs: markdown (string, required), outputFormat (html | svg, optional, default html), outputDir (string, optional), outputFilename (string, optional).
- generate_relationship_graph: generate a relationship graph from Mermaid syntax, saving as SVG or PNG. Inputs: mermaid_text (string, required), outputFormat (svg | png, optional, default svg), outputDir (string, optional), outputFilename (string, optional).
- generate_knowledge_graph: analyze Markdown with an AI model (OpenAI API compatible) to produce Mermaid code and render it to SVG or PNG. Inputs: markdown (string, required), outputFormat (svg | png, optional, default svg), outputDir (string, optional), outputFilename (string, optional), prompt (string, optional), model (string, optional), apiKey (string, optional), baseURL (string, optional).
Note: generate_knowledge_graph requires configuration via environment variables or parameters for OpenAI access (OPENAI_API_KEY, OPENAI_BASE_URL, OPENAI_DEFAULT_MODEL) and may rely on MINDMAP_DEFAULT_SAVE_DIR for saved outputs.
How to install
Prerequisites:\n- Node.js and npm installed on the host (if using the default Node-based server).\n- Access to an MCP client configuration file (e.g., mcp_settings.json) to register the cognigraph-mcp-server.\n- Optional: external tools required by the server (e.g., markmap-cli and @mermaid-js/mermaid-cli) if using the non-npx workflow.\n\nInstall steps:\n1) Clone the repository containing cognigraph-mcp-server.\n2) Build the server if needed (follow the repository's build steps, typically npm install && npm run build).\n3) In your MCP client configuration, register the server with the following details: the server name cognigraph-mcp-server, command node, and the path to build/index.js in args. Provide necessary environment variables such as MINDMAP_DEFAULT_SAVE_DIR and any OpenAI-related keys.\n4) If you plan to use npx for quick startup, ensure Node.js and npm are installed, then update your MCP client settings to set command to npx and args to ["cognigraph-mcp-server"], keeping env unchanged.\n5) Start/restart your MCP client to load the Cognigraph MCP server.\n6) Ensure markmap-cli and @mermaid-js/mermaid-cli are globally available if you encounter command-not-found issues, e.g. npm install -g markmap-cli @mermaid-js/mermaid-cli.
Additional notes
Environment variables and configuration tips:\n- MINDMAP_DEFAULT_SAVE_DIR sets the default output directory for saving tools 2–4. If not set, outputs default to the user's home directory. Tool 1 (generate_mindmap) returns content directly and does not save to disk.\n- OPENAI_API_KEY is required for generate_knowledge_graph. If missing (and not provided via apiKey parameter), that tool will fail. You can supply OPENAI_API_KEY via environment or pass apiKey in the request.\n- OPENAI_BASE_URL and OPENAI_DEFAULT_MODEL customize the OpenAI-compatible endpoint and model; only relevant to generate_knowledge_graph.\n- When using npx, environment variables must still be passed through the MCP client’s env configuration.\n- If tools cannot locate markmap or mmdc, install them globally: npm install -g markmap-cli @mermaid-js/mermaid-cli.\n- The configuration example in mcp_settings.json demonstrates how to wire the server with the node-based entrypoint and path adjustments; adjust paths to match your deployment.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.