kortx
Kortx: MCP server for AI-powered consultation. GPT-5 strategic planning, Perplexity real-time search, GPT Image visual creation, with intelligent context gathering
claude mcp add --transport stdio effatico-kortx-mcp npx -y @effatico/kortx-mcp@latest \
--env OPENAI_API_KEY="${OPENAI_API_KEY}" \
--env PERPLEXITY_API_KEY="${PERPLEXITY_API_KEY}"How to use
Kortx is a lightweight MCP server that provides copilots with access to a suite of planning, research, and enhancement tools built around the Model Context Protocol. It ships with seven consultation tools (think-about-plan, suggest-alternative, improve-copy, solve-problem, consult, search-content, create-visual) plus a batch runner for parallel execution. It also supports file-based context enrichment through a default gatherer and optional connectors to Serena, MCP Knowledge Graph, and CCLSP MCP servers when those services are running. The server is designed for stdio transport and includes features like structured logging, rate limiting, and request caching, with a hardened Docker build that runs as a non-root user. To use Kortx, you integrate it into your MCP client configuration, providing credentials and the desired model preferences, and then invoke the available tools through the mcp interface. The Quick Start demonstrates how to wire Kortx into an MCP client and pass credentials to enable OpenAI and Perplexity access for generation and citation-backed results.
How to install
Prerequisites:
- Node.js >= 22.12.0
- npm >= 9
- Optional: Docker for containerized runs
Recommended steps:
-
Clone the repository (or install the npm package): git clone https://github.com/effatico/kortx-mcp.git cd kortx-mcp
-
Install dependencies: npm install
-
Build the project (if applicable for your setup): npm run build
-
Run in development mode (for local testing): npm run dev
-
If you prefer Docker, build and run the image as described in the Docker section of the README: docker build -t kortx-mcp . docker run -i --rm
-e OPENAI_API_KEY=$OPENAI_API_KEY
-e PERPLEXITY_API_KEY=$PERPLEXITY_API_KEY
kortx-mcp -
Optionally configure via the Quick Start example to add Kortx to your MCP client configuration (see below).
Additional notes
Tips and notes:
- The server uses stdio for transport; HTTP is not implemented yet (as per the README).
- Set OPENAI_API_KEY and PERPLEXITY_API_KEY in your environment to enable model access and Perplexity-powered search.
- You can customize model choices, reasoning effort, verbosity, and retry behavior via configuration (OPENAI_MODEL, OPENAI_REASONING_EFFORT, OPENAI_VERBOSITY, etc.).
- If you run Docker, the image runs as UID/GID 1001 (nodejs) and performs npm audits during build.
- The Quick Start shows how to wire Kortx into an MCP client using an npx-based installation; you can replace with a package manager or local build as needed.
- For longer-lived deployments, consider using docker-compose with volume mounts as suggested in the Docker section of the README.
Related MCP Servers
Pare
Dev tools, optimized for agents. Structured, token-efficient MCP servers for git, test runners, npm, Docker, and more.
vibe-workspace
Manage a vibe workspace with many repos
mini_claude
Give Claude Code persistent memory across sessions. Track habits, log mistakes, prevent death spirals. Runs locally with Ollama.
cadre-ai
Your AI agent squad for Claude Code. 17 specialized agents, persistent memory, desktop automation, and a common sense engine.
mcp-tidy
CLI tool to visualize and manage MCP server configurations in Claude Code. List servers, analyze usage statistics, and clean up unused servers
mcpman
The package manager for MCP servers — install, manage & monitor across Claude Desktop, Cursor, VS Code, Windsurf