toolkit
A Model Context Protocol server providing LLM Agents with system utilities and tools, including IP geolocation, network diagnostics, system monitoring, cryptographic operations, and QR code generation.
claude mcp add --transport stdio cyanheads-toolkit-mcp-server node node_modules/@cyanheads/toolkit-mcp-server/build/index.js \ --env NODE_ENV="production"
How to use
The toolkit MCP server provides a collection of utility tools for LLM agents, including network operations (geolocation, connectivity checks, and basic network diagnostics), system information access, security utilities (hash generation and UUIDs), and a QR code generator. Clients can access these capabilities via the MCP protocol by invoking the toolkit server name and the specific tool (e.g., geolocate, checkConnectivity, getSystemInfo, generateUUID, generateQRCode, etc.). The server is designed to be used by LLM agents through the standard MCP contract, sending requests that specify the tool name and parameters, and receiving structured results.
To use a tool, call mcp.use('toolkit', '<toolName>', <params>). For example, geolocate can return IP-based location data for a given IP, checkConnectivity can test reachability to a host/port, getSystemInfo fetches OS and hardware details, and generateQRCode can produce a QR code in various formats (SVG, terminal, or Base64). The documentation examples in the readme demonstrate typical usage patterns like:
- Geolocation: mcp.use('toolkit', 'geolocate', { query: '8.8.8.8' })
- Connectivity: mcp.use('toolkit', 'checkConnectivity', { host: 'example.com', port: 443 })
- System: mcp.use('toolkit', 'getSystemInfo', {})
- Security: mcp.use('toolkit', 'hashData', { input: 'data', algorithm: 'sha256' }) and mcp.use('toolkit', 'generateUUID', {})
- QR Code: mcp.use('toolkit', 'generateQRCode', { data: 'https://example.com', type: 'svg' })
If you’re integrating into an application, ensure your MCP client is aligned with the MCP protocol version (1.4.0 in this project) and handle responses with appropriate error handling and timeouts as defined by your orchestration layer.
How to install
Prerequisites:
- Node.js (v14+) and npm installed on your system
- Access to a terminal with Git for cloning or npm registry access
Install from npm (recommended):
npm install @cyanheads/toolkit-mcp-server
Or install from source:
git clone https://github.com/cyanheads/toolkit-mcp-server.git
cd toolkit-mcp-server
npm install
npm run build
Configure your MCP client to connect to the server by referencing the protocol and using the provided server name (toolkit) and tool identifiers as shown in the examples. Ensure environment variables (like NODE_ENV) are set as needed for your deployment.
Additional notes
Notes and tips:
- The server exposes multiple utilities; be mindful of rate limiting and access control if you expose it to external agents.
- The QR code generation supports terminal output, SVG, and Base64 formats; specify type accordingly when calling generateQRCode.
- For production deployments, consider configuring NODE_ENV and additional security measures (auth, IP allowlists) depending on your environment.
- If you encounter build or runtime issues, ensure the build output path (build/index.js) matches the configured entry point in your MCP client settings.
- The project adheres to MCP 1.4.0; verify compatibility with your MCP client version when upgrading.
Related MCP Servers
obsidian
Obsidian Knowledge-Management MCP (Model Context Protocol) server that enables AI agents and development tools to interact with an Obsidian vault. It provides a comprehensive suite of tools for reading, writing, searching, and managing notes, tags, and frontmatter, acting as a bridge to the Obsidian Local REST API plugin.
mcp -langfuse
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.
mcp-svelte-docs
🔍 MCP server that lets you search and access Svelte documentation with built-in caching
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
mentor
A Model Context Protocol server providing LLM Agents a second opinion via AI-powered Deepseek-Reasoning R1 mentorship capabilities, including code review, design critique, writing feedback, and idea brainstorming through the Deepseek API.
codemesh
The Self-Improving MCP Server - Agents write code to orchestrate multiple MCP servers with intelligent TypeScript execution and auto-augmentation