mcp s-kagi
A Model Context Protocol server implementation for Kagi's API
claude mcp add --transport stdio ac3xx-mcp-servers-kagi node build/index.js \ --env KAGI_API_KEY="your_kagi_api_key_here"
How to use
The kagi-server MCP server integrates the Kagi Search API into the MCP ecosystem. It exposes a single implemented tool, kagi_search, which accepts a query string and an optional limit, and returns search results from Kagi's API. Once connected to Claude Desktop, you can request web searches in natural language and the server will route the request through the tool to fetch results from Kagi. The server relies on a KAGI_API_KEY environment variable to authenticate with Kagi's API, so ensure you provide a valid key in your environment or in your client configuration when starting the server. Planned tools such as kagi_summarize, kagi_fastgpt, and kagi_enrich are not yet implemented, so only kagi_search will be available for now.
To use the tool, start the MCP server and connect it to Claude Desktop. Then, when you ask Claude to look up information, Claude will invoke the kagi_search tool behind the scenes, returning a list of search results. You can specify a query like “latest advancements in quantum computing” and optionally set a limit to control how many results are returned. Keep in mind the environment setup requires your KAGI_API_KEY to be present for the tool to function.
How to install
Prerequisites:
- Node.js (LTS) and npm installed on your system
- Access to the repository containing the kagi-server MCP server
Step 1: Install dependencies
npm install
Step 2: Create environment configuration
- Create a .env file in the project root and add your Kagi API key:
KAGI_API_KEY=your_api_key_here
- Ensure .env is added to .gitignore to keep keys secure.
Step 3: Build the server
npm run build
Step 4: Run in development or production
- Development (with auto-rebuild, if configured):
npm run watch
- Production-like run (no auto-rebuild):
node build/index.js
Step 5: Connect to Claude Desktop
- In Claude Desktop, configure the MCP server entry with the following (example):
{
"mcpServers": {
"kagi-server": {
"command": "/path/to/kagi-server/build/index.js",
"env": {
"KAGI_API_KEY": "your_api_key_here"
}
}
}
}
Optional: If you install via Smithery, follow the Smithery installation flow described in the repository README and then start the server as you would with a local Node.js MCP server.
Additional notes
Notes and tips:
- The MCP server requires KAGI_API_KEY to access Kagi's API. Do not expose this key in public repositories.
- The implemented tool is kagi_search. Other tools (kagi_summarize, kagi_fastgpt, kagi_enrich) are planned but not yet available.
- When debugging, MCP communicates over stdio. Using the MCP Inspector can help visualize requests and responses.
- If you customize the path to the built index (build/index.js), ensure your MCP config points to the correct path.
- Ensure environment variable provisioning is consistent between development and production deployments.
Related MCP Servers
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
mcp-omnisearch
🔍 A Model Context Protocol (MCP) server providing unified access to multiple search engines (Tavily, Brave, Kagi), AI tools (Perplexity, FastGPT), and content processing services (Jina AI, Kagi). Combines search, AI responses, content processing, and enhancement features through a single interface.
ironcurtain
A secure* runtime for autonomous AI agents. Policy from plain-English constitutions. (*https://ironcurtain.dev)
create -app
A CLI tool for quickly scaffolding Model Context Protocol (MCP) server applications with TypeScript support and modern development tooling
grok-faf
First MCP server for Grok | FAST⚡️AF • URL-based AI context • Vercel-deployed
mcp-turso
MCP server for interacting with Turso-hosted LibSQL databases