Get the FREE Ultimate OpenClaw Setup Guide →

akyn-sdk

Turn any data source into an MCP server in 5 minutes. Build AI-agents-ready knowledge bases.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ilyestal-akyn-sdk npx akyn-ai --name ilyestal-akyn-sdk --dir ./docs \
  --env OPENAI_API_KEY="your-openai-api-key" \
  --env AKYN_API_BASE_URL="https://api.openai.com/v1"

How to use

The akyn-ai MCP server lets you turn any document collection, website, or text corpus into a queryable knowledge base that an AI assistant can reference via the Model Context Protocol. With this server, you can index files, directories, URLs, and raw text, then expose them as an MCP endpoint that clients like Cursor or Claude can query directly. The server supports multiple transport options, including stdio for desktop workflows and HTTP for web clients, enabling flexible deployment in local or cloud environments. You can customize embeddings, chunking, and vector stores (including in-memory or Qdrant) to tailor performance and cost to your data and use case. To connect from clients, you configure an MCP channel (for example via Cursor or Claude Desktop) that points to this server so the AI can retrieve relevant chunks and reason over them during conversations.

How to install

Prerequisites:

  • Node.js (LTS version) installed on your machine
  • npm or pnpm to install packages

Step-by-step installation:

  1. Ensure Node.js is installed. Check with: node -v npm -v
  2. Install the akyn-ai package globally or locally when needed. The Quick Start uses npx to run the CLI without installing globally: npm install -g akyn-ai # optional; you can also rely on npx directly
  3. Run the MCP server with your data sources. For example: npx akyn-ai --name "ilyestal-akyn-sdk" --dir ./docs This will index the contents of ./docs and start an MCP server accessible to MCP clients.
  4. If you need to customize the run (e.g., using your own embeddings or a different vector store), refer to the library options and environment variables described in the docs.

Note: If you are wiring this into an MCP config file (as recommended for clients like Cursor), ensure the command and arguments match your environment (e.g., npx akyn-ai --name ...).

Additional notes

Tips and common issues:

  • Environment variables: OpenAI API keys or other provider keys should be supplied through the env block when embedding or retrieval requires authentication.
  • If you update content sources (files/directories/URLs), re-run indexing or the server may automatically refresh depending on your setup.
  • For production, consider using a persistent vector store (e.g., Qdrant) and a stable hosting environment for the MCP server to ensure fast responses.
  • When using HTTP transport, ensure your firewall or hosting platform allows inbound requests on the port you choose (default 3000 in examples).
  • If the server fails to start, check Node.js version compatibility and ensure that the target directory exists and contains accessible content.
  • The CLI supports various options; review the package docs for additional flags like --port, --config, or --http to adapt to your deployment scenario.

Related MCP Servers

Sponsor this space

Reach thousands of developers