Get the FREE Ultimate OpenClaw Setup Guide →

serverless-rag

MCP server from sionic-ai/serverless-rag-mcp-server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio sionic-ai-serverless-rag-mcp-server sh /Users/sigridjineth/Desktop/work/storm-mcp-server/scripts/run.sh \
  --env STORM_API_KEY="Your Storm API key"

How to use

This MCP server implements Storm Model Context Protocol (MCP) to bridge an LLM application with Sionic AI's Storm Platform for RAG (retrieval-augmented generation) workflows. It exposes a server component that hosts resources and tools which the host LLM can call via the MCP protocol, enabling standardized context sharing, tool invocation, file management, and API integration with Storm endpoints. The server-defined tools include capabilities such as listing agents and buckets, sending non-streaming chat messages, uploading documents, and other file operations, making it straightforward to connect LLM-driven prompts to your RAG data sources and tooling stack. To use it, configure Claude Desktop (or another MCP client) to connect to this storm MCP server by adding an entry under mcpServers with the provided command, arguments, and environment (notably the API key). Once connected, you can invoke the defined tools from your LLM context to query vectors, manage files, or call Storm APIs as part of your RAG pipelines.

How to install

Prerequisites:

  • Python 3.8+ (or a compatible Python runtime) and a functioning environment for running the MCP server components.
  • Access/credentials for Storm API (if required by your setup).
  • Optional: Claude Desktop or another MCP-compatible client for testing the integration.

Installation steps:

  1. Clone or download the Storm MCP server repository to your environment.
  2. Install dependencies (example with Python). If there is a requirements.txt or poetry.lock in the project, install accordingly:
    • python3 -m pip install -r requirements.txt
    • or if using Poetry: poetry install
  3. Ensure environment variables are configured. Set the Storm API key or other necessary credentials in your environment or via the mcp_config env section. For example: export STORM_API_KEY="your-api-key-here"
  4. Start the MCP server using the provided script. The README suggests the server is started via a shell script, so run: sh /Users/sigridjineth/Desktop/work/storm-mcp-server/scripts/run.sh
  5. In your MCP client (e.g., Claude Desktop), add the Storm MCP server configuration: { "mcpServers": { "storm": { "command": "sh", "args": ["/Users/sigridjineth/Desktop/work/storm-mcp-server/scripts/run.sh"] } } }
  6. Test basic functionality by listing available tools (send_nonstream_chat, list_agents, list_buckets, upload_document_by_file) and performing a simple chat or file operation to ensure the server responds as expected.

Additional notes

Tips and considerations:

  • Ensure the path to run.sh in mcp_config matches your actual deployment environment; adjust the absolute path if you relocate the repository.
  • Keep your Storm API key secure; do not expose it in client-side configurations.
  • If the MCP server cannot connect to Storm services, verify network access, API endpoints, and any required CORS or firewall rules.
  • The server exposes tools for file management and data retrieval; use proper indexing and access controls to prevent unintended data exposure.
  • Review logs emitted by scripts/run.sh to troubleshoot startup or runtime issues; ensure dependencies are installed and environment variables are correctly set.
  • If migrating to Docker, uvx, or npm equivalents, adapt the mcp_config accordingly and ensure the entrypoint script remains compatible with the host client.

Related MCP Servers

Sponsor this space

Reach thousands of developers