Get the FREE Ultimate OpenClaw Setup Guide →

LLMTornado

The .NET library to build AI agents with 30+ built-in connectors.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio lofcz-llmtornado dotnet run --project src/LlmTornado.Mcp/LlmTornado.Mcp.csproj \
  --env ASPNETCORE_URLS="http://0.0.0.0:5000" \
  --env LLMTORNADO_LOG_LEVEL="Information"

How to use

LLM Tornado MCP server exposes its Model Context Protocol (MCP) interface so you can connect AI agents to data sources, tools, and workflows from a single control plane. With the MCP server running, you can register providers, data connectors, and tool adapters that LlmTornado can orchestrate—enabling agents to fetch knowledge, execute actions, and assemble multi-step workflows across many backends. This server is designed to be provider-agnostic, allowing you to plug in first-party and third-party providers through strongly-typed adapters and a consistent MCP surface.

To use the MCP capabilities, start the server and point your agent orchestration layer at the MCP endpoint. The server exposes endpoints to access providers, data sources, and tools, as well as workflow orchestration features that let agents coordinate complex tasks with Orchestrator graphs, Runners, and Advancers. You can leverage built-in connectors for popular providers, vector databases, and multimodal inputs/outputs, and you can extend the MCP surface by adding new adapters without altering the core protocol. The tooling supports exporting Mermaids, builder-pattern orchestration, and guardrails for safe agent execution, making it suitable for enterprise-grade automation and rapid prototyping alike.

How to install

Prerequisites:

  • .NET SDK (recommended latest LTS) installed on your platform
  • Git installed
  • A compatible development environment (optional but helpful for local builds)
  1. Clone the repository or install via your preferred source
  1. Restore and build MCP server package
  • dotnet restore src/LlmTornado.Mcp/LlmTornado.Mcp.csproj
  • dotnet build src/LlmTornado.Mcp/LlmTornado.Mcp.csproj -c Release
  1. Run the MCP server locally
  • dotnet run --project src/LlmTornado.Mcp/LlmTornado.Mcp.csproj
  1. Verify the server is listening

Optional: customize environment variables for deployment

  • ASPNETCORE_URLS: specify the listening URL (default http://0.0.0.0:5000)
  • LLMTORNADO_LOG_LEVEL: set log verbosity (Information, Debug, etc.)

Deployment note: In production, consider containerizing the MCP server and exposing only the necessary ports to your infrastructure, using a reverse proxy or API gateway as appropriate.

Additional notes

Tips and common considerations:

  • The MCP surface is designed to be extended with new providers and adapters; if a connector you need is missing, you can implement a strongly-typed adapter to expose its capabilities through MCP.
  • Use the guardrails and Open Telemetry support to improve governance and observability in production deployments.
  • When integrating with vector databases or multimodal inputs, ensure proper data routing and transformation layers are configured to preserve data integrity across steps.
  • If you encounter port conflicts, override the default port using the ASPNETCORE_URLS environment variable at startup.
  • For CI/CD, pin the .NET SDK version to avoid breaking changes across builds.

Related MCP Servers

Sponsor this space

Reach thousands of developers