Get the FREE Ultimate OpenClaw Setup Guide →

vllora

Debug your AI agents

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio vllora-vllora docker run -i vllora/vllora

How to use

vLLora is a lightweight, real-time debugging and observability tool for AI agents that integrates with MCP servers. It traces and analyzes interactions between your AI agents and their providers (such as OpenAI-compatible APIs) in real time, giving you visibility into prompts, responses, tool calls, and agent workflows. The MCP capability lets you connect external tooling and dashboards to the server via HTTP and Server-Sent Events, enabling seamless data streaming and integration with your existing debugging pipelines. To start, run the MCP server (the recommended approach here is via Docker for portability). Once running, navigate to the UI at the configured ports to configure provider API keys and inspect traces as they occur. You can test tracing by sending a chat completion request to the server’s HTTP endpoint and watching the live trace feed appear in the UI.

How to install

Prerequisites:

  • Docker installed on your machine (recommended for simplicity and portability).
  • Optionally, ensure you have internet access to pull the vLLora image from Docker Hub.

Install and run:

  1. Pull and run the vLLora MCP server via Docker:

    docker run -d --name vllora-mcp -p 9090:9090 -p 9091:9091 vllora/vllora

  2. Verify the server is running:

    docker ps | grep vllora

  3. Open the UI and configure API keys:

  4. Optional: If you prefer building locally, you can clone the repository and build the Rust project (requires Rust toolchain):

    git clone https://github.com/vllora/vllora.git cd vllora cargo build --release ./target/release/vlora

    Then access http://localhost:9090 (API) and http://localhost:9091 (UI).

Additional notes

Tips and considerations:

  • The UI at /9091 is used to configure provider API keys and view traces. Ensure your OpenAI-compatible API keys are set up there for tracing to begin.
  • By default, vLLora exposes HTTP endpoints for MCP-style interactions. You can integrate with other MCP-compatible tooling via HTTP/SSE streams.
  • If you run behind a firewall or reverse proxy, ensure that ports 9090 (API) and 9091 (UI) are accessible from your development environment.
  • The Docker image vllora/vllora may be updated over time; pin to a specific tag if version stability is required, e.g., docker run vllora/vllora:1.0.0.
  • Review the docs at the official site for updated MCP integration guidance and any platform-specific notes.

Related MCP Servers

Sponsor this space

Reach thousands of developers