Get the FREE Ultimate OpenClaw Setup Guide →

litellm-agent

MCP server giving AI agents access to 100+ LLMs through LiteLLM

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio berriai-litellm-agent-mcp python -m litellm_agent_mcp \
  --env OPENAI_API_KEY="sk-..." \
  --env ANTHROPIC_API_KEY="sk-..."

How to use

This MCP server exposes LiteLLM’s multi-model API to your AI agents, enabling them to call and compare outputs across 100+ models through LiteLLM’s unified interface. Tools exposed include: call (OpenAI chat completions format for calling any model), responses (OpenAI Responses API format for stateful interactions and structured tool output), messages (Anthropic Claude format), generate_content (Google Gemini format), as well as utilities to compare model outputs, list available models, and get model recommendations. To use, configure the MCP server in your environment and run it with Python. Your agents can then route tasks through the LiteLLM proxy via the litellm command set, selecting the appropriate tool for each task (e.g., code tasks via a code-optimized model, writing tasks via a language-specific model, or long-document tasks via a memory-friendly option). The include tools enable you to inspect model strengths, compare outputs side-by-side, and request recommendations based on task type.

How to install

Prerequisites:

  • Python 3.8+ and pip
  • Git (optional if installing from source)

Option A: Install from PyPI

  1. Ensure Python and pip are installed and in your PATH.

  2. Install the MCP package:

    pip install litellm-agent-mcp

  3. Run the MCP server (example):

    python -m litellm_agent_mcp

Option B: Install from Source

  1. Clone the repository:

    git clone https://github.com/BerriAI/litellm-agent-mcp cd litellm-agent-mcp

  2. Install editable mode:

    pip install -e .

  3. Run the MCP server:

    python -m litellm_agent_mcp

Option C: Configuration snippet (example to include in your MCP config)

{
  "mcpServers": {
    "litellm": {
      "command": "python",
      "args": ["-m", "litellm_agent_mcp"],
      "env": {
        "OPENAI_API_KEY": "sk-...",
        "ANTHROPIC_API_KEY": "sk-..."
      }
    }
  }
}

Additional notes

Tips and common considerations:

  • Set API keys for the providers you intend to use (OpenAI, Anthropic, Google Gemini, Mistral, etc.).
  • You can route lighter tasks to cheaper models by leveraging the compare and recommend tools to optimize cost/performance.
  • If you use a LiteLLM proxy, set LITELLM_API_BASE and LITELLM_API_KEY instead of provider keys.
  • Be mindful of rate limits and model availability across providers; use the models and strengths table to choose appropriate models for each task.
  • The MCP server can be configured via the provided mcp config snippet to fit into your existing MCP orchestration.
  • For production deployments, consider setting additional environment controls (e.g., MODEL_TIME_LIMITS, TRACE/DEBUG flags) and securing API keys with proper secret management.

Related MCP Servers

Sponsor this space

Reach thousands of developers