Get the FREE Ultimate OpenClaw Setup Guide →

mcp

A MCP server built with Python, FastMCP, Ollama, Open-webUI, FastAPI, and Docker following a microservice architecture..

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio rainer85ah-mcp-server docker run -i rainer85ah/mcp-server:latest

How to use

This MCP server is a Python-based FastMCP implementation designed to manage and expose AI/LLM capabilities via a REST API, with integration points for Ollama and Open-webUI. The project emphasizes rapid prototyping and production readiness, offering a pluggable architecture for adding models and routes and a documented FastAPI backend. When running with Docker Compose (the recommended approach), the server provides an API surface at the MCP endpoint and OpenAPI docs, along with an Ollama-based LLM execution layer and a web UI for chat-style interactions through Open-webUI.

To use it, start the container (via Docker Compose as recommended in the repository) and then interact with the exposed endpoints. The MCP service is available at the path /service/mcp/ on the FastAPI server, Ollama is exposed at its standard port, and the OpenAPI docs are accessible under the API docs route. The Open-webUI interface allows chat-style interactions with integrated models, while the REST API enables programmatic model routing and management. This setup is ideal for AI chat platforms, model routing gateways, and developer LLM sandboxes that require a FastAPI-based ML backend with a ready-made MCP surface.

How to install

Prerequisites:

  • Docker and Docker Compose installed on your machine
  • Basic familiarity with containerized apps and FastAPI concepts

Step-by-step installation:

  1. Clone the repository git clone https://github.com/rainer85ah/mcp-server.git cd mcp-server

  2. Install and run with Docker Compose (recommended)

    • Ensure Docker is running
    • Start the services in detached mode using Docker Compose

    docker compose up --build -d

  3. Verify the services are up

  4. Optional: expose or customize ports and environment variables in a docker-compose.override.yml or .env file as needed for your environment

If you prefer running the server directly without Compose, you can use the provided Docker image with a simple docker run command, but using Docker Compose is the recommended approach in this project.

Additional notes

Tips and common issues:

  • Ports: Ensure ports 11434 (Ollama), 8000 (FastAPI/MCP), and 3000 (Open-webUI) are available on your host or adjust in your docker-compose configuration.
  • Environment variables: You may need to configure Ollama host/port or Open-webUI URLs if you customize the setup; see your deployment environment for specifics.
  • OpenAPI docs will reflect configured endpoints; use the /docs path for interactive API exploration.
  • Docker-based deployment is the primary recommended path for reproducibility and ease of setup. If you modify models or routes, keep the FastAPI app compatible with MCP routing conventions.
  • This boilerplate emphasizes pluggability: you can add new model backends or routes without changing core MCP logic.
  • If you encounter API authentication needs, plan to extend the environment or middleware accordingly; the base setup focuses on accessibility and rapid prototyping.

Related MCP Servers

Sponsor this space

Reach thousands of developers