mcp
A MCP server built with Python, FastMCP, Ollama, Open-webUI, FastAPI, and Docker following a microservice architecture..
claude mcp add --transport stdio rainer85ah-mcp-server docker run -i rainer85ah/mcp-server:latest
How to use
This MCP server is a Python-based FastMCP implementation designed to manage and expose AI/LLM capabilities via a REST API, with integration points for Ollama and Open-webUI. The project emphasizes rapid prototyping and production readiness, offering a pluggable architecture for adding models and routes and a documented FastAPI backend. When running with Docker Compose (the recommended approach), the server provides an API surface at the MCP endpoint and OpenAPI docs, along with an Ollama-based LLM execution layer and a web UI for chat-style interactions through Open-webUI.
To use it, start the container (via Docker Compose as recommended in the repository) and then interact with the exposed endpoints. The MCP service is available at the path /service/mcp/ on the FastAPI server, Ollama is exposed at its standard port, and the OpenAPI docs are accessible under the API docs route. The Open-webUI interface allows chat-style interactions with integrated models, while the REST API enables programmatic model routing and management. This setup is ideal for AI chat platforms, model routing gateways, and developer LLM sandboxes that require a FastAPI-based ML backend with a ready-made MCP surface.
How to install
Prerequisites:
- Docker and Docker Compose installed on your machine
- Basic familiarity with containerized apps and FastAPI concepts
Step-by-step installation:
-
Clone the repository git clone https://github.com/rainer85ah/mcp-server.git cd mcp-server
-
Install and run with Docker Compose (recommended)
- Ensure Docker is running
- Start the services in detached mode using Docker Compose
docker compose up --build -d
-
Verify the services are up
- Ollama: http://localhost:11434
- API Docs: http://localhost:8000/docs
- OpenAPI: http://localhost:8000/openapi.json
- MCP Server endpoint: http://localhost:8000/service/mcp/
- Open-webUI: http://localhost:3000
-
Optional: expose or customize ports and environment variables in a docker-compose.override.yml or .env file as needed for your environment
If you prefer running the server directly without Compose, you can use the provided Docker image with a simple docker run command, but using Docker Compose is the recommended approach in this project.
Additional notes
Tips and common issues:
- Ports: Ensure ports 11434 (Ollama), 8000 (FastAPI/MCP), and 3000 (Open-webUI) are available on your host or adjust in your docker-compose configuration.
- Environment variables: You may need to configure Ollama host/port or Open-webUI URLs if you customize the setup; see your deployment environment for specifics.
- OpenAPI docs will reflect configured endpoints; use the /docs path for interactive API exploration.
- Docker-based deployment is the primary recommended path for reproducibility and ease of setup. If you modify models or routes, keep the FastAPI app compatible with MCP routing conventions.
- This boilerplate emphasizes pluggability: you can add new model backends or routes without changing core MCP logic.
- If you encounter API authentication needs, plan to extend the environment or middleware accordingly; the base setup focuses on accessibility and rapid prototyping.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP