Get the FREE Ultimate OpenClaw Setup Guide →

Little_MCP

A simple yet powerful local AI assistant that runs entirely on your machine. Built for learning and experimentation, Little MCP combines the power of open-source LLMs with advanced RAG that work with your personal documents. Included tools: real time weather, calculate, local documents RAG, local SQL database

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ricard1406-little_mcp python mcp_server.py \
  --env PYPATH="path_to_your_python_env_if_needed" \
  --env OLLAMA_HOST="http://localhost:11434" \
  --env OPENWEATHER_API_KEY="your-openweather-api-key"

How to use

Little MCP is a local, privacy-focused AI assistant that runs entirely on your machine. It combines a FastAPI MCP server with a LangChain-based client to provide a multi-tool agent capable of RAG-based document QA, weather and time queries, arithmetic calculations, and more. Start the server with the included Python script, then run the client to begin interacting with your documents and local tools. The system supports a dual-mode approach, allowing you to see the thinking process if you enable it, and it maintains conversational memory across the session to provide context-aware responses.

How to install

Prerequisites:

  • Python 3.8+
  • Ollama installed and running locally
  • OpenWeather API key (free tier)
  1. Clone or download the Little MCP repository: git clone https://github.com/ricard1406-little_mcp.git cd Little_MCP

  2. Create and activate a Python virtual environment (optional but recommended): python -m venv .venv source .venv/bin/activate # macOS/Linux ..venv\Scripts\activate # Windows

  3. Install Python dependencies: pip install -r requirements.txt

  4. Prepare environment variables (optional but recommended):

    • Create a .env file or export variables in your shell: export OPENWEATHER_API_KEY=your_openweather_key export OLLAMA_HOST=http://localhost:11434
  5. Pull required Ollama models (as documented): ollama pull qwen3:4b ollama pull nomic-embed-text

  6. Run the MCP server: python mcp_server.py

  7. In another terminal, run the MCP client: python little_mcp.py

Notes:

  • Ensure Ollama is running and models are downloaded before starting the server.
  • Ensure OpenWeather API key is valid and configured if you plan to use the Weather tool.

Additional notes

Tips and common issues:

  • If the server fails to start, check that port 8000 is free and that the .env file contains required API keys.
  • For the Weather tool, verify your OpenWeather API key is active.
  • If you encounter vector store issues, you can remove the chroma_db_rag directory to rebuild the embeddings.
  • Ensure Ollama is running (ollama serve) and that the specified models are downloaded (ollama list).
  • If the client cannot connect, verify SERVER_URL and network access to the MCP server.
  • The RAG system uses a PDF document loader and a vector store; ensure your data/ directory contains the documents you want to query.

Related MCP Servers

Sponsor this space

Reach thousands of developers