Little_MCP
A simple yet powerful local AI assistant that runs entirely on your machine. Built for learning and experimentation, Little MCP combines the power of open-source LLMs with advanced RAG that work with your personal documents. Included tools: real time weather, calculate, local documents RAG, local SQL database
claude mcp add --transport stdio ricard1406-little_mcp python mcp_server.py \ --env PYPATH="path_to_your_python_env_if_needed" \ --env OLLAMA_HOST="http://localhost:11434" \ --env OPENWEATHER_API_KEY="your-openweather-api-key"
How to use
Little MCP is a local, privacy-focused AI assistant that runs entirely on your machine. It combines a FastAPI MCP server with a LangChain-based client to provide a multi-tool agent capable of RAG-based document QA, weather and time queries, arithmetic calculations, and more. Start the server with the included Python script, then run the client to begin interacting with your documents and local tools. The system supports a dual-mode approach, allowing you to see the thinking process if you enable it, and it maintains conversational memory across the session to provide context-aware responses.
How to install
Prerequisites:
- Python 3.8+
- Ollama installed and running locally
- OpenWeather API key (free tier)
-
Clone or download the Little MCP repository: git clone https://github.com/ricard1406-little_mcp.git cd Little_MCP
-
Create and activate a Python virtual environment (optional but recommended): python -m venv .venv source .venv/bin/activate # macOS/Linux ..venv\Scripts\activate # Windows
-
Install Python dependencies: pip install -r requirements.txt
-
Prepare environment variables (optional but recommended):
- Create a .env file or export variables in your shell: export OPENWEATHER_API_KEY=your_openweather_key export OLLAMA_HOST=http://localhost:11434
-
Pull required Ollama models (as documented): ollama pull qwen3:4b ollama pull nomic-embed-text
-
Run the MCP server: python mcp_server.py
-
In another terminal, run the MCP client: python little_mcp.py
Notes:
- Ensure Ollama is running and models are downloaded before starting the server.
- Ensure OpenWeather API key is valid and configured if you plan to use the Weather tool.
Additional notes
Tips and common issues:
- If the server fails to start, check that port 8000 is free and that the .env file contains required API keys.
- For the Weather tool, verify your OpenWeather API key is active.
- If you encounter vector store issues, you can remove the chroma_db_rag directory to rebuild the embeddings.
- Ensure Ollama is running (ollama serve) and that the specified models are downloaded (ollama list).
- If the client cannot connect, verify SERVER_URL and network access to the MCP server.
- The RAG system uses a PDF document loader and a vector store; ensure your data/ directory contains the documents you want to query.
Related MCP Servers
npcpy
The python library for research and development in NLP, multimodal LLMs, Agents, ML, Knowledge Graphs, and more.
multimodal-agents-course
An MCP Multimodal AI Agent with eyes and ears!
AgentChat
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
phone
A phone control plugin for MCP that allows you to control your Android phone through ADB commands to connect any human
mcp-use-ts
mcp-use is the framework for MCP with the best DX - Build AI agents, create MCP servers with UI widgets, and debug with built-in inspector. Includes client SDK, server SDK, React hooks, and powerful dev tools.
python -client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai