Get the FREE Ultimate OpenClaw Setup Guide →

mcp-local-analyst

Talk to your data locally 💬📊. A private AI Data Analyst built with the Model Context Protocol (MCP), Ollama, and SQLite. Turn natural language into SQL queries without data leaving your machine. Includes a Dockerized Streamlit UI

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio fabiothiroki-mcp-local-analyst docker compose up --build

How to use

MCP Local Analyst is a private AI data assistant that lets you talk to your own database locally. It combines the Model Context Protocol (MCP), Ollama for local LLMs, and SQLite for persistent storage, all exposed through a user-friendly Streamlit UI delivered via Docker Compose. The goal is to turn natural language queries into SQL and have the results returned without sending data to external services. The UI provides controls to select the data model, run queries, and view results in real time, with the underlying data staying on your machine.

To use it, first ensure Ollama is running and that you have Docker with Docker Compose installed. After starting the service with docker-compose up --build, you can open the Streamlit UI in your browser at http://localhost:8501. From there, you can pose questions in natural language, and the system will convert them into SQL against the local SQLite database, execute the queries, and display the results. The data is stored locally in the data/ directory, and the database can be adjusted or seeded by editing src/seed_data.py if needed. You can also configure model selection and parameters from the UI to tailor the responses and performance to your environment.

How to install

Prerequisites

  • Docker & Docker Compose installed on your machine
  • Ollama installed and a local model available (e.g., mistral)

Installation steps

  1. Install Docker Desktop (includes Docker Compose by default)
  2. Install Ollama from https://ollama.ai
  3. Start Ollama and pull a local model if needed, e.g.: ollama pull mistral
  4. Clone this repository to your local machine
  5. Ensure Ollama is serving locally (in a separate terminal): ollama serve
  6. Build and run the MCP server with Docker Compose: docker-compose up --build
  7. Open your browser to http://localhost:8501 to access the Streamlit UI and begin querying your local data.

Additional notes

Tips and notes:

  • Ollama must be running on http://localhost:11434 for the MCP to connect to a local LLM. If you configure a different port, adjust the settings accordingly in the UI or environment.
  • All data remains on your machine; no data is sent externally when using this setup.
  • The SQLite database and seeded data live in the data/ directory. Modify src/seed_data.py to change the initial dataset.
  • The UI is powered by Streamlit inside the Dockerized environment; you can customize model selection and parameters via the app’s interface.
  • If you encounter port conflicts, verify that port 8501 (Streamlit) and 11434 (Ollama) are free or adjust container configuration to use alternate ports.
  • For troubleshooting, check docker logs, ensure the Ollama service is reachable, and confirm that the repository is on the latest version with the expected file paths (src/seed_data.py, data/).

Related MCP Servers

Sponsor this space

Reach thousands of developers ↗