Local_MCP_Client
Local MCP Client is a cross-platform web and API interface for interacting with configurable MCP servers using natural language, powered by Ollama and any local LLM of choice, enabling structured tool execution and dynamic agent behavior.
claude mcp add --transport stdio mytechnotalent-local_mcp_client python local_mcp_client.py \ --env BNJLAT="<your-binja-api-token>"
How to use
Local MCP Client is a cross-platform web and API interface that lets you interact with configurable MCP servers using natural language. It pairs a local Ollama-hosted LLM with a configurable MCP ecosystem, enabling you to issue structured tool calls and define agent behaviors without sending data to external cloud services. The client is driven by a local LLM, accessed via Ollama, and can be extended with additional MCP servers to perform diverse tasks based on natural language prompts and the tools those servers expose. Use the included CLI/UV-based workflow to start the local client, connect to Ollama, and then issue commands that map to actions on your configured MCP servers.
To run it, first ensure Ollama is serving a local LLM (e.g., llama3:8b) and then start the Local MCP Client script locally (local_mcp_client.py) using your preferred Python environment. Once running, you can ask the client to locate data, execute malware-focused MCP workflows, or orchestrate multi-step operations by invoking the available MCP servers through natural language prompts. The client will route intents to the appropriate server and manage tool invocations and responses, enabling dynamic agent behavior based on the configured MCP suite.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- A working Ollama installation serving a local LLM (e.g., llama3:8b)
- git for cloning repositories
- Network access to pull the required dependencies
Installation steps:
-
Set up Python virtual environment and install requirements
- mac/linux: curl -LsSf https://astral.sh/uv/install.sh | sh cd Local_MCP_Client uv init . uv venv source .venv/bin/activate uv pip install -r requirements.txt
- Windows: powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex" cd Local_MCP_Client uv init . uv venv .venv\Scripts\activate uv pip install -r requirements.txt
-
Install Ollama and pull the LLM model
- mac: brew install ollama ollama serve ollama pull llama3:8b
- linux: curl -fsSL https://ollama.com/install.sh | sh ollama serve ollama pull llama3:8b
- Windows: Download Ollama and run ollama serve ollama pull llama3:8b
-
Clone required MCP servers
- mac/linux: cd ~/Documents git clone https://github.com/mytechnotalent/MalwareBazaar_MCP.git git clone https://github.com/Invoke-RE/binja-lattice-mcp
- Windows: cd "%USERPROFILE%\Documents" git clone https://github.com/mytechnotalent/MalwareBazaar_MCP.git git clone https://github.com/Invoke-RE/binja-lattice-mcp
-
Run Ollama (in a separate terminal) ollama serve
-
Run the Local MCP Client
- Set your Binance API token placeholder (if applicable) and start the client mac/linux: export BNJLAT="<your-binja-api-token>" uv run local_mcp_client.py Windows: $env:BNJLAT = "<your-binja-api-token>" uv run local_mcp_client.py
-
Run tests (optional) python -m unittest discover -s tests uv pip install coverage==7.8.0 coverage run --branch -m unittest discover -s tests coverage report -m coverage html open htmlcov/index.html # MAC xdg-open htmlcov/index.html # Linux start htmlcov\index.html # Windows coverage erase
Additional notes
Notes and tips:
- The Local MCP Client expects a locally available Ollama LLM. Ensure Ollama is running and the specified model (e.g., llama3:8b) is pulled before starting the client.
- The BNJLAT environment variable is used for accessing your local or remote BNJ API token if required by the local MCP setup. Replace the placeholder with your actual token when running in a real environment.
- The mcp_config structure uses a Python-based execution model. If you reorganize scripts or add new server adapters, ensure the local_mcp_client.py invocation remains aligned with the script path and virtual environment activation.
- If you encounter issues with dependencies, re-create the venv (uv venv) and reinstall requirements.txt, then re-run the client.
- The client can be extended by adding additional MCP servers under the mcpServers map and wiring their commands/args to the corresponding tools.
Related MCP Servers
nerve
The Simple Agent Development Kit.
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-aoai-web-browsing
A minimal Model Context Protocol 🖥️ server/client🧑💻with Azure OpenAI and 🌐 web browser control via Playwright.
jiki
MCP server from teilomillet/jiki
mcp-llama3-client
A client for the MCP Flight Search service using Ollama and Llama 3.2 to provide a user-friendly flight search interface with Model Context Protocol tools