zin -client
MCP Client which serves as bridge between mcp servers and local LLMs running on Ollama, Created for MCP Servers Developed by Me, However other MCP Servers may run as well
claude mcp add --transport stdio zinja-coder-zin-mcp-client /path/to/uv --directory /path/to/zin-mcp-client/ run zin_mcp_client.py
How to use
ZIN MCP Client is a lightweight command-line and web UI bridge designed to interact with multiple MCP servers from your local environment. It leverages Ollama for local LLM integration and provides a clean CLI to query and orchestrate tools exposed by various MCP servers. The client focuses on simplicity and speed, enabling you to connect to one or more MCP servers, issue tool invocations, and receive structured responses from the servers through a unified interface. It also includes a minimal web UI to monitor and interact with tools when you prefer a browser-based workflow.
To get started, install the required prerequisites, configure your MCP servers in the provided mcp-config.json, and run the client. The client will connect to the configured MCP servers, expose their capabilities via the CLI, and allow you to invoke specific tools, chain tool calls, or perform code reviews and analyses using the connected MCP servers. You can switch between servers, view logs, and manage sessions from the interactive CLI.
How to install
Prerequisites:
- Python 3.10+ installed on your system
- Optional: UV (uv) for dependency management if you choose to follow the recommended setup
- Ollama installed if you plan to use local LLMs for tool invocation
Installation steps:
- Download the release package for zin-mcp-client from the GitHub Releases page and unzip it.
- Navigate to the client directory:
cd zin-mcp-client
- Install dependencies (preferred via uv):
# Install uv if you don't have it
curl -LsSf https://astral.sh/uv/install.sh | sh
# Create a venv (optional but recommended)
uv venv
source .venv/bin/activate # on Windows use .venv\Scripts\activate
# Install Python dependencies
uv pip install -r requirements.txt
- Alternatively, install with plain pip (not recommended but supported):
pip install -r requirements.txt
- Ensure you have a properly configured mcp-config.json in the zin-mcp-client directory. The sample structure mirrors the MCP server configuration format and points the client to run via uv as a managed server-like entrypoint.
- If you want to run optional local tooling (e.g., for Python dependencies or isolated environments), follow the uv venv workflow above to keep dependencies contained.
Additional notes
Tips and considerations:
- This client is designed to be used primarily with local MCP servers. Exposing it over a network may pose security risks; follow best practices for securing local services.
- The configuration relies on uv to manage the environment and run the client as if it were a server process. Adjust paths in mcp_config to reflect your actual installation directories.
- If you encounter dependency errors, using the uv-based environment (uv venv and uv pip install -r requirements.txt) is recommended over plain pip installations.
- The project emphasizes lightweight usage; for heavy feature sets or web UIs, consider other MCP tooling or dedicated server implementations.
- Ensure Ollama is running if you intend to leverage local LLM capabilities for tool invocation.
- When updating, review release notes for potential breaking changes, as stated in the project README.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
nerve
The Simple Agent Development Kit.
Remote
A type-safe solution to remote MCP communication, enabling effortless integration for centralized management of Model Context.
mcp-streamable-http
Example implementation of MCP Streamable HTTP client/server in Python and TypeScript.
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.