Get the FREE Ultimate OpenClaw Setup Guide →

quick -example

Short and sweet example MCP server / client implementation for Tools, Resources and Prompts.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio alucek-quick-mcp-example python mcp_server.py \
  --env PYTHONUNBUFFERED="1"

How to use

This MCP server implements a simple knowledgebase chatbot flow that demonstrates the core MCP concepts: tools, resources, and prompts. It exposes a minimal set of capabilities to allow an LLM to query a vector database (RAG-style retrieval), select predefined resources to add context, and invoke standard prompts for structured analytical workflows. Clients can connect to this server, discover available tools and resources, and drive conversations through templated prompts. The included CLI client (client.py) provides a sample interaction path to exercise the server's functionality end-to-end.

How to install

Prerequisites:

  • Python (recommended 3.10+)
  • A development environment (virtualenv or uv) as described below
  • Access to a terminal or command prompt

Step-by-step installation:

  1. Clone the repository
git clone https://github.com/ALucek/quick-mcp-example.git
cd quick-mcp-example
  1. Create and activate a virtual environment using uv (recommended)
uv venv
source .venv/bin/activate  # macOS/Linux
# OR
.venv\Scripts\activate     # Windows
  1. Install dependencies (via uv sync)
uv sync
  1. Run the example server/client
python client.py mcp_server.py

Notes:

  • The setup references a simple in-repo mcp_server.py and client.py. The server mimics an MCP server exposing tools, resources, and prompts for a basic knowledgebase chatbot workflow.
  • If you prefer a different Python environment, you can also run the server directly with Python from a standard venv, e.g. python mcp_server.py, provided you install the project dependencies beforehand.

Additional notes

Tips and common considerations:

  • If you modify tools/resources/prompts, ensure their schemas remain valid JSON and align with the LLM's expectations.
  • For large vector backends, ensure your embeddings/database are up and accessible from your runtime environment.
  • Use PYTHONUNBUFFERED=1 to ensure real-time logs when running in certain shells.
  • If your client cannot connect, verify network access and that the server script path is correct relative to your working directory.

Related MCP Servers

Sponsor this space

Reach thousands of developers