mcp-llama3-client
A client for the MCP Flight Search service using Ollama and Llama 3.2 to provide a user-friendly flight search interface with Model Context Protocol tools
claude mcp add --transport stdio arjunprabhulal-mcp-llama3-client python mcp_flight_client.py
How to use
This MCP Llama3 Flight Search Client provides a UI-driven interface to search for flights by querying the MCP Flight Search service. It leverages Ollama to run Llama 3.2 locally and uses MCP tooling to construct and send flight search queries through the MCP backend. You can run the client to interact with the flight search backend and rely on the included prompt templates to generate accurate, context-aware search queries. The client acts as a bridge between the local Llama model, the MCP flight search service, and the end-user, enabling seamless flight discovery using natural language prompts.
To use it, start by ensuring the MCP Flight Search service is running (as documented by the MCP Flight Search project). Then install the Python dependencies for the client, pull and run the local Llama 3.2 model with Ollama, and execute the client script. The client will prompt you for search parameters (e.g., origin, destination, dates) or allow you to input a query in natural language. The internal tools, such as search_flights_tool, handle the MCP interaction and present results returned by the MCP server.
How to install
Prerequisites:
- Python 3.9+ and pip
- Access to the MCP Flight Search service (per its repository)
- Ollama installed locally with the Llama 3.2 model available
- Clone this repository and navigate into it
- Create and activate a Python virtual environment (optional but recommended)
- Install required Python packages
- pip install -r requirements.txt
- If requirements.txt is not available, install manually: llama-index, llama-index-llms-ollama, llama-index-tools-mcp, langchain-community
- Install Ollama and Llama 3.2
- Download Ollama from https://ollama.com/download and install
- Ensure Ollama is running, then pull the model: ollama pull llama3.2
- Ensure the MCP Flight Search service is running as per its documentation (from the linked repository)
- Run the client
- python mcp_flight_client.py
Notes:
- If you need to run the MCP Flight Search server separately, follow its startup instructions: mcp-flight-search --connection_type http or python -m mcp_flight_search.server --connection_type http
Additional notes
Environment tips:
- Ensure Ollama is running and the llama3.2 model is loaded before starting the client.
- The client expects the MCP Flight Search service to be accessible over HTTP; confirm the base URL if you deploy the backend differently.
- If you encounter model context issues, verify that llama-index-tools-mcp and related MCP tool integrations are correctly installed and that the prompt templates align with the flight search schema.
- Review prompt_templates.py to customize search prompts or tool usage for different flight data providers.
- The project is designed to work offline for the LLM portion via Ollama, which helps preserve privacy and reduces cloud dependency.
Related MCP Servers
mcp-for-beginners
This open-source curriculum introduces the fundamentals of Model Context Protocol (MCP) through real-world, cross-language examples in .NET, Java, TypeScript, JavaScript, Rust and Python. Designed for developers, it focuses on practical techniques for building modular, scalable, and secure AI workflows from session setup to service orchestration.
minima
On-premises conversational RAG with configurable containers
zin -client
MCP Client which serves as bridge between mcp servers and local LLMs running on Ollama, Created for MCP Servers Developed by Me, However other MCP Servers may run as well
mxcp
Model eXecution + Context Protocol: Enterprise-Grade Data-to-AI Infrastructure
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.