Get the FREE Ultimate OpenClaw Setup Guide →

mcp-llama3-client

A client for the MCP Flight Search service using Ollama and Llama 3.2 to provide a user-friendly flight search interface with Model Context Protocol tools

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio arjunprabhulal-mcp-llama3-client python mcp_flight_client.py

How to use

This MCP Llama3 Flight Search Client provides a UI-driven interface to search for flights by querying the MCP Flight Search service. It leverages Ollama to run Llama 3.2 locally and uses MCP tooling to construct and send flight search queries through the MCP backend. You can run the client to interact with the flight search backend and rely on the included prompt templates to generate accurate, context-aware search queries. The client acts as a bridge between the local Llama model, the MCP flight search service, and the end-user, enabling seamless flight discovery using natural language prompts.

To use it, start by ensuring the MCP Flight Search service is running (as documented by the MCP Flight Search project). Then install the Python dependencies for the client, pull and run the local Llama 3.2 model with Ollama, and execute the client script. The client will prompt you for search parameters (e.g., origin, destination, dates) or allow you to input a query in natural language. The internal tools, such as search_flights_tool, handle the MCP interaction and present results returned by the MCP server.

How to install

Prerequisites:

  • Python 3.9+ and pip
  • Access to the MCP Flight Search service (per its repository)
  • Ollama installed locally with the Llama 3.2 model available
  1. Clone this repository and navigate into it
  2. Create and activate a Python virtual environment (optional but recommended)
  3. Install required Python packages
    • pip install -r requirements.txt
    • If requirements.txt is not available, install manually: llama-index, llama-index-llms-ollama, llama-index-tools-mcp, langchain-community
  4. Install Ollama and Llama 3.2
  5. Ensure the MCP Flight Search service is running as per its documentation (from the linked repository)
  6. Run the client
    • python mcp_flight_client.py

Notes:

  • If you need to run the MCP Flight Search server separately, follow its startup instructions: mcp-flight-search --connection_type http or python -m mcp_flight_search.server --connection_type http

Additional notes

Environment tips:

  • Ensure Ollama is running and the llama3.2 model is loaded before starting the client.
  • The client expects the MCP Flight Search service to be accessible over HTTP; confirm the base URL if you deploy the backend differently.
  • If you encounter model context issues, verify that llama-index-tools-mcp and related MCP tool integrations are correctly installed and that the prompt templates align with the flight search schema.
  • Review prompt_templates.py to customize search prompts or tool usage for different flight data providers.
  • The project is designed to work offline for the LLM portion via Ollama, which helps preserve privacy and reduces cloud dependency.

Related MCP Servers

Sponsor this space

Reach thousands of developers