Get the FREE Ultimate OpenClaw Setup Guide →

Agentic-RAG-with

Agentic RAG with MCP Server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio ashishpatel26-agentic-rag-with-mcp-server python server.py \
  --env GEMINI_API_KEY="your Gemini API key here" \
  --env OPENAI_MODEL_NAME="your-openai-model-name-here"

How to use

Agentic RAG with MCP Server provides an MCP-backed service and client that expose a set of intelligent tools to enhance retrieval-augmented generation workflows. The server hosts tools such as entity extraction, query refinement, time retrieval, and relevance checking, enabling you to extract entities from queries, refine user questions for better retrieval, fetch the current time with a prefix, and filter results by relevance. The client demonstrates how to connect to this MCP server, list available tools, and invoke them with custom arguments. You can combine OpenAI and Gemini capabilities with MCP tools to build more capable RAG pipelines.

To use it, start the MCP server and run the client to explore the tools:

  • Start the server: python server.py
  • Run the client: python mcp-client.py The client will list tools like get_time_with_prefix, extract_entities_tool, refine_query_tool, and check_relevance, and you can call any tool with appropriate arguments. For example, you can extract entities from a user query, refine the query for better retrieval results, and then check the relevance of retrieved chunks using an LLM, all in tandem with your preferred model (OpenAI or Gemini) through the MCP interface.

How to install

Prerequisites:

  • Python 3.9 or higher
  • Internet access to install dependencies
  • Access tokens/keys for OpenAI and Gemini (as configured in .env)

Step-by-step installation:

  1. Clone the repository git clone https://github.com/ashishpatel26/Agentic-RAG-with-MCP-Server.git

  2. Navigate into the project directory cd Agentic-RAG-with-MCP-Server

  3. Create and activate a Python virtual environment (optional but recommended)

    macOS/Linux

    python3 -m venv venv source venv/bin/activate

    Windows

    python -m venv venv\Scripts\activate

  4. Install dependencies pip install -r requirements.txt

  5. Create a .env file from the sample and configure API keys cp .env.sample .env

    Edit .env to set OPENAI_MODEL_NAME and GEMINI_API_KEY

  6. Run the server to verify it starts correctly python server.py

  7. In a separate terminal, run the MCP client to test interactions python mcp-client.py

Additional notes

Tips and notes:

  • Ensure OPENAI_MODEL_NAME and GEMINI_API_KEY are set in the .env file before starting the server.
  • The server uses the FastMCP class from the mcp library; ensure your Python environment has compatible dependencies.
  • If you encounter connection issues with the MCP client, verify that server.py is running and that the environment variables are correctly loaded (you may need to install python-dotenv or load env vars manually).
  • The available tools include: get_time_with_prefix (returns current date/time), extract_entities_tool (uses OpenAI to extract entities from a query), refine_query_tool (OpenAI-powered query refinement), and check_relevance (filters content by chunk relevance using an LLM).
  • When integrating with Gemini, ensure GEMINI_API_KEY is valid and that your calls are permitted by your Gemini setup. Adjust OpenAI/Gemini usage as needed in mcp-client.py commands.
  • If you modify tooling, update mcp-client.py to reflect new tool names or arguments.

Related MCP Servers

Sponsor this space

Reach thousands of developers