Get the FREE Ultimate OpenClaw Setup Guide →

a2a -tutorial

A tutorial on how to use Model Context Protocol by Anthropic and Agent2Agent Protocol by Google

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio tsadoq-a2a-mcp-tutorial python -m crawler_server \
  --env SERPER_DEV_API_KEY="SerpAPI key (required for Google search scraping)"

How to use

This MCP tutorial demonstrates two separate MCP servers. The stock_retriever server exposes tools to look up a company symbol (optional) and fetch live stock data such as current price, daily high/low, opening price, and previous close. The crawler_service exposes a tool to perform Google-like searches and fetch readable text from web pages. Together, these servers illustrate a simple end-to-end MCP-enabled agent pipeline where agents can request data from external sources through standardized Tool interfaces. To use them, start both Python MCP servers in your environment so they listen for MCP Tool requests, then connect an MCP client or an agent that can discover, request, and chain Tools across these servers. Tools are exposed as endpoints you can invoke with prompts or programmatic calls within the MCP protocol, enabling agents to fetch stock information or scrape web content as part of a larger task flow.

How to install

Prerequisites:

  • Python 3.8+ installed on your system
  • Access keys for external APIs (FINNHUB_API_KEY and SERPER_DEV_API_KEY) if you plan to run the example locally
  • Basic familiarity with running Python modules as scripts

Step 1: Clone the repository

Step 2: Install dependencies

  • It is recommended to use a virtual environment. For example:
    • python -m venv venv
    • source venv/bin/activate (on macOS/Linux) or venv\Scripts\activate (on Windows)
  • Install required packages (adjust if a requirements file exists in the repo):
    • pip install -r requirements.txt

Step 3: Configure API keys

  • Set the environment variables for API keys before launching the servers:
    • export FINNHUB_API_KEY=your_finnhub_key (Linux/macOS)
    • set FINNHUB_API_KEY=your_finnhub_key (Windows CMD)
    • export SERPER_DEV_API_KEY=your_serper_dev_key (Linux/macOS)
    • set SERPER_DEV_API_KEY=your_serper_dev_key (Windows CMD)

Step 4: Run the MCP servers

  • Start the stock retriever server:
    • python -m stock_server
  • Start the crawler server:
    • python -m crawler_server

Step 5: Verify operation

  • Ensure both servers are listening on their configured ports and that the MCP client can discover and invoke Tools from each server.

Notes:

  • If your setup uses uvx (Python project management) or a dedicated runtime, adjust the commands accordingly. The provided configuration uses the Python module execution approach for simplicity.

Additional notes

Tips and common issues:

  • Ensure environment variables are exported in the same shell session where you start the servers.
  • If a server fails to start, check mutual port usage and verify that the module names (stock_server, crawler_server) match the actual Python modules in your repo.
  • MCP Tool discovery relies on consistent Tool metadata emitted by the servers; confirm that each server advertises its Tools and capabilities properly.
  • For production usage, consider containerizing the servers (Docker) and securing API keys via secrets management.
  • Debug by directly invoking the modules in isolation (e.g., python -m stock_server --help) to confirm API surface before MCP integration.

Related MCP Servers

Sponsor this space

Reach thousands of developers