MCP-Chinese-Getting-Started-Guide
Model Context Protocol(MCP) 编程极速入门
claude mcp add --transport stdio liaokongvfx-mcp-chinese-getting-started-guide uv run web_search.py
How to use
This MCP server implements a web search tool for use with the MCP ecosystem. It exposes a single tool named web_search that, when invoked, queries an external search API and returns a summarized result set suitable for consumption by clients and LLMs. The server is designed to be run via the uv Python tool manager and can be explored interactively with the MCP Inspector to validate tool availability and behavior. To use it from a client, you start the server process (for example by running uv run web_search.py) and connect an MCP client session over stdio, which allows you to list available tools and call web_search with a query.
Once connected, you can list tools to confirm web_search is registered, then call the tool with a dictionary of arguments, e.g. {"query": "杭州天气"}. The tool will perform an internet search through the external API and return a concatenated summary of results. This setup demonstrates how MCP servers can wire external data sources and services into a standardized tool interface for LLMs and other clients.
How to install
Prerequisites:
- Python 3.11 or compatible environment
- uv (the Python project manager) – installed via MCP workflow
- Access to an internet-enabled environment to reach the web-search API
Step-by-step:
-
Install Python 3.11+ on your system.
-
Create a project directory and initialize it with uv as shown: uv init mcp_getting_started cd mcp_getting_started
-
Create and activate a virtual environment using uv: uv venv source .venv/bin/activate # on Unix-like systems .venv\Scripts\activate.bat # on Windows
-
Install dependencies for MCP tooling including the MCP CLI and HTTP client: uv add "mcp[cli]" httpx openai
-
Save the provided web_search.py script (or create your own) in the project root.
-
Run the MCP server using uv as configured in mcp_config (or follow the example below to test with Inspector): uv run web_search.py
-
Optionally, use the MCP Inspector to debug and visualize the server: npx -y @modelcontextprotocol/inspector uv run web_search.py
Notes:
- Ensure you have an API key for the web-search API if required by your chosen tool.
- The exact Python environment may vary; adapt the activate commands to your shell.
Additional notes
Tips and common issues:
- If the Inspector cannot connect, verify that the port used by the inspector is not blocked and that the server is running with the expected transport (stdio).
- When using APIs that require authentication, store keys in environment variables or .env and load them in your script as shown in the example.
- The web_search tool’s behavior depends on the external API; expect potential cost and rate limits depending on your provider.
- You can adapt the tool name and arguments by editing the web_search.py file; MCP uses the function name as the tool identifier and the function signature as the argument schema.
- If you switch transport (e.g., from stdio to SSE), ensure the client and server configurations align with MCP’s transport layer expectations.
Related MCP Servers
ida-pro
AI-powered reverse engineering assistant that bridges IDA Pro with language models through MCP.
zenfeed
Make RSS 📰 great again with AI 🧠✨!! [网页监控工具-预定送会员:https://waitlist.dingding.glidea.app]
mcp-graphql
Model Context Protocol server for GraphQL
git
An MCP (Model Context Protocol) server enabling LLMs and AI agents to interact with Git repositories. Provides tools for comprehensive Git operations including clone, commit, branch, diff, log, status, push, pull, merge, rebase, worktree, tag management, and more, via the MCP standard. STDIO & HTTP.
ContextPods
Model Context Protocol management suite/factory. An MCP that can generate and manage other local MCPs in multiple languages. Uses the official SDKs for code gen.
mcp-gemini
This project provides a dedicated MCP (Model Context Protocol) server that wraps the @google/genai SDK. It exposes Google's Gemini model capabilities as standard MCP tools, allowing other LLMs (like Cline) or MCP-compatible systems to leverage Gemini's features as a backend workhorse.