hk-transportation
This is an MCP server that provides access to transportation related data in Hong Kong through a FastMCP interface.
claude mcp add --transport stdio hkopenai-hk-transportation-mcp-server python server.py
How to use
This MCP server provides access to Hong Kong transportation data through a FastMCP interface. It exposes endpoints for three main data groups: daily passenger traffic statistics at HK control points (with date range filtering and visitor-type breakdowns), real-time arrival data for Kowloon Motor Bus (KMB) and Long Win Bus (LWB) routes (with language options such as English, Traditional Chinese, and Simplified Chinese), and current waiting times at land boundary control points (also language-filterable). Clients can query these datasets via the MCP protocol exposed by the server once it is running. The server offers a standard stdio interface by default and can also run in SSE mode for streaming capabilities, useful for live-updating data feeds.
Typical usage involves starting the server, then issuing MCP queries to retrieve the specific data you need. For example, you can request daily passenger statistics for a date range, fetch all KMB/LWB routes in a chosen language, or obtain current waiting times at Hong Kong land boundary points. If you are integrating with Cline or other MCP clients, you can configure stdio-based connections or use the provided SSE-enabled mode to receive updates as data changes over time.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- Access to the project repository and its requirements.txt file
Installation steps:
-
Clone the repository: git clone https://github.com/hkopenai/hk-transportation-mcp-server.git cd hk-transportation-mcp-server
-
Create and activate a Python virtual environment (optional but recommended): python -m venv venv source venv/bin/activate # macOS/Linux .\venv\Scripts\activate # Windows
-
Install Python dependencies: pip install -r requirements.txt
-
Run the server (default stdio mode): python server.py
-
(Optional) Run in SSE mode (port 8000): python server.py --sse
-
Confirm the server is running by checking logs or attempting a test MCP query against the running process.
Additional notes
Tips and troubleshooting:
- If you plan to use the SSE streaming mode, ensure network policies allow long-lived connections to port 8000 as configured by the server.
- When integrating with Cline, you can use an stdio connection or switch to an SSE-based feed if you need live updates.
- No environment variables appear to be required for basic operation, but you can extend or override configuration via environment variables if the server supports it in a future update.
- If you modify data sources or endpoints, ensure the dependencies in requirements.txt are up to date and that any external data sources remain accessible.
- For debugging, run with verbose logging if available (check server.py options or logging configuration) to surface query errors and data source issues.
Related MCP Servers
datagouv
Official data.gouv.fr Model Context Protocol (MCP) server that allows AI chatbots to search, explore, and analyze datasets from the French national Open Data platform, directly through conversation.
sympy
A MCP server for symbolic manipulation of mathematical expressions
okta
The Okta MCP Server is a groundbreaking tool built by the team at Fctr that enables AI models to interact directly with your Okta environment using the Model Context Protocol (MCP). Built specifically for IAM engineers, security teams, and Okta administrators, it implements the MCP specification to help work with Okta enitities
mcp-manager
CLI tool for managing Model Context Protocol (MCP) servers in one place & using them across them different clients
linkedin_mcp
A powerful Model Context Protocol server for LinkedIn interactions that enables AI assistants to search for jobs, generate resumes and cover letters, and manage job applications programmatically.
metacoder
Wrapper for multiple AI coders (goose, claude code, codex)