azure-ai-travel-agents
A robust enterprise application sample (deployed on ACA) that leverages MCP and multiple AI agents orchestrated by Langchain.js, Llamaindex.TS and Microsoft Agent Framework.
claude mcp add --transport stdio azure-samples-azure-ai-travel-agents python -m echo_ping
How to use
The azure-ai-travel-agents MCP suite provides multiple specialized MCP servers that enable an orchestration workflow for travel planning. Each server exposes a specific capability as a tool that can be invoked by MCP clients: Customer Query Understanding extracts user preferences from natural language inputs, Destination Recommendation suggests suitable travel options based on those preferences, Itinerary Planning generates a detailed travel plan, and Echo Ping serves as a simple round-trip MCP server for testing connectivity. By combining these servers with LangChain.js, LlamaIndex.TS, or the Microsoft Agent Framework, the application coordinates multiple agents to respond to user queries with coherent itineraries and destination suggestions. Use the MCP clients to route requests to the appropriate server and chain responses across servers to build end-to-end travel experiences.
How to install
Prerequisites:
- Docker (for local previews and the Docker Model Runner workflow)
- Node.js v22+ (for the Node.js MCP server variant)
- Python 3.9+ (for Python MCP servers)
- Java JDK 11+ (for Java-based MCP servers)
- .NET SDK (if you plan to run the .NET MCP server)
Step-by-step installation:
- Install prerequisites on your machine according to your platform (Windows/macOS/Linux).
- Clone the repository:
git clone https://github.com/Azure-Samples/azure-ai-travel-agents.git
cd azure-ai-travel-agents
- Install Node.js MCP variant (Destination Recommendation):
cd path/to/destination-recommendation
npm install
# or if using pnpm/yarn, as appropriate
npm run build
- Install Python MCP variants (Customer Query Understanding and Echo Ping):
pip install -r requirements.txt
- Install Java MCP variant (Itinerary Planning):
# If a build is provided
mvn clean package
- Run MCP servers (examples):
# Node.js server
node dist/server.js
# Python servers
python -m customer_query_understanding
python -m echo_ping
# Java server
java -jar itinerary-planning.jar
- Verify MCP client connectivity and ensure the servers appear in the MCP registry. For local testing, you can use MCP clients to call each server and confirm responses.
Additional notes
Notes and tips:
- The repository supports multi-language MCP servers; ensure each server's dependencies are installed according to its language.
- If you are previewing locally, Docker Model Runner can help emulate the MCP environment and containerized execution.
- Review the adventure.config.json and llms.txt files to understand how LLMs are guided to utilize these MCP tools effectively.
- If you encounter port conflicts, adjust the server port mappings in the run commands or container configurations.
- Ensure environment variables for any API keys or credentials are set as needed by each server (e.g., MCP_AUTH_TOKEN, API_KEYS).
Related MCP Servers
mastra
From the team behind Gatsby, Mastra is a framework for building AI-powered applications and agents with a modern TypeScript stack.
voltagent
AI Agent Engineering Platform built on an Open Source TypeScript AI Agent Framework
microsandbox
opensource self-hosted sandboxes for ai agents
mcp-context-forge
An AI Gateway, registry, and proxy that sits in front of any MCP, A2A, or REST/gRPC APIs, exposing a unified endpoint with centralized discovery, guardrails and management. Optimizes Agent & Tool calling, and supports plugins.
mcpcan
MCPCAN is a centralized management platform for MCP services. It deploys each MCP service using a container deployment method. The platform supports container monitoring and MCP service token verification, solving security risks and enabling rapid deployment of MCP services. It uses SSE, STDIO, and STREAMABLEHTTP access protocols to deploy MCP。
sdk-typescript
A model-driven approach to building AI agents in just a few lines of code.