Get the FREE Ultimate OpenClaw Setup Guide →

azure-ai-agent-demos

Azure AI Foundry Agent service demos and resources

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jcentner-azure-ai-agent-demos python -m azure_ai_agent_demos.server \
  --env MCP_PORT="Port for the MCP server to listen on (default often 8000 or 8080)" \
  --env MCP_BIND_ADDRESS="Bind address for the MCP server (e.g., 0.0.0.0 for all interfaces)"

How to use

This repository provides demonstrations of MCP-enabled Azure AI Foundry Agents. The demos show how to run a local MCP server, inspect it with an MCP inspector client, and connect an agent to it with authentication at runtime. Core components include a local MCP server demo, an inspector workflow, and an agent that can utilize tools such as grounding with Bing Search and integration with OpenAPI schemas. Use the mcp_local_server_agent demo to experiment with a self-contained MCP server and an agent that talks to it, enabling you to see the end-to-end flow from server startup to agent tool usage.

Once the MCP server is running, you can connect an agent to it and issue queries that trigger tool use like Bing-backed grounding or OpenAPI-based operations. The examples are designed to illustrate how an agent can discover tools, invoke them through the MCP protocol, and receive structured responses that integrate with the agent's reasoning loop. This makes it easier to prototype agent behaviors in a controlled environment before moving to enterprise-scale deployments.

How to install

Prerequisites:

  • Python 3.8+ and virtual environment support
  • Access to a shell/terminal

Installation steps:

  1. Clone or download the repository.
  2. Create and activate a virtual environment: python -m venv venv source venv/bin/activate # On Windows use: venv\Scripts\activate
  3. Install dependencies (if a requirements.txt is provided): pip install -r requirements.txt
  4. Run the MCP server (example): python -m azure_ai_agent_demos.server

    If the module name differs, adjust accordingly to the actual server entry point

  5. Run the agent (example): python -m azure_ai_agent_demos.agent --server http://localhost:port
  6. If needed, configure environment variables for authentication and tool endpoints as described in the repo's docs.

Additional notes

This is labeled as a work-in-progress repo. Replace placeholder module paths (azure_ai_agent_demos.server, azure_ai_agent_demos.agent) with the actual server/agent entry points if they differ in your setup. Ensure network connectivity to any external services used by tools (e.g., Bing Search) and make sure API keys or credentials are provided via environment variables. If you encounter port conflicts, change MCP_PORT in your environment or command line and update the client configuration accordingly.

Related MCP Servers

Sponsor this space

Reach thousands of developers