azure-ai-agent-demos
Azure AI Foundry Agent service demos and resources
claude mcp add --transport stdio jcentner-azure-ai-agent-demos python -m azure_ai_agent_demos.server \ --env MCP_PORT="Port for the MCP server to listen on (default often 8000 or 8080)" \ --env MCP_BIND_ADDRESS="Bind address for the MCP server (e.g., 0.0.0.0 for all interfaces)"
How to use
This repository provides demonstrations of MCP-enabled Azure AI Foundry Agents. The demos show how to run a local MCP server, inspect it with an MCP inspector client, and connect an agent to it with authentication at runtime. Core components include a local MCP server demo, an inspector workflow, and an agent that can utilize tools such as grounding with Bing Search and integration with OpenAPI schemas. Use the mcp_local_server_agent demo to experiment with a self-contained MCP server and an agent that talks to it, enabling you to see the end-to-end flow from server startup to agent tool usage.
Once the MCP server is running, you can connect an agent to it and issue queries that trigger tool use like Bing-backed grounding or OpenAPI-based operations. The examples are designed to illustrate how an agent can discover tools, invoke them through the MCP protocol, and receive structured responses that integrate with the agent's reasoning loop. This makes it easier to prototype agent behaviors in a controlled environment before moving to enterprise-scale deployments.
How to install
Prerequisites:
- Python 3.8+ and virtual environment support
- Access to a shell/terminal
Installation steps:
- Clone or download the repository.
- Create and activate a virtual environment: python -m venv venv source venv/bin/activate # On Windows use: venv\Scripts\activate
- Install dependencies (if a requirements.txt is provided): pip install -r requirements.txt
- Run the MCP server (example):
python -m azure_ai_agent_demos.server
If the module name differs, adjust accordingly to the actual server entry point
- Run the agent (example): python -m azure_ai_agent_demos.agent --server http://localhost:port
- If needed, configure environment variables for authentication and tool endpoints as described in the repo's docs.
Additional notes
This is labeled as a work-in-progress repo. Replace placeholder module paths (azure_ai_agent_demos.server, azure_ai_agent_demos.agent) with the actual server/agent entry points if they differ in your setup. Ensure network connectivity to any external services used by tools (e.g., Bing Search) and make sure API keys or credentials are provided via environment variables. If you encounter port conflicts, change MCP_PORT in your environment or command line and update the client configuration accordingly.
Related MCP Servers
activepieces
AI Agents & MCPs & AI Workflow Automation • (~400 MCP servers for AI agents) • AI Automation / AI Agent with MCPs • AI Workflows & AI Agents • MCPs for AI Agents
google_ads_mcp
The Google Ads MCP Server is an implementation of the Model Context Protocol (MCP) that enables Large Language Models (LLMs), such as Gemini, to interact directly with the Google Ads API.
mcp
🤖 Taskade MCP · Official MCP server and OpenAPI to MCP codegen. Build AI agent tools from any OpenAPI API and connect to Claude, Cursor, and more.
mcp -js
MCP server that exposes YepCode processes as callable tools for AI platforms. Securely connect AI assistants to your YepCode workflows, APIs, and automations.
mcpresso
TypeScript framework to build robust, agent-ready MCP servers around your APIs.
AI-SOC-Agent
Blackhat 2025 presentation and codebase: AI SOC agent & MCP server for automated security investigation, alert triage, and incident response. Integrates with ELK, IRIS, and other platforms.