openai -agent-dotnet
Sample to create an AI Agent using OpenAI models with any MCP server running on Azure Container Apps
claude mcp add --transport stdio azure-samples-openai-mcp-agent-dotnet dotnet openai-mcp-agent-dotnet.dll \ --env USE_LOGIN="true or false (default true; controls ACA login protection for client app)" \ --env GPT_MODEL_NAME="azure-openai model name, e.g., gpt-5-mini" \ --env AOAI_USE_API_KEY="true to enable API key authentication for Azure OpenAI, false for default keyless" \ --env ENABLE_DEVELOPMENT_MODE="Both | Client | None (for more verbose logs in development)"
How to use
This MCP server is a .NET-based agent for interacting with an MCP-enabled TypeScript server that runs the to-do workflow. The agent communicates with the remote MCP server to manage AI-assisted to-do tasks via Azure OpenAI, while the client UI (Blazor) handles authentication and user interaction. To use, ensure the MCP client and server are deployed in a compatible environment (e.g., Azure Container Apps) and that the client is configured to talk to the server. The agent supports adjusting model selection (default gpt-5-mini) and authentication modes. Typical usage involves starting the server locally or deploying to your environment, then using the client app to issue prompts like listing to-dos, adding items, or marking tasks complete; the MCP server handles orchestration with the TypeScript server backend and Azure OpenAI.
How to install
Prerequisites:
- .NET 9 SDK
- Visual Studio Code with C# Dev Kit (optional but recommended)
- Node.js LTS (for any frontend tooling)
- Docker Desktop or Podman Desktop
- Azure CLI (az) and an Azure subscription
-
Clone the repository or download the sample package.
-
Restore and build the .NET agent project:
# replace with actual project path if different
cd path/to/openai-mcp-agent-dotnet
# restore dependencies
dotnet restore
# build the project
dotnet build -c Release
- Run the agent locally (assuming the agent dll is built):
dotnet openai-mcp-agent-dotnet.dll
-
Optional: containerize or deploy to Azure following the guidance in the readme, using azd or your preferred deployment method. If deploying via azd, ensure environment variables for login, model name, and authentication are set as described in the prerequisites and notes.
-
Verify connectivity:
- Ensure the MCP client UI is deployed and can authenticate with ACA or your chosen auth method.
- Confirm the client and server communicate with the remote MCP server (TypeScript) and that Azure OpenAI is accessible with the configured model.
Additional notes
Notes and tips:
- By default, the MCP client is protected by ACA built-in authentication; you can disable this by setting USE_LOGIN to false during deployment.
- The agent uses gpt-5-mini by default; you can override with GPT_MODEL_NAME in your environment.
- If you enable API key authentication for Azure OpenAI (AOAI_USE_API_KEY true), ensure you provide the necessary credentials in your environment.
- Development mode settings (ENABLE_DEVELOPMENT_MODE) can help surface more verbose logs for troubleshooting.
- When deploying to Azure, the MCP server and client should be in the same resource group or properly networked to allow secure communication.
- Review the remote MCP server (TypeScript) docs for any endpoint access requirements and model deployment details.
Related MCP Servers
context-space
Ultimate Context Engineering Infrastructure, starting from MCPs and Integrations
multimodal-agents-course
An MCP Multimodal AI Agent with eyes and ears!
python -client
支持查询主流agent框架技术文档的MCP server(支持stdio和sse两种传输协议), 支持 langchain、llama-index、autogen、agno、openai-agents-sdk、mcp-doc、camel-ai 和 crew-ai
n8n-workflow-builder
MCP server that allow LLM in agent mode builds n8n workflows for you
zerodha
Zerodha MCP Server & Client - AI Agent (w/Agno & w/Google ADK)
AIFoundry Connector-FabricGraphQL
MCP Client and Server apps to demo integration of Azure OpenAI-based AI agent with a Data Warehouse, exposed through GraphQL in Microsoft Fabric.