Get the FREE Ultimate OpenClaw Setup Guide →

meai-azure-ai-foundry -sample

.NET is also capable of building AI infused app really easily. This repository provides several sample apps using Microsoft.Extensions.AI, Azure AI Foundry, Foundry Local and MCP server/client.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio devkimchi-meai-azure-ai-foundry-mcp-sample dotnet run --project ./src/McpTodo.ServerApp

How to use

This MCP server sample provides a .NET-based MCP server/client workflow for MEAI with Azure AI Foundry, Foundry Local, and various LLM integrations (OpenAI, Azure OpenAI, GitHub Models, Anthropic, Google Vertex AI). The server component exposes an MCP-compliant interface that coordinates prompts, models, and memory across clients, while the client app(s) can exercise the same capabilities locally. You can run the MCP server, then run a client app in another terminal to interact with the server and exchange messages or tasks using the configured AI providers. Typical usage involves configuring secrets for the chosen AI providers and selecting the desired ChatClient in appsettings.json, then launching the server and client to establish end-to-end MCP communication.

How to install

Prerequisites

  • .NET 9 SDK
  • Git
  • (Optional) Foundry Local if you plan to test Foundry Local flows

Installation and setup

  1. Clone the repository git clone https://github.com/devkimchi/meai-azure-ai-foundry-mcp-sample.git cd meai-azure-ai-foundry-mcp-sample

  2. Restore and build the solution dotnet restore dotnet build

  3. Prepare MCP server configuration The MCP server is configured to run via the MCP server app project. No extra npm/pipx steps are required for this .NET sample.

  4. Run the MCP server dotnet run --project ./src/McpTodo.ServerApp

  5. In a separate terminal, run the MCP client app compatible with your server dotnet watch run --project ./src/mcpTodo.ClientApp

Prerequisites recap

  • Install .NET 9 SDK from the official site
  • Ensure network access to your chosen AI providers (OpenAI, Azure OpenAI, GitHub Models, Anthropic, Google Vertex AI)
  • Have API keys ready and configured via user secrets for the client project as shown in the README

Additional notes

Tips and notes:

  • The sample relies on user secrets to store API keys. Update ConnectionStrings entries (openai, foundry, anthropic, google) in the Meai.ClientApp and Foundry.ClientApp projects as needed.
  • To switch the LLM just change the ChatClient value in appsettings.json (e.g., openai, foundry, local, anthropic, google).
  • If you plan to use Foundry Local, ensure you have Foundry Local CLI installed and a model added to Foundry Local before running the Foundry-related samples.
  • For Azure AI Foundry integration, provide Endpoint and Key via user secrets as demonstrated in the README.
  • If you encounter compatibility issues, verify that the environment variables and secrets match the provider you intend to use and that network access to provider endpoints is allowed in your environment.

Related MCP Servers

Sponsor this space

Reach thousands of developers