Get the FREE Ultimate OpenClaw Setup Guide →

ai-workshop

Building GenAI Apps in C#: AI Templates, GitHub Models, Azure OpenAI & More

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio dotnet-presentations-ai-workshop docker run -i ai-workshop-server \
  --env ACP_ENV="Set appropriate environment (e.g., DEVELOPMENT or PRODUCTION)" \
  --env MCP_LOG_LEVEL="info"

How to use

This MCP server hosts the Model Context Protocol (MCP) workflow for the .NET AI Workshop, focusing on the MCP parts 7-9 (Model Context Protocol servers). It provides a runnable server image that orchestrates the MCP-enabled tooling, enabling you to test how models, vectors, and prompts are combined in a .NET AI scenario. The server exposes a standardized interface for loading an MCP-enabled model provider, a vector store, and a prompt context so you can experiment with GitHub Models in development and Azure OpenAI in production. Use the included tooling to validate how contextual information is retrieved from a vector store and fed into a model to generate informed responses for an AI-assisted workflow within the workshop.

To interact with the MCP server, run the container (as defined in the mcp_config) and point your MCP client or development scripts at the server endpoint. Typical usage involves configuring the MCP client to request a context-rich response by supplying a user query along with a set of context documents or embeddings. The server coordinates the flow: ingesting inputs, retrieving relevant vectors from the vector store, and returning a contextually aware AI-generated response. This setup mirrors the lab’s MCP flow from development (GitHub Models) to production (Azure OpenAI), enabling you to observe how the system handles model switching, vector search, and prompt augmentation.

How to install

Prerequisites:

  • Docker Desktop or Podman installed and running
  • Access to the ai-workshop MCP server image (or build pipeline to produce ai-workshop-server docker image)
  • Sufficient permissions to run containers on your host

Step-by-step:

  1. Verify prerequisites

    • docker version
    • network access to any required model endpoints
  2. Pull or build the MCP server image

    • If using a prebuilt image: docker pull ai-workshop-server
    • If building locally: docker build -t ai-workshop-server . (from the directory containing the Dockerfile)
  3. Run the MCP server (as defined in mcp_config)

    • docker run -i ai-workshop-server
    • Ensure environment variables are supplied if needed (see mcp_config.env)
  4. Validate the server startup

    • Check container logs for readiness messages indicating MCP server is listening on the expected port (e.g., 8080 or 5000)
    • Confirm network accessibility from your MCP client
  5. Connect an MCP client

    • Point your MCP client to the server’s endpoint (e.g., http://localhost:8080/mcp)
    • Send a test MCP request to verify a contextual response flow
  6. Optional: configure production endpoints

    • If integrating with Azure OpenAI, ensure route and credentials are supplied through environment variables or configuration files as required by your deployment environment

Additional notes

Tips and common issues:

  • If the container fails to start, inspect logs for missing dependencies or misconfigured environment variables.
  • Ensure the vector DB (e.g., Qdrant) endpoint is reachable if the MCP server expects a local or remote vector store.
  • When switching between development (GitHub Models) and production (Azure OpenAI), confirm that the MCP provider configuration points to the correct model endpoint and that authentication tokens or API keys are provisioned.
  • Use environment variables to toggle between Development and Production modes without rebuilding the image.
  • Verify compatibility of .NET AI Workshop prerequisites with your local environment if you attempt to run any integration tests outside the container.
  • If you need to run locally, you can adapt the Docker command to mount volumes for embeddings or product data during development.

Related MCP Servers

Sponsor this space

Reach thousands of developers