Get the FREE Ultimate OpenClaw Setup Guide →

snippy

🧩 Build AI-powered MCP Tools with Azure Functions, Durable Agents & Cosmos vector search. Features orchestrated multi-agent workflows using OpenAI.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio azure-samples-snippy docker run -i azure-samples/snippy

How to use

Snippy exposes MCP-compatible tools as Azure Functions, enabling AI assistants such as GitHub Copilot to discover and invoke them. The server acts as a bridge between your code snippet repository and the MCP tooling surface, providing capabilities to save, retrieve, and analyze code snippets, generate style guides, and create comprehensive documentation. With MCP integration, tools like save_snippet, get_snippet, code_style, deep_wiki, and generate_comprehensive_documentation become discoverable endpoints that an AI agent can call to perform tasks such as indexing snippets, retrieving them by name, producing language-specific style guides, and orchestrating multi-agent workflows to produce deeper documentation artifacts.

How to install

Prerequisites:

  • An active Azure subscription (for full deployment options) or a local development environment for emulation
  • Docker installed and running
  • azd (Azure Developer CLI) installed for infrastructure deployment
  • Python 3.11 and Node.js 18+ if you plan to run local functions or emulators
  1. Prepare your environment
  • Install required tools (Docker, azd, Python, Node.js)
  • Authenticate with Azure if you intend to deploy: az login
  1. Obtain the Snippy repository
  1. Initialize the project with azd
  • If using the provided azd template, initialize the project: azd init --template Azure-Samples/snippy
  • Authenticate for the deployment environment: azd auth login
  1. Deploy locally or to Azure
  • To deploy locally using the Docker emulators and azd: azd up
  • To deploy to Azure (cloud): azd up
  1. Verify MCP endpoint
  • After deployment, note the Function App URL and the MCP endpoint surfaced by azd. Use these endpoints to interact with the MCP tools (save_snippet, get_snippet, code_style, deep_wiki, generate_comprehensive_documentation).

Notes:

  • For local development, you can run and test using Docker-based emulators before deploying to Azure.
  • The exact startup command for the MCP-enabled function app may vary by environment; the Docker-based run command example shown in mcp_config can be adjusted to your local or CI/CD setup.

Additional notes

Tips and considerations:

  • Security: Snippy uses secure service-to-service authentication with a managed identity setup in the deployment. In production, restrict inbound traffic using Private Endpoints and VNet integration.
  • Dependencies: Ensure your environment has Python 3.11 and Node.js 18+ if you intend to run function runtimes locally. Azure OpenAI requires appropriate region support and quotas.
  • MCP tooling: Tools exposed include save_snippet (index and save snippet with vector embeddings), get_snippet (retrieve by name), code_style (generate code style guides), deep_wiki (create wiki docs from code), and generate_comprehensive_documentation (orchestrates multi-agent workflows for deep documentation).
  • Observability: Use the DTS dashboard to monitor orchestrations in real-time (local: localhost:8082, cloud via Azure portal).
  • Upgrades: When updating code, re-run azd up to refresh infrastructure and deployments.
  • Troubleshooting: Refer to the lab/tutorial resources in the Snippy repository for common issues related to emulators, IAM permissions, and OpenAI embeddings.
  • Environment variables: Depending on deployment, you may need to configure OPENAI_API_KEY or Azure OpenAI credentials, Cosmos DB connection details, and storage accounts in your environment or via Azure Key Vault.

Related MCP Servers

Sponsor this space

Reach thousands of developers ↗