terragrunt
Model Context Protocol (MCP) server providing AI assistants with comprehensive Terragrunt documentation, code examples, and intelligent tooling for infrastructure-as-code development
claude mcp add --transport stdio omattsson-terragrunt-mcp-server npx -y omattsson-terragrunt-mcp-server
How to use
This Terragrunt MCP Server provides centralized access to Terragrunt documentation, function references, CLI guidance, and configuration best practices. It exposes a suite of tools designed to help developers quickly search, browse, and retrieve Terragrunt content, including an indexed documentation search, function references, CLI help, and HCL configuration references. The server also supports advanced backend configuration generation and guidance to accelerate safe Terragrunt usage in CI/CD pipelines and Terraform workflows. Use the search_docs tool to find topics or code examples, function_reference to learn about Terragrunt’s built-in functions, cli_reference for command usage, and get_hcl_config_reference to understand HCL blocks used in terragrunt.hcl files. The build_config tool helps generate or write Terragrunt configurations based on templates and backends. The observability tool provides server metrics for monitoring your integration with AI assistants like Copilot in VS Code.
How to install
Prerequisites:
- Node.js (14.x or newer) and npm installed on your machine
- Internet access to fetch the MCP server package via npm/pnpm/yarn
-
Install Node.js and npm if not already installed. For example, on macOS with Homebrew:
- brew install node
-
Install and run the Terragrunt MCP Server using npx (recommended):
- npx -y omattsson-terragrunt-mcp-server
-
Alternatively, install locally (optional):
- npm install -D omattsson-terragrunt-mcp-server
- npx omattsson-terragrunt-mcp-server
-
Once running, connect your MCP client to the server endpoint exposed by the process (default localhost if run locally) and start issuing tool calls like search_docs, function_reference, cli_reference, and get_hcl_config_reference.
Additional notes
Notes and tips:
- The server caches Terragrunt documentation to improve performance and resilience. If you encounter stale content, you may need to trigger a refresh or rebuild the cache.
- The search_docs tool supports modes such as search, list, section, and examples. Use appropriate detailLevel to balance response size and completeness.
- For production deployments, consider configuring a persistent cache location and enabling any built-in security controls exposed by the MCP server package.
- If you run behind a proxy or a corporate network, ensure your environment variables for HTTP(S) proxy are set (e.g., http_proxy, https_proxy).
- Review the Advanced Backend Templates and documentation links in the repo to tailor the configuration generator (build_config) for your organization's backends and templates.
Related MCP Servers
bear-notes
MCP server for Bear app - Full Read + Write AI-powered note management with Claude Desktop
taskflow
A task management Model Context Protocol (MCP) server that helps AI assistants break down user requests into manageable tasks with subtasks, dependencies, and notes. Enforces a structured workflow with user approval steps.
mcpdog
🐕 Universal MCP Server Manager - Configure once, manage multiple MCP servers through a single interface. Perfect for Claude Desktop, Claude Code, Cursor, Gemini CLI & AI assistants. Web dashboard, auto-detection, unified proxy layer.
MCP-Client -Project-using-NodeJS
A minimal Model Context Protocol (MCP) implementation built with Node.js and TypeScript. This project demonstrates client–server communication over stdio, structured message handling, and local data access, developed with VS Code and GitHub Copilot to explore modern AI tool integration workflows.
ConferenceHaven-Community
Community feedback, documentation, and discussions for ConferenceHaven MCP - Your AI conference assistant
warp-sql
🗄️ Model Context Protocol (MCP) server for SQL Server integration with Warp terminal. Execute queries, explore schemas, export data, and analyze performance with natural language commands.