otel-instrumentation
Python MCP server providing AI coding assistants with OpenTelemetry documentation, examples, and instrumentation guidance
claude mcp add --transport stdio liatrio-labs-otel-instrumentation-mcp uv run otel-instrumentation-mcp \ --env GITHUB_TOKEN="your_github_token" \ --env GITHUB_APP_ID="your_github_app_id" \ --env GITHUB_INSTALLATION_ID="your_github_installation_id" \ --env GITHUB_APP_PRIVATE_KEY_PATH="path/to/private-key.pem" \ --env OTEL_EXPORTER_OTLP_ENDPOINT="optional_endpoint"
How to use
This MCP server exposes OpenTelemetry instrumentation tooling and references to repositories, documentation, examples, and semantic conventions to help AI assistants generate, validate, and score instrumentation in code. It supports multiple transport methods (stdio, HTTP, SSE) and can be run in local development or production-like environments with GitHub authentication to access private repositories or API data. You can interact with the server using your preferred MCP client and configure it in a client-friendly way to allow AI tools to browse, fetch examples, and apply best-practice instrumentation patterns.
To use the server, run it using uv as described in the Quick Start. Once running, connect your MCP-enabled assistant (e.g., Claude Desktop, Windsurf, Cursor, or a VS Code extension) by adding a configuration that points to the uv command with the appropriate server identifier. The configuration typically includes the command, arguments, and an environment block that provides your GitHub authentication token or app credentials. You can leverage the provided usage examples to tailor the configuration for Claude Desktop, VS Code, Windsurf, or Cursor, adapting the working directory and environment variables as needed. The server supports HTTP and SSE transports for remote access, enabling you to manage instrumentation tasks from different environments while maintaining secure authentication via GitHub tokens or apps.
How to install
Prerequisites:
- Python 3.13+
- uv package manager (https://docs.astral.sh/uv/)
- GitHub authentication method (Personal Access Token or GitHub App credentials)
Installation steps:
- Clone the repository:
git clone https://github.com/liatrio/otel-instrumentation-mcp.git
cd otel-instrumentation-mcp
- Install dependencies (using uv will manage runtime in your environment):
uv sync
- Set up GitHub authentication (choose one):
Option A: Personal Access Token
export GITHUB_TOKEN="your_github_pat"
Option B: GitHub App (recommended for production)
export GITHUB_APP_ID="your_app_id"
export GITHUB_INSTALLATION_ID="your_installation_id"
export GITHUB_APP_PRIVATE_KEY_PATH="/path/to/private-key.pem"
- Run the MCP server:
uv run otel-instrumentation-mcp
Note: You may also configure additional environment variables as described in the repository's .env.example to customize behavior (e.g., OTEL_EXPORTER_OTLP_ENDPOINT).
Additional notes
Environment variables: GITHUB_TOKEN or GitHub App credentials must be set to access GitHub resources. If running remotely, ensure your transport (HTTP or SSE) is properly exposed and that authentication is configured for the remote client. The server includes production-ready Kubernetes manifests and health checks, but for local development you can run via uv with a local drive. If you encounter token scope issues, verify that the token has access to the necessary repositories and GraphQL endpoints. For debugging, you can inspect logs for authentication errors, transport initialization, and MCP server registration. Remember to reload your MCP client configuration after any changes to the server setup.
Related MCP Servers
claude-talk-to-figma
A Model Context Protocol (MCP) that allows Claude Desktop and other AI tools (Claude Code, Cursor, Antigravity, etc.) to read, analyze, and modify Figma designs
ollama
An MCP Server for Ollama
statelessagent
Your AI forgets everything between sessions. SAME fixes that. Local-first, no API keys, single binary.
mcproc
A Model Context Protocol (MCP) server for comfortable background process management on AI agents.
voice-status-report
A Model Context Protocol (MCP) server that provides voice status updates using OpenAI's text-to-speech API.
mcp-tidy
CLI tool to visualize and manage MCP server configurations in Claude Code. List servers, analyze usage statistics, and clean up unused servers