langfuse
A Model Context Protocol (MCP) server for Langfuse, enabling AI agents to query Langfuse trace data for enhanced debugging and observability
claude mcp add --transport stdio avivsinai-langfuse-mcp uvx --python 3.11 langfuse-mcp \ --env LANGFUSE_HOST="https://cloud.langfuse.com" \ --env LANGFUSE_PUBLIC_KEY="pk-..." \ --env LANGFUSE_SECRET_KEY="sk-..."
How to use
Langfuse MCP Server provides a comprehensive observability toolkit for LangFuse, including traces, observations, sessions, exceptions, and prompts, along with dataset management and schema access. Use it to query traces, debug errors, analyze sessions, manage prompts, and inspect datasets, all through a unified MCP interface. The server is driven via the uvx runtime, which runs the Python-based Langfuse MCP package and exposes a standardized set of tools that you can invoke from your CLI clients. You can selectively load tool groups to minimize token overhead or run in read-only mode for safer access.
To start using it, configure your credentials and host, then run the MCP server using the uvx invocation described in the Quick Start. You’ll gain access to a suite of tools like fetch_traces, fetch_observations, fetch_sessions, get_session_details, find_exceptions, list_prompts, create_text_prompt, update_prompt_labels, and dataset operations such as list_datasets and create_dataset. The server is designed to integrate with Claude Code or Codex CLI workflows, enabling you to manage Langfuse observability directly from your coding environment or CI pipelines.
How to install
Prerequisites:
- Python 3.10–3.13 installed
- uv (uvx) installed or available in your environment
- Access keys for Langfuse (LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY) and host URL (LANGFUSE_HOST) if using a self-hosted instance
Option A: Install from PyPI (recommended)
- Install the MCP package: pip install langfuse-mcp
- Verify installation: langfuse-mcp --help
Option B: Install from source (development or custom setup)
- Clone the repository git clone https://github.com/avivsinai/langfuse-mcp.git
- Navigate to the project directory cd langfuse-mcp
- Install in editable mode with development dependencies uv venv --python 3.11 .venv && source .venv/bin/activate uv pip install -e ".[dev]"
- Run the MCP server using uvx (see Quick Start for exact command)
Starting the server with uvx (example): uvx --python 3.11 langfuse-mcp
Additional notes
Tips and notes:
- Use environment variables to securely provide credentials (LANGFUSE_PUBLIC_KEY, LANGFUSE_SECRET_KEY) and host URL (LANGFUSE_HOST).
- To limit loaded capabilities, use selective tool loading, e.g., langfuse-mcp --tools traces,prompts.
- For read-only access, enable the read-only flag: langfuse-mcp --read-only or set LANGFUSE_MCP_READ_ONLY=true in the environment.
- If mirroring to other clients (Cursor or Docker), you can mirror the example configurations provided in the README, adapting the host and credentials to your environment.
- Docker usage is available: docker run --rm -i -e LANGFUSE_PUBLIC_KEY=pk-... -e LANGFUSE_SECRET_KEY=sk-... -e LANGFUSE_HOST=https://cloud.langfuse.com ghcr.io/avivsinai/langfuse-mcp:latest
- The MCP exposes a rich set of tools across traces, observations, sessions, exceptions, prompts, datasets, and schema. Consult the README sections for exact tool names and usage patterns.
Related MCP Servers
everything-claude-code
The agent harness performance optimization system. Skills, instincts, memory, security, and research-first development for Claude Code, Codex, Cowork, and beyond.
deepcontext
DeepContext is an MCP server that adds symbol-aware semantic search to Claude Code, Codex CLI, and other agents for faster, smarter context on large codebases.
sandboxed.sh
Self-hosted orchestrator for AI autonomous agents. Run Claude Code & Open Code in isolated linux workspaces. Manage your skills, configs and encrypted secrets with a git repo.
ask-user-questions
Better 'AskUserQuestion' - A lightweight MCP server/OpenCode plugin/Agent Skills + CLI tool that allows your LLMs ask questions to you. Be the human in the human-in-the-loop!
floop
Spreading activation memory for AI coding agents - corrections in, context-aware behaviors out.
HydraMCP
Connect agents to agents. MCP server for querying any LLM through your existing subscriptions: compare, vote, and synthesize across GPT, Gemini, Claude, and local models from one terminal.