Get the FREE Ultimate OpenClaw Setup Guide →

mcp -conceal

Privacy-focused MCP proxy that intelligently pseudo-anonymizes PII in real-time before data reaches external AI providers, maintaining semantic relationships for accurate analysis

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio gbrigandi-mcp-server-conceal mcp-server-conceal --target-command python3 --target-args database-server.py --host localhost --config /path/to/mcp-server-conceal.toml \
  --env DATABASE_URL="postgresql://localhost/mydb"

How to use

MCP Conceal is an MCP proxy that sits between your MCP client and your MCP server to perform pseudo-anonymization of PII before it reaches external AI providers. It detects sensitive data using a configurable mix of regex patterns and optional AI-based analysis (via an Ollama LLM) and replaces real identifiers with consistent fake data while preserving structure and relationships. The proxy keeps mappings in a local database so the same real data maps to the same fake data across requests. To use it, run the conceal proxy as a front for your MCP server by pointing the proxy at your target server process and configuring the target command/arguments. You can customize the detection approach (regex, LLM, or a hybrid), tailor the fake data generation (locale, seed, and consistency), and supply prompts to guide LLM-based detection. For integration, you typically start mcp-server-conceal with a target command that runs your MCP server logic (for example a Python script) and pass a config file that defines how detection should work and how LLMs should be invoked.

How to install

Prerequisites:

Install from source (recommended if you need to customize):

  1. git clone https://github.com/gbrigandi/mcp-server-conceal
  2. cd mcp-server-conceal
  3. cargo build --release

Binary location after build:

  • target/release/mcp-server-conceal

Alternative: Download pre-built binaries from the Releases page on GitHub and make them executable, then move to a directory in your PATH. For example:

  • chmod +x mcp-server-conceal-linux-amd64
  • sudo mv mcp-server-conceal-linux-amd64 /usr/local/bin/mcp-server-conceal

Usage example to run the proxy:

  • mcp-server-conceal
    --target-command python3
    --target-args "my-mcp-server.py"
    --config mcp-server-conceal.toml

Additional notes

Tips and common considerations:

  • Keep the mappings database secure and with proper file permissions since it stores real-to-fake data mappings.
  • If using LLM-based detection, ensure Ollama (or your chosen local LLM runner) is reachable and properly configured in the endpoint settings.
  • The detection mode and thresholds affect latency and accuracy; start with regex_llm for a balance, then tune as needed.
  • Use absolute paths for database_path and llm_cache in production to avoid path issues.
  • When integrating with your MCP server, you can adjust target-command/target-args to point to your server process and pass environment variables as needed (e.g., database URLs, API keys).
  • Enable verbose/debug logs during troubleshooting by setting RUST_LOG=debug.
  • Regularly back up the mappings database to prevent data loss during migrations or crashes.

Related MCP Servers

Sponsor this space

Reach thousands of developers