SchemaPin
The SchemaPin protocol for cryptographically signing and verifying AI agent tool schemas to prevent supply-chain attacks.
claude mcp add --transport stdio thirdkeyai-schemapin docker run -i schemapin/server:latest
How to use
SchemaPin is a cryptographic verifier and signer for AI tool schemas and skill folders. It provides ECDSA P-256 signatures, RFC 8615 style public key discovery via a .well-known/schemapin.json endpoint, and Trust-On-First-Use pinning to guard against key substitution. The MCP server aspect exposes a production-grade endpoint that other agents can query to fetch trusted keys, verify signatures, and validate schema integrity in your environment. With the server running, agents can rely on the hosted trust bundle and the TOFU pins to perform offline or online verification of tool schemas and related manifests.
To use the server alongside the client libraries, point your agent or orchestrator to the server's .well-known discovery path and to the verification endpoints provided by the MCP server. The server is designed to work with the Python, JavaScript, Go, and Rust SDKs described in the project, enabling signers to publish their public keys and revocation documents, and verifiers to fetch and verify signatures against the published trust bundles. Typical usage involves requesting the signed schema's metadata, obtaining the signer’s public key, and validating the signature against the canonical schema data prior to execution or deployment of tools and skills.
How to install
Prerequisites:
- Docker or a container runtime installed on the host
- Internet access to pull the server image
Installation steps (docker):
-
Pull and run the SchemaPin server image:
docker pull schemapin/server:latest docker run -d --name schemapin-server -p 8080:80 schemapin/server:latest
-
Verify the container is running:
docker ps
-
Configure your environment to trust the server's endpoints (see environment notes).
Alternative (if you have a local build or a different deployment method):
- Build or deploy according to your environment's container orchestration or language-specific deployment guide, ensuring the server exposes the .well-known/schemapin.json discovery endpoint and supporting verification services.
Prerequisites for development builds (optional):
- Docker and Docker Compose (if using compose)
- Access to the schemapin server repository for any local builds or custom configurations
Additional notes
Environment and configuration tips:
- The server relies on .well-known/schemapin.json for public-key discovery; ensure the path is accessible and correctly served over HTTPS in production.
- TOFU pinning requires that clients update their trust anchors when new keys are published or revoked; consider setting a short rotation window during initial deployment.
- If you encounter signature verification failures, verify that the signed data, the signature, and the signer's public key match the canonical encoding and digest formats used by the server (ECDSA P-256 with SHA-256 as described in the docs).
- For offline or air-gapped deployments, deploy the appropriate trust bundle or offline verification package that SchemaPin supports, so verification can occur without external network access.
- If using the Node.js SDKs or other language SDKs, align tool signing workflows with the server’s discovery and revocation docs to ensure seamless verification.
Related MCP Servers
bytebot
Bytebot is a self-hosted AI desktop agent that automates computer tasks through natural language commands, operating within a containerized Linux desktop environment.
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
klavis
Klavis AI (YC X25): MCP integration platforms that let AI agents use tools reliably at any scale
dexto
A coding agent and general agent harness for building and orchestrating agentic applications.
aser
Aser is a lightweight, self-assembling AI Agent frame.
mcp-reticle
Reticle intercepts, visualizes, and profiles JSON-RPC traffic between your LLM and MCP servers in real-time, with zero latency overhead. Stop debugging blind. Start seeing everything.