uspto_fpd_mcp
High-performance MCP server for USPTO Final Petition Decisions API with context reduction and cross-MCP integration
claude mcp add --transport stdio john-walkoe-uspto_fpd_mcp uv --directory C:/Users/YOUR_USERNAME/uspto_fpd_mcp run fpd-mcp \ --env USPTO_API_KEY="your_actual_USPTO_api_key_here" \ --env FPD_PROXY_PORT="8081" \ --env MISTRAL_API_KEY="your_mistral_api_key_here_OPTIONAL" \ --env CENTRALIZED_PROXY_PORT="none"
How to use
This MCP server provides access to the USPTO Final Petition Decisions data with emphasis on context reduction to minimize prompt payloads while preserving essential petition details. It supports token-saving workflows, secure API key handling, and cross-MCP integration to enable holistic patent lifecycle analysis. Use the included petition-focused tools to run targeted searches (by art unit, application, or petition type), perform structured extractions, and access documents via a unified proxy when configured. The platform is designed for high-performance retrieval with progressive disclosure: discover a concise set of petitions first, then drill down into transformation-ready data as needed. When combined with other MCPs like PTAB or Patent File Wrapper, you can correlate petition activity across the lifecycle and leverage cross-MCP downloads for complete context.
How to install
Prerequisites:
- Python 3.11+ and pip
- Git
- Access to uv (Astral UV) for Python-based MCP execution
Step 1: Prepare your environment
- Ensure Python is installed and available in your PATH
- Install uv if not already present (example shown using pipx):
pipx install uv
Step 2: Obtain API keys
- Acquire USPTO API key and optional Mistral API key. These should be kept secure and not committed to version control.
Step 3: Clone the repository
git clone https://github.com/john-walkoe/uspto_fpd_mcp.git
cd uspto_fpd_mcp
Step 4: Configure the MCP server
- Create or edit your mcp configuration to point uv at the fpd-mcp entry point. Example configuration is provided in the mcp_config section below. Save as a JSON file if desired (e.g., mcp_config.json).
Step 5: Run the MCP server
- Using uv as shown in the configuration, start the server. Example command (adjust paths as needed):
uv --directory C:/Users/YOUR_USERNAME/uspto_fpd_mcp run fpd-mcp
Step 6: Verify operation
- Check logs for startup messages, ensure API keys are accepted, and confirm that the local proxy ports (FPD_PROXY_PORT) are accessible if you configured a proxy.
Additional notes
Tips and common considerations:
- If using a centralized proxy (CENTRALIZED_PROXY_PORT not set to none), ensure the PFW MCP is installed and configured to expose the proxy on port 8080, as described in the integration docs.
- The FPD_PROXY_PORT defaults to 8081 when running standalone; it will switch to the centralized proxy port when cross-MCP integration is detected.
- For security, prefer the secure key storage flow described in the documentation. Avoid placing API keys in plain text within configuration files.
- The system supports field customization via YAML-based configurations; consult the Field Customization guide for tailoring the data you retrieve and store.
- If you encounter issues with UV detection or installation, refer to the INSTALL.md guide in the repository for automated setup scripts and troubleshooting steps.
Related MCP Servers
mcp-nixos
MCP-NixOS - Model Context Protocol Server for NixOS resources
legal-context-ce
LegalContext is an open-source Model Context Protocol (MCP) server that creates a secure, standardized bridge between law firms' document management systems (specifically Clio) and AI assistants (starting with Claude Desktop)
vibe-check
Stop AI coding disasters before they cost you weeks. Real-time anti-pattern detection for vibe coders who love AI tools but need a safety net to avoid expensive overengineering traps.
mcp -template
Cookiecutter template for MCP servers with one-click Render.com deployment - Generate production-ready API integration servers in minutes
penpot
MCP server for Penpot - Connect AI assistants to Penpot design platform via Model Context Protocol
predictive-maintenance
AI-Powered Predictive Maintenance & Fault Diagnosis through Model Context Protocol. An open-source framework for integrating Large Language Models with predictive maintenance and fault diagnosis workflows.