Get the FREE Ultimate OpenClaw Setup Guide →

parsey arseface

[Archive] A simple Python Flask app to run Parsey McParseface.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio joshdata-parsey-mcparseface-server python server.py

How to use

Parsey McParseface server is a lightweight Python Flask application that exposes a HTTP API wrapper around the Parsey McParseface model. It is intended to be run as a local service that you can send text to for syntactic and parsing results via HTTP. Start the server using the command provided in the installation section, then interact with the API endpoints to submit text and receive parse trees or related metadata. Keep in mind this server is aligned with a customized Parsey McParseface branch and may not be compatible with the standard TensorFlow models package. Once running, you can integrate it into your tooling to programmatically obtain parsing results for natural language input.

How to install

Prerequisites:

  • Python 3.8+ (preferably a virtual environment)
  • Git
  • Basic build tools if compiling dependencies (as needed by Flask and any ML components)
  1. Set up a virtual environment (optional but recommended): python3 -m venv venv source venv/bin/activate

  2. Install Flask and any required dependencies (adjust to your environment): pip install flask

    If there is a requirements.txt in the project, install it instead:

    pip install -r requirements.txt

  3. Obtain the Parsey McParseface code and the modified TensorFlow models branch as described in the README you provided. This typically involves:

    • Following the TensorFlow models installation steps for Ubuntu 14.04 x64 as referenced
    • Cloning the models repository from the specified GitHub location and checking out the online-parsing branch
  4. Place the server script (as referenced in the README) in your working directory. The README indicates a Flask app that serves Parsey McParseface over HTTP.

  5. Run the server using the command defined in the MCP config (example): python server.py

  6. Optional: If you keep environment variables or additional configuration, set them in the environment before starting the server or configure them inside the script as needed.

Additional notes

Notes:

  • This server is built to work with a modified Parsey McParseface setup from a specific branch of the TensorFlow models repository. It is not guaranteed to work with the standard TensorFlow models package.
  • If you encounter issues, ensure you are on the correct branch of the models repository (online-parsing) and that any required dependencies for that branch are installed.
  • Since this is a local API wrapper, consider securing endpoints if exposing the service beyond a trusted network. Use standard Flask/WSGI deployment practices for production.
  • Environment variables (if used) may include settings for model paths, port, and host; define them as needed in your environment.
  • If you plan to deploy, consider containerizing the server for reproducibility and easier dependency management.

Related MCP Servers

Sponsor this space

Reach thousands of developers