parsey arseface
[Archive] A simple Python Flask app to run Parsey McParseface.
claude mcp add --transport stdio joshdata-parsey-mcparseface-server python server.py
How to use
Parsey McParseface server is a lightweight Python Flask application that exposes a HTTP API wrapper around the Parsey McParseface model. It is intended to be run as a local service that you can send text to for syntactic and parsing results via HTTP. Start the server using the command provided in the installation section, then interact with the API endpoints to submit text and receive parse trees or related metadata. Keep in mind this server is aligned with a customized Parsey McParseface branch and may not be compatible with the standard TensorFlow models package. Once running, you can integrate it into your tooling to programmatically obtain parsing results for natural language input.
How to install
Prerequisites:
- Python 3.8+ (preferably a virtual environment)
- Git
- Basic build tools if compiling dependencies (as needed by Flask and any ML components)
-
Set up a virtual environment (optional but recommended): python3 -m venv venv source venv/bin/activate
-
Install Flask and any required dependencies (adjust to your environment): pip install flask
If there is a requirements.txt in the project, install it instead:
pip install -r requirements.txt
-
Obtain the Parsey McParseface code and the modified TensorFlow models branch as described in the README you provided. This typically involves:
- Following the TensorFlow models installation steps for Ubuntu 14.04 x64 as referenced
- Cloning the models repository from the specified GitHub location and checking out the online-parsing branch
-
Place the server script (as referenced in the README) in your working directory. The README indicates a Flask app that serves Parsey McParseface over HTTP.
-
Run the server using the command defined in the MCP config (example): python server.py
-
Optional: If you keep environment variables or additional configuration, set them in the environment before starting the server or configure them inside the script as needed.
Additional notes
Notes:
- This server is built to work with a modified Parsey McParseface setup from a specific branch of the TensorFlow models repository. It is not guaranteed to work with the standard TensorFlow models package.
- If you encounter issues, ensure you are on the correct branch of the models repository (online-parsing) and that any required dependencies for that branch are installed.
- Since this is a local API wrapper, consider securing endpoints if exposing the service beyond a trusted network. Use standard Flask/WSGI deployment practices for production.
- Environment variables (if used) may include settings for model paths, port, and host; define them as needed in your environment.
- If you plan to deploy, consider containerizing the server for reproducibility and easier dependency management.
Related MCP Servers
mcp-vegalite
MCP server from isaacwasserman/mcp-vegalite-server
github-chat
A Model Context Protocol (MCP) for analyzing and querying GitHub repositories using the GitHub Chat API.
nautex
MCP server for guiding Coding Agents via end-to-end requirements to implementation plan pipeline
pagerduty
PagerDuty's official local MCP (Model Context Protocol) server which provides tools to interact with your PagerDuty account directly from your MCP-enabled client.
futu-stock
mcp server for futuniuniu stock
mcp -boilerplate
Boilerplate using one of the 'better' ways to build MCP Servers. Written using FastMCP