hayhooks
Easily deploy Haystack pipelines as REST APIs and MCP Tools.
claude mcp add --transport stdio deepset-ai-hayhooks python -m hayhooks run \ --env HOST="0.0.0.0" \ --env PORT="1416"
How to use
Hayhooks is a Python-based MCP server that deploys Haystack pipelines and agents as REST APIs and exposes them as MCP Tools for use in AI development environments. After installation, you start the server with hayhooks run, which launches a configurable HTTP API surface for each deployed pipeline or agent. The server supports Open WebUI streaming backends, an embedded Chainlit UI, and open MCP endpoints so you can interact with your Haystack components from tools like Cursor, Claude Desktop, or other MCP clients. Use it to deploy, undeploy, list, or run pipelines and agents through natural MCP tool invocations or direct HTTP requests. The quick start shows how to install Hayhooks, start the server, and deploy a minimal agent wrapper that exposes a /my_agent/run endpoint as well as a streaming /chat/completions interface, illustrating how Hayhooks bridges Haystack with MCP-based tooling.
How to install
Prerequisites:
- Python 3.8+ installed on your system
- pip available in your PATH
- Internet access to install dependencies
Step 1: Create a Python environment (optional but recommended)
- macOS/Linux: python3 -m venv hayhooks-env source hayhooks-env/bin/activate
- Windows: python -m venv hayhooks-env hayhooks-env\Scripts\activate
Step 2: Install Hayhooks
pip install hayhooks
Step 3: Run Hayhooks (MCP server)
hayhooks run
Step 4: Deploy a pipeline or agent (example shown in docs)
- Prepare a wrapper dir as described in the Hayhooks quick start, then deploy via the Hayhooks CLI, for example:
hayhooks pipeline deploy-files -n my_agent ./my_agent_dir
Step 5: Access the server
- REST API endpoint (default port 1416): http://localhost:1416/my_agent/run
- MCP tooling endpoints are exposed for integration with MCP clients.
Note: If you prefer using a Docker-based workflow, you can run Hayhooks in a container and map port 1416 to your host. See the project documentation for Docker usage and additional configuration options.
Additional notes
Tips and caveats:
- The default port is 1416; adjust the PORT environment variable in mcp_config if you need a different port.
- Hayhooks supports both YAML-based and wrapper-based Haystack pipeline deployments; refer to the documentation for details on both approaches.
- When integrating with MCP clients (e.g., Cursor, Claude Desktop), expose each pipeline or agent as an MCP Tool via Hayhooks so they show up as usable tools in your MCP environment.
- For streaming OpenAI-compatible chat interfaces, you can leverage the /chat/completions endpoint with streaming enabled as demonstrated in the quick start example.
- If you encounter issues, enable verbose logging in Hayhooks and check for Python dependency compatibility with your Haystack version.
- To uninstall or upgrade, use pip uninstall hayhooks or pip install --upgrade hayhooks respectively.
Related MCP Servers
lihil
2X faster ASGI web framework for python, offering high-level development, low-level performance.
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
ContextPods
Model Context Protocol management suite/factory. An MCP that can generate and manage other local MCPs in multiple languages. Uses the official SDKs for code gen.
BinAssistMCP
Binary Ninja plugin to provide MCP functionality.
mcp -docy
A Model Context Protocol server that provides documentation access capabilities. This server enables LLMs to search and retrieve content from documentation websites by scraping them with crawl4ai. Built with FastMCP v2.
mockloop
Intelligent Model Context Protocol (MCP) server for AI-assisted API development. Generate mock servers from OpenAPI specs with advanced logging, performance analytics, and server discovery. Optimized for AI development workflows with comprehensive testing insights and automated analysis.