flock
Flock is a workflow-based low-code platform for rapidly building chatbots, RAG, and coordinating multi-agent teams, powered by LangGraph, Langchain, FastAPI, and NextJS.(Flock 是一个基于workflow工作流的低代码平台,用于快速构建聊天机器人、RAG、Agent和Muti-Agent应用,采用 LangGraph、Langchain、FastAPI 和 NextJS 构建。)
claude mcp add --transport stdio onelevenvy-flock npx -y onelevenvy-flock
How to use
Flock is a flexible, low-code workflow platform designed to orchestrate collaborative agents and tools. Built on LangChain and LangGraph, it provides a suite of node types—from Input and LLM nodes to Tool Nodes and Subgraph Nodes—allowing you to assemble end-to-end workflows that can process inputs, reason with LLMs, fetch and utilize retrieved data, and execute tools or actions. The MCP integration enables connecting to multiple MCP servers and loading MCP tools as LangGraph-compatible tools, with support for both standard stdio and SSE transport modes for communication. This makes it possible to convert MCP tools into LangChain tools, orchestrate multi-agent tasks, and embed MCP-driven tooling directly within your workflows. You can compose complex workflows that route based on intent, validate outputs via Human-in-the-loop nodes, and reuse modular subgraphs for maintainability.
How to install
Prerequisites:\n- Node.js (recommended: v18.x) and npm/yarn installed on your machine.\n- Access to the internet to fetch packages from npm.\n\nInstallation steps:\n1) Verify Node.js and npm are available:\n node -v\n npm -v\n\n2) Install or run the Flock MCP server via npx (recommended):\n npx -y onelevenvy-flock\n\n This will install and launch the Flock MCP server from the npm registry. If you prefer a direct package install, you can also try:\n npm install -g onelevenvy-flock\n flock-server\n\n3) Start the server (if not started by the npx command):\n flock --help\n\n4) Configure your MCP clients to connect to the server using the provided transport (stdio or SSE) as documented by the project.\n\nPrerequisites note:\n- Ensure network access for npm registry and compatible Node.js runtime. Some environments may require additional build tools (python, Python headers) for native modules; install build-essential or Xcode Command Line Tools if prompted.
Additional notes
Tips and notes:\n- The MCP integration supports multiple transport modes; stdio and SSE are commonly used. Choose the one that best fits your deployment environment.\n- When enabling MCP tools in LangChain, you can convert MCP tools into LangChain tools for seamless integration with LangGraph agents.\n- If you encounter port or binding issues, verify that the host machine allows the server to listen on the configured port and that any firewalls allow incoming connections.\n- Use the Subgraph Node to encapsulate reusable workflow fragments, and the Human-in-the-loop node to provide human validation or intervention in tool calls and outputs.\n- Check for updates or new MCP tolls in the project release notes to stay current with new capabilities like Streamble HTTP MCP Tools and enhanced agent support.
Related MCP Servers
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.
headroom
The Context Optimization Layer for LLM Applications
evo-ai
Evo AI is an open-source platform for creating and managing AI agents, enabling integration with different AI models and services.
argo
ARGO is an open-source AI Agent platform that brings Local Manus to your desktop. With one-click model downloads, seamless closed LLM integration, and offline-first RAG knowledge bases, ARGO becomes a DeepResearch powerhouse for autonomous thinking, task planning, and 100% of your data stays locally. Support Win/Mac/Docker.
CoexistAI
CoexistAI is a modular, developer-friendly research assistant framework . It enables you to build, search, summarize, and automate research workflows using LLMs, web search, Reddit, YouTube, and mapping tools—all with simple MCP tool calls or API calls or Python functions.
Autono
A ReAct-Based Highly Robust Autonomous Agent Framework.