workflowy
Powerful CLI and MCP server for WorkFlowy: reports, search/replace, backup support, and AI integration (Claude, LLMs)
claude mcp add --transport stdio mholzen-workflowy workflowy mcp --expose=all \ --env WORKFLOWY_API_KEY="your-api-key-if-needed" \ --env WORKFLOWY_API_BASE_URL="https://workflowy.com/api"
How to use
This MCP server wraps the Workflowy CLI and exposes a set of read and write tools that you can invoke via an MCP-compatible AI assistant. The server provides read tools for querying and inspecting your outline (such as workflowy_get, workflowy_list, workflowy_search, workflowy_id, and various report tools), as well as write tools to create, update, move, delete, and transform nodes (workflowy_create, workflowy_update, workflowy_move, workflowy_delete, workflowy_complete, workflowy_uncomplete, workflowy_replace, workflowy_transform). With the MCP interface you can compose prompts that leverage these capabilities, for example requesting a list of overdue tasks, transforming a node’s content, or bulk-replacing text across your outline. You can connect this server to Claude, ChatGPT, or any MCP-enabled assistant using the standard MCP workflow, and expose the appropriate read/write tools in your assistant’s configuration.
To use the server with Claude Desktop or Claude Code, configure the MCP server entry (workflowy) in your Claude config or in your Claude Desktop config file, and start the server with the appropriate exposure flags. Then instruct your assistant to invoke the desired tools (e.g., workflowy search, workflowy_replace, workflowy_transform) to interact with your Workflowy data. The tools are designed to be safe by default, with more powerful write operations requiring explicit exposure settings.
How to install
Prerequisites:
- A supported environment with Go toolchain installed or access to the Workflowy CLI binary via Homebrew, Scoop, or prebuilt binaries.
- Workflowy API access key if you plan to use API-backed features (optional for local CLI usage).
Installation steps (example using the CLI as the MCP server):
-
Install the Workflowy CLI if you haven’t already:
- macOS/Linux via Homebrew: brew install mholzen/workflowy/workflowy-cli
- Windows via Scoop: scoop bucket add workflowy https://github.com/mholzen/scoop-workflowy scoop install workflowy
- Or install via Go build (if you prefer from source): go install github.com/mholzen/workflowy/cmd/workflowy@latest
-
Ensure the CLI is accessible in your PATH (the executable is typically named 'workflowy').
-
Run the MCP server using the embedded CLI with MCP exposure: workflowy mcp --expose=all
-
Optional: set up your API key for Workflowy if required by your workflow: mkdir -p ~/.workflowy echo "your-api-key-here" > ~/.workflowy/api.key
-
If you’re wiring this into an AI assistant, use the mcp_config below to point the assistant at the server and tools you want to expose.
Additional notes
Tips and common considerations:
- The Workflowy MCP server leverages the same CLI you use manually; exposing all write tools (--expose=all) should be done with caution in production environments. For safer setups, selectively enable only the needed tools.
- If you encounter API rate limits or authentication issues, verify your API key and base URL, and consider using offline/backup options for read-only workflows.
- The MCP tools are categorized as Read Tools (safe) and Write Tools (state-changing). Ensure your integration respects the access level you intend to grant your assistant.
- For offline workflows, use the built-in capabilities of Workflowy (where supported) to access cached data or backup files as described in the CLI docs.
- If you customize environments, you may want to pin a specific version of the CLI to avoid breaking changes when new MCP tools are released.
- The following environment variable can help with authentication and API access: WORKFLOWY_API_KEY. Replace the placeholder with your actual key when needed.
Related MCP Servers
rod
Model Context Protocol Server of Rod
gtm
An MCP server for Google Tag Manager. Connect it to your LLM, authenticate once, and start managing GTM through natural language.
memory
A MCP (Model Context Protocol) server providing long-term memory for LLMs
shellguard
MCP server that gives LLM agents read-only shell access over SSH
backlog
Help coding agents and developers to keep track of a project's backlog by storing tasks as markdown in git.
interop
Interop CLI: Go command-line tool for efficient project management and command execution across your development workspace.