mcp-flowcore-platform
MCP server for managing and interacting with Flowcore Platform
claude mcp add --transport stdio flowcore-io-mcp-flowcore-platform npx @flowcore/platform-mcp-server \ --env PAT="Flowcore PAT (Personal Access Token)" \ --env USERNAME="Flowcore username"
How to use
This MCP server exposes the Flowcore Platform via the Model Context Protocol, allowing AI assistants and automation to query, manage, and interact with Flowcore resources through a standardized API. The server is accessed by running the package with npx (no local install required) and providing your Flowcore credentials. Once running, you can issue MCP queries and commands against the Flowcore Platform endpoints supported by the server, enabling programmatic access to resources, configurations, and state.
To use it, run the server with your Flowcore username and Personal Access Token (PAT) as environment values or command-line inputs. Typical usage is through npx, which pulls the latest package version and starts the MCP interface. After starting, you can integrate this MCP server with your AI tooling or chat assistants to perform actions such as listing resources, querying metadata, and executing platform operations via MCP-compliant requests. For local testing or development, you can also explore the related local read model MCP server mentioned in the repository to compare capabilities and performance.
How to install
Prerequisites:
- Node.js installed (npx is included with npm/Yarn).
- internet access to fetch the package from npm registry.
Option A: Run without installing (npx)
- Ensure Node.js is installed.
- Run:
npx @flowcore/platform-mcp-server --username <username> --pat <pat>
- The MCP server starts and awaits requests using your provided credentials.
Option B: Install globally
- Ensure Node.js is installed.
- Install the package globally:
npm install -g @flowcore/platform-mcp-server
- Run the server:
platform-mcp-server --username <username> --pat <pat>
Option C: Development workflow (optional)
- Install Bun (for development steps in the repository):
bun install
- Run directly with Bun (for development):
bun run src/index.ts --username <username> --pat <pat>
- Build and run the built project (if applicable):
bun run build
node dist/cli.js --username <username> --pat <pat>
Additional notes
Environment variables: USERNAME and PAT are required. These correspond to your Flowcore username and Personal Access Token. If you encounter authentication issues, double-check that your PAT has the necessary scopes for MCP interactions. The server is designed to be used with MCP-compliant clients; ensure your client sends properly structured MCP requests. If you’re experimenting locally, you can reference the repository’s guidance to use the local read model MCP server for comparison and potential latency improvements. For troubleshooting, verify Node.js and npm versions, network access to npm, and that the environment variables are correctly set in your execution environment.
Related MCP Servers
context7
Context7 MCP Server -- Up-to-date code documentation for LLMs and AI code editors
obsidian -tools
Add Obsidian integrations like semantic search and custom Templater prompts to Claude or any MCP client.
MiniMax -JS
Official MiniMax Model Context Protocol (MCP) JavaScript implementation that provides seamless integration with MiniMax's powerful AI capabilities including image generation, video generation, text-to-speech, and voice cloning APIs.
mcp-bundler
Is the MCP configuration too complicated? You can easily share your own simplified setup!
akyn-sdk
Turn any data source into an MCP server in 5 minutes. Build AI-agents-ready knowledge bases.
promptboard
The Shared Whiteboard for Your AI Agents via MCP. Paste screenshots, mark them up, and share with AI.