Get the FREE Ultimate OpenClaw Setup Guide →

composer

Composable Streaming Intelligence

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio winkjs-composer node server.js

How to use

Wink Composer provides a framework to build and run composable streaming analytics pipelines, exposed as an MCP Server. It centers on a Flow language that lets you declaratively stitch together small, focused building blocks (nodes) to process live streams, extract features, detect events, and persist insights. The server surfaces AI-ready insights by creating structured, contextualized results from streaming data, enabling retrieval over insights rather than raw data and supporting edge-to-cloud deployments with consistent code paths across environments. Tools and capabilities highlighted in the project include building-block nodes for signal conditioning, feature extraction, detection, flow control, emission, and data quality, all orchestrated through a single Flow pipeline that can scale from individual pipelines to fleet-wide analytics. With connected storage (e.g., QuestDB) and input/output integrations, Composer enables end-to-end streaming intelligence, from ingestion to persistent insights, while maintaining isolated state per sensor, user, or session for resilience and scalability.

How to install

Prerequisites:

  • Node.js (LTS) installed on your system
  • npm or pnpm to install dependencies
  • Git to clone the repository (optional if downloading as a package)

Step-by-step installation:

  1. Clone the repository or download the source: git clone https://github.com/winkjs/composer.git cd composer

  2. Install dependencies: npm install

  3. Build or prepare the server if the project provides a build step (adjust as needed): npm run build

  4. Start the MCP Server locally (adjust script if your package defines a different start command): npm run start

  5. Verify the server is running by checking logs or hitting the health endpoint (if provided by the project). If you use a custom entry point, ensure server.js (or your specified entry) is present and accessible.

Notes:

  • If your project uses environment variables for configuration (e.g., database URLs, broker addresses), create a .env file or export them in your shell before starting the server. The MCP config can also reference these env vars as needed by your deployment.

Additional notes

Tips and considerations:

  • Environment variables: configure your data storage (QuestDB), message brokers (MQTT, Kafka), and AI/LLM integrations via environment variables as documented by the project.
  • Flow language: design modular pipelines using the 6 categories of blocks (Signal Conditioning, Feature Extraction, Detection, Flow Control, Emission, Data Quality) to keep pipelines readable and maintainable.
  • Observability: leverage built-in logging, metrics, and error handling to diagnose issues quickly in streaming scenarios.
  • Deployment: this MCP server is intended to run consistently across edge to cloud; maintain consistent dependencies and entry points to minimize drift between environments.
  • Common issues: ensure correct broker URLs, storage endpoints, and network access; verify that storage like QuestDB is reachable and listening on expected ports; confirm that the Flow steps reference valid node names and that any required prerequisites (e.g., QuestDB, MQTT broker) are up before starting.
  • Configuration: use the provided mcp_config block to wire the server into your MCP orchestration layer, and tailor env vars to your deployment environment.

Related MCP Servers

Sponsor this space

Reach thousands of developers