Get the FREE Ultimate OpenClaw Setup Guide →

mcp-chat

Examples of using Pipedream's MCP server in your app or AI agent.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio pipedreamhq-mcp-chat node path/to/server.js \
  --env AUTH_SECRET="your-auth-secret" \
  --env EXA_API_KEY="your-exa-api-key" \
  --env POSTGRES_URL="postgresql://postgres@localhost:5432/postgres" \
  --env OPENAI_API_KEY="your-openai-api-key" \
  --env PIPEDREAM_CLIENT_ID="your-pipedream-client-id" \
  --env PIPEDREAM_PROJECT_ID="your-pipedream-project-id" \
  --env PIPEDREAM_CLIENT_SECRET="your-pipedream-client-secret" \
  --env PIPEDREAM_PROJECT_ENVIRONMENT="your-environment"

How to use

MCP Chat is a chat application built on top of the MCP framework to access thousands of APIs via Pipedream MCP integrations. It uses the AI SDK to orchestrate model calls, tool invocations, and structured responses within a single chat interface. Operators can leverage built-in authentication, data persistence, and multi-model support to build interactive AI-powered conversations that can call external tools, query databases, or fetch real-time data. To use it, connect your MCP-enabled tools and models (e.g., OpenAI, Anthropic, Gemini) and sign in with the supported auth flow. The app demonstrates how to perform tool calls from the chat, enabling automated actions across various APIs without leaving the conversation. It also integrates persistence and sign-in to provide a continuous, personalized chat experience for users.

How to install

Prerequisites:

  • Node.js and pnpm installed on your machine
  • A Pipedream account and a configured MCP project
  • OpenAI API key and optional credentials for other providers

Steps:

  1. Clone the repository: git clone https://github.com/PipedreamHQ/mcp-chat.git cd mcp-chat

  2. Install dependencies using pnpm: pnpm install

  3. Create or copy environment configuration: cp .env.example .env # Edit with your credentials

  4. Start the development server: pnpm dev

  5. Open the app in your browser: http://localhost:3000

Optional: Deploy to production (e.g., Vercel?) by following the Deploy Your Own instructions in the README and setting the required environment variables in your hosting platform.

Additional notes

Tips and caveats:

  • Ensure your OpenAI API key and any MCP provider keys are kept secure and not checked into source control.
  • If you enable chat persistence and authentication in development, you may need to run additional services (e.g., a local Postgres instance via Docker) and run migrations as shown in the README.
  • The app relies on Pipedream MCP for API access; make sure MCP permissions and OAuth credentials are properly configured.
  • When testing tool calls, monitor network requests and tool response formats to ensure compatibility with your LLM's expectations.
  • If you run into port conflicts, adjust the dev server port in your environment or hosting platform configuration.
  • For local development, using asdf to manage core dependencies (like Node versions) is recommended as described in the README.

Related MCP Servers

Sponsor this space

Reach thousands of developers