mcp-chat
Examples of using Pipedream's MCP server in your app or AI agent.
claude mcp add --transport stdio pipedreamhq-mcp-chat node path/to/server.js \ --env AUTH_SECRET="your-auth-secret" \ --env EXA_API_KEY="your-exa-api-key" \ --env POSTGRES_URL="postgresql://postgres@localhost:5432/postgres" \ --env OPENAI_API_KEY="your-openai-api-key" \ --env PIPEDREAM_CLIENT_ID="your-pipedream-client-id" \ --env PIPEDREAM_PROJECT_ID="your-pipedream-project-id" \ --env PIPEDREAM_CLIENT_SECRET="your-pipedream-client-secret" \ --env PIPEDREAM_PROJECT_ENVIRONMENT="your-environment"
How to use
MCP Chat is a chat application built on top of the MCP framework to access thousands of APIs via Pipedream MCP integrations. It uses the AI SDK to orchestrate model calls, tool invocations, and structured responses within a single chat interface. Operators can leverage built-in authentication, data persistence, and multi-model support to build interactive AI-powered conversations that can call external tools, query databases, or fetch real-time data. To use it, connect your MCP-enabled tools and models (e.g., OpenAI, Anthropic, Gemini) and sign in with the supported auth flow. The app demonstrates how to perform tool calls from the chat, enabling automated actions across various APIs without leaving the conversation. It also integrates persistence and sign-in to provide a continuous, personalized chat experience for users.
How to install
Prerequisites:
- Node.js and pnpm installed on your machine
- A Pipedream account and a configured MCP project
- OpenAI API key and optional credentials for other providers
Steps:
-
Clone the repository: git clone https://github.com/PipedreamHQ/mcp-chat.git cd mcp-chat
-
Install dependencies using pnpm: pnpm install
-
Create or copy environment configuration: cp .env.example .env # Edit with your credentials
-
Start the development server: pnpm dev
-
Open the app in your browser: http://localhost:3000
Optional: Deploy to production (e.g., Vercel?) by following the Deploy Your Own instructions in the README and setting the required environment variables in your hosting platform.
Additional notes
Tips and caveats:
- Ensure your OpenAI API key and any MCP provider keys are kept secure and not checked into source control.
- If you enable chat persistence and authentication in development, you may need to run additional services (e.g., a local Postgres instance via Docker) and run migrations as shown in the README.
- The app relies on Pipedream MCP for API access; make sure MCP permissions and OAuth credentials are properly configured.
- When testing tool calls, monitor network requests and tool response formats to ensure compatibility with your LLM's expectations.
- If you run into port conflicts, adjust the dev server port in your environment or hosting platform configuration.
- For local development, using asdf to manage core dependencies (like Node versions) is recommended as described in the README.
Related MCP Servers
markdownify
A Model Context Protocol server for converting almost anything to Markdown
toolbase
A desktop application that adds powerful tools to Claude and AI platforms
dvmcp
DVMCP is a bridge implementation that connects Model Context Protocol (MCP) servers to Nostr's Data Vending Machine (DVM) ecosystem
codemesh
The Self-Improving MCP Server - Agents write code to orchestrate multiple MCP servers with intelligent TypeScript execution and auto-augmentation
mcp-easy-installer
MCP easy installer is a robust mcp server with tools to search, install, configure, repair and uninstall MCP servers
docmole
Dig through any documentation with AI - MCP server for Claude, Cursor, and other AI assistants