chatgpt-app-typescript-template
ChatGPT app template using Pomerium, OpenAI Apps SDK and Model Context Protocol (MCP), with a Node.js server and React widgets.
claude mcp add --transport stdio pomerium-chatgpt-app-typescript-template node server/src/server.ts
How to use
This MCP server is a Node.js backend that hosts the Echo tool example and serves as the MCP endpoint for ChatGPT apps. It preserves _meta fields in tool responses and works alongside a React widget UI that can call tools via window.openai.callTool. The Echo tool demonstrates input validation, structured responses, and widget integration, enabling you to test end-to-end tool invocation from ChatGPT. To use it, start the server and the widget development server, then connect your ChatGPT app to the MCP endpoint (usually /mcp) and begin testing tool invocations like echo today is a great day in a chat. You can inspect tool definitions, run local tests with MCP Inspector, and view widget outputs in the interactive demo widget.
How to install
Prerequisites:
- Node.js 22+ (ES2023 support)
- npm 10+ (ships with Node 22)
- Clone the repository:
git clone https://github.com/pomerium/chatgpt-app-typescript-template your-chatgpt-app
cd your-chatgpt-app
- Install dependencies (server + widgets):
npm install
- Run the development environment (server + widgets):
npm run dev
This starts the MCP server (http://localhost:8080) and the widget assets server (http://localhost:4444).
- Optional: Run only the server in watch mode:
npm run dev:server
- Optional: Run only the widget dev server:
npm run dev:widgets
Prerequisites and environment setup should align with the Quick Start in the README for a smooth local development workflow.
Additional notes
Tips and common considerations:
- The MCP server exposes /mcp and /health endpoints. Use npm run inspect for local testing with the MCP Inspector.
- The Echo tool showcases Zod validation and a widget response; you can customize or add new tools following the same pattern.
- Ensure Node.js 22+ and npm 10+ are installed to support ES2023 features.
- For production deployment, build steps separate server and widgets; reference npm run build and related commands in the Available Commands section.
- If connecting from ChatGPT, ensure your public URL and /mcp path are reachable and that the server is health-checked before enabling the connector in ChatGPT.
- Logs are structured with Pino; in development, pretty printing is enabled for readability.
Related MCP Servers
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
mxcp
Model eXecution + Context Protocol: Enterprise-Grade Data-to-AI Infrastructure
mcp-dock
A cross-platform MCP Server manager for Cursor, Claude, Windsurf, Zed & TRAE. Features one-click installation, multi-client sync, and a curated registry of Official & Smithery servers.
obsidian
MCP server for git-backed Obsidian vaults. Access and manage notes through Claude, ChatGPT, and other LLMs with automatic git sync. Supports local (stdio/HTTP) and remote (AWS Lambda) deployment.
ConferenceHaven-Community
Community feedback, documentation, and discussions for ConferenceHaven MCP - Your AI conference assistant
Derived-WMD
The Agentic Generative UI Platform: Transform natural language into production-ready React apps in seconds. Featuring autonomous Tambo agents, Model Context Protocol (MCP) for codebase grounding, and secure E2B sandboxed execution.