surfline
MCP server from englishar/surfline-mcp-server
claude mcp add --transport stdio englishar-surfline-mcp-server npm run dev \ --env GOOGLE_CLIENT_ID="Your Google OAuth Client ID" \ --env GOOGLE_CLIENT_SECRET="Your Google OAuth Client Secret" \ --env COOKIE_ENCRYPTION_KEY="A random hex string used to encrypt cookies"
How to use
This MCP server provides rich surf data from Surfline via the MCP interface, exposing comprehensive current conditions, detailed swell analyses, forecasts, tides, and forecaster notes. It supports Claude and any MCP-compatible client, enabling you to query a single combined report or fetch specific data through individual tools. The primary tool get_complete_surf_report returns everything in one call: forecaster notes, sunrise/sunset, tide information, current conditions for all covered spots, swell breakdown, and 8-hour predictions. If you only need particular data, you can use the specialized tools like get_surf_forecast for spot conditions, get_forecaster_notes for human observations, get_tides for tide data, and get_best_spot for recommended spots. The server is designed around Surfline’s public endpoints, with authentication handled via Google OAuth when accessed through Claude integrations, ensuring secure access without Surfline API keys.
How to install
Prerequisites:
- Node.js 18+
- A Cloudflare account (free tier works)
- A Google Cloud project for OAuth (free)
Installation and setup:
- Clone the repository and install dependencies:
git clone https://github.com/englishar-surfline-mcp-server.git
cd surfline-mcp-server
npm install
- Configure Google OAuth credentials:
- Go to Google Cloud Console > APIs & Services > Credentials
- Create a new OAuth 2.0 Client ID (Web application)
- Add authorized redirect URIs such as:
- Note your Client ID and Client Secret and set them in environment or secret store
- Create a KV namespace for OAuth state:
npx wrangler kv namespace create OAUTH_KV
Update wrangler.jsonc with the returned KV ID.
- Set required secrets:
echo 'YOUR_GOOGLE_CLIENT_ID' | npx wrangler secret put GOOGLE_CLIENT_ID
echo 'YOUR_GOOGLE_CLIENT_SECRET' | npx wrangler secret put GOOGLE_CLIENT_SECRET
echo $(openssl rand -hex 32) | npx wrangler secret put COOKIE_ENCRYPTION_KEY
- Start development server locally:
npm run dev
The server will be available at http://localhost:8788. For production deployment, follow your Cloudflare Workers deployment workflow via Wrangler.
- Optional: Test with MCP Inspector to verify the MCP endpoints:
npx @modelcontextprotocol/inspector
Additional notes
Notes and tips:
- This MCP server uses Surfline’s public endpoints; no Surfline API keys are required for basic forecast data, but webcams or premium features are not available.
- To extend coverage to more spots, edit src/index.ts and add new entries to SANTA_CRUZ_SPOTS (or corresponding region map).
- Ensure OAuth credentials are kept secure; use the Cloudflare Secrets store in production.
- If you encounter CORS or OAuth redirect issues, verify the authorized redirect URIs in Google Cloud Console and the worker URL in wrangler configuration.
- The cloud deployment uses Cloudflare Workers with Durable Objects for OAuth state and KV for token persistence; monitor usage to stay within free tier limits.
- You can query either the combined data (get_complete_surf_report) or the granular endpoints (get_surf_forecast, get_forecaster_notes, get_tides, get_best_spot) depending on your AI workflow needs.
Related MCP Servers
iterm
A Model Context Protocol server that executes commands in the current iTerm session - useful for REPL and CLI assistance
mcp
Octopus Deploy Official MCP Server
furi
CLI & API for MCP management
editor
MCP Server for Phaser Editor
DoorDash
MCP server from JordanDalton/DoorDash-MCP-Server
mcp
MCP сервер для автоматического создания и развертывания приложений в Timeweb Cloud