PurpleSector
An AI-powered telemetry analysis tool for Assetto Corsa that helps drivers improve their lap times through real-time data visualization and intelligent coaching suggestions.
claude mcp add --transport stdio chrismarth-purplesector node server.js \ --env WS_PORT="8080" \ --env DATABASE_URL="file:./dev.db" \ --env OPENAI_API_KEY="your_openai_api_key_here" \ --env TELEMETRY_UDP_PORT="9996"
How to use
Purple Sector is an AI-powered telemetry analysis tool for Assetto Corsa and Assetto Corsa Competizione. It ingests telemetry data from your sessions, visualizes key performance metrics in real time, and offers coaching suggestions powered by OpenAI-based analysis. The web interface runs locally and provides dashboards to review lap times, sector splits, throttle, brake, and steering usage, alongside intelligent feedback to help drivers shave tenths off lap times. The system is designed to be run in a development environment with Kafka-backed services and a database to store telemetry and insights. To get started, ensure you have a valid OpenAI API key and point the server at a local dev database. The tools exposed by Purple Sector include live telemetry streams, visualization dashboards, and AI-driven coaching prompts that you can tailor to your driving style and circuit.
How to install
Prerequisites:
- Node.js and npm installed on your machine
- Git cloned repository of Purple Sector
- Access to a local or dev database (the example uses a file-based SQLite-like setup via a dev DB URL)
Install and run locally:
- Install dependencies
npm install - Create a local environment file with required keys
- Create a file named .env.local in the repo root with: OPENAI_API_KEY=your_openai_api_key_here DATABASE_URL="file:./dev.db" WS_PORT=8080 TELEMETRY_UDP_PORT=9996
- Initialize the database schema (if required by the project)
npm run db:push - Start the full development environment (Kafka, services, demo collector, frontend)
npm run dev:start - Open the web UI at
http://localhost:3000
Additional notes
Environment variables: OPENAI_API_KEY is required for AI coaching features. DATABASE_URL should point to a dev/test database. WS_PORT is the WebSocket port for the frontend to receive live telemetry; TELEMETRY_UDP_PORT is the UDP port the telemmetry collector uses. If you run into port conflicts, adjust these values in your .env.local. The README mentions a comprehensive documentation site at https://chrismarth.github.io/PurpleSector/ for architecture and manual startup details. For production usage, review licensing (AGPL-3.0 for non-commercial use and commercial license for commercial deployments) and ensure your deployment conforms to the license terms.
Related MCP Servers
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
langchain_data_agent
NL2SQL - Ask questions in plain English, get SQL queries and results. Powered by LangGraph.
mcp
🤖 Taskade MCP · Official MCP server and OpenAPI to MCP codegen. Build AI agent tools from any OpenAPI API and connect to Claude, Cursor, and more.
fullstack-langgraph-nextjs-agent
Production-ready Next.js template for building AI agents with LangGraph.js. Features MCP integration for dynamic tool loading, human-in-the-loop tool approval, persistent conversation memory with PostgreSQL, and real-time streaming responses. Built with TypeScript, React, Prisma, and Tailwind CSS.
MCP-MultiServer-Interoperable-Agent2Agent-LangGraph-AI-System
This project demonstrates a decoupled real-time agent architecture that connects LangGraph agents to remote tools served by custom MCP (Modular Command Protocol) servers. The architecture enables a flexible and scalable multi-agent system where each tool can be hosted independently (via SSE or STDIO), offering modularity and cloud-deployable execut
mcp -templates
A flexible platform that provides Docker & Kubernetes backends, a lightweight CLI (mcpt), and client utilities for seamless MCP integration. Spin up servers from templates, route requests through a single endpoint with load balancing, and support both deployed (HTTP) and local (stdio) transports — all with sensible defaults and YAML-based configs.