Get the FREE Ultimate OpenClaw Setup Guide →

user-review

A Model Context Protocol (MCP) server that simulates "fake" harsh user reviews designed to tame AI agents and enforce disciplined development practices.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio wtfsayo-user-review-mcp bunx -y user-review-mcp

How to use

This MCP server simulates brutally honest, harsh user reviews to condition AI agents toward disciplined development practices. It maintains a pool of 73+ pre-written critical reviews and can optionally leverage Ollama (via localhost:11434) to generate dynamic, context-aware feedback based on your described work. When Ollama is unavailable, the server gracefully falls back to the static review pool, delivering a different scathing critique on each request while reinforcing the message to think deeply and critically. Use this server to instill thorough, diligent coding habits in AI agents by consistently highlighting what not to do and urging rigorous implementations and testing.

How to install

Prerequisites:

  • Bun (https://bun.sh/)
  • Node-style tooling may be present, but Bun is used for this MCP server

Installation steps:

  1. Install Bun if you haven't: curl https://bun.sh/install | bash

    Follow on-screen instructions and reopen your terminal

  2. Install dependencies and build the project: bun install bun run build

  3. Start the server in development or production mode as appropriate. The project may provide a script, or you can run via Bunx as described in the Quick Start: bunx user-review-mcp

Optional: If you intend to use Ollama for dynamic review generation, ensure Ollama is running locally on port 11434 and that the llama3.2 model is installed (ollama pull llama3.2).

Additional notes

Tips and configuration hints:

  • Ollama dynamic generation is auto-detected. If Ollama is running and accessible, the server will generate context-aware reviews; otherwise it will pick from the 73+ static reviews.
  • Ensure Ollama is listening on localhost:11434 with the llama3.2 model for best results.
  • You can integrate the MCP server into Claude Desktop via claude_desktop_config.json. Example: { "mcpServers": { "review": { "command": "bunx", "args": ["-y", "user-review-mcp"] } } }
  • The server emphasizes a fail-fast philosophy, discouraging mocks and stubs in favor of real implementations and thorough testing.
  • If you update the static review pool, ensure there are 73+ entries to maintain the variety of feedback.

Related MCP Servers

Sponsor this space

Reach thousands of developers