Get the FREE Ultimate OpenClaw Setup Guide →

hackaton

A lightweight Model Context Protocol (MCP) server built for hackathons. It provides an easy-to-use backend service where developers can plug in custom tools, APIs, and workflows to quickly prototype AI-powered applications.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio pro-dutt-hackaton-mcp-server docker run -i hackaton-mcp-image:latest \
  --env LM_STUDIO_API_URL="http://localhost:1234" \
  --env DESCOPE_PROJECT_ID="your_descope_project_id" \
  --env DESCOPE_MANAGEMENT_KEY="your_descope_management_key" \
  --env GITHUB_PERSONAL_ACCESS_TOKEN="your_github_token_here"

How to use

This MCP server powers the repository discovery workflow for the Hackathon project. It exposes MCP tools to analyze GitHub repositories and run custom analyses to surface beginner-friendly repos, good-first-issues, and comparative insights. The server is containerized with Docker and can be started via a docker run command (as configured in the mcp_config) or via docker-compose if you provide a compose setup. Core capabilities include: search_repository, get_issues, get_directory_contents from the standard GitHub MCP, and analyze_repository, get_beginner_resources, suggest_good_first_issues, and compare_repositories from the custom repo_analyzer MCP. These tools enable end-to-end discovery, filtering, and analysis, supporting both automated recommendations and developer-driven exploration.

How to install

Prerequisites:

  • Docker installed and running
  • Git installed
  • Optional: LM Studio (for Approach 1) if you plan to interface via LM Studio

Installation steps:

  1. Clone the repository that contains the MCP server setup (or the parent project that includes the Docker image for the MCP server).
  2. Ensure Docker is running on your machine.
  3. Start the MCP server container(s):
    • If using a docker-compose setup provided by the project: docker-compose up -d
    • If using a direct docker run as configured in mcp_config: docker run -d -i --env GITHUB_PERSONAL_ACCESS_TOKEN=your_token --env DESCOPE_PROJECT_ID=your_id --env DESCOPE_MANAGEMENT_KEY=your_key --env LM_STUDIO_API_URL=http://localhost:1234 hackaton-mcp-image:latest
  4. Verify the MCP server is up and reachable (e.g., check container logs or health endpoints provided by the MCPs).
  5. Configure environment variables as needed for your environment (see the env section in mcp_config).
  6. If you use LM Studio, start LM Studio on port 1234 and configure MCP integration as described in the project docs.

Additional notes

Tips and caveats:

  • Ensure GITHUB_PERSONAL_ACCESS_TOKEN has appropriate scopes for repository data (repo and read user at a minimum).
  • When using docker-based MCP servers, make sure the host ports used by LM Studio and the MCPs do not conflict with other services.
  • The standard GitHub MCP methods (search_repository, get_issues, get_directory_contents) work in tandem with the custom repo_analyzer MCP methods (analyze_repository, get_beginner_resources, suggest_good_first_issues, compare_repositories) to provide richer insights.
  • If you encounter stdin/stdout communication issues, verify that the Docker container has proper IO handling and that the CLI stream expectations align with your orchestration layer (LangGraph or other).
  • LM Studio should be accessible at the configured LM_STUDIO_API_URL; ensure network routing allows the MCP server to reach LM Studio endpoints for model inference.
  • For local testing, you can simulate the GitHub MCP calls by configuring a mock token and limited repository scopes to avoid hitting rate limits.

Related MCP Servers

Sponsor this space

Reach thousands of developers