Get the FREE Ultimate OpenClaw Setup Guide →

MCP

MCP server from abhinavjain1110/MCP-Server

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio abhinavjain1110-mcp-server node backend/index.js \
  --env PORT="8000" \
  --env REPO_PATH="path to repository to analyze" \
  --env AI_PROVIDER="openai or ollama" \
  --env OPENAI_API_KEY="your OpenAI API key (if using OpenAI provider)" \
  --env VECTOR_DB_PATH="path to vector database (e.g., sqlite file)"

How to use

DevInsight is an MCP-powered AI assistant designed to analyze, explain, and refactor code locally. It exposes a suite of MCP tools through a Fastify backend and a React-based frontend interface, enabling tasks such as code analysis, natural language explanations, bug detection, refactoring suggestions, test execution summaries, and RAG-driven code search. Users can index their codebases for retrieval-augmented generation (RAG), then interact via REST API endpoints or the chat UI to request analyses, explanations, or refactors. The system relies on an AI provider (OpenAI or Ollama) and a local or remote vector store to perform semantic search over indexed code snippets. In practice, you can index a repository, then ask the chat to analyze a file, explain how a function works, find potential bugs from an error trace, or run tests and summarize results. The API endpoints (/api/mcp/* and /api/rag/*) map to individual MCP tools such as analyze, explain, find-bug, refactor, run-tests, git-history, and rag/index.

How to install

Prerequisites:

  • Node.js installed (recommended LTS)
  • Access to an AI provider (OpenAI API key or Ollama locally)
  • Git (optional, for repository operations)
  1. Clone the project
  • git clone <repository-url>
  • cd devinsight
  1. Install backend dependencies
cd backend
npm install
  1. Install frontend dependencies (optional for full UI)
cd ../frontend
npm install
  1. Set up environment variables
cp .env.example .env

Edit .env and add required values:

  • OPENAI_API_KEY (if using OpenAI)
  • AI_PROVIDER (openai or ollama)
  • REPO_PATH (path to repository to analyze)
  • VECTOR_DB_PATH (path to vector database)

5) Start the server
```bash
cd backend
npm run dev
  1. (Optional) Start the frontend UI
cd frontend
npm run dev

Note: The backend server exposes endpoints like /api/mcp/analyze, /api/mcp/explain, /api/mcp/find-bug, /api/mcp/refactor, /api/mcp/run-tests, and /api/rag/index for indexing and /api/rag/stats for index stats. Ensure your environment variables are correctly set to enable the AI provider and vector store integration.

Additional notes

Tips:

  • Ensure OPENAI_API_KEY or Ollama is configured; without a valid API key, MCP tools may return placeholders.
  • Before using RAG features, index your repository via POST /api/rag/index with {"repoPath": "./data/your_repo"}.
  • For code analysis and refactoring, provide clear references to files or code blocks (e.g., Analyze the code in backend/index.js).
  • Git-related features require a valid Git repository at REPO_PATH.
  • If using Ollama, ensure the local LLM server is running and accessible as configured in AI_PROVIDER.
  • The system uses a vector store (SQLite via better-sqlite3) for fast semantic search; monitor disk usage for large codebases.

Related MCP Servers

Sponsor this space

Reach thousands of developers