MCP
MCP server from abhinavjain1110/MCP-Server
claude mcp add --transport stdio abhinavjain1110-mcp-server node backend/index.js \ --env PORT="8000" \ --env REPO_PATH="path to repository to analyze" \ --env AI_PROVIDER="openai or ollama" \ --env OPENAI_API_KEY="your OpenAI API key (if using OpenAI provider)" \ --env VECTOR_DB_PATH="path to vector database (e.g., sqlite file)"
How to use
DevInsight is an MCP-powered AI assistant designed to analyze, explain, and refactor code locally. It exposes a suite of MCP tools through a Fastify backend and a React-based frontend interface, enabling tasks such as code analysis, natural language explanations, bug detection, refactoring suggestions, test execution summaries, and RAG-driven code search. Users can index their codebases for retrieval-augmented generation (RAG), then interact via REST API endpoints or the chat UI to request analyses, explanations, or refactors. The system relies on an AI provider (OpenAI or Ollama) and a local or remote vector store to perform semantic search over indexed code snippets. In practice, you can index a repository, then ask the chat to analyze a file, explain how a function works, find potential bugs from an error trace, or run tests and summarize results. The API endpoints (/api/mcp/* and /api/rag/*) map to individual MCP tools such as analyze, explain, find-bug, refactor, run-tests, git-history, and rag/index.
How to install
Prerequisites:
- Node.js installed (recommended LTS)
- Access to an AI provider (OpenAI API key or Ollama locally)
- Git (optional, for repository operations)
- Clone the project
- git clone <repository-url>
- cd devinsight
- Install backend dependencies
cd backend
npm install
- Install frontend dependencies (optional for full UI)
cd ../frontend
npm install
- Set up environment variables
cp .env.example .env
Edit .env and add required values:
- OPENAI_API_KEY (if using OpenAI)
- AI_PROVIDER (openai or ollama)
- REPO_PATH (path to repository to analyze)
- VECTOR_DB_PATH (path to vector database)
5) Start the server
```bash
cd backend
npm run dev
- (Optional) Start the frontend UI
cd frontend
npm run dev
Note: The backend server exposes endpoints like /api/mcp/analyze, /api/mcp/explain, /api/mcp/find-bug, /api/mcp/refactor, /api/mcp/run-tests, and /api/rag/index for indexing and /api/rag/stats for index stats. Ensure your environment variables are correctly set to enable the AI provider and vector store integration.
Additional notes
Tips:
- Ensure OPENAI_API_KEY or Ollama is configured; without a valid API key, MCP tools may return placeholders.
- Before using RAG features, index your repository via POST /api/rag/index with {"repoPath": "./data/your_repo"}.
- For code analysis and refactoring, provide clear references to files or code blocks (e.g., Analyze the code in backend/index.js).
- Git-related features require a valid Git repository at REPO_PATH.
- If using Ollama, ensure the local LLM server is running and accessible as configured in AI_PROVIDER.
- The system uses a vector store (SQLite via better-sqlite3) for fast semantic search; monitor disk usage for large codebases.
Related MCP Servers
zen
Selfhosted notes app. Single golang binary, notes stored as markdown within SQLite, full-text search, very low resource usage
MCP -Deepseek_R1
A Model Context Protocol (MCP) server implementation connecting Claude Desktop with DeepSeek's language models (R1/V3)
mcp-fhir
A Model Context Protocol implementation for FHIR
mcp
Inkdrop Model Context Protocol Server
mcp-appium-gestures
This is a Model Context Protocol (MCP) server providing resources and tools for Appium mobile gestures using Actions API..
dubco -npm
The (Unofficial) dubco-mcp-server enables AI assistants to manage Dub.co short links via the Model Context Protocol. It provides three MCP tools: create_link for generating new short URLs, update_link for modifying existing links, and delete_link for removing short links.