Get the FREE Ultimate OpenClaw Setup Guide →

ai-code-helper

2025 年 AI 编程助手实战项目(作者:程序员鱼皮),基于 Spring Boot 3.5 + Java 21 + LangChain4j + AI 构建智能编程学习与求职辅导机器人,覆盖 AI 大模型接入、LangChain4j 核心特性、流式对话、Prompt 工程、RAG 检索增强、向量数据库、Tool Calling 工具调用、MCP 模型上下文协议、Web 爬虫、安全防护、Vue.js 前端开发、SSE 服务端推送等企业级 AI 应用开发技术。帮助开发者掌握 AI 时代必备技能,熟悉 LangChain 框架,提升编程学习效率和求职竞争力,成为企业需要的 AI 全栈开发人才。

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio liyupi-ai-code-helper mvn spring-boot:run \
  --env HTTP_PORT="Port for the Spring Boot backend (default typically 8080)" \
  --env TH_API_KEY="Tongyi Qianwen API key" \
  --env BIG_MODEL_API_KEY="Big Model API key"

How to use

This MCP server provides an AI-assisted coding mentor and interview preparation assistant built with LangChain4j and the Tongyi Qianwen model. It combines a backend Spring Boot service with a Vue.js frontend and exposes a set of tools for document search, retrieval augmented generation (RAG), and safe input handling to help users learn programming and prepare for interviews. The MCP protocol enables you to manage contexts and model calls in a unified way, so you can plug this server into your broader AI tooling and pipelines. Typical usage involves running the backend to host the API and optionally serving the frontend for a complete UI experience. The server coordinates between the LangChain4j layer, the MCP context protocol, and the model endpoints to deliver streaming responses and structured outputs.

Key capabilities include:

  • Conversational AI for programming guidance, resume and interview coaching, and code examples.
  • RAG-based retrieval from local knowledge bases to surface relevant information.
  • Tool integration for searching interview questions and web-based resources.
  • Streaming response support for a real-time typing-like user experience.
  • Safety guards for input to protect against sensitive content.

To use the MCP server, first start the backend (Spring Boot) and ensure your Tongyi Qianwen API key and any required big-model API keys are configured. Then connect your MCP client or frontend to the backend API endpoints to begin interacting with the AI assistant. The frontend (Vue.js) provides a chat UI and real-time streaming, while the backend exposes RESTful API endpoints and MCP-integrated capabilities.

How to install

Prerequisites:

  • Java 21+ (JDK)
  • Maven 3.6+ (for building and running the Spring Boot app)
  • Access to Tongyi Qianwen API and any Big Model API keys

Install steps:

  1. Clone the repository: git clone <repository-url> cd ai-code-helper

  2. Set up API keys (example for application.yml or environment variables):

    • TH_API_KEY=your Tongyi Qianwen API key
    • BIG_MODEL_API_KEY=your Big Model API key
    • If you expose via environment variables, ensure Spring Boot reads them or configure in application.yml accordingly.
  3. Build and run the backend (Spring Boot): mvn spring-boot:run

  4. Start the frontend (optional, if a frontend is provided): cd ai-code-helper-frontend npm install npm run dev

  5. Access the application: Backend API: http://localhost:8080/api Frontend (if used): http://localhost:5173

Additional notes

Tips and common considerations:

  • Ensure your API keys are securely stored and not committed to version control.
  • The MCP integration relies on proper model and context management; if you see context-related errors, verify your MCP settings and ensure the backend is up-to-date with the protocol version.
  • For local development, you may map different ports if 8080 or 5173 are occupied.
  • If you enable streaming, ensure your client supports receiving and rendering incremental messages.
  • The server can be extended with additional tools (e.g., more interview question sources or custom knowledge bases) by configuring LangChain4j pipelines and MCP tool integrations.
  • Check logs for any connectivity issues with Tongyi Qianwen or the big-model API endpoints and update keys as needed.

Related MCP Servers

Sponsor this space

Reach thousands of developers