Librechat-Advanced
Chat with top llms(free) , built in websearch , code-interpreter data analysis tools , powered by the model context protocol.
claude mcp add --transport stdio vaibhavard-librechat-advanced npx -y vaibhavard-librechat-advanced \ --env NODE_ENV="production" \ --env LIBRECHAT_ADVANCED_CONFIG="default"
How to use
Librechat-Advanced is an all-in-one AI assistant platform that aggregates multiple AI capabilities and agents into a single, accessible interface. It provides free, seamless access to diverse large language models and integrated tools such as Smarty for web search, CodeSandbox for code interpretation and execution, DeepResearch for autonomous search, Excel.AI for Excel editing via a Code Interpreter, and Project.AI for generating Microsoft Project plans. Users can interact with these agents through the unified Librechat-Advanced UI, request specific tasks (like data extraction, code execution, or document processing), and receive outputs with downloadable artifacts where applicable. The platform operates without requiring external API keys, enabling instant experimentation with supported models like Qwen3, Gemini 2.5 Pro, GPT-4o, GPT-4.1, and DeepSeek R1, while leveraging Retrieval-Augmented Generation (RAG) and OCR for document processing.
To use the server, initialize it in your environment (using the MCP configuration) and access the Librechat-Advanced UI. From there, choose the desired agent (e.g., Smarty for live web search, CodeSandbox for coding tasks, or Excel.AI for spreadsheet work) and provide input prompts. The system will route requests to the appropriate internal tools and models, returning structured results and, when relevant, downloadable outputs (XML files for Project.AI, edited Excel files, or code outputs with links).
How to install
Prerequisites:
- Node.js v18+ and npm installed on your machine
- Basic familiarity with running commands in a terminal
-
Install Node.js and npm if not already installed
- macOS/Linux: sudo apt-get update && sudo apt-get install -y nodejs npm
- macOS (Homebrew): brew install node
- Windows: download and install from nodejs.org
-
Install the Librechat-Advanced package (via npm or npx)
- npm install -g vaibhavard-librechat-advanced
- or use npx to run without global install: npx -y vaibhavard-librechat-advanced
-
Prepare environment (optional but recommended)
- Create a .env file with configuration overrides if needed (not required for default setup)
- Example: // .env LIBRECHAT_ADVANCED_CONFIG=default
-
Run the server
- npm start
- If using npx directly, the MCP will spawn the server as configured by the MCP settings
-
Access the UI
- Open http://localhost:3000 (default port) in your browser and begin using the integrated agents
Additional notes
Tips and troubleshooting:
- If the server fails to start, verify Node.js installation and ensure you have network access for pulling the npm package via npx.
- Check environment variables for overrides if the default behavior doesn’t meet your needs (e.g., LIBRECHAT_ADVANCED_CONFIG).
- The platform supports RAG and OCR for document processing; ensure you provide accessible documents in supported formats for best results.
- If you encounter model limitations or latency, consider selecting a different supported model from the available list (Qwen3, Gemini 2.5 Pro, GPT-4o, GPT-4.1, DeepSeek R1).
Related MCP Servers
mindsdb
Query Engine for AI Analytics: Build self-reasoning agents across all your live data
fastmcp
🚀 The fast, Pythonic way to build MCP servers and clients.
mcp-agent
Build effective agents using Model Context Protocol and simple workflow patterns
neurolink
Universal AI Development Platform with MCP server integration, multi-provider support, and professional CLI. Build, test, and deploy AI applications with multiple ai providers.
MediaWiki
Model Context Protocol (MCP) Server to connect your AI with any MediaWiki
rs-utcp
Official Rust implementation of the UTCP