AIStack
MCP-first development environment with community tools
claude mcp add --transport stdio mjdevaccount-aistack-mcp python -m aistack_mcp \ --env LOG_LEVEL="INFO (default) or DEBUG" \ --env QDRANT_URL="http://localhost:6333 (default) or your Qdrant endpoint" \ --env OLLAMA_HOME="Path to Ollama installation (optional if already on PATH)"
How to use
AIStack-MCP provides dual-mode orchestration for enterprise-grade multi-repo development. It enables safe single-repo isolation when you need strict boundaries, and seamless multi-repo coordination for cross-repo intelligence and project-wide context. The server ships with an interactive setup script, validation tooling, and a dev dashboard to monitor services, models, and vector data locally. Core capabilities revolve around local-first AI using Ollama for LLM inference and Qdrant for vector search, while maintaining the option to leverage centralized tools when needed. Typical workflows include switching between modes with a single command, validating configurations before deployment, and using the integrated code intelligence features to search, analyze, and generate across repositories. Use the quickstart wizard to bootstrap your environment, then use the validation tool to ensure your MCP config is zero-warn before running in production or local development.
How to install
Prerequisites:
- Python 3.8+ (preferably 3.8 or newer)
- Pip (comes with Python)
- Optional: Docker for containerized components
- Ollama installed and accessible in PATH (for local LLM inference)
- Qdrant installed and running (vector store)
Step-by-step:
-
Prepare environment
- python3 -m venv venv
- source venv/bin/activate # on Unix/macOS
- .\venv\Scripts\activate # on Windows
-
Install the MCP server package (Python)
- pip install aistack-mcp
-
Verify installation and configure
- Ensure Ollama and Qdrant are running locally
- Create or edit your configuration as needed (see mcp_config in this document)
-
Run the MCP server
- python -m aistack_mcp
-
Optional: Use the quickstart and tooling
- Run the interactive setup wizard: quickstart.ps1 (Windows) or its cross-platform equivalent if provided
- Validate your MCP config: python -m aistack_mcp.tools.validate_mcp_config --strict
Notes:
- If you prefer containerized setup, AIStack-MCP also offers npm/PyPI/Docker server installers via the One-Command Installer described in the project docs.
Additional notes
Tips and common issues:
- Ensure Ollama and Qdrant services are reachable before starting the MCP server; misconfigurations here are a common startup blocker.
- Use the strict validation mode to catch misconfigurations early and avoid zero-warning builds in CI.
- For production deployments, consider running Ollama/Qdrant as separate services and point AIStack-MCP to their endpoints via environment variables.
- The environment variable VARNAME descriptions in mcp_config are placeholders; replace with production-ready values as appropriate for your environment.
- If you encounter authentication or permission issues in multi-repo setups, review workspace isolation and explicit permission configurations documented in the official guides.
Related MCP Servers
augments
Comprehensive MCP server providing real-time framework documentation access for Claude Code with intelligent caching, multi-source integration, and context-aware assistance.
Archive-Agent
Find your files with natural language and ask questions.
LLaMa -Streamlit
AI assistant built with Streamlit, NVIDIA NIM (LLaMa 3.3:70B) / Ollama, and Model Control Protocol (MCP).
mcp-playground
A Streamlit-based chat app for LLMs with plug-and-play tool support via Model Context Protocol (MCP), powered by LangChain, LangGraph, and Docker.
apifox
Apifox MCP 服务器 - 让 Claude 等 AI 助手通过自然语言管理你的 Apifox 项目,轻松创建、更新和审计 API 接口
local -gateway
Aggregate multiple MCP servers into a single endpoint with web UI, OAuth 2.1, and profile-based tool management