nodespace-core
Local-first knowledge base with semantic search for AI coding assistants. Fewer tokens, faster context.
claude mcp add --transport stdio nodespaceai-nodespace-core http http://localhost:3100/mcp
How to use
NodeSpace includes a built-in MCP server that starts automatically when you open the desktop app. Your AI tools can connect locally to this MCP endpoint so they can query your project knowledge without reaching out to the cloud. To use it with your MCP clients (for example Claude Code or Codex-style integrations), point the client’s MCP configuration to http://localhost:3100/mcp. The server exposes a JSON-based API at that endpoint, enabling you to register and query your knowledge base seamlessly within your local environment.
Once connected, you can leverage the NodeSpace MCP integration to provide persistent context across sessions, reducing the need to re-explain codebases. The MCP setup is designed to run entirely on your machine, keeping your data offline and under your control. If you use multiple AI tools, you can configure each one to connect to the same local MCP endpoint so all tools share a unified context source.
How to install
Prerequisites:
- A supported operating system (macOS, Windows, Linux)
- Desktop installation of NodeSpace (downloadable from the project releases) or access to the source for building locally
Installation options:
-
Install the desktop app (recommended):
- Download the NodeSpace desktop app from the official releases page
- Run the installer and follow prompts to complete installation
- Launch the app; the built-in MCP server starts automatically when the app is opened
-
Build from source (advanced):
- Prerequisites: Bun 1.0+, Rust 1.80+ (see Bun and Rust install docs in the readme)
- Clone the repository: git clone https://github.com/NodeSpaceAI/nodespace-core
- Install dependencies and run the development build: cd nodespace-core bun install bun run tauri:dev
Notes:
- The MCP server is embedded in the NodeSpace desktop app and does not require a separate deployment for standard use.
- If you need to run a standalone local instance for testing, you can rely on the app’s local MCP endpoint at http://localhost:3100/mcp after starting the app.
Additional notes
Tips and caveats:
- The MCP endpoint is local by default (http://localhost:3100/mcp). Ensure the desktop app is running to keep the MCP server active.
- This MCP setup is described as alpha/early development in the project; features and data formats may change.
- No explicit environment variables are required for standard use; configuration is primarily through the MCP client settings.
- If you encounter connectivity issues, verify the desktop app is launched and firewall rules allow local network traffic to port 3100.
Related MCP Servers
Overture
Overture is an open-source, locally running web interface delivered as an MCP (Model Context Protocol) server that visually maps out the execution plan of any AI coding agent as an interactive flowchart/graph before the agent begins writing code.
google-ai-mode
MCP server for free Google AI Mode search with citations. Query optimization, CAPTCHA handling, multi-agent support. Works with Claude Code, Cursor, Cline, Windsurf.
mcp-image
MCP server for AI image generation and editing with automatic prompt optimization and quality presets (fast/balanced/quality). Powered by Gemini (Nano Banana 2 & Pro).
sub-agents
Define task-specific AI sub-agents in Markdown for any MCP-compatible tool.
devtap
Bridge build/dev process output to AI coding sessions via MCP — supports Claude Code, Codex, OpenCode, Gemini CLI, and aider
statelessagent
Your AI forgets everything between sessions. SAME fixes that. Local-first, no API keys, single binary.