ai
Things related to artificial intelligence built on top of Nasdanika capabilities
claude mcp add --transport stdio nasdanika-ai docker run -i nasdanika-ai
How to use
The AI MCP server provides capabilities to work with AI components built on Nasdanika’s resource sets and models. It exposes tooling around embeddings, vector stores, semantic search, and chat-like interactions that can be used to semantically search documents, reason over model relationships, and generate descriptive narratives about model elements. Expect to interact with a semantic search HTTP interface and a CLI that can generate embeddings, manage a vector store, and serve a static frontend for chat interactions. Tools available include: embeddings generation from text, vector store management (build, update, and query), semantic search routes for integrated QA or chat workflows, and a simple chat UI component that can be served alongside the static site and semantic search endpoints. In practice you can load content into a vector store, build or update embeddings, and then perform semantic search that accounts for both semantic similarity and graph-based relationships, enabling contextual answers such as “who is a parent of Lea” or “show relationships within the family model.” The server emphasizes explainability by linking text descriptions to model relationships and metamodel-derived definitions, enabling richer Q&A and guided narrative generation alongside standard chat completions.
How to install
Prerequisites:
- Docker installed and running on your host
- Access to the internet to pull the container image (nasdanika-ai)
Installation steps:
- Install Docker if you don’t have it:
- macOS: https://docs.docker.com/desktop/mac/
- Windows: https://docs.docker.com/desktop/windows/
- Linux: follow your distro’s guide (e.g., sudo apt-get install docker.io)
- Pull the AI MCP server image (or ensure it is available in your registry):
docker pull nasdanika-ai - Run the server (as specified in mcp_config):
docker run -i nasdanika-ai - Verify the server is reachable (default port exposure depends on the image; often 8080 or 80):
- Optional: customize environment variables for embeddings sources, vector store path, or API keys if the image supports them. For example:
docker run -e EMBEDDINGS_SOURCE=OpenAI -e VECTORS_PATH=/data/vectors -p 8080:8080 nasdanika-ai - If you prefer not to use Docker, check if there is an alternative (npx/node or python) by consulting the project docs; this README does not specify a non-Docker runtime.
Note: The exact image name and port mappings may vary; adjust the docker run command to your environment and image tag as needed.
Additional notes
Tips and considerations:
- If you already have a vector store index, you can hot-load or incrementally update it with new embeddings using the provided CLI (CLI commands are part of the AI MCP tooling, as described in the README).
- Ensure your embeddings source (OpenAI or Ollams) API keys are configured in the environment variables if the image expects them (e.g., OPENAI_API_KEY).
- The server integrates descriptive narrations with graph-based reasoning; for best results, provide rich text descriptions for model elements and ensure your resource sets include explicit relationships (parent/child, sibling, etc.).
- When debugging, check container logs for startup messages and route availability. If the semantic search routes aren’t responding, verify port exposure and that the vector store index is loaded properly.
- If you need to run a static chat UI alongside, expose an HTTP server for the frontend and ensure it can access the semantic search endpoints (the README mentions serving both a static site and semantic search routes).
Related MCP Servers
deep-research
Use any LLMs (Large Language Models) for Deep Research. Support SSE API and MCP server.
drift
Codebase intelligence for AI. Detects patterns & conventions + remembers decisions across sessions. MCP server for any IDE. Offline CLI.
minecraft
A Minecraft MCP Server powered by Mineflayer API. It allows to control a Minecraft character in real-time, allowing AI assistants to build structures, explore the world, and interact with the game environment through natural language instruction
wanaku
Wanaku MCP Router
ClueoMCP
🎭 The Personality Layer for LLMs- Transform any MCP-compatible AI with rich, consistent personalities powered by Clueo's Big Five personality engine.
Archive-Agent
Find your files with natural language and ask questions.