gpt2099.nu
a Nushell cross.stream extension to interact with LLMs and MCP servers
claude mcp add --transport stdio cablehead-gpt2099.nu node path/to/server.js \ --env NPX_PACKAGE="gpt2099.nu or applicable entrypoint" \ --env CROSS_STREAM_CONFIG="path/to/cross.stream/config.json"
How to use
gpt2099.nu is a Nushell-based MCP client designed to provide a scriptable interface for interacting with multiple AI providers through a single, consistent workflow. It leverages cross.stream for event-driven context management and editable conversation threads that persist across sessions. This setup allows you to configure providers, manage your conversation history, and extend functionality by connecting to external MCP servers. With gpt2099.nu, you can initialize the LLM command, enable chosen providers, and run model calls directly from Nushell, while keeping your context window accessible and editable within the same workflow.
How to install
-
Prerequisites:
- Nushell installed (compatible version as per the project docs)
- Access to cross.stream (see the cross.stream installation guide)
- Internet access to fetch necessary MCP components
-
Installation steps:
- Install Nushell following the official guide at https://www.nushell.sh/
- Install cross.stream per the project docs referenced in the repository: https://cablehead.github.io/xs/getting-started/installation/
- Obtain the gpt2099.nu MCP client repository (clone or download)
- If there is a server component, install dependencies via your preferred method (the README doesn’t declare a specific runtime; adapt to your environment)
- Prepare the environment by ensuring any required API keys or provider configurations are available (see the gpt2099.nu documentation for provider setup)
- Start the MCP server or client component as appropriate for your environment (examples below assume a node-based entrypoint; adapt as needed)
-
Example startup (adjust to your actual entrypoint):
# If this project provides a Node.js entrypoint npm install -g npm run start# If using a Python/uvx based entrypoint provided by the project uvx run server.py -
Post-install verification:
- Ensure the Nushell prompt loads the cross.stream environment
- Validate that you can initialize the MCP command (e.g., gpt init) and that provider configuration can be stored
- Test a simple call through the MCP to confirm end-to-end provider interaction
Additional notes
Notes and tips:
- This MCP client emphasizes scriptability within Nushell and persistent context threads; use cross.stream to manage and edit your conversation history.
- Ensure you enable your desired providers and set model aliases to streamline usage (as documented in the repository).
- If you encounter environment variable issues, confirm that API keys for providers (OpenAI, Anthropic, etc.) are correctly set and accessible by the running process.
- For debugging, verify that the MCP server entrypoint and any cross.stream adapters are correctly wired to your Nushell environment and that the event stream hooks are properly initialized.
- If the server fails to start, check common issues like missing dependencies, incorrect paths to the entrypoint, or misconfigured environment variables. Refer to the provider and schema references in the repository for compatibility notes.
Related MCP Servers
SearChat
Search + Chat = SearChat(AI Chat with Search), Support OpenAI/Anthropic/VertexAI/Gemini, DeepResearch, SearXNG, Docker. AI对话式搜索引擎,支持DeepResearch, 支持OpenAI/Anthropic/VertexAI/Gemini接口、聚合搜索引擎SearXNG,支持Docker一键部署。
openapi
OpenAPI definitions, converters and LLM function calling schema composer.
coplay-unity-plugin
Unity plugin for Coplay
claude-additional-models
Reduce Claude Desktop consumption by 10x - Integrate Google's Gemini or Z.ai's GLM-5 (744B params) with Claude via MCP for intelligent task delegation
DeepCo
A Chat Client for LLMs, written in Compose Multiplatform.
askimo
AI Native App Client App with Built-in Knowledge Base & Multi-Provider Support (OpenAI, Gemini, Ollama, ...)