Everywhere
Context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.
claude mcp add --transport stdio dearva-everywhere dotnet Everywhere.dll \ --env PORT="Port to run the server on (default may be 5000 or as configured in appsettings)" \ --env OPENAI_API_KEY="Your OpenAI API key (if using OpenAI as an LLM)" \ --env AZURE_OPENAI_API_KEY="Azure OpenAI API key if using Azure-hosted models" \ --env SOME_EXTERNAL_SERVICE_KEY="Placeholder for any external service keys if required"
How to use
Everywhere is an interactive AI assistant designed to provide context-aware help across your screen and applications. It exposes integrated MCP tools that allow you to run prompts and workflows directly within the UI, access the File System, Terminal, and Web Browser, and leverage multiple LLM providers through a unified interface. You can invoke a captured context (such as an error message or a webpage) and prompt Everywhere to analyze, summarize, translate, or draft content while staying in your current workflow. The MCP integration makes it possible to compose multi-step actions, fetch data from external services, and render results in-place within the app, enabling seamless AI-assisted productivity without switching apps or copying data around. The server-side component coordinates LLM calls, tool invocations, and UI interactions to deliver a responsive, context-aware experience.
How to install
Prerequisites:
- .NET 7 or .NET 10 SDK and runtime installed on your system
- Access to the Everywhere repository (clone or download release)
- (Optional) External API keys for LLM providers if you plan to use non-default models
Installation steps:
-
Clone the repository or download the build artifacts from the release page. git clone https://github.com/DearVa/Everywhere.git cd Everywhere
-
Build the project (if you have the source code): dotnet build --configuration Release
-
Run the server: dotnet run --configuration Release or if you have the compiled DLL ready: dotnet Everywhere.dll
-
Configure environment variables as needed (see mcp_config env section) and ensure the server port is accessible from your MCP client.
-
If you prefer docker or other environments, adapt the command according to the target runtime (the README does not provide a Docker image name by default).
Additional notes
Tips and common considerations:
- If you see authentication or API errors, verify your LLM provider keys (OPENAI_API_KEY or AZURE_OPENAI_API_KEY) are correctly set in the environment.
- Ensure the port used by the MCP server is open and not blocked by a firewall.
- If runtime errors occur, check that the required .NET runtime version matches the one used to build the server.
- Update the environment with any required external service keys before starting the server.
- The MCP integration supports multiple tools (Web Browser, File System, Terminal, etc.). Explore the UI to discover available commands and workflows.
- For troubleshooting, consult the project docs and issue tracker on GitHub for common setup pitfalls and platform-specific notes.
Related MCP Servers
lobehub
The ultimate space for work and life — to find, build, and collaborate with agent teammates that grow with you. We are taking agent harness to the next level — enabling multi-agent collaboration, effortless agent team design, and introducing agents as the unit of work interaction.
AstrBot
Agentic IM Chatbot infrastructure that integrates lots of IM platforms, LLMs, plugins and AI feature, and can be your openclaw alternative. ✨
archestra
Secure cloud-native MCP registry, gateway & orchestrator
Risuai
Make your own story. User-friendly software for LLM roleplaying
tuui
A desktop MCP client designed as a tool unitary utility integration, accelerating AI adoption through the Model Context Protocol (MCP) and enabling cross-vendor LLM API orchestration.
better-chatbot
Just a Better Chatbot. Powered by Agent & MCP & Workflows.