x-act
x-act a lib to compose ai assistant
claude mcp add --transport stdio ox-ai-x-act npx -y ox-ai-x-act \ --env PORT="Port to run the local server (default 3000)" \ --env OX_AI_CACHE_DIR="Directory for local caches and models (optional)"
How to use
x-act is a complete Gen-AI library designed to run fully locally and provide tools to build, orchestrate, and serve AI-powered capabilities. It can be used as a CLI via npx or as an importable library in your own applications, enabling local hosting of AI models, pipelines, and components without relying on external services. The package aims to simplify composing Gen-AI workflows, managing components, and exposing APIs for integration with other local services.
To get started, install or run via npx and explore the included tooling and APIs. Once running, you can leverage the built-in modules to create prompts, manage model selection, compose multi-step reasoning chains, and serve endpoints for development or testing. The library is designed to work offline with locally hosted models while providing hooks to connect to remote services if needed.
How to install
Prerequisites:
- Node.js (14.x or newer) and npm/yarn installed on your system
- Basic familiarity with the command line
Installation steps:
- Install or run the package with npx (no global install needed):
npx -y ox-ai-x-act
- Alternatively, if you prefer to install locally for ongoing development, you can initialize a project and install the package:
mkdir my-genai-app
cd my-genai-app
npm init -y
npm install ox-ai-x-act
- Start the local server (if the package provides a server entry point) or import the library in your code as documented by the package:
node path/to/server.js
- Check the running service at http://localhost:3000 (or the port you configured).
Prerequisites note: Ensure your environment has sufficient memory and, if using large models locally, allocate appropriate CPU/GPU resources as required by the models you plan to run.
Additional notes
Tips and common considerations:
- Environment variables: PORT may change the listening port; OX_AI_CACHE_DIR can improve performance by caching models locally.
- If you encounter network-related issues with dependencies during first run, ensure your npm/yarn registry access is available and your network allows fetching packages.
- For production usage, consider configuring a reverse proxy and setting up proper TLS termination.
- Review the library's documentation for supported local models, supported runtimes, and any optional plugins or modules that extend capabilities.
- If you run into compatibility issues with Node.js versions, consult the project's compatibility matrix and consider using nvm to manage Node versions.
Related MCP Servers
mcp-redis
The official Redis MCP Server is a natural language interface designed for agentic applications to manage and search data in Redis efficiently
mcp -odoo
A Model Context Protocol (MCP) server that enables AI assistants to securely interact with Odoo ERP systems through standardized resources and tools for data retrieval and manipulation.
mcp-logic
Fully functional AI Logic Calculator utilizing Prover9/Mace4 via Python based Model Context Protocol (MCP-Server)- tool for Windows Claude App etc
apple-books
Apple Books MCP Server
mcpx-py
Python client library for https://mcp.run - call portable & secure tools for your AI Agents and Apps
packt-netops-ai-workshop
🔧 Build Intelligent Networks with AI