factorial
Open-source MCP server implementing the Model Context Protocol to connect AI agents with Factorial, the HR Management System.
claude mcp add --transport stdio ratek-20-factorial-mcp-server docker run -i --rm -p 7000:7000 -v factorial-mcp-server_cache:/app/data -e EXIT_ON_EOF=true --env-file <absolute-path-to-your-.env-file> ghcr.io/ratek-20/factorial-mcp-server:latest --transport stdio \ --env ENV_FILE="<absolute-path-to-your-.env-file>" \ --env EXIT_ON_EOF="true"
How to use
This MCP server exposes a set of tools that let AI agents interact with the Factorial HR system for time-off management. The server communicates over the STDIO transport, so you typically run it inside a container (Docker) or from a local environment and connect your AI client which speaks the MCP/STDIO protocol. Available tools cover authorization, employee information retrieval, and full lifecycle management of time-off requests, including querying leave balances, submitting requests, updating them, and approving or deleting time-off records. To use the server, configure your AI client (Gemini CLI, Claude Code, or Copilot) with the provided mcpServers entry and point it at the server deployment. When you invoke a tool, the server will handle OAuth2 authorization with Factorial and then perform the requested operation, returning a structured result suitable for natural language processing and summarization by the AI agent.
How to install
Prerequisites:
- Docker installed and accessible on your machine
- Access to the Factorial OAuth2 credentials (Application ID and Secret)
- An environment file containing your OAuth2 credentials
Installation steps:
-
Prepare OAuth2 credentials
- Create a file named .env with your OAuth2 details: OAUTH2_APPLICATION_ID=<your-actual-id> OAUTH2_APPLICATION_SECRET=<your-actual-secret>
-
Acquire the MCP server image
- Pull the latest Factorial MCP server image (Docker): docker pull ghcr.io/ratek-20/factorial-mcp-server:latest
-
Run the MCP server (example using Docker CLI)
- Start the server container, exposing port 7000 and mounting a cache directory:
docker run -i --rm -p 7000:7000
-v factorial-mcp-server_cache:/app/data
-e EXIT_ON_EOF=true
--env-file <absolute-path-to-your-.env-file>
ghcr.io/ratek-20/factorial-mcp-server:latest
--transport stdio
- Start the server container, exposing port 7000 and mounting a cache directory:
docker run -i --rm -p 7000:7000
-
Connect your AI client
- Configure your AI client (Gemini CLI, Copilot, or Claude) with the mcpServers entry pointing to the running container, using the --transport stdio option as shown in the README examples.
Notes:
- Ensure the OAuth redirect URI in Factorial matches your deployment (the README specifies http://127.0.0.1:7000/oauth2-callback for local use).
- The cache directory (factorial-mcp-server_cache) is used to store GET responses to reduce API calls to Factorial.
- If running behind a proxy or in a VM, adjust port mappings accordingly.
Additional notes
Tips and common considerations:
- The server communicates via STDIO; ensure the client tool you use supports this transport and that the container has access to your OAuth2 credentials via the env-file.
- The cache is in-memory by default on the server side; for longer-lived deployments, consider persistent storage if the MCP server implementation exposes it.
- When configuring your AI client, include only the tools you actually plan to use to reduce cognitive load for the agent (see the list of available tools in the README).
- If you encounter OAuth2 authorization issues, verify that the Redirect URI configured in Factorial exactly matches the one the MCP server uses, and confirm the Application ID and Secret are correct.
- The server supports a range of time-off operations: get_available_vacation_days, get_leave_types, request_time_off, read_time_offs, approve_time_off, update_time_off, delete_time_off, get_current_employee, get_employee, and authorize.
Related MCP Servers
nutrient-dws
A Model Context Protocol (MCP) server implementation that integrates with the Nutrient Document Web Service (DWS) Processor API, providing powerful PDF processing capabilities for AI assistants.
janee
Secrets management for AI agents via MCP • @janeesecure
unity
A Unity MCP server that allows MCP clients like Claude Desktop or Cursor to perform Unity Editor actions.
local -gateway
Aggregate multiple MCP servers into a single endpoint with web UI, OAuth 2.1, and profile-based tool management
memory
A MCP (Model Context Protocol) server providing long-term memory for LLMs
MCP -for-Hashing
A Model Context Protocol (MCP) server for calculating MD5 and SHA-256 hashes, complete with tools and guides for understanding and building MCP servers. Works with Claude Desktop & VSCode.