Get the FREE Ultimate OpenClaw Setup Guide →

llm-functions

Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio sigoden-llm-functions node mcp/server.js \
  --env PORT="8080 (or as configured by the server)"

How to use

LLM Functions provides a framework to expose tooling and agents to language models using OpenAI's function calling protocol. It supports tools written in Bash, JavaScript, and Python, and uses the argc framework to automatically generate function declarations from code comments. You can assemble tools and agents, build a shared bin and declarations, and then connect them to AIChat or your own MCP bridge. The system emphasizes discovering, listing, and invoking capabilities (tools) and orchestrating agents that call those tools as needed. Use the included build and check steps to verify your environment, then start your MCP server to serve declarations to clients that support the Model Context Protocol.

To work with the server, you typically run the MCP server component (as configured in mcp_config) and then use your MCP bridge or AIChat integration to access the tools and agents. Tools are defined by comments in Bash, JavaScript, or Python scripts, and agents are defined with an index.yaml and accompanying tools, enabling sophisticated workflows through function calling. When integrated with AIChat, you can expose a functions_dir that makes all tools and agents available to LLMs for dynamic tool invocation.

How to install

Prerequisites

  • Git
  • Node.js (and npm) or Python, depending on how you plan to run the server
  • jq (for JSON processing)
  • argc (CLI tooling for building and wiring tools/agents)

Installation steps (Node.js path)

  1. Install prerequisites (example using common package managers):
# Node.js is assumed installed
node -v
npm -v
jq --version
  1. Clone the repository
git clone https://github.com/sigoden/llm-functions.git
cd llm-functions
  1. Install dependencies (if a package.json exists at the project root or within the mcp path)
npm install
  1. Build tools/agents and verify
npx argc build
npx argc check
  1. Run the MCP server (as configured in mcp_config)
node mcp/server.js

Installation steps (Python path)

  1. Ensure Python is installed (3.8+)
  2. Install dependencies (if a requirements.txt exists)
python3 -m pip install -r requirements.txt
  1. Run any Python-based MCP server entrypoint if provided in the repo, or follow project-specific instructions in the README for starting the bridge/server.

Note: The project documentation mentions building with argc and integrating via AIChat. If you are using a prebuilt MCP bridge or a different entrypoint, adapt the start command to the script that launches the MCP server. If you rely on a containerized setup, you can adapt the commands to a docker run sequence as described in your deployment workflow.

Additional notes

Tips and common notes:

  • prerequisites include argc and jq for tool generation and JSON processing. Ensure they are installed and available in your PATH.
  • The MCP integration relies on a functions directory (AIChat) or a MCP bridge to expose tools/agents to LLMs. Configure the AICHAT_FUNCTIONS_DIR environment variable if you want to link directly to AIChat.
  • When defining tools and agents, comments in code are used to auto-generate JSON declarations (tools.json). Keep your annotations up to date for consistent behavior.
  • If you encounter port conflicts, adjust the PORT environment variable in the mcp_config for your server.
  • For web-based or container deployments, you can adapt the mcp_config to docker, npx, or other runners as appropriate for your ecosystem.
  • If using AIChat, ensure your tools and agents appear in the functions_dir expected by AIChat and verify that the function calling interface is correctly wired between the LLM and your MCP backend.

Related MCP Servers

Sponsor this space

Reach thousands of developers