Get the FREE Ultimate OpenClaw Setup Guide →

AI-Learning

AI learning - more info in wiki

Installation
Run this command in your terminal to add the MCP server to Claude Code.
Run in terminal:
Command
claude mcp add --transport stdio jacobt2006-ai-learning node server.js \
  --env AI_LEARNING_ROOT="path to root directory for AI-Learning data (optional)" \
  --env AI_LEARNING_DEBUG="optional (set to 'true' for verbose logs)"

How to use

AI-Learning is an MCP server designed to bridge an AI model with parts of your local environment in a controlled way. It enables the AI to read and edit files, interact with Git repositories, access notes from apps like Obsidian, and run simple scripts or commands safely through a defined interface. The server exposes a set of tools that the Gemini CLI or other MCP clients can call to perform tasks such as file I/O, repository management, note retrieval, and script execution, while keeping direct access to your system restricted behind the bridge.

Once running, you can invoke the server using your MCP client tooling to request specific actions. For example, you can ask it to fetch a file's contents, update a note, clone or pull a Git repository, or execute a small shell command. The server will return results or outputs that you can feed back to the AI for further reasoning or iteration. Use the provided tooling to manage permissions and scope (which folders or repos are accessible) to maintain safety boundaries while enabling useful automation and assistance.

How to install

Prerequisites:

  • Node.js (LTS) and npm installed on your machine
  • Basic familiarity with command line
  • Access to the internet to pull dependencies

Installation steps:

  1. Install Node.js and npm if you don’t have them:

  2. Clone or download the AI-Learning MCP server repository (or use npm if published):

  3. Install dependencies:

    • npm install
  4. Start the MCP server:

    • npm start
    • or node server.js
  5. Optional: configure environment variables before starting (see additional_notes). Example:

    • AI_LEARNING_ROOT=/path/to/data
    • AI_LEARNING_DEBUG=true
  6. Verify the server is running by checking logs or hitting the MCP API endpoint with a client (Gemini CLI or other MCP clients).

If you prefer an easier, one-liner setup (as suggested by the repository’s Gemini CLI guidance), you can use the install script once Node.js is present:

Note: The exact startup command may vary depending on how you deploy the server (development vs. production). Adjust the start command and environment as needed.

Additional notes

Tips and caveats:

  • Start with a limited access configuration: restrict AI-Learning to specific directories or Git repositories to minimize risk.
  • Use the AI_LEARNING_ROOT environment variable to confine the working directory for file operations.
  • Enable verbose logging (AI_LEARNING_DEBUG=true) only for troubleshooting, then disable to reduce log noise.
  • Ensure you trust the AI’s outputs when performing file edits or script executions; consider implementing a review step before applying changes.
  • If you use Obsidian notes, keep the notes directory permissions tight to avoid unintended data exposure.
  • If you encounter port or binding issues, verify that the port used by the MCP server is open in your firewall and not already in use by another process.
  • Regularly update dependencies and monitor for security advisories related to any packages you install as part of the MCP server.
  • Check for compatibility notes specific to your OS (MacOS/Linux/Windows) in the repository or release notes.

Related MCP Servers

Sponsor this space

Reach thousands of developers