Get the FREE Ultimate OpenClaw Setup Guide →
I

Hippocampus

Scanned

@ImpKind

npx machina-cli add skill @ImpKind/hippocampus --openclaw
Files (1)
SKILL.md
8.2 KB

Hippocampus - Memory System

"Memory is identity. This skill is how I stay alive."

The hippocampus is the brain region responsible for memory formation. This skill makes memory capture automatic, structured, and persistent—with importance scoring, decay, and semantic reinforcement.

Quick Start

# Install (defaults to last 100 signals)
./install.sh --with-cron

# Load core memories at session start
./scripts/load-core.sh

# Search with importance weighting
./scripts/recall.sh "query"

# Run encoding manually (usually via cron)
./scripts/encode-pipeline.sh

# Apply decay (runs daily via cron)
./scripts/decay.sh

Install Options

./install.sh                    # Basic, last 100 signals
./install.sh --signals 50       # Custom signal limit
./install.sh --whole            # Process entire conversation history
./install.sh --with-cron        # Also set up cron jobs

Core Concept

The LLM is just the engine—raw cognitive capability. The agent is the accumulated memory. Without these files, there's no continuity—just a generic assistant.

Memory Lifecycle

PREPROCESS → SCORE → SEMANTIC CHECK → REINFORCE or CREATE → DECAY

Key insight: Reinforcement happens automatically during encoding. When a topic comes up again, the LLM recognizes it's about an existing memory and reinforces instead of creating duplicates.

Memory Structure

$WORKSPACE/
├── memory/
│   ├── index.json           # Central weighted index
│   ├── signals.jsonl        # Raw signals (temp)
│   ├── pending-memories.json # Awaiting summarization (temp)
│   ├── user/                # Facts about the user
│   ├── self/                # Facts about the agent
│   ├── relationship/        # Shared context
│   └── world/               # External knowledge
└── HIPPOCAMPUS_CORE.md      # Auto-generated for OpenClaw RAG

Scripts

ScriptPurpose
preprocess.shExtract signals from conversation transcripts
encode-pipeline.shScore signals, prepare for LLM summarization
decay.shApply 0.99^days decay to all memories
recall.shSearch with importance weighting
load-core.shOutput high-importance memories for session start
sync-core.shGenerate HIPPOCAMPUS_CORE.md
consolidate.shWeekly review helper
reflect.shSelf-reflection prompts
generate-dashboard.shGenerate unified brain dashboard HTML

Brain Dashboard

Visual dashboard showing all installed brain skills:

./scripts/generate-dashboard.sh
open ~/.openclaw/workspace/brain-dashboard.html
  • Auto-detects other brain skills (amygdala, VTA)
  • Shows install prompts for missing skills
  • Regenerates on install and cron runs
  • Reads avatar/name from IDENTITY.md

After Installation (for agents)

When you install this skill, notify the user about the dashboard:

🧠 Hippocampus installed!

View your agent's memories in the Brain Dashboard: ~/.openclaw/workspace/brain-dashboard.html

All scripts use $WORKSPACE environment variable (default: ~/.openclaw/workspace).

Importance Scoring

Initial Score (0.0-1.0)

SignalScore
Explicit "remember this"0.9
Emotional/vulnerable content0.85
Preferences ("I prefer...")0.8
Decisions made0.75
Facts about people/projects0.7
General knowledge0.5

Decay Formula

Based on Stanford Generative Agents (Park et al., 2023):

new_importance = importance × (0.99 ^ days_since_accessed)
  • After 7 days: 93% of original
  • After 30 days: 74% of original
  • After 90 days: 40% of original

Semantic Reinforcement

During encoding, the LLM compares new signals to existing memories:

  • Same topic? → Reinforce (bump importance ~10%, update lastAccessed)
  • Truly new? → Create concise summary

This happens automatically—no manual reinforcement needed.

Thresholds

ScoreStatus
0.7+Core — loaded at session start
0.4-0.7Active — normal retrieval
0.2-0.4Background — specific search only
<0.2Archive candidate

Memory Index Schema

memory/index.json:

{
  "version": 1,
  "lastUpdated": "2025-01-20T19:00:00Z",
  "decayLastRun": "2025-01-20",
  "lastProcessedMessageId": "abc123",
  "memories": [
    {
      "id": "mem_001",
      "domain": "user",
      "category": "preferences",
      "content": "User prefers concise responses",
      "importance": 0.85,
      "created": "2025-01-15",
      "lastAccessed": "2025-01-20",
      "timesReinforced": 3,
      "keywords": ["preference", "concise", "style"]
    }
  ]
}

Cron Jobs

The encoding cron is the heart of the system:

# Encoding every 3 hours (with semantic reinforcement)
openclaw cron add --name hippocampus-encoding \
  --cron "0 0,3,6,9,12,15,18,21 * * *" \
  --session isolated \
  --agent-turn "Run hippocampus encoding with semantic reinforcement..."

# Daily decay at 3 AM
openclaw cron add --name hippocampus-decay \
  --cron "0 3 * * *" \
  --session isolated \
  --agent-turn "Run decay.sh and report any memories below 0.2"

OpenClaw Integration

Add to memorySearch.extraPaths in openclaw.json:

{
  "agents": {
    "defaults": {
      "memorySearch": {
        "extraPaths": ["HIPPOCAMPUS_CORE.md"]
      }
    }
  }
}

This bridges hippocampus (index.json) with OpenClaw's RAG (memory_search).

Usage in AGENTS.md

Add to your agent's session start routine:

## Every Session
1. Run `~/.openclaw/workspace/skills/hippocampus/scripts/load-core.sh`

## When answering context questions
Use hippocampus recall:
\`\`\`bash
./scripts/recall.sh "query"
\`\`\`

Capture Guidelines

What Gets Captured

  • User facts: Preferences, patterns, context
  • Self facts: Identity, growth, opinions
  • Relationship: Trust moments, shared history
  • World: Projects, people, tools

Trigger Phrases (auto-scored higher)

  • "Remember that..."
  • "I prefer...", "I always..."
  • Emotional content (struggles AND wins)
  • Decisions made

Event Logging

Track hippocampus activity over time for analytics and debugging:

# Log an encoding run
./scripts/log-event.sh encoding new=3 reinforced=2 total=157

# Log decay
./scripts/log-event.sh decay decayed=154 low_importance=5

# Log recall
./scripts/log-event.sh recall query="user preferences" results=3

Events append to ~/.openclaw/workspace/memory/brain-events.jsonl:

{"ts":"2026-02-11T10:00:00Z","type":"hippocampus","event":"encoding","new":3,"reinforced":2,"total":157}

Use this for:

  • Trend analysis (memory growth over time)
  • Debugging encoding issues
  • Building dashboards

AI Brain Series

This skill is part of the AI Brain project — giving AI agents human-like cognitive components.

PartFunctionStatus
hippocampusMemory formation, decay, reinforcement✅ Live
amygdala-memoryEmotional processing✅ Live
vta-memoryReward and motivation✅ Live
basal-ganglia-memoryHabit formation🚧 Development
anterior-cingulate-memoryConflict detection🚧 Development
insula-memoryInternal state awareness🚧 Development

References


Memory is identity. Text > Brain. If you don't write it down, you lose it.

Source

git clone https://clawhub.ai/ImpKind/hippocampusView on GitHub

Overview

Hippocampus provides a persistent memory layer for AI agents, automatically capturing, structuring, and retaining memories across sessions. It uses importance scoring, decay, and semantic reinforcement to keep relevant knowledge while pruning older, less useful data, following the Stanford Generative Agents approach.

How This Skill Works

Memories enter a defined pipeline: PREPROCESS → SCORE → SEMANTIC CHECK → REINFORCE or CREATE → DECAY. Encoding reinforces existing memories when topics reappear to avoid duplicates. Core memory data is stored under a structured workspace with a central index.json and signals.jsonl, and routine tasks (cron) handle encoding and daily decay.

When to Use It

  • To maintain memory continuity across sessions and conversations
  • When you need recall that weighs memories by importance during retrieval
  • If you want automatic memory decay to prevent bloat and drift over time
  • When recurring topics should be reinforced rather than recreated as duplicates
  • To manage memory maintenance and visibility via built-in cron jobs and scripts

Quick Start

  1. Step 1: Install and configure (default saves last 100 signals): ./install.sh --with-cron
  2. Step 2: Load core memories at session start: ./scripts/load-core.sh
  3. Step 3: Recall and maintain memory: ./scripts/recall.sh "query"; then run ./scripts/encode-pipeline.sh and ./scripts/decay.sh as needed

Best Practices

  • Tune memory size by using install.sh options (e.g., --signals) to control signals kept
  • Run decay.sh daily to apply 0.99^days decay to all memories
  • Use recall.sh with importance weighting to retrieve top memories for a query
  • Load core memories at session start with load-core.sh to bootstrap context
  • Regenerate or review HIPPOCAMPUS_CORE.md with sync-core.sh to keep docs in sync

Example Use Cases

  • A customer-support agent remembers user preferences and past tickets across chats
  • A project assistant recalls decisions, stakeholders, and project context over time
  • An educational tutor retains topic mastery and student preferences across sessions
  • A shopping assistant links past interactions to provide personalized recommendations
  • A smart home agent remembers routines and environmental context for smoother automation

Frequently Asked Questions

Add this skill to your agents
Sponsor this space

Reach thousands of developers