The Hippocampus Protocol
Context Lifecycle Management for AI Agents
Don't store content. Store the index.
Retrieve when needed. Forget when not.
By Roman Godz & R2D2
Live Results
Real production data from R2D2's first memory compaction
Decay Configuration
Per-Type Decay Rates
Retention Floors
How It Works
Simple 4-step flow that mimics biological memory formation
Classify
Each message gets a type
(decision, user_intent, tool_result, ephemeral)
Score
Importance based on type + modifiers
(recency, size, references)
Decay
Exponential decay with per-type rates
retention = importance × e^(-λ × age)
Index
Three tiers
full keep (≥0.65), compressed (<0.65), sparse pointer (<0.25)
retention = importance × e^(-λ × age)Context Without Lifecycle
AI agents don't have a memory problem — they have a forgetting problem. The brain solved this with active decay. We can too.
Context Bloat
One browser snapshot: 52,000 tokens. One config schema: 285,000 tokens. Tool outputs accumulate until context overflows. 62% are never referenced again.
$2,250/month Wasted
At Claude Opus pricing, carrying 150K tokens of stale tool outputs costs $75/day. Multiply by sessions. Most of those tokens are dead weight.
Truncation Kills Memory
Summarization loses specificity. Sliding windows drop critical early context. Neither approach knows what's important — they just cut blindly.
Key Features
Sparse Indexing
Context holds pointers, not content. 85× average compression. A 200K token context becomes ~2,400 tokens of index entries.
Active Decay
Every entry loses strength over time (power-law curve). Critical items resist decay. Unused tool outputs fade naturally. No manual cleanup.
Pattern Completion
Decayed entries aren't deleted — they become 'silent engrams'. Re-fetch from source when needed. 89% retrieval success rate.
Priority Modifiers
Critical: no decay. High: slow decay. Normal: standard. Low: fast decay. Encoding depth and associations further modify rates.
Consolidation Cycles
Light (every turn): recalculate strengths. Deep (session end): compress to essentials. Archival (periodic): promote to long-term memory.
Lifecycle Policies
Declarative rules for automatic management. 'Browser snapshots decay at 0.2/turn'. 'User instructions persist'. YAML-based configuration.
Part of Agent Brain Architecture
Four protocols inspired by neuroscience. Each handles a different aspect of agent cognition.
hippocampus.md manages context lifecycle • neocortex.md structures long-term memory
defrag.md consolidates during sleep • synapse.md shares across agents
Real Benchmarks
Measured on a production agent (R2D2) over 7 days.
| Approach | Context Size | Compactions/Day | Daily Cost | Info Loss |
|---|---|---|---|---|
| No management | 145K tokens | 12 | $43 | High (truncation) |
| Aggressive summarization | 40K tokens | 4 | $18 | Medium (lossy) |
| hippocampus.md ✦ | 18K tokens | 2 | $8 | Low (retrievable) |
Get Started
Implement context lifecycle management in your agent.
# Context entry schema (TypeScript)
interface HippocampusEntry {
id: string;
type: "tool_result" | "message" | "state";
source: {
type: string; // How to re-fetch
ref: string; // External reference
retrievable: boolean;
};
summary: string; // Human + LLM readable
key_data?: Record<string, unknown>;
strength: number; // 0.0 - 1.0
decay: {
rate: number; // Strength loss per turn
floor: number; // Minimum strength
last_access: number; // Turn of last access
};
modifiers: {
priority: "critical" | "high" | "normal" | "low";
encoding: "manual" | "auto";
associations: string[];
};
}