Skip to content

Documentation

Learn how to use MetaMemory

Everything you need to give your AI agents persistent, intelligent memory.

Getting Started

Core Concepts

Memory Engine

The central orchestration layer that manages the full memory lifecycle: CRUD operations, automatic multi-vector embedding generation, emotional tagging, graph relationship creation, and episode tracking. Memories are indexed across four specialized vector spaces and retrieved through five parallel channels fused via weighted Reciprocal Rank Fusion.

Multi-Vector Encoding

Each memory is encoded into four specialized embedding spaces: semantic (1536d via OpenAI), emotional trajectory (132d capturing temporal emotion dynamics), process sequence (132d with positional encoding for action steps), and situational context (64d encoding task type, domain, complexity). Fusion weights α are learned per-context via gradient descent on retrieval feedback.

Adaptive Retrieval

A 7-layer self-improving system that continuously refines retrieval quality. Layer 1–2: Thompson Sampling and UCB for strategy selection. Layer 3–4: gradient boosting (50 stumps, η=0.1) for effectiveness prediction after 100+ samples. Layer 5–6: Bayesian parameter optimization and LLM-discovered meta-memory rules. Layer 7: online drift detection with automatic rollback.

Emotional Intelligence

Emotions are modeled as continuous trajectories rather than static labels, capturing the evolution of affect within an episode (e.g., frustration → insight → satisfaction). Each trajectory is encoded into a 132-dimensional embedding with volatility, trend, and range features. The emotional retrieval channel computes trajectory-level similarity, grounded in cognitive science research on affective memory.

Guides

API Reference