Guides
LangChain Integration
Drop-in BaseMemory adapter for LangChain agents and chains.
Overview
MetaMemory provides a drop-in BaseMemory adapter for LangChain. Replace your existing ConversationBufferMemory import with the MetaMemory adapter and your chains get persistent multi-vector memory with five-channel retrieval. Zero architecture changes required.
Installation
npm install metamemory @langchain/coreBasic Usage
import { MetaMemoryAdapter } from 'metamemory/integrations/langchain';
import { ConversationChain } from 'langchain/chains';
import { ChatOpenAI } from '@langchain/openai';
const memory = new MetaMemoryAdapter({
userId: 'agent-1',
// All standard MemoryEngine options are supported
similarityThreshold: 0.55,
});
const chain = new ConversationChain({
llm: new ChatOpenAI({ modelName: 'gpt-4' }),
memory,
});
const response = await chain.call({
input: 'What happened with the Redis deployment last week?',
});What the Adapter Does
Behind the scenes, the adapter:
- Implements LangChain's
BaseMemoryinterface - Stores each conversation turn as a memory with automatic multi-vector encoding
- On retrieval, runs five-channel adaptive search and returns relevant context
- Tracks emotional state per turn when emotion data is available
- Groups conversation turns into episodes automatically
Migration from ConversationBufferMemory
Replace the import and initialization. No other code changes needed:
// Before
import { ConversationBufferMemory } from 'langchain/memory';
const memory = new ConversationBufferMemory();
// After
import { MetaMemoryAdapter } from 'metamemory/integrations/langchain';
const memory = new MetaMemoryAdapter({ userId: 'agent-1' });