Integrations
Memorer works with any AI framework. These guides show the recommended pattern for adding persistent memory to your existing applications.
Available guides
| Framework | Description |
|---|---|
| OpenAI | Usage pattern with the OpenAI Python SDK |
| Anthropic | Usage pattern with the Anthropic Python SDK |
| LangChain | Usage pattern with the LangChain framework |
How it works
The pattern is the same for any framework:
- Before the LLM call — use
recall()to get relevant memories and inject them as system prompt context - After the LLM call — use
remember()to store the conversation exchange as new memories
from memorer import Memorer
client = Memorer(api_key="mem_sk_...")
user = client.for_user("user-123")
# 1. Recall relevant memories
results = user.recall(user_message)
# 2. Inject into system prompt
system_prompt = f"You are a helpful assistant.\n\nUser context:\n{results.context}"
# 3. Call your LLM (OpenAI, Anthropic, etc.)
response = call_your_llm(system_prompt, user_message)
# 4. Remember the exchange
user.remember(f"User: {user_message}\nAssistant: {response}")Last updated on