Skip to Content
ConceptsConversations

Conversations

Most apps either stuff the entire chat history into the prompt — expensive, and you hit context limits fast — or lose context entirely between sessions. Conversations solve both problems: you get recent messages and relevant long-term memories in a single recall() call, without bloating your token usage.

Creating a conversation

user = client.for_user("user-123") # Using the shortcut (creates a new conversation) conv = user.conversation() print(conv.id) # UUID for this session # Or resume an existing one conv = user.conversation("existing-conversation-id")

Adding messages

# Add a user message conv.add("user", "I just moved to Portland") # Add an assistant message conv.add("assistant", "Welcome to Portland! How do you like it?")

Messages are automatically processed — entities and relationships are extracted and stored in the knowledge graph.

Retrieving context

# Get conversation context + long-term memories for your LLM prompt result = conv.recall( "Where does the user live?", include_conversation_context=True, # Include recent messages max_recent_messages=10, # Limit recent messages top_k=10, # Semantic search results ) # Combined context string for LLM print(result.context) # Recent messages for msg in result.conversation_context: print(f"[{msg.role}]: {msg.content}") # Semantic search results for r in result.results: print(f"{r.content} (score: {r.relevance_score})")

Listing conversations

conversations = user.conversations.list() for conv in conversations.conversations: print(f"{conv.id} - {conv.created_at} ({conv.message_count} messages)")
Last updated on