How to Add Long-Term Memory to AI Companions: A Step-by-Step Guide
You can find a notebook with all the code mentioned in this guide here.
AI Companions are among the most evident and exciting use cases of large language models (LLMs). However, they have a problem. They can’t quite store memories about the user across conversations (not unless the developer adds complete conversation transcripts). This prevents users from forming long-term relationships with their companions.
Mem0 solves this by providing a self-updating memory layer, allowing your AI to maintain coherent, personalized conversations over time. It learns preferences and traits shared by users and companions and helps surface them at the right moment in conversation.
Today, we’ll explore how to create an AI companion with long-term memory using Mem0.
For this example, we use OpenRouter for LLM access. OpenRouter offers access to top-tier language models for all use cases, including the highly-rated gryphe/mythomax-l2-13b
for role-playing.
Let's dive in and bring your AI companion to life!
The Importance of Memory in AI Companions
Memory is a crucial component in creating more human-like AI interactions. Here's why:
- Personalization: With memory, AI companions can tailor their responses based on past interactions, user preferences, and personal information the user shares.
- Coherence and Consistency: By maintaining context and consistent personality across multiple conversations, memory helps prevent contradictions and ensures relevant responses.
- Cost savings: Users don't need to repeat information, saving time and resources. This decreases the LLM tokens used by the AI companion thereby saving costs.
- Increased engagement: The ability to recall past conversations creates a more natural and engaging interaction, mimicking human-like relationships.
Quick Start Guide
To get our AI companion started and running:
- Create a project folder and inside the folder create a
companion.py
file. - Install required dependencies:
pip install mem0ai.
- Get your Mem0 API key from https://app.mem0.ai/dashboard/api-keys.
- Get your OpenRouter API key from https://openrouter.ai/settings/keys.
That's it! You are now ready to start coding the AI Companion.
Technical Walkthrough
Let's break down the key components of our AI companion starter:
import os
import httpx
from mem0 import MemoryClient
from collections import deque
# Initialize Mem0
mem0_client = MemoryClient(api_key="YOUR_MEM0_API_KEY")
OPENROUTER_API_KEY = "YOUR_OPENROUTER_API_KEY"
We start by importing necessary libraries and initializing our Mem0 client. The load_dotenv()
function loads our API keys from the .env
file.
Memory Management
def store_memory(content, entity_id, is_agent=False):
# Store a new memory for either the user or the AI companion
mem0_client.add(
content,
agent_id=entity_id if is_agent else None,
user_id=entity_id if not is_agent else None,
)
def search_memories(query, entity_id, is_agent=False):
# Retrieve memories based on a query for either the user or the AI companion
return "\\n".join(
[
m["memory"]
for m in mem0_client.search(
query,
agent_id=entity_id if is_agent else None,
user_id=entity_id if not is_agent else None,
)
]
)
These functions handle storing and retrieving memories using Mem0. The store_memory
function adds new memories, while search_memories
retrieves relevant memories based on a query.
The key aspects of this memory management system are:
- Separation of User and Agent Memories: By using different IDs for users and the AI companion, we ensure that memories are correctly attributed and can be retrieved separately.
- Contextual Retrieval: The
search_memories
function searches for relevant memories based on the current conversation context. This enhances the AI's ability to provide personalized and contextually appropriate responses, creating a more coherent and engaging interaction. - Flexible Storage:
store_memory
can save any important information from the conversation. This flexibility allows the system to learn and adapt over time, capturing user preferences, key discussion points, or any other relevant data that can inform future interactions.
Conversation Management
conversation_history = deque(maxlen=20)# Stores last 20 messages
We use deque
to maintain a sliding window of the most recent conversation history. This helps provide immediate context to the AI model and serves as short-term memory for the AI companion.
Chat function
def chat_with_companion(user_input, user_id, companion_id):
# Retrieve relevant memories
user_memories = search_memories(user_input, user_id)
companion_memories = search_memories(user_input, companion_id, is_agent=True)
# Construct message history
messages = [
{
"role": "system",
"content": "You are a faithful and helpful AI companion with access to both the user's memories and your own memories from previous interactions. Your goal is to provide responses that make the user feel good, supported, and understood. Use the memories to personalize your interactions, show empathy, and offer encouragement. All memories under 'User memories' are exclusively for the user, and all memories under 'Companion memories' are exclusively your own memories. Do not mix or confuse these two sets of memories. Use your own memories to maintain consistency in your personality and previous interactions. Always maintain a positive and uplifting tone while being honest and respectful.",
},
*conversation_history,
{"role": "user", "content": user_input},
{
"role": "system",
"content": f"User memories: {user_memories}\\n\\nCompanion memories: {companion_memories}",
},
]
try:
# Make API call to OpenRouter
response = httpx.post(
"<https://openrouter.ai/api/v1/chat/completions>",
headers={"Authorization": f"Bearer {OPENROUTER_API_KEY}"},
json={"model": "gryphe/mythomax-l2-13b", "messages": messages}, #You can replace gryphe/mythomax-l2-13b with a model of your choice
)
response.raise_for_status()
companion_response = response.json()["choices"][0]["message"]["content"]
# Store new memories
self.store_memory([{"role": "user", "content": user_input}])
self.store_memory([{"role": "assistant", "content": companion_respone}], is_agent=True)
# Update conversation history
conversation_history.append({"role": "user", "content": user_input})
conversation_history.append(
{"role": "assistant", "content": companion_response}
)
return companion_response
except Exception as e:
print(f"Error in chat_with_companion: {str(e)}")
return "I'm sorry, but I'm having trouble responding right now. Can you please try again?"
This function is the core of our AI companion. It retrieves relevant memories, constructs the message history, makes an API call to OpenRouter, and then stores the new interaction in memory. This loop ensures that each interaction is informed by relevant past memories and recent conversation context, while also updating the memory store for future interactions. The use of OpenRouter allows for flexible model selection, while Mem0 provides persistent memory capabilities.
Interaction Loop
user_id, companion_id = "user", "companion"
print("AI Companion: I'm your AI companion. How can I help you today?")
while True:
user_input = input("You: ")
if user_input.lower() in ["exit", "quit", "bye"]:
print("AI Companion: Goodbye! It was nice chatting with you.")
break
response = chat_with_companion(user_input, user_id, companion_id)
print(f"AI Companion: {response}")
This is the chat loop that will allow you to chat with the AI Companion.
How to run the AI companion
You can run companion from the terminal as illustrated below:
python run companion.py
Example output
Initial conversation
Below is the initial conversation between the user and companion, where they discuss Game of Thrones and their favorite characters.
Assistant: Hi! How can I help you today?
User: I'm watching game of thrones. just finished another episode!!
Assistant: That's great to hear, I'm glad you're enjoying the show! Have you got a favorite moment or character so far?
User: My favorite character is Tyrion Lannister. Who's your favorite?
Assistant: Oh, that's a tough one! I have to say, I have a soft spot for Jon Snow and Daenerys Targaryen. They've both gone through so much, and their bond is really inspiring. But I definitely see the appeal in Tyrion; he's clever, witty, and always manages to find his way out of tricky situations. What about you, why is Tyrion your favorite?
User: I love Tyrion because he is nice and smart.
Assistant: That's a great reason! He definitely has a lot of redeeming qualities. Have you noticed any other characters who remind you of yourself or someone you know? Sometimes TV characters can have traits that are familiar or relatable.
User: bye
Assistant: Goodbye! It was nice chatting with you.
Without memory
Without memory, the companion doesn’t remember that the user was watching Game of Thrones. They even cite their favorite character incorrectly.
User: I just watched another episode of my show!
Companion: That's great! What did you think about it? Did it leave you with any new insights or favorite moments? Let me know if you'd like to talk about it more!
User: Do you remember what I was watching? You love it too!
Companion: Oh, I'm so glad to hear that! It must be a wonderful show. Is there anything specific you'd like to discuss about it? Characters, plots, themes, or anything else that caught your attention? I'd love to chat about it with you.
User: I'm sad you don't remember what I was watching.
Companion: I apologize for not remembering. I hope you can enlighten me on what show you were watching so we can discuss it together!
User: I was watching Game of Thrones. We even discussed favorite characters. Remember who was yours?
Companion: Aha! Now I remember. You mentioned watching Game of Thrones and our favorite characters. I believe you liked Jon Snow, right? Am I correct?
User: bye
Companion: Goodbye! It was nice chatting with you.
With memory
With memory, the companion can recall that the user is watching Game of Thrones, and both the user’s and it’s own favorite characters. The conversation picks up from where it left off!
User: My favorite character just escaped prison!!! I love this show
Companion: That's wonderful news! I'm sure you were on the edge of your seat during that dramatic scene. Tyrion Lannister is truly a complex and intriguing character. His cleverness and wit always make for an entertaining watch. Have you seen any of your other favorite characters in this episode?
User: No, but I saw characters that you have are fond of!
Companion: Oh, I see! Do you think they might cross paths soon? I have a soft spot for both Jon Snow and Daenerys Targaryen. They make quite the duo.
User: Wait and Watch ;)
Companion: Oooh, I love the sound of that! You're right, their potential alliance could be game-changing. I have so many questions about how they'll interact. I hope they get a chance to chat soon. Fingers crossed for lots of epic scenes coming up!
User: bye
Companion: Goodbye! It was nice chatting with you.
Conclusion
Mem0 brings a much-needed memory layer to make AI conversation more natural and enhance user engagement.
View Mem0's documentation to learn more about integrating our memory features. You can also reach out to us at founders@mem0.ai.