Mem0 provides an intelligent, adaptive memory layer for Large Language Models (LLMs), enhancing personalized AI experiences by retaining and utilizing contextual information across diverse applications. This enhanced memory capability is crucial for applications ranging from customer support and healthcare diagnostics to autonomous systems and personalized content recommendations, allowing AI to remember user preferences, adapt to individual needs, and continuously improve over time.
Note
The Mem0 repository now also includes the Embedchain project. We continue to maintain and support Embedchain ❤️. You can find the Embedchain codebase in the embedchain directory.
The Mem0 package can be installed directly from pip command in the terminal.
pip install mem0ai
Mem0 supports various LLMs, details of which can be found in our docs, checkout Supported LLMs. By default, Mem0 is equipped with gpt-4o
, and to use it, you need to set the keys in the environment variable.
import os
os.environ["OPENAI_API_KEY"] = "sk-xxx"
Now, you can simply initialize the memory.
from mem0 import Memory
m = Memory()
You can perform the following task on the memory.
- Add: adds memory
- Update: update memory of a given memory_id
- Search: fetch memories based on a query
- Get: return memories for a certain user/agent/session
- History: describes how a memory has changed over time for a specific memory ID
# 1. Add: Store a memory from any unstructured text
result = m.add("I am working on improving my tennis skills. Suggest some online courses.", user_id="alice", metadata={"category": "hobbies"})
# Created memory --> 'Improving her tennis skills.' and 'Looking for online suggestions.'
# 2. Update: update the memory
result = m.update(memory_id=<memory_id_1>, data="Likes to play tennis on weekends")
# Updated memory --> 'Likes to play tennis on weekends.' and 'Looking for online suggestions.'
# 3. Search: search related memories
related_memories = m.search(query="What are Alice's hobbies?", user_id="alice")
# Retrieved memory --> 'Likes to play tennis on weekends'
# 4. Get all memories
all_memories = m.get_all()
memory_id = all_memories[0]["id"] # get a memory_id
# All memory items --> 'Likes to play tennis on weekends.' and 'Looking for online suggestions.'
# 5. Get memory history for a particular memory_id
history = m.history(memory_id=<memory_id_1>)
# Logs corresponding to memory_id_1 --> {'prev_value': 'Working on improving tennis skills and interested in online courses for tennis.', 'new_value': 'Likes to play tennis on weekends' }
Tip
If you are looking for a hosted version and don't want to setup the infrastucture yourself, checkout Mem0 Platform Docs to get started in minutes.
- Multi-Level Memory: User, Session, and AI Agent memory retention
- Adaptive Personalization: Continuous improvement based on interactions
- Developer-Friendly API: Simple integration into various applications
- Cross-Platform Consistency: Uniform behavior across devices
- Managed Service: Hassle-free hosted solution
For detailed usage instructions and API reference, visit our documentation at docs.mem0.ai.
For production environments, you can use Qdrant as a vector store:
from mem0 import Memory
config = {
"vector_store": {
"provider": "qdrant",
"config": {
"host": "localhost",
"port": 6333,
}
},
}
m = Memory.from_config(config)
- Integration with various LLM providers
- Support for LLM frameworks
- Integration with AI Agents frameworks
- Customizable memory creation/update rules
- Hosted platform support
Join our Slack or Discord community for support and discussions. If you have any questions, feel free to reach out to us using one of the following methods: