Lyzr Cognis: Open-Source Memory Architecture for Conversational AI
Lyzr Cognis represents a cohesive memory framework designed for conversational AI agents, tackling the issue of absent persistent memory in LLM agents. It incorporates a multi-tiered retrieval system that merges OpenSearch BM25 keyword matching with Matryoshka vector similarity search, integrated through Reciprocal Rank Fusion. The context-aware ingestion process fetches existing memories prior to extraction, facilitating smart version tracking. Temporal boosting is employed to improve time-sensitive inquiries, while a BGE-2 cross-encoder reranker fine-tunes the outcomes. Tested against LoCoMo and LongMemEval benchmarks across eight answer generation models, it delivers leading performance. This open-source system is currently operational in production.
Key facts
- Lyzr Cognis is a unified memory architecture for conversational AI agents.
- It addresses the lack of persistent memory in LLM agents.
- Uses a multi-stage retrieval pipeline with OpenSearch BM25 and Matryoshka vector similarity search.
- Fusion via Reciprocal Rank Fusion.
- Context-aware ingestion pipeline retrieves existing memories before extraction.
- Enables intelligent version tracking preserving full memory history.
- Temporal boosting enhances time-sensitive queries.
- BGE-2 cross-encoder reranker refines final result quality.
- Evaluated on LoCoMo and LongMemEval benchmarks across eight answer generation models.
- Achieves state-of-the-art performance on both benchmarks.
- System is open-source and deployed in production.
Entities
Institutions
- Lyzr
- OpenSearch
- Matryoshka
- BGE-2
- LoCoMo
- LongMemEval