Researchlong term memorylong contextassociative memorygoogle research
Google Research Debuts Titans And MIRAS Memory Framework
10.0
Relevance Score
Google Research introduces two papers, Titans and MIRAS, proposing memory-driven sequence models to handle extremely long context. Titans uses a surprise metric, momentum, and adaptive forgetting to build a long-term memory module, while MIRAS offers a framework of four design choices for associative memory; evaluations show Titans scales beyond two million tokens and outperforms larger baselines, including GPT-4, on BABILong.


