From RAG to Memory: Non-Parametric Continual Learning for Large Language Models

Bernal Jiménez Gutiérrez, Yiheng Shu, Weijian Qi, Sizhe Zhou, Yu Su·February 20, 2025

Summary

HippoRAG, a framework for large language models, improves retrieval-augmented generation by integrating deeper passage information and enhancing LLM usage. It excels in factual, sense-making, and associative memory tasks, outperforming standard RAG and embedding models. HippoRAG 2, a non-parametric continual learning method, uses the PageRank algorithm and an LLM to construct a knowledge graph, demonstrating superior performance across tasks. It offers flexibility with different retrievers and LLMs, advancing human-like non-parametric continual learning systems.

Key findings

4

Advanced features