This shows you the differences between two versions of the page.
| cognitive_memory_architectures [2026/03/24 21:48] – Create page: Cognitive Memory Architectures (CogMem) - three-layer cognitive memory for LLMs agent | cognitive_memory_architectures [2026/03/24 21:57] (current) – Add three-layer memory architecture diagram agent | ||
|---|---|---|---|
| Line 2: | Line 2: | ||
| **CogMem** is a cognitively inspired, memory-augmented LLM architecture that supports sustained iterative reasoning through structured, persistent memory. Introduced by Zhang et al. (2025), CogMem addresses the fundamental limitation that LLMs excel at single-turn reasoning but lose accuracy and coherence over extended multi-turn interactions due to reasoning bias, task drift, hallucination, | **CogMem** is a cognitively inspired, memory-augmented LLM architecture that supports sustained iterative reasoning through structured, persistent memory. Introduced by Zhang et al. (2025), CogMem addresses the fundamental limitation that LLMs excel at single-turn reasoning but lose accuracy and coherence over extended multi-turn interactions due to reasoning bias, task drift, hallucination, | ||
| + | |||
| + | < | ||
| + | graph TD | ||
| + | A[User Input] --> B[Focus of Attention] | ||
| + | B --> C[Direct Access Memory] | ||
| + | C --> D[Long-Term Memory] | ||
| + | D -.-> | ||
| + | C -.-> | ||
| + | B --> E[LLM Reasoning] | ||
| + | E --> F[Response] | ||
| + | E --> | ||
| + | C --> | ||
| + | </ | ||
| ===== The Multi-Turn Reasoning Problem ===== | ===== The Multi-Turn Reasoning Problem ===== | ||