Core Concepts
Reasoning
Memory & Retrieval
Agent Types
Design Patterns
Training & Alignment
Frameworks
Tools
Safety & Security
Evaluation
Meta
Core Concepts
Reasoning
Memory & Retrieval
Agent Types
Design Patterns
Training & Alignment
Frameworks
Tools
Safety & Security
Evaluation
Meta
This page redirects to the main article. For the full treatment of State Space Models, their evolution from S4 through Mamba and Mamba-2, comparisons with Transformers, and hybrid architectures, see:
State Space Models (SSMs) are a family of sequence modeling architectures that process sequential data through a fixed-size hidden state updated via linear dynamics. They offer O(n) linear-time complexity compared to the O(n^2) quadratic cost of Transformer attention, making them compelling for long-context and resource-constrained applications. Key models include S4, Mamba, Mamba-2, Jamba, RWKV, and Griffin/RecurrentGemma.