Langfuse is an open-source LLM observability platform that provides tracing, evaluation, prompt management, and cost tracking for production LLM applications. With over 24,000 GitHub stars and MIT licensing, it has become the leading open-source alternative for monitoring and debugging AI applications in production.
The V4 architecture shifted to an observations-first model where traces are correlation IDs (like session_id) rather than top-level entities, with immutable spans ingested via OTel protocols.
graph TB
subgraph Apps["Instrumented Applications"]
App1[Python App + SDK]
App2[JS/TS App + SDK]
App3[OpenTelemetry]
App4[LiteLLM Gateway]
end
subgraph Ingestion["Ingestion Layer"]
Queue[Redis + BullMQ]
Batch[Micro-Batch Processor]
end
subgraph Storage["Storage Layer"]
PG[(PostgreSQL - Transactional)]
CH[(ClickHouse - Traces/Spans)]
end
subgraph Features["Feature Layer"]
Trace[Trace Explorer]
Eval[Evaluation Engine]
Prompt[Prompt Manager]
Cost[Cost Dashboard]
Metrics[Metrics and Analytics]
end
subgraph UI["Web Dashboard"]
Dashboard[Dashboard Views]
Filters[Saved Filters]
Graphs[Agent Graphs]
end
Apps --> Ingestion
Queue --> Batch
Batch --> Storage
Storage --> Features
Features --> UI
Tracing Capabilities
Langfuse captures the full request lifecycle with rich detail:
LLM Operations – Inputs, outputs, latency, token usage, model parameters