====== Chainlit ======
**Chainlit** is an open-source Python framework for building production-ready conversational AI applications(([[https://github.com/Chainlit/chainlit|Chainlit GitHub Repository]])). With over **12,000 stars** on GitHub, it lets developers go from a Python script to a polished chat interface in minutes — with real-time streaming, step-by-step observability, file uploads, authentication, and integrations with LangChain, LlamaIndex, and OpenAI.
Chainlit focuses on the gap between AI backend logic and user-facing interfaces, providing an event-driven architecture with Python decorators that eliminates the need for frontend development expertise while delivering a professional chat experience(([[https://docs.chainlit.io/get-started/overview|Chainlit Official Documentation]])).
===== How It Works =====
Chainlit uses an **event-driven architecture** with Python decorators for chat lifecycle hooks.((DataCamp. "Chainlit Tutorial." [[https://www.datacamp.com/tutorial/chainlit|datacamp.com]])) The core decorators — @cl.on_chat_start and @cl.on_message — handle session initialization and message processing. The framework automatically renders a web-based chat UI with streaming support, message history, and step visualization.
A key feature is **Chain of Thought observability**: Chainlit visualizes intermediate AI steps (retrieval, reasoning, tool calls) in the UI, giving users transparency into how the AI reached its answer. This builds trust and aids debugging.
===== Key Features =====
* **Real-time streaming** — Stream LLM responses token-by-token to the UI
* **Chain of Thought** — Visualize intermediate AI reasoning steps
* **Customizable UI** — Theming, welcome messages, buttons, images, file uploads
* **Session management** — Persistent chat history via cl.user_session
* **MCP support** — Model Context Protocol for dynamic tool integration
* **Authentication** — Built-in user auth and access control
* **Feedback collection** — Thumbs up/down for model improvement
* **Multi-modal** — Support for images, files, and data visualization
===== Installation and Usage =====
# Install Chainlit
# pip install chainlit
import chainlit as cl
from openai import AsyncOpenAI
client = AsyncOpenAI()
@cl.on_chat_start
async def start():
cl.user_session.set("history", [])
await cl.Message(content="Hello! How can I help you today?").send()
@cl.on_message
async def main(message: cl.Message):
history = cl.user_session.get("history")
history.append({"role": "user", "content": message.content})
msg = cl.Message(content="")
await msg.send()
# Stream response from OpenAI
stream = await client.chat.completions.create(
model="gpt-4o",
messages=history,
stream=True
)
async for chunk in stream:
if chunk.choices[0].delta.content:
await msg.stream_token(chunk.choices[0].delta.content)
history.append({"role": "assistant", "content": msg.content})
cl.user_session.set("history", history)
await msg.update()
# Run with: chainlit run app.py
# LangChain integration example
from langchain_openai import ChatOpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
@cl.on_message
async def langchain_handler(message: cl.Message):
llm = ChatOpenAI(model="gpt-4o", streaming=True)
prompt = PromptTemplate.from_template("Answer: {question}")
chain = LLMChain(llm=llm, prompt=prompt)
cb = cl.AsyncLangchainCallbackHandler()
result = await chain.acall(
message.content,
callbacks=[cb]
)
await cl.Message(content=result["text"]).send()
===== Architecture =====
%%{init: {'theme': 'dark'}}%%
graph TB
User([User]) -->|Chat Message| WebUI[Chainlit Web UI]
WebUI -->|WebSocket| Server[Chainlit Server]
Server -->|@cl.on_chat_start| Init[Session Init]
Server -->|@cl.on_message| Handler[Message Handler]
Handler -->|API Call| LLM[LLM Provider]
LLM -->|Stream Tokens| Handler
Handler -->|cl.Message| WebUI
Init -->|cl.user_session| Session[Session Store]
Handler -->|Steps| COT[Chain of Thought]
COT -->|Visualization| WebUI
Handler -->|Callbacks| LC[LangChain]
Handler -->|Callbacks| LI[LlamaIndex]
Handler -->|MCP| Tools[External Tools]
Config[config.toml] -->|Theme + Auth| Server
===== Comparison with Alternatives =====
^ Feature ^ Chainlit ^ Streamlit ^ Gradio ^ Flask/FastAPI ^
| Time to chat UI | Minutes | Hours | Hours | Days |((Chainlit Official Website. [[https://chainlit.io|chainlit.io]]))
| AI observability | Built-in | None | None | None |
| Streaming | Native | Limited | Limited | Manual |
| Frontend needed | No | No | No | Yes |
| Session management | Built-in | Basic | Basic | Manual |
===== See Also =====
* [[open_interpreter|Open Interpreter — Natural Language Computer Interface]]
* [[goose|Goose — AI Coding Agent by Block]]
* [[arize_phoenix|Arize Phoenix — AI Observability]]
===== References =====