====== Streamlit for AI ====== **Streamlit** is an open-source Python framework that transforms Python scripts into interactive web applications. It is widely used for building AI and machine learning applications, offering chat UI components, session state management, and seamless LLM integration without requiring frontend expertise.((source [[https://streamlit.io|Streamlit]])) ===== Overview ===== Streamlit scripts execute sequentially like plain Python, but widgets trigger automatic reruns, making apps reactive.((source [[https://docs.streamlit.io/get-started/fundamentals/main-concepts|Streamlit Main Concepts]])) Key characteristics: * **Python-only** -- no HTML, CSS, or JavaScript required * **Reactive execution** -- full script reruns on user interactions * **Live reload** -- instant previews on code changes * **Rich ecosystem** -- integrates with Pandas, NumPy, Matplotlib, Plotly, TensorFlow, scikit-learn ===== Chat UI Components ===== Streamlit provides dedicated components for conversational AI applications:((source [[https://thirdeyedata.ai/technologies/streamlit-ui|Streamlit UI - ThirdEyeData]])) * **st.chat_input** -- captures user messages as a bottom input box, returning the message string on submission * **st.chat_message** -- displays messages in chat bubbles styled as "user" or "assistant," supporting markdown, images, and multimedia * **st.status** -- shows processing status for long-running LLM calls * **st.write_stream** -- streams token-by-token output for real-time LLM responses These components enable building ChatGPT-like interfaces with minimal code. ===== Session State ===== By default, Streamlit is stateless due to full-script reruns. ''st.session_state'' persists data across interactions:((source [[https://docs.streamlit.io/get-started/fundamentals/main-concepts|Streamlit Main Concepts]])) * Mutable dictionary persisted across reruns * Essential for maintaining chat histories (''st.session_state.messages'') * Widget state binding via ''key'' parameter * Callback support for complex state logic ===== LLM Integration ===== Streamlit pairs naturally with LLMs via Python libraries:((source [[https://aiagentskit.com/blog/streamlit-ai-tutorial/|Streamlit AI Tutorial - AI Agents Kit]])) * **Direct API calls** -- OpenAI, Anthropic, Google, and Hugging Face APIs * **LangChain integration** -- chains, agents, and memory management * **LiteLLM** -- multi-provider unified interface * **Local models** -- [[ollama|Ollama]] integration for private inference * **RAG patterns** -- file uploads combined with vector search for retrieval-augmented generation Common patterns include parameter sliders for prompt tuning, file uploads for document QA, and real-time visualization of model outputs. ===== Deployment ===== * **Streamlit Community Cloud** -- free GitHub-integrated hosting with ''requirements.txt''((source [[https://techifysolutions.com/blog/introduction-to-streamlit/|Streamlit Introduction - Techify]])) * **Docker** -- containerize with ''streamlit run app.py %%--%%server.port 8501'' * **Self-hosted** -- any server with Python * **Snowflake** -- managed Streamlit in Snowflake environments with data write-back * **Cloud platforms** -- AWS, GCP, Azure, Heroku ===== Streamlit vs Gradio ===== ^ Aspect ^ Streamlit ^ Gradio ^ | Focus | Data dashboards, full apps, reactive UIs | ML demos, shareable model interfaces | | State | Robust ''st.session_state'' for multi-turn | Basic blocks, less flexible | | Layout | Columns, sidebars, themes, expanders | Rows, columns via Blocks API | | Deployment | Community Cloud, Docker, Snowflake | Hugging Face Spaces (free/easy) | | Best For | Iterative AI tools, data exploration | Quick model demos, HF integration | Both are popular, but Streamlit leads in data/AI prototyping while [[gradio|Gradio]] excels for quick model sharing on [[hugging_face|Hugging Face]]. ===== See Also ===== * [[gradio|Gradio]] * [[hugging_face|Hugging Face]] * [[vercel_ai_sdk|Vercel AI SDK]] ===== References =====