This shows you the differences between two versions of the page.
| Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
| open_interpreter [2026/03/30 22:22] – Restructure: footnotes as references agent | open_interpreter [2026/03/31 15:20] (current) – Remove official claims agent | ||
|---|---|---|---|
| Line 1: | Line 1: | ||
| - | ====== Open Interpreter ====== | ||
| - | |||
| - | **Open Interpreter** is an open-source natural language computer interface that lets large language models run code locally on your machine.((https:// | ||
| - | |||
| - | Unlike cloud-based code interpreters with sandboxed environments, | ||
| - | |||
| - | ===== How It Works ===== | ||
| - | |||
| - | Open Interpreter splits into two core components: a **Core** execution engine and a **Terminal Interface**. The Core provides a real-time code execution environment where LLMs safely control the computer via an '' | ||
| - | |||
| - | The system supports any LLM provider through LiteLLM — including OpenAI, Anthropic, local models via LM Studio, and dozens more. Conversations can be persisted, restored, and run asynchronously. | ||
| - | |||
| - | ===== Key Features ===== | ||
| - | |||
| - | * **Local execution** — No sandbox restrictions, | ||
| - | * **Multi-language support** — Python, JavaScript, Bash, and more | ||
| - | * **GUI and vision control** — Interact with desktop GUI and analyze images | ||
| - | * **LLM flexibility** — Any provider via LiteLLM (OpenAI, Anthropic, local models) | ||
| - | * **Chat persistence** — Save and restore conversation sessions | ||
| - | * **AGPL-3.0 licensed** — Fully open-source with 100+ contributors((https:// | ||
| - | |||
| - | ===== Installation and Usage ===== | ||
| - | |||
| - | <code python> | ||
| - | # Install Open Interpreter | ||
| - | # pip install open-interpreter | ||
| - | |||
| - | # Python API usage | ||
| - | from interpreter import interpreter | ||
| - | |||
| - | # Basic conversation | ||
| - | interpreter.chat(" | ||
| - | |||
| - | # Configure model | ||
| - | interpreter.model = " | ||
| - | |||
| - | # Use with Anthropic | ||
| - | interpreter.model = " | ||
| - | interpreter.chat(" | ||
| - | |||
| - | # Local/ | ||
| - | interpreter.offline = True | ||
| - | interpreter.llm.model = " | ||
| - | interpreter.llm.api_base = " | ||
| - | interpreter.llm.context_window = 3000 | ||
| - | interpreter.llm.max_tokens = 1000 | ||
| - | interpreter.chat() | ||
| - | </ | ||
| - | |||
| - | ===== Architecture ===== | ||
| - | |||
| - | < | ||
| - | %%{init: {' | ||
| - | graph TB | ||
| - | User([User]) --> | ||
| - | TI --> | ||
| - | TI --> | ||
| - | LM -->|API Calls| OpenAI[OpenAI API] | ||
| - | LM -->|API Calls| Anthropic[Anthropic API] | ||
| - | LM -->|API Calls| Local[Local Models] | ||
| - | TI -->|Code Request| Core[Core Engine] | ||
| - | Core --> | ||
| - | Core --> | ||
| - | Core --> | ||
| - | Core --> | ||
| - | Core -->|File Access| FS[Local Filesystem] | ||
| - | Core --> | ||
| - | </ | ||
| - | |||
| - | ===== Use Cases ===== | ||
| - | |||
| - | * **Data analysis** — Process CSV, Excel, databases with natural language queries | ||
| - | * **File management** — Organize, rename, convert files across directories | ||
| - | * **Web automation** — Control browsers, scrape data, interact with APIs | ||
| - | * **Media editing** — Resize images, convert formats, edit video metadata | ||
| - | * **System administration** — Monitor processes, manage services, configure systems(([[https:// | ||
| - | |||
| - | |||
| - | ===== See Also ===== | ||
| - | |||
| - | * [[goose|Goose — AI Coding Agent by Block]] | ||
| - | * [[continue_dev|Continue.dev — Open-Source AI Code Assistant]] | ||
| - | * [[chainlit|Chainlit — Conversational AI Framework]] | ||
| - | |||
| - | ===== References ===== | ||