OpenCode is an open-source, terminal-native AI coding agent that provides intelligent code generation, refactoring, debugging, and documentation assistance. With over 129,000 GitHub stars and 800+ contributors, it has become one of the most popular open-source alternatives to proprietary coding assistants.
Multi-Provider Support – Connects to 75+ LLMs via Models.dev including OpenAI GPT-4, Anthropic Claude, Google Gemini, and local models via Ollama/LM Studio
Terminal-Native TUI – Beautiful terminal user interface optimized for power users with session management and real-time feedback
LSP Integration – Auto-loads Language Server Protocol servers (PyRight, Rust Analyzer, TypeScript) for real-time code feedback
MCP Support – Model Context Protocol servers for extended context such as GitHub repository access
Plan and Build Modes – Plan mode for analysis/explanation, Build mode for code changes and edits
Multi-Session – Run parallel agents on the same project with shareable session links
GitHub Integration – Trigger via /opencode comments in issues and PRs, executing in GitHub Actions runners
Offline Mode – Full functionality with local models at zero API cost
Architecture
OpenCode operates as a modular, provider-agnostic system with several key components:
Core Agent Engine – Handles natural language processing via connected LLMs with context management
LSP Integration Layer – Auto-discovers and loads language-specific servers for real-time code feedback, improving LLM accuracy
MCP Client – Optional remote or local MCP servers for extended context
ACP Support – Agent Client Protocol for standardized editor/IDE communication
TUI Renderer – Terminal user interface with interactive chat, session management, and visualization
Local/Free – Code Llama, DeepSeek Coder, StarCoder via Ollama or LM Studio
Subscriptions – Direct login for ChatGPT Plus/Pro, GitHub Copilot
Performance scales with model quality: GPT-4/Claude achieves 80-90% quality vs commercial tools; open models reach 60-70%.
Code Example
# AGENTS.md configuration example for OpenCode# Place in your project root to provide context to the agent## Project: MyApp# Tech Stack: Python 3.12, FastAPI, PostgreSQL# Testing: pytest with coverage# Conventions: type hints, PEP 8, docstrings required## Usage:# pip install opencode# opencode # launches TUI# opencode "refactor to async/await" # direct commandimportsubprocessimport json
def run_opencode_headless(prompt, model="claude-sonnet"):
result =subprocess.run(["opencode","--headless","--model", model, prompt],
capture_output=True, text=True)return json.loads(result.stdout)
output = run_opencode_headless("Add error handling to api/routes.py")print(output["changes"])