AI coding agent, built for the terminal. https://opencode.ai
Find a file
2025-06-19 12:29:57 -04:00
.github/workflows fix deploys 2025-06-18 10:47:07 -04:00
infra docs: readme 2025-06-16 20:10:19 -04:00
packages allow selecting model and continuing previous session for opencode run 2025-06-19 12:29:57 -04:00
patches fix issue with tool schemas and google 2025-06-17 11:27:07 -04:00
scripts hooks 2025-06-19 00:20:03 -04:00
.gitignore feat(tui): configurable keybinds and mouse scroll 2025-06-18 13:56:51 -05:00
bun.lock Update SST 2025-06-18 23:38:06 -04:00
bunfig.toml Optimize package management with catalog and exact versions 2025-05-30 21:56:37 -04:00
install fix(install): check if the path export command already exists (#28) 2025-06-13 23:28:33 -04:00
LICENSE sync 2025-05-30 20:48:36 -04:00
opencode.json stop loading models.dev format from global config 2025-06-18 23:08:51 -04:00
package.json hooks 2025-06-19 00:20:03 -04:00
README.md feat(tui): expand input to fit message 2025-06-19 08:45:27 -05:00
screenshot.png screenshot 2025-06-13 17:42:14 -04:00
sst-env.d.ts Infra: use Astro component 2025-06-07 23:46:56 -04:00
sst.config.ts sync 2025-05-30 20:48:36 -04:00
tsconfig.json sync 2025-05-30 20:48:36 -04:00

opencode logo

npm Build status


AI coding agent, built for the terminal.

Note: Version 0.1.x is a full rewrite, and we do not have proper documentation for it yet. Should have this out week of June 17th 2025.

opencode Terminal UI

Installation

# YOLO
curl -fsSL https://opencode.ai/install | bash

# Package managers
npm i -g opencode-ai@latest        # or bun/pnpm/yarn
brew install sst/tap/opencode      # macOS
paru -S opencode-bin               # Arch Linux

Note: Remove versions older than 0.1.x before installing

Providers

The recommended approach is to sign up for Claude Pro or Max, run opencode auth login, and select Anthropic. It's the most cost-effective way to use opencode.

opencode is powered by the provider list at Models.dev, so you can use opencode auth login to configure API keys for any provider you'd like to use. This is stored in ~/.local/share/opencode/auth.json.

$ opencode auth login

┌  Add credential
│
◆  Select provider
│  ● Anthropic (recommended)
│  ○ OpenAI
│  ○ Google
│  ○ Amazon Bedrock
│  ○ Azure
│  ○ DeepSeek
│  ○ Groq
│  ...
└

The Models.dev dataset is also used to detect common environment variables like OPENAI_API_KEY to autoload that provider.

If there are additional providers you want to use you can submit a PR to the Models.dev repo. If configuring just for yourself check out the Config section below.

Config

Config is optional and can be placed in the root of your repo or globally in ~/.config/opencode/config.json. It can be checked in and shared with your team.

{
  "$schema": "http://opencode.ai/config.json"
  "theme": "opencode",
  "model": "anthropic/claude-sonnet-4-20250514" // format is provider/model
  "autoshare": false,
  "autoupdate": true,
}

Keybinds

You can configure custom keybinds, the values listed below are the defaults.

{
  "$schema": "http://opencode.ai/config.json",
  "keybinds": {
    "leader": "ctrl+x",
    "help": "<leader>h",
    "editor_open": "<leader>e",
    "session_new": "<leader>n",
    "session_list": "<leader>l",
    "session_share": "<leader>s",
    "session_interrupt": "esc",
    "session_compact": "<leader>c",
    "tool_details": "<leader>d",
    "model_list": "<leader>m",
    "theme_list": "<leader>t",
    "project_init": "<leader>i",
    "input_clear": "ctrl+c",
    "input_paste": "ctrl+v",
    "input_submit": "enter",
    "input_newline": "shift+enter,ctrl+j",
    "history_previous": "up",
    "history_next": "down",
    "messages_page_up": "pgup",
    "messages_page_down": "pgdown",
    "messages_half_page_up": "ctrl+alt+u",
    "messages_half_page_down": "ctrl+alt+d",
    "messages_previous": "ctrl+alt+k",
    "messages_next": "ctrl+alt+j",
    "messages_first": "ctrl+g",
    "messages_last": "ctrl+alt+g",
    "app_exit": "ctrl+c,<leader>q"
  }
}

MCP

{
  "$schema": "http://opencode.ai/config.json",
  "mcp": {
    "localmcp": {
      "type": "local",
      "command": ["bun", "x", "my-mcp-command"],
      "environment": {
        "MY_ENV_VAR": "my_env_var_value"
      }
    },
    "remotemcp": {
      "type": "remote",
      "url": "https://my-mcp-server.com"
    }
  }
}

Providers

You can use opencode with any provider listed at here. Be sure to specify the npm package to use to load the provider. Remember most popular providers are preloaded from models.dev

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama": {
      "npm": "@ai-sdk/openai-compatible",
      "options": {
        "baseURL": "http://localhost:11434/v1"
      },
      "models": {
        "llama2": {}
      }
    }
  }
}

Contributing

To run opencode locally you need.

  • Bun
  • Golang 1.24.x

To run.

$ bun install
$ bun run packages/opencode/src/index.ts

FAQ

How do I use this with OpenRouter?

OpenRouter is not in the Models.dev database yet, but you can configure it manually.

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "openrouter": {
      "npm": "@openrouter/ai-sdk-provider",
      "name": "OpenRouter",
      "options": {},
      "models": {
        "anthropic/claude-3.5-sonnet": {
          "name": "Claude 3.5 Sonnet"
        }
      }
    }
  }
}

And then to configure an api key you can do opencode auth login and select "Other -> 'openrouter'"

How is this different than Claude Code?

It's very similar to Claude Code in terms of capability. Here are the key differences:

  • 100% open source
  • Not coupled to any provider. Although Anthropic is recommended, opencode can be used with OpenAI, Google or even local models. As models evolve the gaps between them will close and pricing will drop so being provider agnostic is important.
  • A focus on TUI. opencode is built by neovim users and the creators of terminal.shop; we are going to push the limits of what's possible in the terminal.
  • A client/server architecture. This for example can allow opencode to run on your computer, while you can drive it remotely from a mobile app. Meaning that the TUI frontend is just one of the possible clients.

What about Windows support?

There are some minor problems blocking opencode from working on windows. We are working on on them now. You'll need to use WSL for now.

What's the other repo?

The other confusingly named repo has no relation to this one. You can read the story behind it here.


Join our community YouTube | X.com