docs: Add ollama configuration example

This commit is contained in:
a 2025-06-19 21:59:00 -04:00
parent 6674c6083a
commit 34d68b777a

View file

@ -57,6 +57,30 @@ To configure a local model, specify the npm package to use and the `baseURL`.
}
```
#### Example: Ollama
Requirements:
- You must set-up an API key for Ollama to appear as a provider.
- The model name must include the 'variant', e.g. "latest", "7b", etc.
- The JSON value must be empty.
```json title="opencode.json" {5,7}
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "http://localhost:11434/v1"
},
"models": {
"llama3.2:latest": {}
}
}
}
}
```
---
## Select a model