docs: use ollama example

This commit is contained in:
Dax Raad 2025-06-14 18:55:39 -04:00
parent fa1266263d
commit 2ea0399aa7
2 changed files with 12 additions and 15 deletions

View file

@ -82,17 +82,16 @@ You can use opencode with any provider listed at [here](https://ai-sdk.dev/provi
```json title="opencode.json"
{
"$schema": "http://opencode.ai/config.json",
"$schema": "https://opencode.ai/config.json",
"provider": {
"@ai-sdk/openai-compatible": {
"name": "MySpecialProvider",
"name": "ollama",
"options": {
"apiKey": "xxx",
"baseURL": "https://api.provider.com/v1"
"baseURL": "http://localhost:11434/v1"
},
"models": {
"my-model-name": {
"name": "My Model Name"
"llama2": {
"name": "llama2"
}
}
}
@ -117,9 +116,9 @@ $ bun run src/index.ts
### FAQ
#### How do I use this with OpenRouter
#### How do I use this with OpenRou?ter
Theoretically you can use this with OpenRouter with config like this
OpenRouter is not yet in the models.dev database but you can configure it manually.
```json title="opencode.json"
{
@ -139,5 +138,3 @@ Theoretically you can use this with OpenRouter with config like this
}
}
```
However we are using [ai-sdk v5](https://ai-sdk.dev) which OpenRouter does not support yet. The moment they do this will work

View file

@ -1,14 +1,14 @@
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"@openrouter/ai-sdk-provider": {
"name": "OpenRouter",
"@ai-sdk/openai-compatible": {
"name": "ollama",
"options": {
"apiKey": "sk-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
"baseURL": "http://localhost:11434/v1"
},
"models": {
"anthropic/claude-3.5-sonnet": {
"name": "claude-3.5-sonnet"
"llama2": {
"name": "llama2"
}
}
}