-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Closed
Description
Eg, something like:
{
"$schema": "https://opencode.ai/config.json",
"theme": "openai",
"provider": {
"vertexai": {
"npm": "@ai-sdk/google-vertex",
"options": {
"project": "my-project-1",
"location": "us-central1",
"googleAuthOptions": {
"keyFile": "creds.json"
}
},
"models": {
"gemini-2.5-pro": {},
"gemini-2.0-flash": {},
"gemini-2.5-flash": {},
"claude-4.0-sonnet": {},
"claude-4.0-opus": {}
}
}
}
}
And actually this isn't fully accurate, @ai-sdk/google-vertex works for Gemini models but @ai-sdk/google-vertex/anthropic is needed for Claude models hosted on Vertex.
Alternatively, what I did for now is simply use my litellm proxy gateway running in a seperate process with the following config:
{
"$schema": "https://opencode.ai/config.json",
"theme": "openai",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "http://localhost:4000/v1",
"apiKey": "sk-demo-1234"
},
"models": {
"gemini-2.5-pro": {},
"gemini-2.0-flash": {},
"gemini-2.5-flash": {},
"claude-4.0-sonnet": {},
"claude-4.0-opus": {},
"claude-3.7-sonnet": {},
"o4-mini-high": {},
"o4-mini": {},
"o3": {}
}
}
}
}
The only problem with this is you don't get token/cost tracking data. I think a fair number of people do this, so it could be nice to simply flag somehow that the model you're passing through is the "official" model from models.dev?
anurag-roy, titan-graham, ArykLAnderson, NachoVazquez, suwncap and 17 more
Metadata
Metadata
Assignees
Labels
No labels