Skip to content

Conversation

@johnsaigle
Copy link

@johnsaigle johnsaigle commented Jun 20, 2025

Tested this locally and got Ollama to work via opencode run "test prompt".

The "Requirements" section describe the current behaviour of the program, but these might in fact be bugs, (see #227). I'm happy to update the PR if these instructions aren't correct.

@semidark
Copy link

semidark commented Jul 1, 2025

If Opencode uses ai-sdk, as stated here in the documentation, shouldn’t we be able to use the Ollama API directly, without the need for the OpenAI API layer?

See: https://ai-sdk.dev/providers/community-providers/ollama

I tried it with the following example (see ollama part of the config), but it is not yet working. Only the ollama-v1 part of the config is somehow working.

{
  "$schema": "https://opencode.ai/config.json",
  "provider": {
    "ollama-v1": {
      "npm": "@ai-sdk/openai-compatible",
      "options": {
        "baseURL": "http://ollama.lan:11434/v1"
      },
      "models": {
        "qwen3-32b:MAX": {
	  "name"      : "Qwen3 32b - Max",
	  "tool_call" : true,
	  "reasoning" : true,
	  "limit": {
	    "context": 22000,
	    "output": 8192
	  } 
	},
        "qwen3-14b:MAX": {
	  "name"      : "Qwen3 14b - Max",
	  "tool_call" : true,
	  "reasoning" : true,
	  "limit": {
	    "context": 40960,
	    "output": 8192
	  } 
	}
      }
    },
    "ollama": {
      "npm": "@ai-sdk/ollama-ai-provider",
      "options": {
        "baseURL": "http://ollama.lan:11434"
      },
      "models": {
        "qwen3-32b:MAX": {
          "name": "Qwen3 32b - Max"
	}
      }
    }
  }
}

@thdxr thdxr self-assigned this Jul 2, 2025
@johnsaigle
Copy link
Author

Closing this because it's probably very stale

@johnsaigle johnsaigle closed this Jul 15, 2025
m-pa pushed a commit to m-pa/opencode that referenced this pull request Dec 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants