From 4c7fda58d2a135ea7615b91c6220c33acf699132 Mon Sep 17 00:00:00 2001 From: Tobias Fenster Date: Sun, 1 Feb 2026 19:42:18 +0100 Subject: [PATCH 1/3] Document OpenCode integration for Docker Model Runner and WSL troubleshooting Added OpenCode integration details and configuration instructions. Also added a troubleshooting hint for WSL --- .../ai/model-runner/ide-integrations.md | 36 +++++++++++++++++++ 1 file changed, 36 insertions(+) diff --git a/content/manuals/ai/model-runner/ide-integrations.md b/content/manuals/ai/model-runner/ide-integrations.md index dec313f6a641..cfa00899f507 100644 --- a/content/manuals/ai/model-runner/ide-integrations.md +++ b/content/manuals/ai/model-runner/ide-integrations.md @@ -224,6 +224,40 @@ response = llm.complete("Write a hello world function") print(response.text) ``` +## OpenCode + +[OpenCode](https://opencode.ai/) is an open-source coding assistant designed to integrate directly into developer workflows. It supports multiple model providers and exposes a flexible configuration system that makes it easy to switch between them. + +### Configuration + +1. Install OpenCode (s. [docs](https://opencode.ai/docs/#install)) +2. Reference DMR in your OpenCode configuration, either globally at `~/.config/opencode/opencode.json` or project specific with a `opencode.json` file in the root of your project + ``` + { + "$schema": "https://opencode.ai/config.json", + "provider": { + "dmr": { + "npm": "@ai-sdk/openai-compatible", + "name": "Docker Model Runner", + "options": { + "baseURL": "http://localhost:12434/v1", + }, + "models": { + "qwen-coder3": { + "name": "qwen-coder3" + }, + "devstral-small-2": { + "name": "devstral-small-2" + } + } + } + } + } + ``` +3. Select the model you want in OpenCode + +Your can find more details in [this Docker Blog post](https://www.docker.com/blog/opencode-docker-model-runner-private-ai-coding/) + ## Common issues ### "Connection refused" errors @@ -240,6 +274,8 @@ print(response.text) 3. Check if another service is using port 12434. +4. If you run your tool in WSL and want to connect to DMR on the host via `localhost`, this might not directly work. Configuring WSL to use [mirrored networking](https://learn.microsoft.com/en-us/windows/wsl/networking#mirrored-mode-networking) can solve this. + ### "Model not found" errors 1. Verify the model is pulled: From dfb6dd94f4ebf971491b67550de0a7eeb56620f6 Mon Sep 17 00:00:00 2001 From: Tobias Fenster Date: Sun, 1 Feb 2026 20:02:09 +0100 Subject: [PATCH 2/3] Apply suggestions from code review Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --- .../manuals/ai/model-runner/ide-integrations.md | 14 +++++++------- 1 file changed, 7 insertions(+), 7 deletions(-) diff --git a/content/manuals/ai/model-runner/ide-integrations.md b/content/manuals/ai/model-runner/ide-integrations.md index cfa00899f507..df539b80a9dc 100644 --- a/content/manuals/ai/model-runner/ide-integrations.md +++ b/content/manuals/ai/model-runner/ide-integrations.md @@ -230,7 +230,7 @@ print(response.text) ### Configuration -1. Install OpenCode (s. [docs](https://opencode.ai/docs/#install)) +1. Install OpenCode (see [docs](https://opencode.ai/docs/#install)) 2. Reference DMR in your OpenCode configuration, either globally at `~/.config/opencode/opencode.json` or project specific with a `opencode.json` file in the root of your project ``` { @@ -240,14 +240,14 @@ print(response.text) "npm": "@ai-sdk/openai-compatible", "name": "Docker Model Runner", "options": { - "baseURL": "http://localhost:12434/v1", + "baseURL": "http://localhost:12434/v1" }, "models": { - "qwen-coder3": { - "name": "qwen-coder3" + "ai/qwen2.5-coder": { + "name": "ai/qwen2.5-coder" }, - "devstral-small-2": { - "name": "devstral-small-2" + "ai/llama3.2": { + "name": "ai/llama3.2" } } } @@ -256,7 +256,7 @@ print(response.text) ``` 3. Select the model you want in OpenCode -Your can find more details in [this Docker Blog post](https://www.docker.com/blog/opencode-docker-model-runner-private-ai-coding/) +You can find more details in [this Docker Blog post](https://www.docker.com/blog/opencode-docker-model-runner-private-ai-coding/) ## Common issues From e4073322cc22b2fbc336ef54f85b69e1f646cee5 Mon Sep 17 00:00:00 2001 From: Tobias Fenster Date: Sun, 1 Feb 2026 20:02:53 +0100 Subject: [PATCH 3/3] Specify JSON format in OpenCode configuration Updated configuration section to specify JSON format for OpenCode configuration. --- content/manuals/ai/model-runner/ide-integrations.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/manuals/ai/model-runner/ide-integrations.md b/content/manuals/ai/model-runner/ide-integrations.md index df539b80a9dc..b4d726f4e580 100644 --- a/content/manuals/ai/model-runner/ide-integrations.md +++ b/content/manuals/ai/model-runner/ide-integrations.md @@ -232,7 +232,7 @@ print(response.text) 1. Install OpenCode (see [docs](https://opencode.ai/docs/#install)) 2. Reference DMR in your OpenCode configuration, either globally at `~/.config/opencode/opencode.json` or project specific with a `opencode.json` file in the root of your project - ``` + ```json { "$schema": "https://opencode.ai/config.json", "provider": {