Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions core/config/onboarding.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@ import { ConfigYaml } from "@continuedev/config-yaml";
export const LOCAL_ONBOARDING_PROVIDER_TITLE = "Ollama";
export const LOCAL_ONBOARDING_FIM_MODEL = "qwen2.5-coder:1.5b-base";
export const LOCAL_ONBOARDING_FIM_TITLE = "Qwen2.5-Coder 1.5B";
export const LOCAL_ONBOARDING_CHAT_MODEL = "llama3.1:8b";
export const LOCAL_ONBOARDING_CHAT_TITLE = "Llama 3.1 8B";
export const LOCAL_ONBOARDING_CHAT_MODEL = "qwen3:8b";
export const LOCAL_ONBOARDING_CHAT_TITLE = "Qwen 3 8B";
export const LOCAL_ONBOARDING_EMBEDDINGS_MODEL = "nomic-embed-text:latest";
export const LOCAL_ONBOARDING_EMBEDDINGS_TITLE = "Nomic Embed";

Expand Down
2 changes: 2 additions & 0 deletions core/llm/toolSupport.ts
Original file line number Diff line number Diff line change
Expand Up @@ -219,6 +219,8 @@ export const PROVIDER_TOOL_SUPPORT: Record<string, (model: string) => boolean> =
"glm-5",
"deepseek",
"dolphin",
"gemma3",
"phi4",
].some((part) => modelName.toLowerCase().includes(part))
) {
return true;
Expand Down
20 changes: 10 additions & 10 deletions docs/customize/model-providers/top-level/ollama.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,9 @@ slug: ../ollama
sidebarTitle: "Ollama"
---

<Tip>
**Discover Ollama models [here](https://continue.dev/lmstudio)**
</Tip>
<Tip>**Discover Ollama models [here](https://continue.dev/lmstudio)**</Tip>

<Info>
Get started with [Ollama](https://ollama.com/download)
</Info>
<Info>Get started with [Ollama](https://ollama.com/download)</Info>

## Configuration

Expand Down Expand Up @@ -42,10 +38,12 @@ sidebarTitle: "Ollama"
}
```
</Tab>

</Tabs>

<Info>
**Check out a more advanced configuration [here](https://continue.dev/ollama/qwen3-coder-30b?view=config)**
**Check out a more advanced configuration
[here](https://continue.dev/ollama/qwen3-coder-30b?view=config)**
</Info>

## How to Configure Model Capabilities in Ollama
Expand Down Expand Up @@ -85,10 +83,12 @@ Ollama models usually have their capabilities auto-detected correctly. However,
}
```
</Tab>

</Tabs>

<Note>
Many Ollama models support tool use by default. Vision models often also support image input
Many Ollama models support tool use by default. Vision models often also
support image input
</Note>

## Troubleshooting
Expand All @@ -99,9 +99,9 @@ Continue may set a higher default context length than other Ollama tools, causin

```yaml title="config.yaml"
models:
- name: Deepseek R1
- name: Qwen 3 8B
provider: ollama
model: deepseek-r1:latest
model: qwen3:8b
defaultCompletionOptions:
contextLength: 2048
```
Expand Down
Loading
Loading