feat(databricks): update model YAMLs [bot]#367
Conversation
There was a problem hiding this comment.
Cursor Bugbot has reviewed your changes and found 3 potential issues.
Bugbot Autofix is OFF. To automatically fix reported issues with cloud agents, enable autofix in the Cursor dashboard.
| max_input_tokens: 131072 | ||
| max_output_tokens: 131072 | ||
| max_tokens: 131072 | ||
| context_window: 128000 |
There was a problem hiding this comment.
Token limit fields accidentally removed from gpt-oss-20b model
High Severity
The max_input_tokens, max_output_tokens, and max_tokens fields (previously 131072) were removed entirely instead of being updated. The limits section now only contains context_window: 128000. The sibling model databricks-gpt-oss-120b.yaml correctly retained all three fields (updated to 128000). Any system reading these token limits will get no value, likely causing errors or falling back to incorrect defaults.
| max_input_tokens: 200000 | ||
| max_output_tokens: 32000 | ||
| max_tokens: 32000 | ||
| modalities: |
There was a problem hiding this comment.
Missing context_window in claude-opus-4 model YAML
Medium Severity
The context_window field was not added to this model's limits, while every other Claude model in this PR (claude-3-7-sonnet, claude-haiku-4-5, claude-opus-4-1, claude-opus-4-5, claude-sonnet-4, claude-sonnet-4-5) received context_window: 200000. Given max_input_tokens is 200000, this omission appears accidental.
| max_input_tokens: 200000 | ||
| max_output_tokens: 64000 | ||
| max_tokens: 64000 | ||
| modalities: |
There was a problem hiding this comment.
Missing context_window in claude-sonnet-4-1 model YAML
Medium Severity
The context_window field was not added to this model's limits, while every other Claude model in this PR received context_window: 200000. The nearly identical databricks-claude-sonnet-4.yaml has context_window: 200000, making this omission inconsistent and likely accidental.


Auto-generated by poc-agent for provider
databricks.Note
Medium Risk
Mainly updates Databricks model metadata, but changes to
limits(e.g., context windows/token caps) and deprecation flags can affect downstream model selection and request sizing.Overview
Refreshes Databricks model YAML metadata by adding explicit
context_windowvalues,modalities(text/image and, for Gemini Pro, audio/video/pdf), andthinkingsupport across many chat models.Updates capability flags and message-role support (e.g., adds
system_messages,structured_output,tools, andmessages.optionsfordatabricks-gpt-5) and annotates models with lifecycle/documentation metadata (isDeprecated,deprecationDate,sources).Adjusts some token limits to align with stated context windows (notably reducing
databricks-gpt-oss-120bmax tokens to128000).Written by Cursor Bugbot for commit 284b5c5. This will update automatically on new commits. Configure here.