From 530133e9e3eb176fba11a97e15a0a1dd77077769 Mon Sep 17 00:00:00 2001 From: David Karlsson <35727626+dvdksn@users.noreply.github.com> Date: Thu, 19 Mar 2026 11:32:28 +0000 Subject: [PATCH] compose: clarify cloud provider portability section Remove vague references to unnamed "compatible cloud providers". Rewrite the Cloud providers section to explain that the Compose models specification enables portability for platforms that implement it. Simplify Prerequisites to only mention Docker Model Runner. Co-Authored-By: Claude Sonnet 4.6 --- content/manuals/ai/compose/models-and-compose.md | 10 +++++----- 1 file changed, 5 insertions(+), 5 deletions(-) diff --git a/content/manuals/ai/compose/models-and-compose.md b/content/manuals/ai/compose/models-and-compose.md index 6c601d27e5f2..3fb5d72c536a 100644 --- a/content/manuals/ai/compose/models-and-compose.md +++ b/content/manuals/ai/compose/models-and-compose.md @@ -16,8 +16,7 @@ Compose lets you define AI models as core components of your application, so you ## Prerequisites - Docker Compose v2.38 or later -- A platform that supports Compose models such as Docker Model Runner (DMR) or compatible cloud providers. - If you are using DMR, see the [requirements](/manuals/ai/model-runner/_index.md#requirements). +- A platform that supports Compose models such as [Docker Model Runner (DMR)](/manuals/ai/model-runner/_index.md#requirements). ## What are Compose models? @@ -166,7 +165,7 @@ Docker Model Runner will: ### Cloud providers -The same Compose file can run on cloud providers that support Compose models: +The Compose models specification is designed to be portable. Platforms that implement the Compose specification can support the `models` top-level element, allowing the same Compose file to run on different infrastructure. Cloud-specific behavior can be configured using extension attributes (`x-*`): ```yaml services: @@ -184,9 +183,10 @@ models: - "cloud.region=us-west-2" ``` -Cloud providers might: +How a platform handles model definitions depends on its implementation. A platform might: + - Use managed AI services instead of running models locally -- Apply cloud-specific optimizations and scaling +- Apply platform-specific optimizations and scaling - Provide additional monitoring and logging capabilities - Handle model versioning and updates automatically