diff --git a/content/manuals/ai/compose/models-and-compose.md b/content/manuals/ai/compose/models-and-compose.md index 6c601d27e5f2..3fb5d72c536a 100644 --- a/content/manuals/ai/compose/models-and-compose.md +++ b/content/manuals/ai/compose/models-and-compose.md @@ -16,8 +16,7 @@ Compose lets you define AI models as core components of your application, so you ## Prerequisites - Docker Compose v2.38 or later -- A platform that supports Compose models such as Docker Model Runner (DMR) or compatible cloud providers. - If you are using DMR, see the [requirements](/manuals/ai/model-runner/_index.md#requirements). +- A platform that supports Compose models such as [Docker Model Runner (DMR)](/manuals/ai/model-runner/_index.md#requirements). ## What are Compose models? @@ -166,7 +165,7 @@ Docker Model Runner will: ### Cloud providers -The same Compose file can run on cloud providers that support Compose models: +The Compose models specification is designed to be portable. Platforms that implement the Compose specification can support the `models` top-level element, allowing the same Compose file to run on different infrastructure. Cloud-specific behavior can be configured using extension attributes (`x-*`): ```yaml services: @@ -184,9 +183,10 @@ models: - "cloud.region=us-west-2" ``` -Cloud providers might: +How a platform handles model definitions depends on its implementation. A platform might: + - Use managed AI services instead of running models locally -- Apply cloud-specific optimizations and scaling +- Apply platform-specific optimizations and scaling - Provide additional monitoring and logging capabilities - Handle model versioning and updates automatically