Skip to content

docs: add MLflow AI Gateway as LLM provider#5228

Draft
PattaraS wants to merge 1 commit intocrewAIInc:mainfrom
PattaraS:add-mlflow-gateway-integration
Draft

docs: add MLflow AI Gateway as LLM provider#5228
PattaraS wants to merge 1 commit intocrewAIInc:mainfrom
PattaraS:add-mlflow-gateway-integration

Conversation

@PattaraS
Copy link
Copy Markdown

@PattaraS PattaraS commented Apr 2, 2026

Summary

Adds an MLflow AI Gateway Accordion entry to docs/en/concepts/llms.mdx, showing how to use it as an OpenAI-compatible LLM backend in CrewAI.

What is MLflow AI Gateway?

MLflow AI Gateway is a database-backed LLM proxy built into the MLflow tracking server (MLflow ≥ 3.0). It provides a unified OpenAI-compatible API across dozens of providers — OpenAI, Anthropic, Gemini, Mistral, Bedrock, Ollama, and more — with encrypted secrets management, fallback/retry, traffic splitting, and per-endpoint token budgets.

Changes

  • Single Accordion section added to docs/en/concepts/llms.mdx with:
    • Brief description of MLflow AI Gateway
    • Setup instructions (start MLflow server, create endpoint in UI)
    • Code example using LLM(model="openai/<endpoint-name>", base_url="...")
    • Note about LiteLLM dependency

Notes

  • Existing MLflow tracing documentation (docs/en/observability/mlflow.mdx) is unchanged — this PR covers the gateway as an LLM provider, which is a separate feature

This pull request was AI-assisted by Claude.

Adds an Accordion entry for MLflow AI Gateway in the LLMs concept page,
showing how to use it as an OpenAI-compatible LLM backend with base_url
pointing to the gateway server.
@PattaraS PattaraS marked this pull request as draft April 2, 2026 09:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant