Skip to content

fix: forward base_url to instructor client for OpenAI-compatible endpoints#5264

Open
shivam2407 wants to merge 1 commit intocrewAIInc:mainfrom
shivam2407:fix/instructor-base-url-5204
Open

fix: forward base_url to instructor client for OpenAI-compatible endpoints#5264
shivam2407 wants to merge 1 commit intocrewAIInc:mainfrom
shivam2407:fix/instructor-base-url-5204

Conversation

@shivam2407
Copy link
Copy Markdown

Summary

Fixes #5204

When using an OpenAI-compatible provider with a custom base_url (e.g. self-hosted vLLM, Ollama, or any non-OpenAI endpoint), InternalInstructor silently discards the base_url and sends structured output requests to api.openai.com instead of the configured endpoint.

Root cause

_create_instructor_client() extracts only model and provider from the LLM, then calls instructor.from_provider(f"{provider}/{model_string}"). The base_url is never forwarded, and instructor.from_provider() creates a default client pointing at api.openai.com/v1/.

Fix

Added _get_llm_client_kwargs() to extract base_url, api_base, and api_key from the LLM object. When base_url is present, constructs an explicit SDK client instead of using from_provider():

Provider Client used Key forwarded as
openai OpenAI(base_url=...) base_url
azure / azure_openai AzureOpenAI(azure_endpoint=..., api_version=...) azure_endpoint
anthropic Anthropic(base_url=...) base_url
Others from_provider() fallback api_key only

Additional behaviors:

  • api_base is normalized to base_url; when both are set, base_url takes precedence
  • All guards use is not None (not truthiness) so empty strings are handled correctly
  • When no custom base_url is set, behavior is unchanged (from_provider() path)

Changes

File Change
internal_instructor.py Added _get_llm_client_kwargs(), _create_instructor_client_with_base_url() with per-provider client construction. Updated _create_instructor_client() to route through the new path when base_url is present.
test_converter.py 8 new tests + fixed 1 existing mock

Test plan

  • 8 new tests pass (OpenAI/Anthropic/Azure forwarding, api_base normalization, collision precedence, string LLM, no-base_url default)
  • All 50 existing tests still pass
  • Manual testing with vLLM/Ollama endpoint + output_pydantic task

…oints (crewAIInc#5204)

When using an OpenAI-compatible provider with a custom base_url (e.g.
self-hosted vLLM, Ollama), InternalInstructor silently discarded the
base_url and sent requests to api.openai.com instead.

Add _get_llm_client_kwargs() to extract base_url, api_base, and
api_key from the LLM object. When base_url is present, construct an
explicit SDK client instead of using instructor.from_provider() which
doesn't forward it:
- OpenAI: OpenAI(base_url=...)
- Azure: AzureOpenAI(azure_endpoint=..., api_version=...)
- Anthropic: Anthropic(base_url=...)

Also normalizes api_base to base_url (base_url takes precedence when
both are set).

Includes 8 tests covering OpenAI/Anthropic/Azure forwarding, api_base
normalization, collision precedence, string LLM fallback, and the
default no-base_url path.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] InternalInstructor discards base_url when creating instructor client — breaks OpenAI-compatible endpoints

1 participant