Releases: UiPath/uipath-llm-client-python
Releases · UiPath/uipath-llm-client-python
UiPath LLM Client [v1.9.5]
[1.9.5] - 2026-04-21
Added
utils.headers.UIPATH_DEFAULT_REQUEST_HEADERSpublic constant — the single source of truth for built-in gateway request headers (X-UiPath-LLMGateway-TimeoutSeconds=295,X-UiPath-LLMGateway-AllowFull4xxResponse=false).UiPathHttpxClient._default_headersnow references this constant; the langchain base client reuses the same constant for itsclass_default_headers.
UiPath LLM Client [v1.9.4]
[1.9.4] - 2026-04-21
Changed
- Bumped dependency floors to the latest released versions:
pydantic-settings>=2.14.0,uipath-platform>=0.1.32,google-genai>=1.73.1,anthropic>=0.96.0,litellm>=1.83.7.
UiPath LangChain Client [langchain-v1.9.5]
[1.9.5] - 2026-04-21
Changed
UiPathBaseLLMClient.default_headersis now additive: caller-supplied headers are merged on top of a class-levelclass_default_headers(timeout andAllowFull4xxResponsepolicy) instead of replacing them. User values still win on key collisions. Previously, passing anydefault_headers={...}caused the built-in defaults to be dropped fromself.default_headers(though the core httpx client's class defaults kept them on the wire).UiPathBaseLLMClient.class_default_headersnow points at the shareduipath.llm_client.utils.headers.UIPATH_DEFAULT_REQUEST_HEADERSconstant (single source of truth with core'sUiPathHttpxClient._default_headers).- Minimum
uipath-llm-clientbumped to 1.9.5 for the sharedUIPATH_DEFAULT_REQUEST_HEADERSconstant.
UiPath LangChain Client [langchain-v1.9.4]
[1.9.4] - 2026-04-21
Changed
- Bumped dependency floors to the latest released versions:
langchain-openai>=1.1.15,langchain-google-genai>=4.2.2,langchain-anthropic>=1.4.1,anthropic[bedrock,vertex]>=0.96.0,langchain-aws[anthropic]>=1.4.4,langchain-azure-ai>=1.2.2. - Minimum
uipath-llm-clientbumped to 1.9.4 to match the core dependency-floor release.
UiPath LangChain Client [langchain-v1.9.3]
[1.9.3] - 2026-04-20
Changed
get_chat_model()now defaults to the OpenAI Responses API (ApiFlavor.RESPONSES) when discovery does not specify a flavor for an OpenAI chat model. Explicitapi_flavor=on the call and BYOM-discovered flavors still take precedence. The LiteLLM client still defaults to chat-completions for OpenAI because LiteLLM 1.83.x drops the injected httpxclientwhen its acompletion→aresponses bridge fires, which breaks async auth against the UiPath gateway.
UiPath LLM Client [v1.9.2]
[1.9.2] - 2026-04-17
Changed
PlatformBaseSettings.build_auth_headers()now uses the header-name constants fromuipath.platform.common.constants(lowercase canonical form). HTTP header names are case-insensitive so wire-level behavior is unchanged.UIPATH_PROCESS_KEYis now URL-encoded (urllib.parse.quote(..., safe="")) before being placed inX-UiPath-ProcessKey, matching the platform-wide convention.
Added
HEADER_LICENSING_CONTEXTheader populated dynamically fromUiPathConfig.licensing_contextwhen set.
UiPath LLM Client [v1.9.1]
[1.9.1] - 2026-04-17
Added
utils.model_family.is_anthropic_model_name()helper andANTHROPIC_MODEL_NAME_KEYWORDStuple — name-based Claude detection for BYOM deployments where discovery does not exposemodelFamily
Fixed
UiPathLiteLLMnow detects Claude-family models by name whenmodelFamilyis unavailable (BYOM), correctly routing Bedrock/Vertex provider selection and default flavors
UiPath LLM Client [v1.9.0]
[1.9.0] - 2026-04-17
Added
ModelFamilyStrEnum constants (OPENAI,GOOGLE_GEMINI,ANTHROPIC_CLAUDE) for model family matchingget_model_info()onUiPathBaseSettings— centralized model lookup with filtering by name, vendor, and BYO connection ID- Discovery cache on
get_available_models()keyed by settings properties, withrefreshparameter to bypass
Changed
get_available_models()is now a concrete cached method on the base class; subclasses implement_fetch_available_models()insteadvalidate_byo_model()is now a default no-op on the base class (only LLMGateway overrides it) and is called automatically insideget_model_info()- LiteLLM client uses
get_model_info()instead of duplicating model discovery logic
UiPath LangChain Client [langchain-v1.9.2]
[1.9.2] - 2026-04-17
Changed
- Breaking: captured gateway headers are now exposed on
AIMessage.response_metadataunder theheaderskey (previouslyuipath_llmgateway_headers). Update any consumers that read this key. - Minimum
uipath-llm-clientbumped to 1.9.2 for the platform-headers refactor and licensing-context support.
UiPath LangChain Client [langchain-v1.9.1]
[1.9.1] - 2026-04-17
Fixed
- Detect Anthropic-family models by additional name keywords (
anthropic,opus,sonnet,haiku,mythos) alongsideclaude— applies to Bedrock INVOKE factory routing and the normalized client's empty tool-call content workaround. Uses the sharedis_anthropic_model_name()helper from core 1.9.1.
Changed
- Minimum
uipath-llm-clientbumped to 1.9.1 for the sharedis_anthropic_model_name()helper