You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PR #385 (closing #378) made top_p optional and changed its default to None in LLMConfig. However, the fix only addressed the config layer — the API call layer in openevolve/llm/openai.py still unconditionally includes top_p in the request params:
After #385, self.top_p defaults to None, so this puts "top_p": None into the params dict. Whether this causes an error depends on the downstream API client:
The standard OpenAI SDK may strip None values internally (uses NOT_GIVEN as its sentinel)
AWS Bedrock, LiteLLM, and other OpenAI-compatible wrappers may serialize it as "top_p": null in the JSON body, triggering a 400 error
Reproduction
Configure OpenEvolve with an AWS Bedrock endpoint (or any OpenAI-compatible provider that does not strip null values)