Skip to content

feat(auth): support OAuth tokens and codex Responses routing#478

Merged
mikehostetler merged 2 commits intoagentjido:mainfrom
l3wi:feat/oauth-auth-codex-routing
Mar 6, 2026
Merged

feat(auth): support OAuth tokens and codex Responses routing#478
mikehostetler merged 2 commits intoagentjido:mainfrom
l3wi:feat/oauth-auth-codex-routing

Conversation

@l3wi
Copy link
Copy Markdown
Contributor

@l3wi l3wi commented Mar 5, 2026

Summary

  • add ReqLLM.Auth for unified credential resolution with API key and OAuth token modes
  • add access_token + auth_mode provider options for OpenAI and Anthropic
  • wire OpenAI/Anthropic request + stream auth paths through unified credential resolver
  • normalize OpenAI model metadata in ReqLLM.model/1 so reasoning/codex models default to openai_responses protocol when missing wire metadata
  • add provider fallback routing in OpenAI for reasoning/codex families when wire metadata is absent
  • document OAuth token usage in OpenAI and Anthropic guides
  • add regression tests for OAuth auth handling and codex Responses routing

Validation

  • mix format --check-formatted
  • mix compile --warnings-as-errors
  • mix quality
  • env -u GOOGLE_CLOUD_PROJECT mix test
  • env -u GOOGLE_CLOUD_PROJECT mix test test/providers/openai_test.exs test/providers/anthropic_test.exs test/req_llm/generation_test.exs test/req_llm_test.exs

Notes

  • I also ran mix mc "*:*" locally; it currently fails in this checkout due missing fixture paths/baseline fixture coverage outside this change set (for example: test/support/fixtures/openroute/...).
  • Changed-provider test coverage included in this PR is passing.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 8d51e4c0c8

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Comment thread lib/req_llm/auth.ex Outdated
Comment thread lib/req_llm/providers/openai.ex
@l3wi
Copy link
Copy Markdown
Contributor Author

l3wi commented Mar 5, 2026

Addressed both review points in follow-up commit 244d978.

  1. ReqLLM.Auth.resolve/2 now honors auth_mode when selecting credentials.

    • :api_key mode always resolves API key credentials.
    • :oauth mode requires access_token.
    • This prevents stale access_token values from overriding explicit API key mode.
  2. ReqLLM.Providers.OpenAI.get_api_type/1 now checks both atom-key and string-key wire protocol metadata before fallback routing.

    • Explicit string-key %{"wire" => %{"protocol" => "openai_chat"}} is now respected.

Added regression tests:

  • OpenAI provider test: explicit string-key openai_chat protocol routes to /chat/completions
  • Generation test: auth_mode: :api_key uses API key even when access_token is present

Validation run:

  • mix quality
  • env -u GOOGLE_CLOUD_PROJECT mix test test/providers/openai_test.exs test/req_llm/generation_test.exs test/providers/anthropic_test.exs test/req_llm_test.exs

@mikehostetler mikehostetler merged commit 59bde23 into agentjido:main Mar 6, 2026
11 of 12 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants