Skip to content

Add support for OpenAI-compatible API providers (e.g., MiniMax M2.7)#165

Open
octo-patch wants to merge 2 commits intoVectifyAI:mainfrom
octo-patch:feature/add-openai-compatible-provider-support
Open

Add support for OpenAI-compatible API providers (e.g., MiniMax M2.7)#165
octo-patch wants to merge 2 commits intoVectifyAI:mainfrom
octo-patch:feature/add-openai-compatible-provider-support

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 15, 2026

Summary

  • Add support for OpenAI-compatible API providers via OPENAI_BASE_URL env var and --base-url CLI flag
  • Support OPENAI_API_KEY as alternative to CHATGPT_API_KEY for broader compatibility
  • Add --api-key CLI argument to override API key at runtime
  • Document MiniMax (MiniMax-M2.7 with 1M context window) as example provider

Changes

pageindex/utils.py

  • Accept optional base_url parameter in API functions (ChatGPT_API, ChatGPT_API_with_finish_reason, ChatGPT_API_async)
  • Pass base_url to openai.OpenAI() / openai.AsyncOpenAI() client when set
  • Read OPENAI_API_KEY as fallback when CHATGPT_API_KEY is not set
  • Read OPENAI_BASE_URL from environment

run_pageindex.py

  • Add --api-key and --base-url CLI arguments
  • Set corresponding env vars so downstream code picks them up

README.md

  • Document alternative API key env var (OPENAI_API_KEY)
  • Document --api-key and --base-url CLI options
  • Add MiniMax usage example with MiniMax-M2.7 model

Test plan

  • Verified MiniMax-M2.7 model responds correctly via MiniMax API
  • Existing OpenAI usage path unchanged (no breaking changes)

octo-patch and others added 2 commits March 15, 2026 16:33
- Support OPENAI_BASE_URL env var and --base-url CLI flag to use
  alternative OpenAI-compatible API endpoints
- Support OPENAI_API_KEY as alternative to CHATGPT_API_KEY
- Add --api-key CLI flag to override API key
- Use temperature=0.01 for custom endpoints (some providers like
  MiniMax reject exactly 0)
- Document MiniMax and alternative provider usage in README
- Update documented model from MiniMax-M2.5 (204K) to MiniMax-M2.7 (1M context)
- Simplify temperature handling: MiniMax now accepts temperature=0 natively,
  removing the need for the 0.01 workaround with custom base URLs
Co-Authored-By: Octopus <liyuan851277048@icloud.com>
@octo-patch octo-patch changed the title Add support for OpenAI-compatible API providers (e.g., MiniMax) Add support for OpenAI-compatible API providers (e.g., MiniMax M2.7) Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant