-
Notifications
You must be signed in to change notification settings - Fork 2.8k
feat: support gemini openai api #12883
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: support gemini openai api #12883
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR adds support for Google's Gemini OpenAI-compatible API to APISIX's AI proxy plugins. It follows the same implementation pattern as the recently added Azure OpenAI and AIMLAPI support.
Changes:
- Added a new driver for
gemini-openaiprovider that uses Google's OpenAI-compatible endpoint - Updated all plugin schemas to include
gemini-openaiin the provider enum lists - Added comprehensive test coverage with a new test file
- Updated documentation in both English and Chinese to reflect the new provider option
Reviewed changes
Copilot reviewed 11 out of 11 changed files in this pull request and generated 4 comments.
Show a summary per file
| File | Description |
|---|---|
apisix/plugins/ai-drivers/gemini-openai.lua |
New driver implementation for Gemini's OpenAI-compatible API endpoint |
apisix/plugins/ai-drivers/schema.lua |
Added gemini-openai to provider schemas and compatibility checker |
apisix/plugins/ai-proxy/schema.lua |
Added gemini-openai to provider enums for both single and multi-instance schemas |
apisix/plugins/ai-request-rewrite.lua |
Added gemini-openai to provider enum list |
docs/en/latest/plugins/ai-proxy.md |
Added documentation for gemini-openai provider with endpoint details |
docs/en/latest/plugins/ai-proxy-multi.md |
Added documentation for gemini-openai provider with endpoint details |
docs/en/latest/plugins/ai-request-rewrite.md |
Added gemini-openai to provider options list |
docs/zh/latest/plugins/ai-proxy.md |
Chinese documentation update for gemini-openai provider |
docs/zh/latest/plugins/ai-proxy-multi.md |
Chinese documentation update for gemini-openai provider |
docs/zh/latest/plugins/ai-request-rewrite.md |
Chinese documentation update for gemini-openai provider |
t/plugin/ai-proxy-gemini-openai.t |
New test file with comprehensive test coverage for the gemini-openai provider |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
t/plugin/ai-proxy-gemini-openai.t
Outdated
| print "Hello, World!\n"; | ||
| print $resp; | ||
|
|
||
|
|
Copilot
AI
Jan 11, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Debug print statements should be removed before merging. These lines appear to be left over from development and testing.
| print "Hello, World!\n"; | |
| print $resp; |
| "stream": true | ||
| }, | ||
| "override": { | ||
| "endpoint": "http://localhost:7737/v1/chat/completions" |
Copilot
AI
Jan 11, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The endpoint configuration in TEST 3 uses port 7737 which differs from port 6724 used elsewhere in the test file. However, the mock server is only configured to listen on port 6724 (line 52). This will cause TEST 4 to fail as it tries to connect to a non-existent server on port 7737.
| "endpoint": "http://localhost:7737/v1/chat/completions" | |
| "endpoint": "http://localhost:6724/v1/chat/completions" |
| | ------------------------- | ------------ | -------- | ------------------------------------------------------------------------------------ | | ||
| | prompt | Yes | String | The prompt send to LLM service. | | ||
| | provider | Yes | String | Name of the LLM service. Available options: openai, deekseek, azure-openai, aimlapi and openai-compatible. When `aimlapi` is selected, the plugin uses the OpenAI-compatible driver with a default endpoint of `https://api.aimlapi.com/v1/chat/completions`. | | ||
| | provider | Yes | String | Name of the LLM service. Available options: openai, deekseek, azure-openai, aimlapi, gemini-openai, and openai-compatible. When `aimlapi` is selected, the plugin uses the OpenAI-compatible driver with a default endpoint of `https://api.aimlapi.com/v1/chat/completions`. | |
Copilot
AI
Jan 11, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spelling error: "deekseek" should be "deepseek" to match the correct provider name used throughout the codebase.
| | ------------------------- | ------------ | -------- | ------------------------------------------------------------------------------------ | | ||
| | prompt | 是 | String | 发送到 LLM 服务的提示。 | | ||
| | provider | 是 | String | LLM 服务的名称。可用选项:openai、deekseek、azure-openai、aimlapi 和 openai-compatible。当选择 `aimlapi` 时,插件使用 OpenAI 兼容驱动程序,默认端点为 `https://api.aimlapi.com/v1/chat/completions`。 | | ||
| | provider | 是 | String | LLM 服务的名称。可用选项:openai、deekseek、azure-openai、aimlapi、gemini-openai 和 openai-compatible。当选择 `aimlapi` 时,插件使用 OpenAI 兼容驱动程序,默认端点为 `https://api.aimlapi.com/v1/chat/completions`。 | |
Copilot
AI
Jan 11, 2026
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Spelling error: "deekseek" should be "deepseek" to match the correct provider name used throughout the codebase.
Description
Refer to the changes in #12565 to add support for gemini
also check gemini openai doc: https://ai.google.dev/gemini-api/docs/openai
Already tested in real world:
Which issue(s) this PR fixes:
Fixes #
Checklist