Skip to content

Conversation

@SkyeYoung
Copy link
Member

@SkyeYoung SkyeYoung commented Jan 9, 2026

Description

Refer to the changes in #12565 to add support for gemini

also check gemini openai doc: https://ai.google.dev/gemini-api/docs/openai

Already tested in real world:

# config.yaml
deployment:
  role: data_plane
  role_data_plane:
    config_provider: yaml
routes:
  - id: 1
    uri: /hello
    plugins:
      ai-proxy:
        provider: gemini
        auth:
          header:
            Authorization: "Bearer xxx"
        options:
          model: gemini-2.0-flash
#END
ubuntu-20:~/apisix$ curl -X POST http://localhost:9080/anything   -H "Content-Type: application/json"   -d '{"messages": [{"role": "user", "content": "Hello"}]}'
data: {"choices":[{"delta":{"content":"Hello! How can I help you today?","role":"assistant"},"finish_reason":"stop","index":0}],"created":1767957025,"id":"IeJgacrFI4OfqtsPuaDzuQ8","model":"gemini-2.5-flash","object":"chat.completion.chunk"}

data: [DONE]

Which issue(s) this PR fixes:

Fixes #

Checklist

  • I have explained the need for this PR and the problem it solves
  • I have explained the changes or the new features added to this PR
  • I have added tests corresponding to this change
  • I have updated the documentation to reflect this change
  • I have verified that this change is backward compatible (If not, please discuss on the APISIX mailing list first)

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. enhancement New feature or request labels Jan 9, 2026
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR adds support for Google's Gemini OpenAI-compatible API to APISIX's AI proxy plugins. It follows the same implementation pattern as the recently added Azure OpenAI and AIMLAPI support.

Changes:

  • Added a new driver for gemini-openai provider that uses Google's OpenAI-compatible endpoint
  • Updated all plugin schemas to include gemini-openai in the provider enum lists
  • Added comprehensive test coverage with a new test file
  • Updated documentation in both English and Chinese to reflect the new provider option

Reviewed changes

Copilot reviewed 11 out of 11 changed files in this pull request and generated 4 comments.

Show a summary per file
File Description
apisix/plugins/ai-drivers/gemini-openai.lua New driver implementation for Gemini's OpenAI-compatible API endpoint
apisix/plugins/ai-drivers/schema.lua Added gemini-openai to provider schemas and compatibility checker
apisix/plugins/ai-proxy/schema.lua Added gemini-openai to provider enums for both single and multi-instance schemas
apisix/plugins/ai-request-rewrite.lua Added gemini-openai to provider enum list
docs/en/latest/plugins/ai-proxy.md Added documentation for gemini-openai provider with endpoint details
docs/en/latest/plugins/ai-proxy-multi.md Added documentation for gemini-openai provider with endpoint details
docs/en/latest/plugins/ai-request-rewrite.md Added gemini-openai to provider options list
docs/zh/latest/plugins/ai-proxy.md Chinese documentation update for gemini-openai provider
docs/zh/latest/plugins/ai-proxy-multi.md Chinese documentation update for gemini-openai provider
docs/zh/latest/plugins/ai-request-rewrite.md Chinese documentation update for gemini-openai provider
t/plugin/ai-proxy-gemini-openai.t New test file with comprehensive test coverage for the gemini-openai provider

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines 31 to 34
print "Hello, World!\n";
print $resp;


Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Debug print statements should be removed before merging. These lines appear to be left over from development and testing.

Suggested change
print "Hello, World!\n";
print $resp;

Copilot uses AI. Check for mistakes.
"stream": true
},
"override": {
"endpoint": "http://localhost:7737/v1/chat/completions"
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The endpoint configuration in TEST 3 uses port 7737 which differs from port 6724 used elsewhere in the test file. However, the mock server is only configured to listen on port 6724 (line 52). This will cause TEST 4 to fail as it tries to connect to a non-existent server on port 7737.

Suggested change
"endpoint": "http://localhost:7737/v1/chat/completions"
"endpoint": "http://localhost:6724/v1/chat/completions"

Copilot uses AI. Check for mistakes.
| ------------------------- | ------------ | -------- | ------------------------------------------------------------------------------------ |
| prompt | Yes | String | The prompt send to LLM service. |
| provider | Yes | String | Name of the LLM service. Available options: openai, deekseek, azure-openai, aimlapi and openai-compatible. When `aimlapi` is selected, the plugin uses the OpenAI-compatible driver with a default endpoint of `https://api.aimlapi.com/v1/chat/completions`. |
| provider | Yes | String | Name of the LLM service. Available options: openai, deekseek, azure-openai, aimlapi, gemini-openai, and openai-compatible. When `aimlapi` is selected, the plugin uses the OpenAI-compatible driver with a default endpoint of `https://api.aimlapi.com/v1/chat/completions`. |
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spelling error: "deekseek" should be "deepseek" to match the correct provider name used throughout the codebase.

Copilot uses AI. Check for mistakes.
| ------------------------- | ------------ | -------- | ------------------------------------------------------------------------------------ |
| prompt || String | 发送到 LLM 服务的提示。 |
| provider || String | LLM 服务的名称。可用选项:openai、deekseek、azure-openai、aimlapi 和 openai-compatible。当选择 `aimlapi` 时,插件使用 OpenAI 兼容驱动程序,默认端点为 `https://api.aimlapi.com/v1/chat/completions`|
| provider || String | LLM 服务的名称。可用选项:openai、deekseek、azure-openai、aimlapi、gemini-openai 和 openai-compatible。当选择 `aimlapi` 时,插件使用 OpenAI 兼容驱动程序,默认端点为 `https://api.aimlapi.com/v1/chat/completions`|
Copy link

Copilot AI Jan 11, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Spelling error: "deekseek" should be "deepseek" to match the correct provider name used throughout the codebase.

Copilot uses AI. Check for mistakes.
@SkyeYoung SkyeYoung marked this pull request as draft January 12, 2026 06:09
@SkyeYoung SkyeYoung marked this pull request as ready for review January 22, 2026 03:31
@SkyeYoung SkyeYoung requested a review from nic-6443 January 22, 2026 03:32
@SkyeYoung SkyeYoung merged commit 03278a3 into apache:master Jan 23, 2026
26 checks passed
@SkyeYoung SkyeYoung deleted the young/feat/support-gemini-openai branch January 23, 2026 09:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request size:L This PR changes 100-499 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants