Skip to content

feat: add MiniMax AI provider support with M2.7 default model#1802

Open
octo-patch wants to merge 2 commits intoCodePhiliaX:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax AI provider support with M2.7 default model#1802
octo-patch wants to merge 2 commits intoCodePhiliaX:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

@octo-patch octo-patch commented Mar 15, 2026

Summary

Add MiniMax as an AI provider option for intelligent SQL generation in Chat2DB. MiniMax offers OpenAI-compatible API endpoints, enabling seamless integration with the existing provider architecture.

Changes

Backend

  • New provider package (controller/ai/minimax/):
    • MiniMaxAIClient.java - Singleton factory for client management
    • MiniMaxAIStreamClient.java - OkHttp-based streaming client with Builder pattern
    • MiniMaxAIEventSourceListener.java - SSE event handler for streaming responses
    • MiniMaxChatCompletions.java - Response model for chat completions
  • ChatController.java - Added MINIMAXAI routing case for chat/SQL completion endpoints
  • ConfigController.java - Added MiniMax config refresh on settings update
  • AiSqlSourceEnum.java - Added MINIMAXAI enum value

Frontend

  • ai.ts - Added MINIMAXAI to AIType enum
  • aiTypeConfig.ts - Added MiniMax AI form config with default model MiniMax-M2.7

Model Configuration

  • Default model: MiniMax-M2.7 (latest flagship model with enhanced reasoning and coding)
  • API Base URL: https://api.minimax.io/v1/chat/completions
  • Users can override the model via the settings UI (e.g. MiniMax-M2.7-highspeed for low-latency scenarios)

Why

MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities, making it well-suited for SQL generation tasks.

Testing

  • Verified MiniMax appears as a provider option in the AI settings UI
  • Tested streaming chat completions with MiniMax API
  • Confirmed backward compatibility with existing providers

PR Bot added 2 commits March 15, 2026 20:03
Add MiniMax as an AI provider option for SQL generation. MiniMax provides
OpenAI-compatible API endpoints, making integration straightforward.

Changes:
- Backend: Add MiniMax provider with client, stream client, event listener,
  and response model classes following the existing provider pattern
- Backend: Register MiniMax in AiSqlSourceEnum, ChatController dispatch,
  and ConfigController save/load logic
- Frontend: Add MINIMAXAI to AIType enum, display name, and form config
  with default API host and model settings

Supported models:
- MiniMax-M2.5 (default) - 204K context window
- MiniMax-M2.5-highspeed - Same performance, faster

API Documentation:
- OpenAI Compatible: https://platform.minimax.io/docs/api-reference/text-openai-api
- Update default model from MiniMax-M2.5 to MiniMax-M2.7 in frontend config and backend stream client
- MiniMax-M2.7 is the latest flagship model with enhanced reasoning and coding capabilities
- Users can still manually specify any MiniMax model via the settings UI
@octo-patch octo-patch changed the title feat: add MiniMax AI provider support feat: add MiniMax AI provider support (M2.7 default) Mar 18, 2026
@octo-patch octo-patch changed the title feat: add MiniMax AI provider support (M2.7 default) feat: add MiniMax AI provider support with M2.7 default model Mar 18, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant