Skip to content

feat: add MiniMax as LLM provider#170

Open
octo-patch wants to merge 1 commit intoshroominic:mainfrom
octo-patch:feature/add-minimax-provider
Open

feat: add MiniMax as LLM provider#170
octo-patch wants to merge 1 commit intoshroominic:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link
Copy Markdown

Summary

Adds MiniMax AI as a fourth LLM provider option alongside OpenAI, Azure OpenAI, and Anthropic.

  • MINIMAX_API_KEY env var in CodeInterpreterAPISettings for auto-detection
  • MiniMax branch in _choose_llm() using ChatOpenAI with base_url=https://api.minimax.io/v1 (OpenAI-compatible API)
  • Default model: MiniMax-M2.5 when no MiniMax-specific model is configured; also supports MiniMax-M2.5-highspeed for faster responses
  • Temperature clamping: [0.01, 1.0] range required by MiniMax API
  • Agent compatibility: Returns ChatOpenAI instance, so OpenAIFunctionsAgent is used automatically (function calling works)
  • README updated with MiniMax configuration instructions

Usage

export MINIMAX_API_KEY=your-minimax-api-key
export MODEL=MiniMax-M2.5  # or MiniMax-M2.5-highspeed
from codeinterpreterapi import CodeInterpreterSession

with CodeInterpreterSession() as session:
    response = session.generate_response("Plot the bitcoin chart of 2023")
    response.show()

Changes

File Change
src/codeinterpreterapi/config.py Add MINIMAX_API_KEY setting
src/codeinterpreterapi/session.py Add MiniMax branch in _choose_llm()
README.md Add MiniMax configuration docs
tests/test_minimax_provider.py 12 unit tests (config, model selection, temp clamping, priority)
tests/test_minimax_integration.py 3 integration tests (LLM creation, completion, session)

Test plan

  • 12 unit tests pass (pytest tests/test_minimax_provider.py)
  • 3 integration tests pass (MINIMAX_API_KEY=... pytest tests/test_minimax_integration.py)
  • Existing tests unaffected
  • MiniMax returns ChatOpenAI instance for OpenAIFunctionsAgent compatibility
  • Temperature clamped to valid range
  • Provider priority maintained: Azure > OpenAI > Anthropic > MiniMax

Add MiniMax AI (https://www.minimax.io/) as a fourth LLM provider option
alongside OpenAI, Azure OpenAI, and Anthropic. MiniMax uses an
OpenAI-compatible API, so it integrates via ChatOpenAI with a custom
base_url.

- Add MINIMAX_API_KEY setting to CodeInterpreterAPISettings
- Add MiniMax branch in _choose_llm() with temp clamping [0.01, 1.0]
- Default to MiniMax-M2.5 model when no MiniMax-specific model is set
- Update README with MiniMax configuration instructions
- Add 12 unit tests and 3 integration tests
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant