Skip to content

[ENHANCEMENT] For OpenAI Compatible endpoints provide a dropdown from which users can select among all the models that Roo has specific capabilities for #11674

@MarkErik

Description

@MarkErik

Problem (one or two sentences)

For users of LLMs served locally via llama.cpp, we are missing out on the model support that Roo has. For example, support for MiniMax M2.5 was just added, but only if you use it via the MiniMax provider, however the model is the same if served via llama.cpp.

Context (who is affected and when)

Anyone using models via llama.cpp (or exo, or mlx-openai-server, etc.) via OpenAI Compatible endpoint, missing out on the specific interfaces to make models work better with Roo.

Desired behavior (conceptual, not technical)

If you select an OpenAI Compatible Provider, there is another drop down underneath it, that lets a user see all the different models Roo has support for - allowing the user to select a model to match what they are running.

Constraints / preferences (optional)

No response

Request checklist

  • I've searched existing Issues and Discussions for duplicates
  • This describes a specific problem with clear context and impact

Roo Code Task Links (optional)

No response

Acceptance criteria (optional)

No response

Proposed approach (optional)

No response

Trade-offs / risks (optional)

For models that the user may be running but aren't in Roo's supported list, provide a 'Universal' option in the list, which is essentially what we have today.

Metadata

Metadata

Assignees

No one assigned

    Labels

    EnhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions