Skip to content

Conversation

@devin-ai-integration
Copy link
Contributor

@devin-ai-integration devin-ai-integration bot commented Jan 21, 2026

fix: improve LiteLLM fallback error message for non-native providers

Summary

Fixes #4262

When using non-native providers like Groq with models that require LiteLLM (e.g., groq/openai/gpt-oss-120b), the error message was confusing and contradictory:

Before:

LiteLLM is not available, falling back to LiteLLM
ImportError: Fallback to LiteLLM is not available

After:

Model 'groq/llama-3.1-70b-versatile' requires LiteLLM for inference but LiteLLM is not installed.
Please install it with: pip install 'crewai[litellm]' or pip install litellm

The new error message:

  • Includes the specific model name that requires LiteLLM
  • Provides clear installation instructions
  • Removes the contradictory "falling back to LiteLLM" phrasing

Review & Testing Checklist for Human

  • Verify the error message is clear and actionable when testing with a non-native provider model without LiteLLM installed
  • Confirm the two new tests adequately cover the error scenario

Test Plan

  1. In a fresh environment without LiteLLM installed, try: LLM(model="groq/llama-3.1-70b-versatile")
  2. Verify the error message includes the model name and installation instructions

Notes


Open with Devin

Fixes #4262

When using non-native providers like Groq with models that require LiteLLM,
the error message was confusing: 'LiteLLM is not available, falling back to LiteLLM'.

This commit:
- Fixes the contradictory error message
- Provides a clear, actionable error message that includes:
  - The model name that requires LiteLLM
  - Instructions on how to install LiteLLM (pip install 'crewai[litellm]')
- Adds tests to verify the error message is helpful and includes the model name

Co-Authored-By: João <joao@crewai.com>
@devin-ai-integration
Copy link
Contributor Author

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR. Add '(aside)' to your comment to have me ignore it.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

Copy link
Contributor Author

@devin-ai-integration devin-ai-integration bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Devin Review: No Issues Found

Devin Review analyzed this PR and found no potential bugs to report.

View in Devin Review to see 3 additional flags.

Open in Devin Review

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] OpenAI dependency conflict + LiteLLM fallback failure when using Groq GPT-OSS 120B

0 participants