-
Notifications
You must be signed in to change notification settings - Fork 572
chore(gen_ai): add auto-enablement for google genai and litellm #5295
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
chore(gen_ai): add auto-enablement for google genai and litellm #5295
Conversation
shellmayr
commented
Jan 12, 2026
- Auto-enable the LiteLLM integration
- Auto-enable the Google GenAI integration
|
|
||
| _INTEGRATION_DEACTIVATES = { | ||
| "langchain": {"openai", "anthropic"}, | ||
| "litellm": {"openai", "anthropic"}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Bug: The litellm integration incorrectly deactivates the openai and anthropic integrations, which can lead to lost telemetry for direct SDK calls.
Severity: HIGH
🔍 Detailed Analysis
The litellm integration is being added to the _INTEGRATION_DEACTIVATES map, causing it to disable the openai and anthropic integrations. This is incorrect because the litellm integration uses callbacks and does not wrap the official OpenAI or Anthropic SDKs, unlike the langchain integration. There is no risk of duplicate telemetry. This change will cause a loss of instrumentation for any direct calls a user makes to the OpenAI or Anthropic SDKs if the litellm integration is also enabled in their application, leading to missing telemetry data.
💡 Suggested Fix
Remove the "litellm": {"openai", "anthropic"} entry from the _INTEGRATION_DEACTIVATES dictionary in sentry_sdk/integrations/__init__.py. The litellm integration's architecture does not require deactivating other AI integrations.
🤖 Prompt for AI Agent
Review the code at the location below. A potential bug has been identified by an AI
agent.
Verify if this is a real issue. If it is, propose a fix; if not, explain why it's not
valid.
Location: sentry_sdk/integrations/__init__.py#L171
Potential issue: The `litellm` integration is being added to the
`_INTEGRATION_DEACTIVATES` map, causing it to disable the `openai` and `anthropic`
integrations. This is incorrect because the `litellm` integration uses callbacks and
does not wrap the official OpenAI or Anthropic SDKs, unlike the `langchain` integration.
There is no risk of duplicate telemetry. This change will cause a loss of
instrumentation for any direct calls a user makes to the OpenAI or Anthropic SDKs if the
`litellm` integration is also enabled in their application, leading to missing telemetry
data.
Did we get this right? 👍 / 👎 to inform future reviews.
Reference ID: 8473225
|
|
||
| _INTEGRATION_DEACTIVATES = { | ||
| "langchain": {"openai", "anthropic"}, | ||
| "litellm": {"openai", "anthropic"}, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It would be nice to have tests for the new auto-deactivation.
Existing tests like this are in tests/test_ai_integration_deactivation.py.
|
@shellmayr can you investigate why the |
|
@alexander-alderman-webb yep - on it! |