Skip to content

fix(openai): Attach response model with streamed Completions API #103

fix(openai): Attach response model with streamed Completions API

fix(openai): Attach response model with streamed Completions API #103

Triggered via pull request February 26, 2026 16:25
Status Success
Total duration 3m 0s
Artifacts

warden.yml

on: pull_request
Fit to window
Zoom out
Zoom in

Annotations

2 warnings
Accessing x.model without defensive check could raise AttributeError and break streaming: sentry_sdk/integrations/openai.py#L662
The new code accesses `x.model` directly without checking if the attribute exists, and this call is placed outside the `capture_internal_exceptions()` block. If a streaming chunk doesn't have a `model` attribute (which could happen with certain API responses or edge cases), an `AttributeError` will propagate to the user's application and break their streaming logic. The rest of the code in this function uses defensive checks like `hasattr(x, "choices")` before accessing attributes.
[CWR-SCT] Accessing x.model without defensive check could raise AttributeError and break streaming (additional location): sentry_sdk/integrations/openai.py#L616
The new code accesses `x.model` directly without checking if the attribute exists, and this call is placed outside the `capture_internal_exceptions()` block. If a streaming chunk doesn't have a `model` attribute (which could happen with certain API responses or edge cases), an `AttributeError` will propagate to the user's application and break their streaming logic. The rest of the code in this function uses defensive checks like `hasattr(x, "choices")` before accessing attributes.