fix: support more dtypes in pad_nd function#8752
fix: support more dtypes in pad_nd function#8752skdas20 wants to merge 1 commit intoProject-MONAI:devfrom
Conversation
📝 WalkthroughWalkthroughThe change updates Estimated code review effort🎯 1 (Trivial) | ⏱️ ~3 minutes 🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 inconclusive)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
monai/transforms/croppad/functional.py (1)
73-109:⚠️ Potential issue | 🟡 MinorAdd tests for newly supported dtypes in
pad_nd.The commit enables
pad_ndto supporttorch.bool,torch.int16,torch.int64, andtorch.uint8but provides no test coverage for these dtypes. The test infrastructure only generates float arrays viaget_arr(). Add parameterized tests covering both the PyTorch-succeeds path and PyTorch-fails-→-NumPy-fallback path for each new dtype.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@monai/transforms/croppad/functional.py` around lines 73 - 109, Add parameterized unit tests for pad_nd to cover the newly supported dtypes (torch.bool, torch.int16, torch.int64, torch.uint8) by generating inputs with get_arr(dtype=...) or equivalent and exercising both the PyTorch-success path (call pad_nd with a mode supported by _pt_pad such as "constant") and the PyTorch-fallback path (call pad_nd with a mode that forces _np_pad, e.g., "linear_ramp" or a mode/kwargs that triggers the NotImplementedError path). For each dtype and path, assert the output shape equals the expected padded shape and the output dtype (or numpy dtype for fallback) matches the input dtype, and include tests for tensor inputs (torch.Tensor) and numpy inputs to ensure device/array handling in pad_nd, referencing the pad_nd function and the underlying helpers _pt_pad and _np_pad to locate where behavior is validated.
🧹 Nitpick comments (2)
monai/transforms/croppad/functional.py (2)
102-106:"value"in the error keyword list is too broad.The substring
"value"matches the word in many RuntimeError messages unrelated to dtype/padding support (e.g.,"value out of range","expected value"), silently routing them to the numpy fallback instead of re-raising. With more dtypes now going through_pt_pad, this heuristic fires more frequently.♻️ Tighten the keyword match
- if isinstance(err, NotImplementedError) or any( - k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value") - ): + if isinstance(err, NotImplementedError) or any( + k in str(err) for k in ("supported", "unexpected keyword", "implemented") + ) or (isinstance(err, ValueError) and "value" in str(err)):🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@monai/transforms/croppad/functional.py` around lines 102 - 106, The current except block in the padding fallback incorrectly treats any error message containing the substring "value" as a cue to call _np_pad, which is too broad; update the heuristic in the except (ValueError, TypeError, RuntimeError) handler (the code path that falls back from _pt_pad to _np_pad) to remove the generic "value" check and instead match narrower, relevant phrases such as "dtype", "unsupported", "not supported", or "unsupported dtype"/"unsupported padding", or rely only on NotImplementedError and explicit "supported"/"implemented"/"unexpected keyword" matches so that unrelated messages like "value out of range" are not swallowed and re-raised correctly. Ensure the change references the same except block and keeps calling _np_pad(img, pad_width=to_pad, mode=mode, **kwargs) only for genuinely unsupported padding/dtype errors.
97-109:"value"in the error keyword list can silently swallow unrelated errors.The fallback condition at line 104 matches any error message containing
"value"— a very common substring in PyTorch error strings (e.g.,"padding value out of range","invalid value", etc.). With more dtypes now routing through_pt_pad, this heuristic fires more often, increasing the risk of masking a legitimate error and silently degrading to NumPy padding.♻️ Suggested tightening
- if isinstance(err, NotImplementedError) or any( - k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value") - ): + if isinstance(err, NotImplementedError) or any( + k in str(err) for k in ("supported", "unexpected keyword", "implemented") + ) or "not implemented" in str(err).lower():Or, more precisely, scope the
"value"check toValueErroronly:- if isinstance(err, NotImplementedError) or any( - k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value") - ): + if isinstance(err, NotImplementedError) or any( + k in str(err) for k in ("supported", "unexpected keyword", "implemented") + ) or (isinstance(err, ValueError) and "value" in str(err)):🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@monai/transforms/croppad/functional.py` around lines 97 - 109, The fallback logic currently treats any error message containing "value" as safe to fallback to _np_pad, which can mask unrelated errors; update the except block so the "value" substring check only applies when err is a ValueError. Concretely, in the except handling around _np_pad/_pt_pad, change the conditional to something like: if isinstance(err, NotImplementedError) or (isinstance(err, ValueError) and any(k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value"))): so that only ValueError instances are allowed to match "value" and then call _np_pad(img, pad_width=to_pad, mode=mode, **kwargs); otherwise re-raise the original error.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Outside diff comments:
In `@monai/transforms/croppad/functional.py`:
- Around line 73-109: Add parameterized unit tests for pad_nd to cover the newly
supported dtypes (torch.bool, torch.int16, torch.int64, torch.uint8) by
generating inputs with get_arr(dtype=...) or equivalent and exercising both the
PyTorch-success path (call pad_nd with a mode supported by _pt_pad such as
"constant") and the PyTorch-fallback path (call pad_nd with a mode that forces
_np_pad, e.g., "linear_ramp" or a mode/kwargs that triggers the
NotImplementedError path). For each dtype and path, assert the output shape
equals the expected padded shape and the output dtype (or numpy dtype for
fallback) matches the input dtype, and include tests for tensor inputs
(torch.Tensor) and numpy inputs to ensure device/array handling in pad_nd,
referencing the pad_nd function and the underlying helpers _pt_pad and _np_pad
to locate where behavior is validated.
---
Nitpick comments:
In `@monai/transforms/croppad/functional.py`:
- Around line 102-106: The current except block in the padding fallback
incorrectly treats any error message containing the substring "value" as a cue
to call _np_pad, which is too broad; update the heuristic in the except
(ValueError, TypeError, RuntimeError) handler (the code path that falls back
from _pt_pad to _np_pad) to remove the generic "value" check and instead match
narrower, relevant phrases such as "dtype", "unsupported", "not supported", or
"unsupported dtype"/"unsupported padding", or rely only on NotImplementedError
and explicit "supported"/"implemented"/"unexpected keyword" matches so that
unrelated messages like "value out of range" are not swallowed and re-raised
correctly. Ensure the change references the same except block and keeps calling
_np_pad(img, pad_width=to_pad, mode=mode, **kwargs) only for genuinely
unsupported padding/dtype errors.
- Around line 97-109: The fallback logic currently treats any error message
containing "value" as safe to fallback to _np_pad, which can mask unrelated
errors; update the except block so the "value" substring check only applies when
err is a ValueError. Concretely, in the except handling around _np_pad/_pt_pad,
change the conditional to something like: if isinstance(err,
NotImplementedError) or (isinstance(err, ValueError) and any(k in str(err) for k
in ("supported", "unexpected keyword", "implemented", "value"))): so that only
ValueError instances are allowed to match "value" and then call _np_pad(img,
pad_width=to_pad, mode=mode, **kwargs); otherwise re-raise the original error.
- Removes hardcoded dtype exclusion list from pad_nd() - Enables torch.nn.functional.pad for bool and integer dtypes - Try-except fallback to numpy still handles unsupported combinations - Fixes Project-MONAI#7842 Signed-off-by: skdas20 <skdas5405@gmail.com>
85ed7a9 to
790cdaa
Compare
There was a problem hiding this comment.
🧹 Nitpick comments (1)
monai/transforms/croppad/functional.py (1)
102-106: Fragile error-message keyword matching now covers a wider surface.The fallback from
_pt_padto_np_padrelies on the error message containing one of("supported", "unexpected keyword", "implemented", "value"). PyTorch does not support non-float dtypes forreflectandreplicatemodes, and their errors historically contain"not implemented for", so"implemented"matches. But this is brittle — any future PyTorch error message that doesn't contain one of these keywords will surface as a confusing re-raisedValueErrorinstead of transparently falling back. Consider whether a broader catch (e.g., catch allRuntimeErrorand always fall back) would be safer for_pt_padfailures, given the explicit intent is "try PyTorch, fall back to NumPy."♻️ Proposed safer fallback
- except (ValueError, TypeError, RuntimeError) as err: - if isinstance(err, NotImplementedError) or any( - k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value") - ): - return _np_pad(img, pad_width=to_pad, mode=mode, **kwargs) - raise ValueError( - f"{img.shape} {to_pad} {mode} {kwargs} {img.dtype} {img.device if isinstance(img, torch.Tensor) else None}" - ) from err + except (ValueError, TypeError, RuntimeError) as err: + try: + return _np_pad(img, pad_width=to_pad, mode=mode, **kwargs) + except Exception: + raise ValueError( + f"{img.shape} {to_pad} {mode} {kwargs} {img.dtype} {img.device if isinstance(img, torch.Tensor) else None}" + ) from errNote: this also changes pre-existing behaviour, so validate carefully.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@monai/transforms/croppad/functional.py` around lines 102 - 106, The current fragile keyword check in the except block for _pt_pad should be replaced so that all RuntimeError failures from _pt_pad (and NotImplementedError) unconditionally fall back to _np_pad, while still re-raising other ValueError/TypeError cases; update the except block around the call to _pt_pad to catch (ValueError, TypeError, RuntimeError) as err and if isinstance(err, (NotImplementedError, RuntimeError)): return _np_pad(img, pad_width=to_pad, mode=mode, **kwargs) else: raise to preserve existing behavior for plain ValueError/TypeError.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Nitpick comments:
In `@monai/transforms/croppad/functional.py`:
- Around line 102-106: The current fragile keyword check in the except block for
_pt_pad should be replaced so that all RuntimeError failures from _pt_pad (and
NotImplementedError) unconditionally fall back to _np_pad, while still
re-raising other ValueError/TypeError cases; update the except block around the
call to _pt_pad to catch (ValueError, TypeError, RuntimeError) as err and if
isinstance(err, (NotImplementedError, RuntimeError)): return _np_pad(img,
pad_width=to_pad, mode=mode, **kwargs) else: raise to preserve existing behavior
for plain ValueError/TypeError.
Removed hardcoded dtype exclusion list
Now supports bool and integer dtypes with PyTorch's optimized padding