Skip to content

Comments

fix: support more dtypes in pad_nd function#8752

Open
skdas20 wants to merge 1 commit intoProject-MONAI:devfrom
skdas20:fix/pad_nd-dtype-support
Open

fix: support more dtypes in pad_nd function#8752
skdas20 wants to merge 1 commit intoProject-MONAI:devfrom
skdas20:fix/pad_nd-dtype-support

Conversation

@skdas20
Copy link

@skdas20 skdas20 commented Feb 22, 2026

Removed hardcoded dtype exclusion list
Now supports bool and integer dtypes with PyTorch's optimized padding

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 22, 2026

📝 Walkthrough

Walkthrough

The change updates pad_nd to remove a dtype-based guard when selecting PyTorch padding (_pt_pad). Previously, _pt_pad was used only if the mode was in {"constant", "reflect", "edge", "replicate", "wrap", "circular"} and the tensor dtype was not in {torch.int16, torch.int64, torch.bool, torch.uint8}. Now _pt_pad is chosen whenever the mode is in that set, regardless of dtype. No other behavior or public signatures were changed.

Estimated code review effort

🎯 1 (Trivial) | ⏱️ ~3 minutes

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 inconclusive)

Check name Status Explanation Resolution
Description check ❓ Inconclusive Description is minimal but captures the core change. Missing required template sections like issue reference, detailed description, and checklist items. Expand description to include issue number (appears to be #7842), more context on why dtype support matters, and complete the template checklist.
✅ Passed checks (2 passed)
Check name Status Explanation
Title check ✅ Passed Title accurately describes the main change: removing dtype exclusion list and enabling more dtypes in pad_nd.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
monai/transforms/croppad/functional.py (1)

73-109: ⚠️ Potential issue | 🟡 Minor

Add tests for newly supported dtypes in pad_nd.

The commit enables pad_nd to support torch.bool, torch.int16, torch.int64, and torch.uint8 but provides no test coverage for these dtypes. The test infrastructure only generates float arrays via get_arr(). Add parameterized tests covering both the PyTorch-succeeds path and PyTorch-fails-→-NumPy-fallback path for each new dtype.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/transforms/croppad/functional.py` around lines 73 - 109, Add
parameterized unit tests for pad_nd to cover the newly supported dtypes
(torch.bool, torch.int16, torch.int64, torch.uint8) by generating inputs with
get_arr(dtype=...) or equivalent and exercising both the PyTorch-success path
(call pad_nd with a mode supported by _pt_pad such as "constant") and the
PyTorch-fallback path (call pad_nd with a mode that forces _np_pad, e.g.,
"linear_ramp" or a mode/kwargs that triggers the NotImplementedError path). For
each dtype and path, assert the output shape equals the expected padded shape
and the output dtype (or numpy dtype for fallback) matches the input dtype, and
include tests for tensor inputs (torch.Tensor) and numpy inputs to ensure
device/array handling in pad_nd, referencing the pad_nd function and the
underlying helpers _pt_pad and _np_pad to locate where behavior is validated.
🧹 Nitpick comments (2)
monai/transforms/croppad/functional.py (2)

102-106: "value" in the error keyword list is too broad.

The substring "value" matches the word in many RuntimeError messages unrelated to dtype/padding support (e.g., "value out of range", "expected value"), silently routing them to the numpy fallback instead of re-raising. With more dtypes now going through _pt_pad, this heuristic fires more frequently.

♻️ Tighten the keyword match
-        if isinstance(err, NotImplementedError) or any(
-            k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value")
-        ):
+        if isinstance(err, NotImplementedError) or any(
+            k in str(err) for k in ("supported", "unexpected keyword", "implemented")
+        ) or (isinstance(err, ValueError) and "value" in str(err)):
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/transforms/croppad/functional.py` around lines 102 - 106, The current
except block in the padding fallback incorrectly treats any error message
containing the substring "value" as a cue to call _np_pad, which is too broad;
update the heuristic in the except (ValueError, TypeError, RuntimeError) handler
(the code path that falls back from _pt_pad to _np_pad) to remove the generic
"value" check and instead match narrower, relevant phrases such as "dtype",
"unsupported", "not supported", or "unsupported dtype"/"unsupported padding", or
rely only on NotImplementedError and explicit
"supported"/"implemented"/"unexpected keyword" matches so that unrelated
messages like "value out of range" are not swallowed and re-raised correctly.
Ensure the change references the same except block and keeps calling
_np_pad(img, pad_width=to_pad, mode=mode, **kwargs) only for genuinely
unsupported padding/dtype errors.

97-109: "value" in the error keyword list can silently swallow unrelated errors.

The fallback condition at line 104 matches any error message containing "value" — a very common substring in PyTorch error strings (e.g., "padding value out of range", "invalid value", etc.). With more dtypes now routing through _pt_pad, this heuristic fires more often, increasing the risk of masking a legitimate error and silently degrading to NumPy padding.

♻️ Suggested tightening
-        if isinstance(err, NotImplementedError) or any(
-            k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value")
-        ):
+        if isinstance(err, NotImplementedError) or any(
+            k in str(err) for k in ("supported", "unexpected keyword", "implemented")
+        ) or "not implemented" in str(err).lower():

Or, more precisely, scope the "value" check to ValueError only:

-        if isinstance(err, NotImplementedError) or any(
-            k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value")
-        ):
+        if isinstance(err, NotImplementedError) or any(
+            k in str(err) for k in ("supported", "unexpected keyword", "implemented")
+        ) or (isinstance(err, ValueError) and "value" in str(err)):
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/transforms/croppad/functional.py` around lines 97 - 109, The fallback
logic currently treats any error message containing "value" as safe to fallback
to _np_pad, which can mask unrelated errors; update the except block so the
"value" substring check only applies when err is a ValueError. Concretely, in
the except handling around _np_pad/_pt_pad, change the conditional to something
like: if isinstance(err, NotImplementedError) or (isinstance(err, ValueError)
and any(k in str(err) for k in ("supported", "unexpected keyword",
"implemented", "value"))): so that only ValueError instances are allowed to
match "value" and then call _np_pad(img, pad_width=to_pad, mode=mode, **kwargs);
otherwise re-raise the original error.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@monai/transforms/croppad/functional.py`:
- Around line 73-109: Add parameterized unit tests for pad_nd to cover the newly
supported dtypes (torch.bool, torch.int16, torch.int64, torch.uint8) by
generating inputs with get_arr(dtype=...) or equivalent and exercising both the
PyTorch-success path (call pad_nd with a mode supported by _pt_pad such as
"constant") and the PyTorch-fallback path (call pad_nd with a mode that forces
_np_pad, e.g., "linear_ramp" or a mode/kwargs that triggers the
NotImplementedError path). For each dtype and path, assert the output shape
equals the expected padded shape and the output dtype (or numpy dtype for
fallback) matches the input dtype, and include tests for tensor inputs
(torch.Tensor) and numpy inputs to ensure device/array handling in pad_nd,
referencing the pad_nd function and the underlying helpers _pt_pad and _np_pad
to locate where behavior is validated.

---

Nitpick comments:
In `@monai/transforms/croppad/functional.py`:
- Around line 102-106: The current except block in the padding fallback
incorrectly treats any error message containing the substring "value" as a cue
to call _np_pad, which is too broad; update the heuristic in the except
(ValueError, TypeError, RuntimeError) handler (the code path that falls back
from _pt_pad to _np_pad) to remove the generic "value" check and instead match
narrower, relevant phrases such as "dtype", "unsupported", "not supported", or
"unsupported dtype"/"unsupported padding", or rely only on NotImplementedError
and explicit "supported"/"implemented"/"unexpected keyword" matches so that
unrelated messages like "value out of range" are not swallowed and re-raised
correctly. Ensure the change references the same except block and keeps calling
_np_pad(img, pad_width=to_pad, mode=mode, **kwargs) only for genuinely
unsupported padding/dtype errors.
- Around line 97-109: The fallback logic currently treats any error message
containing "value" as safe to fallback to _np_pad, which can mask unrelated
errors; update the except block so the "value" substring check only applies when
err is a ValueError. Concretely, in the except handling around _np_pad/_pt_pad,
change the conditional to something like: if isinstance(err,
NotImplementedError) or (isinstance(err, ValueError) and any(k in str(err) for k
in ("supported", "unexpected keyword", "implemented", "value"))): so that only
ValueError instances are allowed to match "value" and then call _np_pad(img,
pad_width=to_pad, mode=mode, **kwargs); otherwise re-raise the original error.

- Removes hardcoded dtype exclusion list from pad_nd()
- Enables torch.nn.functional.pad for bool and integer dtypes
- Try-except fallback to numpy still handles unsupported combinations
- Fixes Project-MONAI#7842

Signed-off-by: skdas20 <skdas5405@gmail.com>
@skdas20 skdas20 force-pushed the fix/pad_nd-dtype-support branch from 85ed7a9 to 790cdaa Compare February 22, 2026 19:56
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
monai/transforms/croppad/functional.py (1)

102-106: Fragile error-message keyword matching now covers a wider surface.

The fallback from _pt_pad to _np_pad relies on the error message containing one of ("supported", "unexpected keyword", "implemented", "value"). PyTorch does not support non-float dtypes for reflect and replicate modes, and their errors historically contain "not implemented for", so "implemented" matches. But this is brittle — any future PyTorch error message that doesn't contain one of these keywords will surface as a confusing re-raised ValueError instead of transparently falling back. Consider whether a broader catch (e.g., catch all RuntimeError and always fall back) would be safer for _pt_pad failures, given the explicit intent is "try PyTorch, fall back to NumPy."

♻️ Proposed safer fallback
-    except (ValueError, TypeError, RuntimeError) as err:
-        if isinstance(err, NotImplementedError) or any(
-            k in str(err) for k in ("supported", "unexpected keyword", "implemented", "value")
-        ):
-            return _np_pad(img, pad_width=to_pad, mode=mode, **kwargs)
-        raise ValueError(
-            f"{img.shape} {to_pad} {mode} {kwargs} {img.dtype} {img.device if isinstance(img, torch.Tensor) else None}"
-        ) from err
+    except (ValueError, TypeError, RuntimeError) as err:
+        try:
+            return _np_pad(img, pad_width=to_pad, mode=mode, **kwargs)
+        except Exception:
+            raise ValueError(
+                f"{img.shape} {to_pad} {mode} {kwargs} {img.dtype} {img.device if isinstance(img, torch.Tensor) else None}"
+            ) from err

Note: this also changes pre-existing behaviour, so validate carefully.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@monai/transforms/croppad/functional.py` around lines 102 - 106, The current
fragile keyword check in the except block for _pt_pad should be replaced so that
all RuntimeError failures from _pt_pad (and NotImplementedError) unconditionally
fall back to _np_pad, while still re-raising other ValueError/TypeError cases;
update the except block around the call to _pt_pad to catch (ValueError,
TypeError, RuntimeError) as err and if isinstance(err, (NotImplementedError,
RuntimeError)): return _np_pad(img, pad_width=to_pad, mode=mode, **kwargs) else:
raise to preserve existing behavior for plain ValueError/TypeError.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@monai/transforms/croppad/functional.py`:
- Around line 102-106: The current fragile keyword check in the except block for
_pt_pad should be replaced so that all RuntimeError failures from _pt_pad (and
NotImplementedError) unconditionally fall back to _np_pad, while still
re-raising other ValueError/TypeError cases; update the except block around the
call to _pt_pad to catch (ValueError, TypeError, RuntimeError) as err and if
isinstance(err, (NotImplementedError, RuntimeError)): return _np_pad(img,
pad_width=to_pad, mode=mode, **kwargs) else: raise to preserve existing behavior
for plain ValueError/TypeError.

@ericspod
Copy link
Member

Hi @skdas20 we have a previous PR #8672 that includes this fix. I am looking at that one now as we've left it to sit for too long, sorry if this was a fix you were needing.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants