feat(autojac): Add tangents to backward and jac#562
Conversation
bb135d6 to
ff0cb23
Compare
|
@claude Review this. |
|
Claude finished @ValerianRey's task in 1m 14s —— View job Code ReviewTasks
SummaryThis PR adds a new Issues Found1. 🐛 Bug in Example 2 Documentation (Critical)Location: The second example claims to demonstrate using >>> J1 = torch.tensor([1.0, 0.0])
>>> J2 = torch.tensor([0.0, 1.0])
>>>
>>> backward([y1, y2]) # ❌ Should be: backward([y1, y2], jac_tensors=[J1, J2])The variables Impact: Users following this example won't understand how to actually use 2. 📝 Typo in Example 3 DocumentationLocation: Missing >>> from torchjd.autojac import backward
>> # ❌ Should be: >>>3. 🧹 Incomplete CommentLocation: # correspond to the shape of the corresponding element in tensors_This comment is incomplete. It appears to be a sentence fragment. Should probably be: # Check that jac_tensors correspond to the shape of the corresponding element in tensors_4. 🔍 Removed Test Could Be ValuableLocation: The PR removes Recommendation: Consider whether the transform composition logic still needs validation, even if the helper is gone. Positive Observations✅ Excellent test coverage: Added comprehensive tests for:
✅ Documentation examples are tested: All three docstring examples have corresponding tests in ✅ Type hints are correct: Proper use of ✅ Mathematical documentation is clear: The explanation of the matrix product ✅ Error handling: Proper validation with Minor Suggestions
Adherence to Project Standards
RecommendationRequest changes to fix the critical bug in Example 2. The other issues are minor but should be addressed for code quality. |
|
@PierreQuinton I think claude caught a few interesting things. I'll make a proper review later. |
|
@claude I think we are supposed to add an entry to the changelog for this as it changes the public interface (backward is a public function), could you suggest one? |
|
Claude finished @PierreQuinton's task in 35s —— View job Suggesting Changelog Entry
|
Add a jac_outputs parameter to the jac function, similar to how jac_tensors was added to backward. The parameter allows users to backpropagate custom Jacobians instead of the identity matrix, enabling chained Jacobian computations via the chain rule. - Add jac_outputs parameter after inputs in jac signature - Extract _create_jac_outputs_dict helper function - Refactor _create_transform to only create Jac transform - Add third example showing chain rule usage with jac_outputs - Add comprehensive tests for jac_outputs functionality Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This reverts commit cdb2a31.
Introduce check_consistent_first_dimension in _utils.py to validate that all Jacobians have the same first dimension. This function is now used by backward, jac, and jac_to_grad, replacing duplicate validation logic. Error messages now include the specific parameter name (jac_tensors, jac_outputs, or tensors.jac) to help users identify the source of dimension mismatches. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Validates that jac_outputs and jac_tensors have the same length as their corresponding outputs/tensors before processing, providing clear error messages with actual lengths. Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
jac_tensors to backwardbackward and jac
backward and jacbackward and jac
backward and jacbackward and jac
|
@claude please review again. |
|
Claude finished @ValerianRey's task in 1m 40s —— View job Code ReviewTasks
SummaryThis PR successfully adds Status of Previous Review Issues✅ Fixed: Example 2 Bug - The second example in ✅ Fixed: Typo in Example 3 - No longer present (Example 3 doesn't exist in current version). ✅ Fixed: Incomplete Comment - The comment issue has been resolved. The validation logic is now clearer. ✅ Addressed: Test Removal - The Current Issues1. 📝 Missing Documentation: Consistent First Dimension RequirementLocation: Issue: The documentation for Current text (lines 30-33 in :param jac_tensors: The initial Jacobians to backpropagate, analog to the `grad_tensors`
parameter of `torch.autograd.backward`. If provided, it must have the same structure as
``tensors`` and each tensor in ``jac_tensors`` must match the shape of the corresponding
tensor in ``tensors``, with an extra leading dimension representing the number of rows of
the resulting Jacobian (e.g. the number of losses). If ``None``, defaults to the identity
matrix. In this case, the standard Jacobian of ``tensors`` is computed, with one row for
each value in the ``tensors``.Suggested fix: Add a sentence clarifying the first dimension must be consistent: :param jac_tensors: The initial Jacobians to backpropagate, analog to the `grad_tensors`
parameter of `torch.autograd.backward`. If provided, it must have the same structure as
``tensors`` and each tensor in ``jac_tensors`` must match the shape of the corresponding
tensor in ``tensors``, with an extra leading dimension representing the number of rows of
the resulting Jacobian (e.g. the number of losses). All tensors in ``jac_tensors`` must
have the same first dimension. If ``None``, defaults to the identity matrix. In this case,
the standard Jacobian of ``tensors`` is computed, with one row for each value in the
``tensors``.Same issue in 2. 🔍 Minor: Confusing Example 2 DescriptionLocation: Issue: The description says "we explicitly specify the identity J1 = torch.tensor([1.0, 0.0])
J2 = torch.tensor([0.0, 1.0])These aren't technically "the identity" - they're the rows of an identity matrix. When used together, they form an identity transformation, but individually they're not identity matrices. The phrasing could be clearer. Suggested improvement: This is the same example as before, except that we explicitly specify ``jac_tensors`` as
the rows of the identity matrix (which is equivalent to using the default ``None``).3. ✅ Validation Tests Are ExcellentThe PR adds comprehensive validation with helpful error messages:
All three have corresponding unit tests in
4. ✅ Test Coverage Is ComprehensiveThe PR includes excellent test coverage:
5. ✅ Changelog Entries Are GoodThe changelog entries (lines 16-20) accurately describe the new functionality and correctly reference the analogous PyTorch functions. Code Quality Observations✅ Type hints: Properly used throughout with ✅ Consistent API: Both ✅ Mathematical correctness: The matrix product ✅ Error messages: Clear and informative, specifying parameter names and shapes ✅ Helper functions: Recommendations
VerdictRequest minor changes to add the missing documentation about the consistent first dimension requirement. Otherwise, this is high-quality work with excellent test coverage. |

jac_tensorsparameter tobackwardjac_outputsparameter tojacDifferent PR: