-
Notifications
You must be signed in to change notification settings - Fork 239
[NVBUG 5801937] Disable dq_only by default #777
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Important Review skippedAuto incremental reviews are disabled on this repository. Please check the settings in the CodeRabbit UI or the You can disable this status message by setting the 📝 WalkthroughWalkthroughThe default value of the Changes
Estimated code review effort🎯 1 (Trivial) | ⏱️ ~2 minutes 🚥 Pre-merge checks | ✅ 3✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Signed-off-by: ajrasane <131806219+ajrasane@users.noreply.github.com>
b1cd3f6 to
43cc715
Compare
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #777 +/- ##
==========================================
- Coverage 74.62% 74.19% -0.44%
==========================================
Files 192 192
Lines 18989 18977 -12
==========================================
- Hits 14171 14080 -91
- Misses 4818 4897 +79 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
## What does this PR do? **Type of change:** Bug fix **Overview:** Disable dq_only flag by default in modelopt onnx quantization ## Testing Able to build and run model with modelopt onnx Python CLI - **Make sure you read and follow [Contributor guidelines](https://github.com/NVIDIA/Model-Optimizer/blob/main/CONTRIBUTING.md)** and your commits are signed. - **Is this change backward compatible?**: No - dq_only is set to False by default - **Did you write any new necessary tests?**: No - **Did you add or update any necessary documentation?**: No - **Did you update [Changelog](https://github.com/NVIDIA/Model-Optimizer/blob/main/CHANGELOG.rst)?**: No <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Release Notes * **Chores** * Updated quantization default behavior: Q/DQ (Quantize/Dequantize) nodes are now added by default instead of only Dequantize nodes. <sub>✏️ Tip: You can customize this high-level summary in your review settings.</sub> <!-- end of auto-generated comment: release notes by coderabbit.ai --> Signed-off-by: ajrasane <131806219+ajrasane@users.noreply.github.com> Signed-off-by: Jingyu Xin <jingyux@nvidia.com>
## What does this PR do? **Type of change:** Bug fix **Overview:** Disable dq_only flag by default in modelopt onnx quantization ## Testing Able to build and run model with modelopt onnx Python CLI - **Make sure you read and follow [Contributor guidelines](https://github.com/NVIDIA/Model-Optimizer/blob/main/CONTRIBUTING.md)** and your commits are signed. - **Is this change backward compatible?**: No - dq_only is set to False by default - **Did you write any new necessary tests?**: No - **Did you add or update any necessary documentation?**: No - **Did you update [Changelog](https://github.com/NVIDIA/Model-Optimizer/blob/main/CHANGELOG.rst)?**: No <!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit ## Release Notes * **Chores** * Updated quantization default behavior: Q/DQ (Quantize/Dequantize) nodes are now added by default instead of only Dequantize nodes. <sub>✏️ Tip: You can customize this high-level summary in your review settings.</sub> <!-- end of auto-generated comment: release notes by coderabbit.ai --> Signed-off-by: ajrasane <131806219+ajrasane@users.noreply.github.com>
What does this PR do?
Type of change:
Bug fix
Overview:
Disable dq_only flag by default in modelopt onnx quantization
Testing
Able to build and run model with modelopt onnx Python CLI
Summary by CodeRabbit
Release Notes
✏️ Tip: You can customize this high-level summary in your review settings.