Skip to content

[https://nvbugs/5845769][fix] B300(sm103) support on VLMs#11274

Merged
MartinMarciniszyn merged 2 commits intoNVIDIA:release/1.2from
yechank-nvidia:qwen3vl_b300
Feb 5, 2026
Merged

[https://nvbugs/5845769][fix] B300(sm103) support on VLMs#11274
MartinMarciniszyn merged 2 commits intoNVIDIA:release/1.2from
yechank-nvidia:qwen3vl_b300

Conversation

@yechank-nvidia
Copy link
Collaborator

@yechank-nvidia yechank-nvidia commented Feb 4, 2026

Summary by CodeRabbit

  • Chores
    • Improved GPU kernel compatibility for additional hardware configurations.

Signed-off-by: yechank <161688079+yechank-nvidia@users.noreply.github.com>
@yechank-nvidia yechank-nvidia self-assigned this Feb 4, 2026
@yechank-nvidia yechank-nvidia requested a review from a team as a code owner February 4, 2026 10:20
@yechank-nvidia yechank-nvidia changed the title B300 support on VLMs [https://nvbugs/5845769][fix] B300 support on VLMs Feb 4, 2026
@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 4, 2026

📝 Walkthrough

Walkthrough

Adds a conditional SM value remapping in getXMMAKernelsV2 where kSM_103 is remapped to kSM_100 before kernel retrieval, mirroring an existing kSM_121 to kSM_120 mapping pattern. The change adjusts SM selection without altering function signatures or control flow logic.

Changes

Cohort / File(s) Summary
SM Kernel Remapping
cpp/tensorrt_llm/kernels/contextFusedMultiHeadAttention/fused_multihead_attention_v2.cpp
Added conditional to remap kSM_103 to kSM_100 in getXMMAKernelsV2 function, following existing SM architecture remapping pattern.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

🚥 Pre-merge checks | ✅ 1 | ❌ 2
❌ Failed checks (2 warnings)
Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Description check ⚠️ Warning The PR description is empty except for the template comments; it lacks required sections including a description of the issue/solution, test coverage information, and PR checklist verification. Add a meaningful description explaining the B300 support changes, list relevant test coverage, and complete the PR checklist items to meet repository standards.
✅ Passed checks (1 passed)
Check name Status Explanation
Title check ✅ Passed The title follows the required template format with [https://nvbugs/5845769][fix] prefix and clearly describes the change: adding B300 (sm103) support for VLMs.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Signed-off-by: yechank <161688079+yechank-nvidia@users.noreply.github.com>
@yechank-nvidia
Copy link
Collaborator Author

/bot run

@tensorrt-cicd
Copy link
Collaborator

PR_Github #34758 [ run ] triggered by Bot. Commit: 30a8c0c

@tensorrt-cicd
Copy link
Collaborator

PR_Github #34788 [ run ] triggered by Bot. Commit: 30a8c0c

@tensorrt-cicd
Copy link
Collaborator

PR_Github #34788 [ run ] completed with state SUCCESS. Commit: 30a8c0c
/LLM/release-1.2/L0_MergeRequest_PR pipeline #318 completed with status: 'FAILURE'

⚠️ Action Required:

  • Please check the failed tests and fix your PR
  • If you cannot view the failures, ask the CI triggerer to share details
  • Once fixed, request an NVIDIA team member to trigger CI again

@yechank-nvidia
Copy link
Collaborator Author

/bot run

@tensorrt-cicd
Copy link
Collaborator

PR_Github #34864 [ run ] triggered by Bot. Commit: 30a8c0c

@yechank-nvidia yechank-nvidia added the Multimodal Label for issues & PRs regarding Multimodal related objects label Feb 5, 2026
@tensorrt-cicd
Copy link
Collaborator

PR_Github #34864 [ run ] completed with state SUCCESS. Commit: 30a8c0c
/LLM/release-1.2/L0_MergeRequest_PR pipeline #327 completed with status: 'FAILURE'

⚠️ Action Required:

  • Please check the failed tests and fix your PR
  • If you cannot view the failures, ask the CI triggerer to share details
  • Once fixed, request an NVIDIA team member to trigger CI again

@yechank-nvidia
Copy link
Collaborator Author

/bot run

@tensorrt-cicd
Copy link
Collaborator

PR_Github #34898 [ run ] triggered by Bot. Commit: 30a8c0c

@tensorrt-cicd
Copy link
Collaborator

PR_Github #34898 [ run ] completed with state SUCCESS. Commit: 30a8c0c
/LLM/release-1.2/L0_MergeRequest_PR pipeline #330 completed with status: 'SUCCESS'

@yechank-nvidia yechank-nvidia changed the title [https://nvbugs/5845769][fix] B300 support on VLMs [https://nvbugs/5845769][fix] B300(sm103) support on VLMs Feb 5, 2026
@MartinMarciniszyn MartinMarciniszyn merged commit baa2abf into NVIDIA:release/1.2 Feb 5, 2026
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Multimodal Label for issues & PRs regarding Multimodal related objects

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants