-
Notifications
You must be signed in to change notification settings - Fork 4.5k
Fix Republish Released Docker Images job #36954
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Assigning reviewers: R: @damccorm for label python. Note: If you would like to opt out of this review, comment Available commands:
The PR bot will only process comments in the main thread (not review comments). |
| nvidia-nvjitlink-cu12: | ||
| license: "https://raw.githubusercontent.com/NVIDIA/cccl/refs/heads/main/LICENSE" | ||
| nvidia-nvtx-cu12: | ||
| license: "https://raw.githubusercontent.com/NVIDIA/cccl/refs/heads/main/LICENSE" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we need these? If we're pulling in NVIDIA licenses at any point, something has gone wrong since all released artifacts need to be Apache license compliant
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
gpu_image_requirements.txt contains nvidia deps and there is the error:
RuntimeError: Could not retrieve licences for packages ['nvidia-cublas-cu12', 'nvidia-cuda-cupti-cu12', 'nvidia-cuda-nvrtc-cu12', 'nvidia-cuda-runtime-cu12', 'nvidia-cudnn-cu12', 'nvidia-cufft-cu12', 'nvidia-cufile-cu12', 'nvidia-curand-cu12', 'nvidia-cusolver-cu12', 'nvidia-cusparse-cu12', 'nvidia-nccl-cu12', 'nvidia-nvjitlink-cu12', 'nvidia-nvtx-cu12', 'tokenizers', 'triton'] in Python3.12 environment. 20.19 These licenses were not able to be pulled automatically. Please search code source of the dependencies on the internet and add urls to RAW license file at sdks/python/container/license_scripts/dep_urls_py.yaml for each missing license and rerun the test.
How should we proceed?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Where are we using gpu_image_requirements.txt? Right now, we should just be pushing the requirements file, not using it beyond that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see, it's because the release-2.69.0-postrelease branch had nvidia requirements in base_image_requirements.txt. And by default the workflow is running against that branch.
I updated the PR, I think it should be enough to fix the workflow, please review.
Where are we using gpu_image_requirements.txt? Right now, we should just be pushing the requirements file, not using it beyond that.
damccorm
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Fixes #33834
Thank you for your contribution! Follow this checklist to help us incorporate your contribution quickly and easily:
addresses #123), if applicable. This will automatically add a link to the pull request in the issue. If you would like the issue to automatically close on merging the pull request, commentfixes #<ISSUE NUMBER>instead.CHANGES.mdwith noteworthy changes.See the Contributor Guide for more tips on how to make review process smoother.
To check the build health, please visit https://github.com/apache/beam/blob/master/.test-infra/BUILD_STATUS.md
GitHub Actions Tests Status (on master branch)
See CI.md for more information about GitHub Actions CI or the workflows README to see a list of phrases to trigger workflows.