ci: externalize release workflow helper scripts#128
ci: externalize release workflow helper scripts#128nobodyiam merged 5 commits intoapolloconfig:mainfrom
Conversation
|
Warning Rate limit exceeded
⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📝 WalkthroughWalkthroughAdds CI automation for Maven releases: four new Python scripts to extract uploaded artifact URLs, resolve OSSRH staging repository context, orchestrate/poll Sonatype publishing, and write a release summary; plus updates/new GitHub Actions workflows to run and coordinate these steps. Changes
Sequence Diagram(s)sequenceDiagram
participant GHA as GitHub Actions
participant Maven as Maven
participant Extract as Extract Script
participant Resolve as Resolve Script
participant Publish as Sonatype Publish Script
participant OSSRH as OSSRH Staging API
participant Portal as Sonatype Portal
GHA->>Maven: run mvn clean deploy (logs -> maven-deploy.log)
GHA->>Extract: run release_extract_upload_context.py
Extract-->>GHA: deploy-artifacts.json + counts -> GITHUB_OUTPUT
GHA->>Resolve: run release_resolve_repository_context.py
Resolve->>OSSRH: query staging repos (auth)
OSSRH-->>Resolve: candidate repos
Resolve-->>GHA: repository-context.json -> GITHUB_OUTPUT
GHA->>Publish: run sonatype_publish.py (repo_key/creds)
Publish->>OSSRH: locate/list repository
Publish->>Portal: trigger publish / poll deployment status
Portal-->>Publish: deployment state updates
Publish-->>GHA: final outputs & summary
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Actionable comments posted: 5
🧹 Nitpick comments (5)
.github/workflows/sonatype-publish.yml (1)
34-37: No validation ofmodeinput value.The
modeinput accepts free-text but onlyportal_api,automatic, anduser_managedare meaningful. Consider using atype: choiceinput to constrain it.Proposed fix
mode: description: 'manual upload mode: portal_api|automatic|user_managed' required: true - default: 'portal_api' + default: 'portal_api' + type: choice + options: + - portal_api + - automatic + - user_managed🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/workflows/sonatype-publish.yml around lines 34 - 37, Replace the free-text "mode" input with a choice-constrained input so only the allowed values are accepted: change the "mode" input declaration (the "mode" field) to use type: choice and specify choices: ['portal_api','automatic','user_managed'] and keep the existing default 'portal_api'; this will validate CI input and prevent invalid values for the mode parameter used elsewhere..github/scripts/release_extract_upload_context.py (2)
35-42: Redundant.jar.asc/.pom.ascexclusion checks.A URL ending with
.jar.ascwill never passurl.endswith(".jar"), so thenot url.endswith(".jar.asc")guard is alwaysTrue(same for.pom/.pom.asc). Harmless but unnecessary.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/release_extract_upload_context.py around lines 35 - 42, The list comprehensions that build jar_urls and pom_urls include redundant exclusion checks (not url.endswith(".jar.asc") and not url.endswith(".pom.asc")) because a string that ends with ".jar.asc" cannot also end with ".jar"; update the comprehensions that generate jar_urls and pom_urls (which iterate over deduped_urls) to only check url.endswith(".jar") and url.endswith(".pom") respectively, removing the unnecessary .asc guards so the behavior is preserved but the expressions are simplified.
12-17:write_outputis duplicated across three scripts.The identical helper appears in
release_extract_upload_context.py,release_resolve_repository_context.py, andsonatype_publish.py. Consider extracting it into a shared module (e.g.,.github/scripts/_ci_utils.py) to reduce maintenance surface.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/release_extract_upload_context.py around lines 12 - 17, The helper function write_output is duplicated; extract it into a single shared module (e.g., create _ci_utils.py) that defines def write_output(key: str, value: str) -> None with the same logic (read GITHUB_OUTPUT env, return if empty, append f"{key}={value}\n" using UTF-8), then replace the local definitions in release_extract_upload_context.py, release_resolve_repository_context.py, and sonatype_publish.py with a single import from that module (e.g., from _ci_utils import write_output), leaving call sites unchanged and ensuring imports and packaging are updated so the scripts can find the new module..github/scripts/release_resolve_repository_context.py (2)
18-18:request_jsonsignature differs fromsonatype_publish.py.This version hardcodes
method="GET"and omits themethodparameter, whilesonatype_publish.pyacceptsmethodas an argument. If the sharedwrite_outputis extracted (as suggested earlier), consider also unifyingrequest_jsoninto a shared module with a consistent interface.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/release_resolve_repository_context.py at line 18, The request_json function currently lacks a method parameter and hardcodes method="GET", causing an inconsistent API with the version in sonatype_publish.py; update request_json to accept a method: str (default "GET") parameter and use it when building the request, and consider moving request_json into the shared module alongside the extracted write_output so both callers (request_json and the consumer in sonatype_publish.py) use the same unified function signature and behavior.
91-94: Network failures skipsearch_candidatestracking.When
status is None(network error), the loop continues without appending tosearch_candidates. This means the diagnostics context file won't reflect that a search was attempted but failed, making debugging harder.Proposed fix
status, payload = request_json(url, headers) if status is None: last_error = payload.get("error", "unknown error") + context["search_candidates"].append( + {"state": state, "ip": ip, "count": 0, "error": last_error} + ) continue🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/release_resolve_repository_context.py around lines 91 - 94, When request_json(url, headers) returns status is None we currently continue without recording the attempt; modify the loop so that before continuing you append a diagnostic entry to search_candidates (include the url, status=None and an error message from payload.get("error","unknown error")) and update last_error accordingly, then continue—this ensures request_json failures are tracked; refer to the variables request_json, status, payload, search_candidates, and last_error to locate and update the logic.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/scripts/release_resolve_repository_context.py:
- Around line 18-34: The request_json function currently assumes a 2xx response
body is valid JSON and calls json.loads(body), which will raise JSONDecodeError
and crash when the OSSRH API returns non-JSON; update request_json to catch
json.JSONDecodeError (or ValueError) around json.loads(body) in the success path
(after reading body) and return response.status with a fallback payload such as
{"raw": body} (mirroring the behavior used in sonatype_publish.request_json) so
non-JSON responses are handled gracefully.
In @.github/scripts/sonatype_publish.py:
- Around line 166-171: The error message builds "HTTP None" when publish_status
is None; update the logic around publish_status and publish_payload to avoid
calling str(publish_status) when None and instead prefer
publish_payload['error'] or a safe status string (e.g., "HTTP unknown") when
publish_status is missing; change the construction of reason in the block that
checks publish_status (use publish_payload.get('error') first and only append a
formatted HTTP status when publish_status is not None) so you never format None
into the message.
- Around line 109-132: The upload failure path currently sets reason based on
upload_status/upload_payload but then continues into the repository-listing and
polling logic, which can overwrite reason or pick up stale deployment_id; modify
the flow in the function that calls request_json for upload (using variables
upload_status, upload_payload, reason, repository_key, deployment_id, list_url)
to short-circuit when a definitive upload error exists—i.e., after setting
reason when upload_status is None or >=400, immediately skip the repository
listing/polling block (guard it with if not reason or return/raise) so the
original upload failure is preserved and no further polling occurs.
In @.github/workflows/release.yml:
- Around line 42-52: Replace the deprecated GH Action versions: change
actions/checkout@v2 to actions/checkout@v4 and actions/setup-java@v1 to
actions/setup-java@v4; for setup-java@v4 add the required distribution input
(e.g., set distribution: temurin or corretto) alongside the existing
java-version and credential inputs (server-id, server-username, server-password,
gpg-private-key, gpg-passphrase) so the workflow uses supported action runtimes
and avoids failures on newer GitHub-hosted runners.
In @.github/workflows/sonatype-publish.yml:
- Around line 57-72: The Sonatype publish summary step currently injects step
outputs directly into the run script (e.g., `${{ steps.publish.outputs.reason
}}`), which risks shell injection; change the "Sonatype publish summary" step to
pass each output into environment variables (e.g., PUBLISH_RESULT: ${{
steps.publish.outputs.result }}, PUBLISH_REASON: ${{
steps.publish.outputs.reason }}, PUBLISH_REPO_KEY, PUBLISH_DEPLOYMENT_ID,
PUBLISH_FINAL_STATE, PUBLISH_DEPLOYMENT_URL) and then reference those env vars
inside the run block (e.g., echo "- result: $PUBLISH_RESULT") using proper
quoting and safe checks like if [ -n "$PUBLISH_REASON" ]; then ... to avoid
evaluating contents as shell code.
---
Nitpick comments:
In @.github/scripts/release_extract_upload_context.py:
- Around line 35-42: The list comprehensions that build jar_urls and pom_urls
include redundant exclusion checks (not url.endswith(".jar.asc") and not
url.endswith(".pom.asc")) because a string that ends with ".jar.asc" cannot also
end with ".jar"; update the comprehensions that generate jar_urls and pom_urls
(which iterate over deduped_urls) to only check url.endswith(".jar") and
url.endswith(".pom") respectively, removing the unnecessary .asc guards so the
behavior is preserved but the expressions are simplified.
- Around line 12-17: The helper function write_output is duplicated; extract it
into a single shared module (e.g., create _ci_utils.py) that defines def
write_output(key: str, value: str) -> None with the same logic (read
GITHUB_OUTPUT env, return if empty, append f"{key}={value}\n" using UTF-8), then
replace the local definitions in release_extract_upload_context.py,
release_resolve_repository_context.py, and sonatype_publish.py with a single
import from that module (e.g., from _ci_utils import write_output), leaving call
sites unchanged and ensuring imports and packaging are updated so the scripts
can find the new module.
In @.github/scripts/release_resolve_repository_context.py:
- Line 18: The request_json function currently lacks a method parameter and
hardcodes method="GET", causing an inconsistent API with the version in
sonatype_publish.py; update request_json to accept a method: str (default "GET")
parameter and use it when building the request, and consider moving request_json
into the shared module alongside the extracted write_output so both callers
(request_json and the consumer in sonatype_publish.py) use the same unified
function signature and behavior.
- Around line 91-94: When request_json(url, headers) returns status is None we
currently continue without recording the attempt; modify the loop so that before
continuing you append a diagnostic entry to search_candidates (include the url,
status=None and an error message from payload.get("error","unknown error")) and
update last_error accordingly, then continue—this ensures request_json failures
are tracked; refer to the variables request_json, status, payload,
search_candidates, and last_error to locate and update the logic.
In @.github/workflows/sonatype-publish.yml:
- Around line 34-37: Replace the free-text "mode" input with a
choice-constrained input so only the allowed values are accepted: change the
"mode" input declaration (the "mode" field) to use type: choice and specify
choices: ['portal_api','automatic','user_managed'] and keep the existing default
'portal_api'; this will validate CI input and prevent invalid values for the
mode parameter used elsewhere.
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #128 +/- ##
============================================
+ Coverage 68.68% 71.01% +2.32%
- Complexity 1503 1634 +131
============================================
Files 212 224 +12
Lines 6396 6727 +331
Branches 647 679 +32
============================================
+ Hits 4393 4777 +384
+ Misses 1673 1600 -73
- Partials 330 350 +20 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
41ce491 to
bd0375f
Compare
There was a problem hiding this comment.
Actionable comments posted: 3
🧹 Nitpick comments (2)
.github/scripts/release_extract_upload_context.py (2)
25-30:write_outputis copy-pasted verbatim across all four scriptsAn identical
write_outputimplementation exists inrelease_resolve_repository_context.py,sonatype_publish.py, andrelease_write_summary.py. Consider extracting it to a shared.github/scripts/_shared.pyorgithub_actions_utils.pymodule and importing it in each script, reducing four copies to one.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/release_extract_upload_context.py around lines 25 - 30, The write_output function is duplicated across multiple scripts (write_output in release_extract_upload_context.py, release_resolve_repository_context.py, sonatype_publish.py, release_write_summary.py); extract this helper into a new shared module (e.g., .github/scripts/_shared.py or github_actions_utils.py) containing the write_output(key: str, value: str) -> None implementation, update each script to import write_output from that module, and remove the local duplicate definitions so the four scripts call the single shared write_output implementation.
43-43: Redundantrstrip(":")The regex pattern
(\S+):captures non-whitespace characters up to (but not including) the literal:, sotarget_repoalready has no trailing colon. Therstrip(":")call is a no-op and can be removed.♻️ Proposed cleanup
for target_repo, url in pattern.findall(log_text): - normalized = target_repo.rstrip(":") - if normalized == repository_name: + if target_repo == repository_name: uploaded_urls.append(url)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/release_extract_upload_context.py at line 43, Remove the redundant rstrip call: the regex already strips the colon so set normalized directly from target_repo (replace the expression that assigns normalized using target_repo.rstrip(":") with a direct assignment using target_repo), and run tests/linters to confirm no behavior relies on trailing-colon trimming elsewhere; look for the assignment to the variable named normalized where target_repo is used and update it accordingly.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/scripts/release_resolve_repository_context.py:
- Around line 107-140: The code silently ignores HTTP error responses from
request_json; update the loop handling after calling request_json(url, headers)
to treat non-None status that is not successful as an error: extract an error
string from payload (e.g., payload.get("error") or include the HTTP
status/message), set last_error to that value, and append it into
context["search_candidates"] (include the status and error fields instead of
leaving status/count empty). Ensure the logic that sets context["reason"] at the
end will see last_error so that HTTP errors (401/4xx/5xx) are reported instead
of returning "No staging repository key found"; reference the request_json call,
last_error variable, the context["search_candidates"] append site, and the final
selected/last_error usage to locate where to change the handling.
In @.github/scripts/sonatype_publish.py:
- Around line 106-121: The auto-detection loop for repository_key uses "_,
payload = request_json(...)" and ignores the HTTP status, so 4xx/5xx responses
become silent and lead to the misleading "No staging repository key found"
reason; update the loop that builds search_url (using OSSRH_BASE and namespace)
to inspect the HTTP status returned by request_json (or modify request_json to
return/raise on non-2xx), and on non-success responses include the status and
error body in the logged/returned message (or raise) instead of proceeding as if
no repositories were found; ensure repository_key remains unset only when a
successful 200 search returns no repositories and mirror the same error-handling
pattern used in release_resolve_repository_context.py.
In @.github/workflows/sonatype-publish.yml:
- Around line 62-84: The "Sonatype publish summary" job step should be
configured to always run so it executes even if the `publish` step fails; add an
`if: always()` condition to the step definition (the step with name "Sonatype
publish summary") directly under its name so the existing env and run block
remain unchanged and the summary is appended to `GITHUB_STEP_SUMMARY` regardless
of the exit status of `sonatype_publish.py`.
---
Nitpick comments:
In @.github/scripts/release_extract_upload_context.py:
- Around line 25-30: The write_output function is duplicated across multiple
scripts (write_output in release_extract_upload_context.py,
release_resolve_repository_context.py, sonatype_publish.py,
release_write_summary.py); extract this helper into a new shared module (e.g.,
.github/scripts/_shared.py or github_actions_utils.py) containing the
write_output(key: str, value: str) -> None implementation, update each script to
import write_output from that module, and remove the local duplicate definitions
so the four scripts call the single shared write_output implementation.
- Line 43: Remove the redundant rstrip call: the regex already strips the colon
so set normalized directly from target_repo (replace the expression that assigns
normalized using target_repo.rstrip(":") with a direct assignment using
target_repo), and run tests/linters to confirm no behavior relies on
trailing-colon trimming elsewhere; look for the assignment to the variable named
normalized where target_repo is used and update it accordingly.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
.github/scripts/release_resolve_repository_context.py (1)
42-48: Minor inconsistency inHTTPErrorbody handling vs. sibling script.In
sonatype_publish.pythe body is read outside the try-block before being decoded, so a partial-read I/O error is surfaced rather than silently eaten, and a non-JSON body is preserved as{"raw": body}. Hereerror.read()is inside the sametry, meaning any read-level exception collapses to{"error": "HTTP {code}"}and non-JSON error bodies are lost entirely.♻️ Align with
sonatype_publish.py- except urllib.error.HTTPError as error: - try: - payload = json.loads(error.read().decode("utf-8")) - except Exception: # noqa: BLE001 - payload = {"error": f"HTTP {error.code}"} - payload.setdefault("error", f"HTTP {error.code}") - return error.code, payload + except urllib.error.HTTPError as error: + body = error.read().decode("utf-8") + try: + payload = json.loads(body) if body else {} + except json.JSONDecodeError: + payload = {"raw": body} + payload.setdefault("error", f"HTTP {error.code}") + return error.code, payload🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/release_resolve_repository_context.py around lines 42 - 48, Read and decode the HTTPError body before the JSON-parsing try-block so any read/IO exceptions propagate and non-JSON bodies are preserved; specifically, call body = error.read().decode("utf-8") before the try, then attempt payload = json.loads(body) inside a try/except that only catches JSON decode errors (e.g., ValueError) and sets payload = {"raw": body} on failure, then payload.setdefault("error", f"HTTP {error.code}") and return error.code, payload — keep using the existing error and payload variables to locate the code to change..github/scripts/sonatype_publish.py (1)
67-72:write_outputis duplicated across three scripts.The function body is identical in
sonatype_publish.py:67-72,release_resolve_repository_context.py:53-58, andrelease_extract_upload_context.py:24-29. A shared_github_utils.py(or similar) would eliminate the duplication.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/sonatype_publish.py around lines 67 - 72, The write_output function is duplicated across modules; extract it into a single shared helper module (e.g., create a new module _github_utils with a write_output(key: str, value: str) -> None) and replace the local definitions in sonatype_publish, release_resolve_repository_context, and release_extract_upload_context to import and call the shared write_output. Ensure the new helper uses os.environ.get("GITHUB_OUTPUT", "").strip() and opens the path with encoding="utf-8" in append mode to preserve current behavior and return early when the env var is empty.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/scripts/sonatype_publish.py:
- Around line 146-171: The GET that lists repositories currently ignores its
HTTP status by using "_, list_payload = request_json(...)" so transient failures
or auth issues are swallowed and later reported as "No portal deployment id
found"; update the call to capture and check the status (e.g., "list_status,
list_payload = request_json('GET', list_url, headers)"), and if list_status is
None or >= 400 set/raise a clear error reason that includes list_status and any
error from list_payload before proceeding to parse repositories; ensure this
logic touches the variables request_json, list_url, list_status/list_payload,
repositories, and deployment_id so failures are surfaced instead of returning an
empty repositories list.
---
Nitpick comments:
In @.github/scripts/release_resolve_repository_context.py:
- Around line 42-48: Read and decode the HTTPError body before the JSON-parsing
try-block so any read/IO exceptions propagate and non-JSON bodies are preserved;
specifically, call body = error.read().decode("utf-8") before the try, then
attempt payload = json.loads(body) inside a try/except that only catches JSON
decode errors (e.g., ValueError) and sets payload = {"raw": body} on failure,
then payload.setdefault("error", f"HTTP {error.code}") and return error.code,
payload — keep using the existing error and payload variables to locate the code
to change.
In @.github/scripts/sonatype_publish.py:
- Around line 67-72: The write_output function is duplicated across modules;
extract it into a single shared helper module (e.g., create a new module
_github_utils with a write_output(key: str, value: str) -> None) and replace the
local definitions in sonatype_publish, release_resolve_repository_context, and
release_extract_upload_context to import and call the shared write_output.
Ensure the new helper uses os.environ.get("GITHUB_OUTPUT", "").strip() and opens
the path with encoding="utf-8" in append mode to preserve current behavior and
return early when the env var is empty.
There was a problem hiding this comment.
Actionable comments posted: 1
🧹 Nitpick comments (2)
.github/scripts/sonatype_publish.py (2)
186-188: Prefertime.monotonic()overtime.time()for deadline tracking.
time.time()is wall-clock time and can jump forward or backward due to NTP adjustments or daylight-saving transitions, potentially causing the polling loop to expire early or run longer thantimeout_minutes.time.monotonic()is the correct choice for measuring elapsed time.♻️ Proposed refactor
- deadline = time.time() + timeout_minutes * 60 + deadline = time.monotonic() + timeout_minutes * 60 - while time.time() <= deadline: + while time.monotonic() <= deadline:🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/sonatype_publish.py around lines 186 - 188, Replace wall-clock timing with monotonic timing: change the deadline computation that currently uses time.time() (the assignment to deadline = time.time() + timeout_minutes * 60) and the polling loop condition while time.time() <= deadline to use time.monotonic() instead so elapsed-time measurement is immune to system clock changes; keep the same timeout_minutes arithmetic and the existing variable names (deadline, timeout_minutes) and loop structure.
37-39: Ruff S310:urllib.request.urlopenacceptsfile://and custom URL schemes.All current call sites in
main()construct URLs from hardcoded HTTPS constants (OSSRH_BASE,PORTAL_BASE), so there is no practical risk in the current code. However, as a public helper,request_jsondoes not validate the scheme, meaning any future caller could inadvertently pass an untrusted URL. A lightweight scheme guard would address Ruff S310 and future-proof the helper.♻️ Proposed hardening
def request_json( method: str, url: str, headers: dict[str, str], ) -> tuple[int | None, dict[str, Any]]: + if not url.startswith(("https://", "http://")): + return None, {"error": f"Rejected non-HTTP URL scheme: {url[:30]}"} request = urllib.request.Request(url=url, method=method, headers=headers)🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In @.github/scripts/sonatype_publish.py around lines 37 - 39, The helper request_json currently calls urllib.request.urlopen without validating URL schemes (triggers Ruff S310); update request_json to parse the passed url (use urllib.parse.urlparse) and enforce an allowlist of safe schemes (e.g., "http" and "https") before constructing the Request and calling urllib.request.urlopen, raising a ValueError (or similar) for disallowed schemes; keep the existing HTTPS constants (OSSRH_BASE, PORTAL_BASE) usage unchanged but add this lightweight guard so future callers cannot pass file:// or custom schemes.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In @.github/scripts/sonatype_publish.py:
- Around line 185-241: The poll loop discards the HTTP status by doing "_,
status_payload = request_json(...)" so persistent non-2xx responses become
final_state == "unknown" and the code later hides the real error; change that
call to capture the status (e.g., status, status_payload = request_json(...))
and when status is None or status >= 400 set reason to include the HTTP status
and any error message from status_payload (similar to the existing publish API
handling), then break the loop. Update the branch that assigns final_state via
extract_deployment_state(status_payload) to run only after a successful status
(2xx) response so failures set reason immediately; reference request_json,
extract_deployment_state, final_state, reason, result, publish_triggered,
publish_status, and publish_payload in your fix.
---
Nitpick comments:
In @.github/scripts/sonatype_publish.py:
- Around line 186-188: Replace wall-clock timing with monotonic timing: change
the deadline computation that currently uses time.time() (the assignment to
deadline = time.time() + timeout_minutes * 60) and the polling loop condition
while time.time() <= deadline to use time.monotonic() instead so elapsed-time
measurement is immune to system clock changes; keep the same timeout_minutes
arithmetic and the existing variable names (deadline, timeout_minutes) and loop
structure.
- Around line 37-39: The helper request_json currently calls
urllib.request.urlopen without validating URL schemes (triggers Ruff S310);
update request_json to parse the passed url (use urllib.parse.urlparse) and
enforce an allowlist of safe schemes (e.g., "http" and "https") before
constructing the Request and calling urllib.request.urlopen, raising a
ValueError (or similar) for disallowed schemes; keep the existing HTTPS
constants (OSSRH_BASE, PORTAL_BASE) usage unchanged but add this lightweight
guard so future callers cannot pass file:// or custom schemes.
What's the purpose of this PR
This PR makes the Apollo Java release automation workflows easier to maintain by extracting inline Python blocks from GitHub Actions YAML into dedicated script files under
.github/scripts. It keeps the existing behavior for bothsnapshotsandreleases, while making the release context extraction and Sonatype publish logic reusable and testable.Which issue(s) this PR fixes:
Fixes # N/A (release task)
Brief changelog
.github/scripts/release_extract_upload_context.pyto parsemaven-deploy.logand emit upload URL context..github/scripts/release_resolve_repository_context.pyto resolverepository_key/portal_deployment_idforreleasesonly..github/scripts/release_write_summary.pyto publish workflow summary from generated context JSON files..github/scripts/sonatype_publish.pyto encapsulate Sonatype upload/poll/publish flow..github/workflows/release.ymlto call script files instead of inline Python..github/workflows/sonatype-publish.ymland switched it to script-based execution.repository=snapshots, repository context is markednot_applicableand does not require staging repository resolution.Follow this checklist to help us incorporate your contribution quickly and easily:
mvn clean testto make sure this pull request doesn't break anything.CHANGESlog.Validation performed in this branch:
python3 -m py_compile .github/scripts/release_extract_upload_context.py .github/scripts/release_resolve_repository_context.py .github/scripts/release_write_summary.py .github/scripts/sonatype_publish.pypython3 - <<'PY' ... yaml.safe_load(...) ... PYfor.github/workflows/release.ymland.github/workflows/sonatype-publish.ymlSummary by CodeRabbit
New Features
Chores