Skip to content

fix: lower PyTorch minimum to 2.8.0 for vLLM compatibility#53

Merged
abrichr merged 1 commit intomainfrom
fix/pytorch-version-compat
Mar 4, 2026
Merged

fix: lower PyTorch minimum to 2.8.0 for vLLM compatibility#53
abrichr merged 1 commit intomainfrom
fix/pytorch-version-compat

Conversation

@abrichr
Copy link
Member

@abrichr abrichr commented Mar 4, 2026

Summary

  • Lower PyTorch minimum version from >=2.9.1 to >=2.8.0 in pyproject.toml
  • Enables installing openadapt-ml alongside vLLM 0.11.0 (which pins torch==2.8.0)
  • GPU E2E validation on AWS g5.xlarge (A10G) confirmed the full stack works with PyTorch 2.8.0+cu128 (openadapt-evals PR #87)

Test plan

  • CI tests pass
  • uv sync resolves cleanly with the updated constraint

🤖 Generated with Claude Code

vLLM 0.11.0 pins torch==2.8.0. The GPU E2E validation (openadapt-evals
PR #87) confirmed the full ML stack works with PyTorch 2.8.0+cu128.
The previous >=2.9.1 constraint prevented installing openadapt-ml
alongside vLLM in the same environment.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@abrichr abrichr merged commit c0bc069 into main Mar 4, 2026
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant