Skip to content

feat: removing vllm backend#781

Open
avinash2692 wants to merge 4 commits intomainfrom
feat/780-remove-vllm
Open

feat: removing vllm backend#781
avinash2692 wants to merge 4 commits intomainfrom
feat/780-remove-vllm

Conversation

@avinash2692
Copy link
Copy Markdown
Member

@avinash2692 avinash2692 commented Apr 2, 2026

Remove vLLM backend to unblock transformers v5 upgrade

Type of PR

  • Bug Fix
  • New Feature
  • Documentation
  • Other

Description

vLLM pins an older version of transformers as a hard dependency, which
blocks us from upgrading to transformers v5. Since vLLM is the only thing
holding us back, the cleanest path forward is to remove it from the project
entirely rather than waiting for vLLM to catch up.

Changes in this PR:

  • Deleted LocalVLLMBackend and the mellea/backends/vllm module
  • Removed the [vllm] optional dependency and dropped it from [all] / [backends]
  • Replaced the subprocess-based vLLM integration test with a lightweight mock
    OpenAI-compatible server fixture
  • Removed all vLLM references from docs, README, and nav config
  • Regenerated uv.lock — the transformers version constraint is now freed

Out of scope: The transformers v5 upgrade itself is a being done here #418

Testing

  • Tests added to the respective file if code was changed
  • New code has 100% coverage if code was added
  • Ensure existing tests and github automation passes (a maintainer will kick off the github automation when the rest of the PR is populated)

@avinash2692 avinash2692 changed the title first pass at removing vllm feat: removing vllm backend Apr 2, 2026
@github-actions github-actions bot added the enhancement New feature or request label Apr 2, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Apr 2, 2026

The PR description has been updated. Please fill out the template for your PR to be reviewed.

@nrfulton nrfulton self-requested a review April 2, 2026 22:07
@avinash2692 avinash2692 marked this pull request as ready for review April 3, 2026 21:13
@avinash2692 avinash2692 requested a review from a team as a code owner April 3, 2026 21:13
Copy link
Copy Markdown
Contributor

@markstur markstur left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

helpers/server_type.py is_vllm_server_with_structured_output() is still there (and its tests), but it looks like that is for openai-vllm which we keep. Intentional, right?

Copy link
Copy Markdown
Contributor

@jakelorocco jakelorocco left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems good to me; a few small nits

can you please confirm that the openai_vllm tests still pass to make sure we didn't remove anything necessary?

Comment on lines -40 to -43
# There's also some risk that a group could unintentionally fully import the dependencies of
# another group (without explicitly listing it). That will lead to false positives here on
# the should_fail side. This happens with vllm (which imports all the parts of hf) so we just
# exclude the hf import statements from that test.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we please leave this comment in the tests? I'm okay if we want to explicitly remove the vllm stuff at the end.

Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perhaps we should name this something backend agnostic so that future changes don't require a change in the script name?

@jakelorocco
Copy link
Copy Markdown
Contributor

helpers/server_type.py is_vllm_server_with_structured_output() is still there (and its tests), but it looks like that is for openai-vllm which we keep. Intentional, right?

Yes; we want to remove our vLLM Backend but not the vLLM compatibility with the OpenAI Backend.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Remove vllm backend and dependency Remove mellea.backends.vllm

3 participants