Skip to content

RuntimeError: Tensor.item() cannot be called on meta tensors with Transformers 5.x InternVL 3.5 fails to load with Transformers v5 due to meta tensor initialization #1254

@yalabri20

Description

@yalabri20

Checklist

  • 1. I have searched related issues but cannot get the expected help.
  • 2. The bug has not been fixed in the latest version.
  • 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.

Describe the bug

What is happening?

InternVL 3.5 crashes with:

((RuntimeError: Tensor.item() cannot be called on meta tensors))

Happens when loading or calling model.chat

Reproducible on CPU and GPU


Where it happens???

-Inside modeling_intern_vit.py

-During vision backbone initialization

-Specifically when .item() is called on tensors created during init

Why it happens (your key insight)?????

-Transformers ≥5.x may initialize models on the meta device

-InternVL performs scalar extraction (.item()) before weights are materialized

-This creates a compatibility issue

One important comparison::::

Was working with me 1 Weeks ago

Reproduction

Install torch + transformers ≥5.x

Load OpenGVLab/InternVL3_5-1B using AutoModel.from_pretrained

Call model.chat (or even just model init)

Observe crash

Environment

OS: Google Colab (Ubuntu)

Python: 3.12

torch: 2.5.1+cu124

transformers: 5.1.0

InternVL: InternVL3_5-1B

CUDA: True

Error traceback

File "modeling_intern_vit.py", line XXX
dpr = [x.item() for x in torch.linspace(...)]
RuntimeError: Tensor.item() cannot be called on meta tensors

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions