[ernie-image] use concrete Mistral3Model / Ministral3ForCausalLM types#13687
[ernie-image] use concrete Mistral3Model / Ministral3ForCausalLM types#13687akshan-main wants to merge 1 commit intohuggingface:mainfrom
Conversation
|
thanks @akshan-main for what understand, this is not the best fix, since with the fallback (as it was before) it will still not work because the models are not present in |
|
ohh I just thought it was an issue with CI on our end, not something you need to fix |
750ee11 to
51321f1
Compare
|
@yiyixuxu kept just the type change |
|
can you try update the minimum heree https://github.com/huggingface/diffusers/blob/main/setup.py#L143 cc @sayakpaul @DN6 |
|
@yiyixuxu done |
|
Hmm, I think if is_transformers_version("<", "5.0.0"):
raiseThis way, I don't think we have to update the |
41cdfdb to
aa7f34f
Compare
|
@sayakpaul @yiyixuxu applied it, lmk if I need to revert |
What does this PR do?
Follow-up to #13663 (per @asomoza's review there): replace the
AutoModel/AutoModelForCausalLMreferences fortext_encoderandpewith the concreteMistral3Model/Ministral3ForCausalLMso the loaded checkpoint matches the declared types.To avoid breaking installs with older
transformers(Ministral3ForCausalLMis only in 5.0+,Mistral3Modelis in 4.50+):TYPE_CHECKING+ string annotations on__init__so the imports never run at module load._get_signature_typesusesget_type_hintsand falls back gracefully when a name can't be resolved.encoders.py:try/exceptimport that aliases the concrete class when available and falls back toAutoModel/AutoModelForCausalLMotherwise. TheComponentSpecthen references the alias.So users on recent transformers get the silenced warning; users on older transformers keep working as before.
Before submitting
Who can review?
@yiyixuxu @asomoza