Error while trying to Deploy this model

#9
by naturalll111 - opened

How to resolve it?

  File "/app/huggingface_inference_toolkit/utils.py", line 252, in get_pipeline
    hf_pipeline = pipeline(task=task, model=model_dir, device=device, **kwargs)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/dist-packages/transformers/pipelines/__init__.py", line 849, in pipeline
    config = AutoConfig.from_pretrained(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/configuration_auto.py", line 1073, in from_pretrained
    raise ValueError(
ValueError: The checkpoint you are trying to load has model type `qwen3` but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

You can update Transformers with the command `pip install --upgrade transformers`. If this does not work, and the checkpoint is very new, then there may not be a release version that supports this model yet. In this case, you can get the most up-to-date code by installing Transformers from source with the command `pip install git+https://github.com/huggingface/transformers.git`

Application startup failed. Exiting.

same problem. Have you resovled it? I have upgraded transformers>=4.40

I alse met this problem. To resovle this, upgrade transformers to the latest version. ( Officially:The code of Qwen3 has been in the latest Hugging Face transformers and we advise you to use the latest version of transformers.
With transformers<4.51.0, you will encounter the following error:)

Sign up or log in to comment