A compatibility issue with the BigBirdForMaskedLM model and the Hugging Face Transformers library?

#1
by Berylite - opened

Warning when run Codon Transformer model (model = BigBirdForMaskedLM.from_pretrained("adibvafa/CodonTransformer").to(device)):
BigBirdForMaskedLM has generative capabilities, as prepare_inputs_for_generation is explicitly defined. However, it doesn't directly inherit from GenerationMixin. From πŸ‘‰v4.50πŸ‘ˆ onwards, PreTrainedModel will NOT inherit from GenerationMixin, and this model will lose the ability to call generate and other related functions.

  • If you're using trust_remote_code=True, you can get rid of this warning by loading the model with an auto class. See https://huggingface.co/docs/transformers/en/model_doc/auto#auto-classes
  • If you are the owner of the model architecture code, please modify your model class such that it inherits from GenerationMixin (after PreTrainedModel, otherwise you'll get an exception).
  • If you are not the owner of the model architecture class, please contact the model code owner to update it.
    BigBirdForMaskedLM has generative capabilities, as prepare_inputs_for_generation is explicitly defined. However, it doesn't directly inherit from GenerationMixin. From πŸ‘‰v4.50πŸ‘ˆ onwards, PreTrainedModel will NOT inherit from GenerationMixin, and this model will lose the ability to call generate and other related functions.
  • If you're using trust_remote_code=True, you can get rid of this warning by loading the model with an auto class. See https://huggingface.co/docs/transformers/en/model_doc/auto#auto-classes
  • If you are the owner of the model architecture code, please modify your model class such that it inherits from GenerationMixin (after PreTrainedModel, otherwise you'll get an exception).
  • If you are not the owner of the model architecture class, please contact the model code owner to update it.
    BigBirdForMaskedLM has generative capabilities, as prepare_inputs_for_generation is explicitly defined. However, it doesn't directly inherit from GenerationMixin. From πŸ‘‰v4.50πŸ‘ˆ onwards, PreTrainedModel will NOT inherit from GenerationMixin, and this model will lose the ability to call generate and other related functions.
  • If you're using trust_remote_code=True, you can get rid of this warning by loading the model with an auto class. See https://huggingface.co/docs/transformers/en/model_doc/auto#auto-classes
  • If you are the owner of the model architecture code, please modify your model class such that it inherits from GenerationMixin (after PreTrainedModel, otherwise you'll get an exception).
  • If you are not the owner of the model architecture class, please contact the model code owner to update it.
Owner

Thank you for opening this! Indeed when the time comes that Transformers stops supporting, I'll fix it

adibvafa changed discussion status to closed

Sign up or log in to comment