Model Loading Error: "data did not match any variant of untagged enum ModelWrapper" when loading MedFound-Llama3-8B-finetuned
在尝试加载 medicalai/MedFound-Llama3-8B-finetuned 模型时,遇到反序列化错误:
复现步骤
- 使用以下代码加载模型:
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "medicalai/MedFound-Llama3-8B-finetuned"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
device_map="auto",
torch_dtype=torch.float16,
trust_remote_code=True
)错误发生在模型加载阶段
我重新安装了环境,但还是报错,显示文件错误,但是我是直接下载的,不应该存在文件缺失的错误。这是什么问题?
Traceback (most recent call last):
File "/data1/sunqian/data/mimic_cxr/split_files/Up_obs_val.py", line 9, in
tokenizer = AutoTokenizer.from_pretrained(model_path)
File "/data1/anaconda3/envs/sunqian/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 814, in from_pretrained
return tokenizer_class.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
File "/data1/anaconda3/envs/sunqian/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2029, in from_pretrained
return cls._from_pretrained(
File "/data1/anaconda3/envs/sunqian/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2261, in _from_pretrained
tokenizer = cls(*init_inputs, **init_kwargs)
File "/data1/anaconda3/envs/sunqian/lib/python3.9/site-packages/transformers/tokenization_utils_fast.py", line 111, in init
fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file)
Exception: data did not match any variant of untagged enum ModelWrapper at line 1251008 column 3
这是我的所有的文件config.json model-00002-of-00005.safetensors model-00005-of-00005.safetensors special_tokens_map.json
generation_config.json model-00003-of-00005.safetensors model.safetensors.index.json tokenizer_config.json
model-00001-of-00005.safetensors model-00004-of-00005.safetensors README.md tokenizer.json