Text model not compatable with llama.cpp

#5
by didgimon96 - opened

I downloaded illustrious_v130_fixed_clip_l_fp32-f16 and illustrious_v130_fixed_clip_g_fp32-f16 gguf text encoders, I am using the illustrious_v130-q8_0 and using illustrious_v130_vae_fp32-f16 vae, when I run it it tells me that the text encoders aren't compatible with llama.cpp
Screenshot 2025-10-01 113026

didgimon96 changed discussion title from Text encoder not compatable with llama.cpp to Text model not compatable with llama.cpp
Owner

you should use gguf node (pypi|repo|pack) instead of comfyui-gguf node

Sign up or log in to comment