Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
qingy2024
/
GRMR-V3-Q4B-GGUF
like
1
GGUF
Model card
Files
Files and versions
xet
Community
Deploy
Use this model
main
GRMR-V3-Q4B-GGUF
34.1 GB
1 contributor
History:
4 commits
qingy2024
Update README.md
3e8a0b2
verified
6 months ago
.gitattributes
2.17 kB
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-FP16.gguf
8.05 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q2_K.gguf
1.67 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q3_K_L.gguf
2.24 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q3_K_M.gguf
2.08 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q3_K_S.gguf
1.89 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q4_K_M.gguf
2.5 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q4_K_S.gguf
2.38 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q5_K_M.gguf
2.89 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q5_K_S.gguf
2.82 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q6_K.gguf
3.31 GB
xet
Upload folder using huggingface_hub
6 months ago
GRMR-V3-Q4B-Q8_0.gguf
4.28 GB
xet
Upload folder using huggingface_hub
6 months ago
README.md
511 Bytes
Update README.md
6 months ago