Komodo-7B Sunda Lemes β€” Q4_K_M (GGUF)

Versi quantized GGUF (Q4_K_M) dari Yellow-AI-NLP/komodo-7b-base.

Detail

  • Format: GGUF
  • Quantization: Q4_K_M
  • Bahasa: Indonesia & Sunda (lemes)

Pakai dengan llama.cpp

llama-cli -m komodo7b.Q4_K_M.gguf -p "Judul: Rek ka mana?\nBalas:"
Downloads last month
23
GGUF
Model size
7B params
Architecture
llama
Hardware compatibility
Log In to view the estimation

4-bit

16-bit

Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for SutanRifkyt/komodo7b-sunda-lemess-gguf

Quantized
(9)
this model