Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
PrunaAI
/
codellama-CodeLlama-13b-Python-hf-AWQ-4bit-smashed
like
0
Follow
Pruna AI
256
Text Generation
Transformers
Safetensors
llama
pruna-ai
text-generation-inference
4-bit precision
awq
Model card
Files
Files and versions
xet
Community
1
Train
Deploy
Use this model
91a277e
codellama-CodeLlama-13b-Python-hf-AWQ-4bit-smashed
/
generation_config.json
sharpenb
Upload folder using huggingface_hub (
#1
)
99530f1
verified
about 1 year ago
raw
Copy download link
history
blame
Safe
132 Bytes
{
"_from_model_config"
:
true
,
"bos_token_id"
:
1
,
"do_sample"
:
true
,
"eos_token_id"
:
2
,
"transformers_version"
:
"4.40.0"
}