chatglm-6b-MNN / llm_config.json
zhaode's picture
Upload folder using huggingface_hub
c7fb48f verified
raw
history blame contribute delete
255 Bytes
{
"hidden_size": 4096,
"layer_nums": 28,
"attention_mask": "glm",
"key_value_shape": [
2,
0,
1,
32,
128
],
"prompt_template": "%s[gMASK]<sop>",
"is_visual": false,
"is_single": true
}