model: update model card and specs
Browse files
README.md
CHANGED
@@ -1,5 +1,5 @@
|
|
1 |
---
|
2 |
-
license:
|
3 |
pipeline_tag: text-generation
|
4 |
language:
|
5 |
- en
|
@@ -23,15 +23,16 @@ FeatherQwen is a 72B parameter language model created through a merge of Qwen2-7
|
|
23 |
## Technical Specifications
|
24 |
|
25 |
### Architecture
|
|
|
26 |
- Models: Qwen2-72B-Instruct (base), calme2.1-72b, magnum-72b-v1
|
27 |
-
- Merged layers:
|
28 |
- Total tensors: 1,043
|
29 |
|
30 |
### Tensor Distribution
|
31 |
-
- Attention layers:
|
32 |
-
- MLP layers:
|
33 |
-
- Layer norms:
|
34 |
-
- Miscellaneous (embeddings, output):
|
35 |
|
36 |
### Merging
|
37 |
Custom script utilizing safetensors library.
|
@@ -47,8 +48,11 @@ model = AutoModelForCausalLM.from_pretrained("leafspark/FeatherQwen2-72B-v0.1",
|
|
47 |
device_map="auto",
|
48 |
torch_dtype=torch.float16)
|
49 |
tokenizer = AutoTokenizer.from_pretrained("leafspark/FeatherQwen2-72B-v0.1")
|
50 |
-
|
51 |
```
|
|
|
|
|
|
|
|
|
52 |
|
53 |
### Hardware Requirements
|
54 |
- Minimum ~140GB of storage
|
|
|
1 |
---
|
2 |
+
license: other
|
3 |
pipeline_tag: text-generation
|
4 |
language:
|
5 |
- en
|
|
|
23 |
## Technical Specifications
|
24 |
|
25 |
### Architecture
|
26 |
+
- `Qwen2ForCasualLM`
|
27 |
- Models: Qwen2-72B-Instruct (base), calme2.1-72b, magnum-72b-v1
|
28 |
+
- Merged layers: 80
|
29 |
- Total tensors: 1,043
|
30 |
|
31 |
### Tensor Distribution
|
32 |
+
- Attention layers: 560 files
|
33 |
+
- MLP layers: 240 files
|
34 |
+
- Layer norms: 160 files
|
35 |
+
- Miscellaneous (embeddings, output): 83 files
|
36 |
|
37 |
### Merging
|
38 |
Custom script utilizing safetensors library.
|
|
|
48 |
device_map="auto",
|
49 |
torch_dtype=torch.float16)
|
50 |
tokenizer = AutoTokenizer.from_pretrained("leafspark/FeatherQwen2-72B-v0.1")
|
|
|
51 |
```
|
52 |
+
### GGUFs
|
53 |
+
|
54 |
+
Find them here: [leafspark/FeatherQwen2-72B-v0.1-GGUF](https://huggingface.co/leafspark/FeatherQwen2-72B-v0.1-GGUF)
|
55 |
+
|
56 |
|
57 |
### Hardware Requirements
|
58 |
- Minimum ~140GB of storage
|