Transformers
GGUF
English
programming
code generation
code
coding
coder
chat
brainstorm
qwen
qwen3
qwencoder
brainstorm 20x
creative
all uses cases
Jan-V1
horror
science fiction
fantasy
Star Trek
The Next Generation
TNG
Philip K. Dick
Deckard
finetune
thinking
reasoning
unsloth
Mixture of Experts
mixture of experts
Merge
conversational
auto-patch README.md
Browse files
README.md
CHANGED
|
@@ -59,7 +59,7 @@ static quants of https://huggingface.co/DavidAU/Qwen3-2x6B-TNG-Deckard-Alpha-III
|
|
| 59 |
|
| 60 |
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF).***
|
| 61 |
|
| 62 |
-
weighted/imatrix quants
|
| 63 |
## Usage
|
| 64 |
|
| 65 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
|
@@ -76,9 +76,11 @@ more details, including on how to concatenate multi-part files.
|
|
| 76 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q3_K_S.gguf) | Q3_K_S | 4.7 | |
|
| 77 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q3_K_M.gguf) | Q3_K_M | 5.2 | lower quality |
|
| 78 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q3_K_L.gguf) | Q3_K_L | 5.6 | |
|
|
|
|
| 79 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q4_K_S.gguf) | Q4_K_S | 6.1 | fast, recommended |
|
| 80 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q4_K_M.gguf) | Q4_K_M | 6.4 | fast, recommended |
|
| 81 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q5_K_S.gguf) | Q5_K_S | 7.3 | |
|
|
|
|
| 82 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q6_K.gguf) | Q6_K | 8.7 | very good quality |
|
| 83 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q8_0.gguf) | Q8_0 | 11.2 | fast, best quality |
|
| 84 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.f16.gguf) | f16 | 21.0 | 16 bpw, overkill |
|
|
|
|
| 59 |
|
| 60 |
***For a convenient overview and download list, visit our [model page for this model](https://hf.tst.eu/model#Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF).***
|
| 61 |
|
| 62 |
+
weighted/imatrix quants are available at https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-i1-GGUF
|
| 63 |
## Usage
|
| 64 |
|
| 65 |
If you are unsure how to use GGUF files, refer to one of [TheBloke's
|
|
|
|
| 76 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q3_K_S.gguf) | Q3_K_S | 4.7 | |
|
| 77 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q3_K_M.gguf) | Q3_K_M | 5.2 | lower quality |
|
| 78 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q3_K_L.gguf) | Q3_K_L | 5.6 | |
|
| 79 |
+
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.IQ4_XS.gguf) | IQ4_XS | 5.8 | |
|
| 80 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q4_K_S.gguf) | Q4_K_S | 6.1 | fast, recommended |
|
| 81 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q4_K_M.gguf) | Q4_K_M | 6.4 | fast, recommended |
|
| 82 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q5_K_S.gguf) | Q5_K_S | 7.3 | |
|
| 83 |
+
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q5_K_M.gguf) | Q5_K_M | 7.5 | |
|
| 84 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q6_K.gguf) | Q6_K | 8.7 | very good quality |
|
| 85 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.Q8_0.gguf) | Q8_0 | 11.2 | fast, best quality |
|
| 86 |
| [GGUF](https://huggingface.co/mradermacher/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B-GGUF/resolve/main/Qwen3-2x6B-TNG-Deckard-Alpha-III-12B.f16.gguf) | f16 | 21.0 | 16 bpw, overkill |
|