File size: 992 Bytes
f496035 8e5f413 f496035 8e5f413 0ff7f68 f496035 6c8bb7e 8e5f413 6c8bb7e f496035 8e5f413 f496035 6c8bb7e 8e5f413 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
base_model:
- deepseek-ai/DeepSeek-R1
library_name: transformers
tags:
- reasoning
- R1
- 1M
- fast
- Deca
- Deca-AI
- Deca-2
- Qwen
license: apache-2.0
---

The Deca 2 family of models, now generally availible, is built on cutting-edge architectures like DeepSeek R1, LLaMA 3, and Qwen 2, delivering extraordinary performance. With a focus on insane speed and high efficiency, Deca 2 is revolutionizing text generation and setting new standards in the industry. It also comes with a **1 million** context window.
As more capabilities are added, Deca 2 will evolve into a more powerful, any-to-any model in the future. While it’s focused on text generation for now, its foundation is designed to scale, bringing even more advanced functionalities to come.
**3/3 Release**
* Updated weights with better experts
* Made Deca 2 Mini Generally Availible
**2/14 Release:**
* Enhanced Instruction Following |