File size: 431 Bytes
bff0e1d d0e63ad bff0e1d |
1 2 3 4 5 6 7 8 9 10 11 |
---
library_name: transformers
pipeline_tag: text-generation
license: apache-2.0
---
This is a 800M parameters model pre-trained with [QuEST](https://arxiv.org/abs/2502.05003) over 80B C4 tokens in 2:4 sparse INT4 format.
The code to verify that this model works in INT4 can be found [here](https://github.com/IST-DASLab/QuEST/blob/main/src/HadamardFourEightTesting.ipynb).
Github repository: https://github.com/IST-DASLab/QuEST |