Mistral-7B β fraQtl Compressed
14.48 GB β 9.84 GB. Near-zero quality loss.
| Metric | Value |
|---|---|
| Original size | 14.48 GB |
| Compressed size | 9.84 GB |
| PPL delta (wikitext-2) | +0.35 |
| Source model | mistralai/Mistral-7B-Instruct-v0.2 |
| NIAH retrieval | 3/3 preserved |
What is this?
Mistral-7B-Instruct compressed with fraQtl. Same architecture, smaller files, near-zero quality loss.
Try it live
fraQtl Demo β this model running with KV cache compression on top.
Access
Weights are gated. For access or integration support: contact@fraqtl.ai
- Downloads last month
- 151
Inference Providers NEW
This model isn't deployed by any Inference Provider. π Ask for provider support