Update README.md
Browse files
README.md
CHANGED
|
@@ -10,6 +10,87 @@ base_model: OuteAI/Lite-Oute-1-300M-Instruct
|
|
| 10 |
This model was converted to OpenVINO from [`OuteAI/Lite-Oute-1-300M-Instruct`](https://huggingface.co/OuteAI/Lite-Oute-1-300M-Instruct) using [optimum-intel](https://github.com/huggingface/optimum-intel)
|
| 11 |
via the [export](https://huggingface.co/spaces/echarlaix/openvino-export) space.
|
| 12 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
First make sure you have optimum-intel installed:
|
| 14 |
|
| 15 |
```bash
|
|
|
|
| 10 |
This model was converted to OpenVINO from [`OuteAI/Lite-Oute-1-300M-Instruct`](https://huggingface.co/OuteAI/Lite-Oute-1-300M-Instruct) using [optimum-intel](https://github.com/huggingface/optimum-intel)
|
| 11 |
via the [export](https://huggingface.co/spaces/echarlaix/openvino-export) space.
|
| 12 |
|
| 13 |
+
# Lite-Oute-1-300M-Instruct
|
| 14 |
+
|
| 15 |
+
Lite-Oute-1-300M-Instruct is a Lite series model based on the Mistral architecture, comprising approximately 300 million parameters. <br>
|
| 16 |
+
This model aims to improve upon our previous 150M version by increasing size and training on a more refined dataset. The primary goal of this 300 million parameter model is to offer enhanced performance while still maintaining efficiency for deployment on a variety of devices. <br>
|
| 17 |
+
With its larger size, it should provide improved context retention and coherence, however users should note that as a compact model, it still have limitations compared to larger language models. <br>
|
| 18 |
+
The model was trained on 30 billion tokens with a context length of 4096.
|
| 19 |
+
|
| 20 |
+
## Available versions:
|
| 21 |
+
<a href="https://huggingface.co/OuteAI/Lite-Oute-1-300M-Instruct">Lite-Oute-1-300M-Instruct</a> <br>
|
| 22 |
+
<a href="https://huggingface.co/OuteAI/Lite-Oute-1-300M-Instruct-GGUF">Lite-Oute-1-300M-Instruct-GGUF</a> <br>
|
| 23 |
+
<a href="https://huggingface.co/OuteAI/Lite-Oute-1-300M">Lite-Oute-1-300M</a> <br>
|
| 24 |
+
<a href="https://huggingface.co/OuteAI/Lite-Oute-1-300M-GGUF">Lite-Oute-1-300M-GGUF</a> <br>
|
| 25 |
+
|
| 26 |
+
## Chat format
|
| 27 |
+
> [!IMPORTANT]
|
| 28 |
+
> This model uses **ChatML** template. Ensure you use the correct template:
|
| 29 |
+
|
| 30 |
+
```
|
| 31 |
+
<|im_start|>system
|
| 32 |
+
[System message]<|im_end|>
|
| 33 |
+
<|im_start|>user
|
| 34 |
+
[Your question or message]<|im_end|>
|
| 35 |
+
<|im_start|>assistant
|
| 36 |
+
[The model's response]<|im_end|>
|
| 37 |
+
```
|
| 38 |
+
|
| 39 |
+
## Benchmarks:
|
| 40 |
+
<table style="text-align: left;">
|
| 41 |
+
<tr>
|
| 42 |
+
<th>Benchmark</th>
|
| 43 |
+
<th>5-shot</th>
|
| 44 |
+
<th>0-shot</th>
|
| 45 |
+
</tr>
|
| 46 |
+
<tr>
|
| 47 |
+
<td>ARC Challenge</td>
|
| 48 |
+
<td>26.37</td>
|
| 49 |
+
<td>26.02</td>
|
| 50 |
+
</tr>
|
| 51 |
+
<tr>
|
| 52 |
+
<td>ARC Easy</td>
|
| 53 |
+
<td>51.43</td>
|
| 54 |
+
<td>49.79</td>
|
| 55 |
+
</tr>
|
| 56 |
+
<tr>
|
| 57 |
+
<td>CommonsenseQA</td>
|
| 58 |
+
<td>20.72</td>
|
| 59 |
+
<td>20.31</td>
|
| 60 |
+
</tr>
|
| 61 |
+
<tr>
|
| 62 |
+
<td>HellaSWAG</td>
|
| 63 |
+
<td>34.93</td>
|
| 64 |
+
<td>34.50</td>
|
| 65 |
+
</tr>
|
| 66 |
+
<tr>
|
| 67 |
+
<td>MMLU</td>
|
| 68 |
+
<td>25.87</td>
|
| 69 |
+
<td>24.00</td>
|
| 70 |
+
</tr>
|
| 71 |
+
<tr>
|
| 72 |
+
<td>OpenBookQA</td>
|
| 73 |
+
<td>31.40</td>
|
| 74 |
+
<td>32.20</td>
|
| 75 |
+
</tr>
|
| 76 |
+
<tr>
|
| 77 |
+
<td>PIQA</td>
|
| 78 |
+
<td>65.07</td>
|
| 79 |
+
<td>65.40</td>
|
| 80 |
+
</tr>
|
| 81 |
+
<tr>
|
| 82 |
+
<td>Winogrande</td>
|
| 83 |
+
<td>52.01</td>
|
| 84 |
+
<td>53.75</td>
|
| 85 |
+
</tr>
|
| 86 |
+
</table>
|
| 87 |
+
|
| 88 |
+
|
| 89 |
+
|
| 90 |
+
|
| 91 |
+
|
| 92 |
+
|
| 93 |
+
|
| 94 |
First make sure you have optimum-intel installed:
|
| 95 |
|
| 96 |
```bash
|