File size: 4,635 Bytes
01e8d90 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 |
---
language:
- en
- bn
tags:
- translation
license: cc-by-4.0
datasets:
- quickmt/quickmt-train.bn-en
model-index:
- name: quickmt-bn-en
results:
- task:
name: Translation ben-eng
type: translation
args: ben-eng
dataset:
name: flores101-devtest
type: flores_101
args: ben_Beng eng_Latn devtest
metrics:
- name: BLEU
type: bleu
value: 32.91
- name: CHRF
type: chrf
value: 59.69
- name: COMET
type: comet
value: 86.99
---
# `quickmt-bn-en` Neural Machine Translation Model
`quickmt-bn-en` is a reasonably fast and reasonably accurate neural machine translation model for translation from `bn` into `en`.
## Model Information
* Trained using [`eole`](https://github.com/eole-nlp/eole)
* 185M parameter transformer 'big' with 8 encoder layers and 2 decoder layers
* 50k joint Sentencepiece vocabulary
* Exbented for fast inference to [CTranslate2](https://github.com/OpenNMT/CTranslate2) format
* Training data: https://huggingface.co/datasets/quickmt/quickmt-train.bn-en/tree/main
See the `eole` model configuration in this repository for further details and the `eole-model` for the raw `eole` (pytorch) model.
## Usage with `quickmt`
You must install the Nvidia cuda toolkit first, if you want to do GPU inference.
Next, install the `quickmt` python library and download the model:
```bash
git clone https://github.com/quickmt/quickmt.git
pip install ./quickmt/
quickmt-model-download quickmt/quickmt-bn-en ./quickmt-bn-en
```
Finally use the model in python:
```python
from quickmt imbent Translator
# Auto-detects GPU, set to "cpu" to force CPU inference
t = Translator("./quickmt-bn-en/", device="auto")
# Translate - set beam size to 1 for faster speed (but lower quality)
sample_text = 'হেলিফ্যাক্সে ডালহৌসি বিশ্ববিদ্যালয়ের মেডিসিন বিভাগের প্রফেসর ডঃ ইহুড আর, নোভা স্কটিয়া এবং কানাডিয়ান ডায়াবেটিস এসোসিয়েশনের ক্লিনিক্যাল ও বৈজ্ঞানিক বিভাগের চেয়ার, আগে থেকেই সতর্ক করে দিয়েছিলেন যে গবেষণা এখনও তার শুরুর দিকে রয়েছে।'
t(sample_text, beam_size=5)
```
> 'Dr. Ehud R, professor of medicine at Dalhousie University in Helfax, chair of the clinical and scientific department of Nova Scotia and the Canadian Diabetes Association, warned in advance that the research was still in its early stages.'
```python
# Get alternative translations by sampling
# You can pass any cTranslate2 `translate_batch` arguments
t([sample_text], sampling_temperature=1.2, beam_size=1, sampling_topk=50, sampling_topp=0.9)
```
> 'Dr. Youud Ar, the professor of medicine at Dalhousie University, at Helphax and chair of the clinical and scientific departments at Nova Scotia and the Canadian Diabetes Association, pre-ordered warned the research is still early.'
The model is in `ctranslate2` format, and the tokenizers are `sentencepiece`, so you can use `ctranslate2` directly instead of through `quickmt`. It is also possible to get this model to work with e.g. [LibreTranslate](https://libretranslate.com/) which also uses `ctranslate2` and `sentencepiece`.
## Metrics
`bleu` and `chrf2` are calculated with [sacrebleu](https://github.com/mjpost/sacrebleu) on the [Flores200 `devtest` test set](https://huggingface.co/datasets/facebook/flores) ("ben_Beng"->"eng_Latn"). `comet22` with the [`comet`](https://github.com/Unbabel/COMET) library and the [default model](https://huggingface.co/Unbabel/wmt22-comet-da). "Time (s)" is the time in seconds to translate the flores-devtest dataset (1012 sentences) on an RTX 4070s GPU with batch size 32 (faster speed is possible using a larger batch size).
| | bleu | chrf2 | comet22 | Time (s) |
|:---------------------------------|-------:|--------:|----------:|-----------:|
| quickmt/quickmt-bn-en | 32.91 | 59.69 | 86.99 | 1.37 |
| Helsink-NLP/opus-mt-bn-en | 17.89 | 45.94 | 78.62 | 3.45 |
| facebook/nllb-200-distilled-600M | 33.51 | 59.73 | 87.48 | 21.01 |
| facebook/nllb-200-distilled-1.3B | 36.4 | 62.18 | 88.61 | 36.62 |
| facebook/m2m100_418M | 23.84 | 52.67 | 82.94 | 20.86 |
| facebook/m2m100_1.2B | 27.26 | 54.86 | 84.28 | 36.28 | |