modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-09-11 12:33:28
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 555
values | tags
listlengths 1
4.05k
| pipeline_tag
stringclasses 55
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-09-11 12:33:10
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
gsarti/it5-small-informal-to-formal
|
gsarti
| 2022-03-09T07:47:36Z | 18 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tf",
"jax",
"t5",
"text2text-generation",
"italian",
"sequence-to-sequence",
"style-transfer",
"formality-style-transfer",
"it",
"dataset:yahoo/xformal_it",
"arxiv:2203.03759",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-02T23:29:05Z |
---
language:
- it
license: apache-2.0
tags:
- italian
- sequence-to-sequence
- style-transfer
- formality-style-transfer
datasets:
- yahoo/xformal_it
widget:
- text: "maronn qualcuno mi spieg' CHECCOSA SUCCEDE?!?!"
- text: "wellaaaaaaa, ma fraté sei proprio troppo simpatiko, grazieeee!!"
- text: "nn capisco xke tt i ragazzi lo fanno"
- text: "IT5 è SUPERMEGA BRAVISSIMO a capire tt il vernacolo italiano!!!"
metrics:
- rouge
- bertscore
model-index:
- name: it5-small-informal-to-formal
results:
- task:
type: formality-style-transfer
name: "Informal-to-formal Style Transfer"
dataset:
type: xformal_it
name: "XFORMAL (Italian Subset)"
metrics:
- type: rouge1
value: 0.646
name: "Avg. Test Rouge1"
- type: rouge2
value: 0.451
name: "Avg. Test Rouge2"
- type: rougeL
value: 0.628
name: "Avg. Test RougeL"
- type: bertscore
value: 0.702
name: "Avg. Test BERTScore"
args:
- model_type: "dbmdz/bert-base-italian-xxl-uncased"
- lang: "it"
- num_layers: 10
- rescale_with_baseline: True
- baseline_path: "bertscore_baseline_ita.tsv"
co2_eq_emissions:
emissions: "8g"
source: "Google Cloud Platform Carbon Footprint"
training_type: "fine-tuning"
geographical_location: "Eemshaven, Netherlands, Europe"
hardware_used: "1 TPU v3-8 VM"
---
# IT5 Small for Informal-to-formal Style Transfer 🧐
This repository contains the checkpoint for the [IT5 Small](https://huggingface.co/gsarti/it5-small) model fine-tuned on Informal-to-formal style transfer on the Italian subset of the XFORMAL dataset as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
## Using the model
Model checkpoints are available for usage in Tensorflow, Pytorch and JAX. They can be used directly with pipelines as:
```python
from transformers import pipelines
i2f = pipeline("text2text-generation", model='it5/it5-small-informal-to-formal')
i2f("nn capisco xke tt i ragazzi lo fanno")
>>> [{"generated_text": "non comprendo perché tutti i ragazzi agiscono così"}]
```
or loaded using autoclasses:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("it5/it5-small-informal-to-formal")
model = AutoModelForSeq2SeqLM.from_pretrained("it5/it5-small-informal-to-formal")
```
If you use this model in your research, please cite our work as:
```bibtex
@article{sarti-nissim-2022-it5,
title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
author={Sarti, Gabriele and Nissim, Malvina},
journal={ArXiv preprint 2203.03759},
url={https://arxiv.org/abs/2203.03759},
year={2022},
month={mar}
}
```
|
gsarti/it5-base-formal-to-informal
|
gsarti
| 2022-03-09T07:45:49Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"t5",
"text2text-generation",
"italian",
"sequence-to-sequence",
"style-transfer",
"formality-style-transfer",
"it",
"dataset:yahoo/xformal_it",
"arxiv:2203.03759",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-02T23:29:05Z |
---
language:
- it
license: apache-2.0
tags:
- italian
- sequence-to-sequence
- style-transfer
- formality-style-transfer
datasets:
- yahoo/xformal_it
widget:
- text: "Questa performance è a dir poco spiacevole."
- text: "In attesa di un Suo cortese riscontro, Le auguriamo un piacevole proseguimento di giornata."
- text: "Questa visione mi procura una goduria indescrivibile."
- text: "qualora ciò possa interessarti, ti pregherei di contattarmi."
metrics:
- rouge
- bertscore
model-index:
- name: it5-base-formal-to-informal
results:
- task:
type: formality-style-transfer
name: "Formal-to-informal Style Transfer"
dataset:
type: xformal_it
name: "XFORMAL (Italian Subset)"
metrics:
- type: rouge1
value: 0.652
name: "Avg. Test Rouge1"
- type: rouge2
value: 0.446
name: "Avg. Test Rouge2"
- type: rougeL
value: 0.632
name: "Avg. Test RougeL"
- type: bertscore
value: 0.665
name: "Avg. Test BERTScore"
args:
- model_type: "dbmdz/bert-base-italian-xxl-uncased"
- lang: "it"
- num_layers: 10
- rescale_with_baseline: True
- baseline_path: "bertscore_baseline_ita.tsv"
co2_eq_emissions:
emissions: "17g"
source: "Google Cloud Platform Carbon Footprint"
training_type: "fine-tuning"
geographical_location: "Eemshaven, Netherlands, Europe"
hardware_used: "1 TPU v3-8 VM"
---
# IT5 Base for Formal-to-informal Style Transfer 🤗
This repository contains the checkpoint for the [IT5 Base](https://huggingface.co/gsarti/it5-base) model fine-tuned on Formal-to-informal style transfer on the Italian subset of the XFORMAL dataset as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
## Using the model
Model checkpoints are available for usage in Tensorflow, Pytorch and JAX. They can be used directly with pipelines as:
```python
from transformers import pipelines
f2i = pipeline("text2text-generation", model='it5/it5-base-formal-to-informal')
f2i("Vi ringrazio infinitamente per vostra disponibilità")
>>> [{"generated_text": "e grazie per la vostra disponibilità!"}]
```
or loaded using autoclasses:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("it5/it5-base-formal-to-informal")
model = AutoModelForSeq2SeqLM.from_pretrained("it5/it5-base-formal-to-informal")
```
If you use this model in your research, please cite our work as:
```bibtex
@article{sarti-nissim-2022-it5,
title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
author={Sarti, Gabriele and Nissim, Malvina},
journal={ArXiv preprint 2203.03759},
url={https://arxiv.org/abs/2203.03759},
year={2022},
month={mar}
}
```
|
gsarti/mt5-small-formal-to-informal
|
gsarti
| 2022-03-09T07:44:42Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tf",
"jax",
"tensorboard",
"mt5",
"text2text-generation",
"italian",
"sequence-to-sequence",
"style-transfer",
"formality-style-transfer",
"it",
"dataset:yahoo/xformal_it",
"arxiv:2203.03759",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-02T23:29:05Z |
---
language:
- it
license: apache-2.0
tags:
- italian
- sequence-to-sequence
- style-transfer
- formality-style-transfer
datasets:
- yahoo/xformal_it
widget:
- text: "Questa performance è a dir poco spiacevole."
- text: "In attesa di un Suo cortese riscontro, Le auguriamo un piacevole proseguimento di giornata."
- text: "Questa visione mi procura una goduria indescrivibile."
- text: "qualora ciò possa interessarti, ti pregherei di contattarmi."
metrics:
- rouge
- bertscore
model-index:
- name: mt5-small-formal-to-informal
results:
- task:
type: formality-style-transfer
name: "Formal-to-informal Style Transfer"
dataset:
type: xformal_it
name: "XFORMAL (Italian Subset)"
metrics:
- type: rouge1
value: 0.651
name: "Avg. Test Rouge1"
- type: rouge2
value: 0.450
name: "Avg. Test Rouge2"
- type: rougeL
value: 0.631
name: "Avg. Test RougeL"
- type: bertscore
value: 0.666
name: "Avg. Test BERTScore"
args:
- model_type: "dbmdz/bert-base-italian-xxl-uncased"
- lang: "it"
- num_layers: 10
- rescale_with_baseline: True
- baseline_path: "bertscore_baseline_ita.tsv"
co2_eq_emissions:
emissions: "17g"
source: "Google Cloud Platform Carbon Footprint"
training_type: "fine-tuning"
geographical_location: "Eemshaven, Netherlands, Europe"
hardware_used: "1 TPU v3-8 VM"
---
# mT5 Small for Formal-to-informal Style Transfer 🤗
This repository contains the checkpoint for the [mT5 Small](https://huggingface.co/google/mt5-small) model fine-tuned on Formal-to-informal style transfer on the Italian subset of the XFORMAL dataset as part of the experiments of the paper [IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation](https://arxiv.org/abs/2203.03759) by [Gabriele Sarti](https://gsarti.com) and [Malvina Nissim](https://malvinanissim.github.io).
A comprehensive overview of other released materials is provided in the [gsarti/it5](https://github.com/gsarti/it5) repository. Refer to the paper for additional details concerning the reported scores and the evaluation approach.
## Using the model
Model checkpoints are available for usage in Tensorflow, Pytorch and JAX. They can be used directly with pipelines as:
```python
from transformers import pipelines
f2i = pipeline("text2text-generation", model='it5/mt5-small-formal-to-informal')
f2i("Vi ringrazio infinitamente per vostra disponibilità")
>>> [{"generated_text": "e grazie per la vostra disponibilità!"}]
```
or loaded using autoclasses:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("it5/mt5-small-formal-to-informal")
model = AutoModelForSeq2SeqLM.from_pretrained("it5/mt5-small-formal-to-informal")
```
If you use this model in your research, please cite our work as:
```bibtex
@article{sarti-nissim-2022-it5,
title={{IT5}: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation},
author={Sarti, Gabriele and Nissim, Malvina},
journal={ArXiv preprint 2203.03759},
url={https://arxiv.org/abs/2203.03759},
year={2022},
month={mar}
}
```
|
M-Quan/wav2vec2-demo
|
M-Quan
| 2022-03-09T06:20:54Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-09T01:26:39Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: wav2vec2-demo
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-demo
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4239
- Wer: 0.3508
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 20
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.4093 | 4.0 | 500 | 1.2405 | 0.8685 |
| 0.5597 | 8.0 | 1000 | 0.4538 | 0.4437 |
| 0.2113 | 12.0 | 1500 | 0.4106 | 0.3749 |
| 0.1188 | 16.0 | 2000 | 0.4609 | 0.3775 |
| 0.0776 | 20.0 | 2500 | 0.4239 | 0.3508 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.10.3
|
nickmuchi/vit-base-xray-pneumonia
|
nickmuchi
| 2022-03-09T05:43:35Z | 153 | 4 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"vit",
"image-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
image-classification
| 2022-03-09T02:04:50Z |
---
license: apache-2.0
tags:
- image-classification
- generated_from_trainer
datasets:
- chest xrays
widget:
- src: https://drive.google.com/uc?id=1yqnhD4Wjt4Y_NGLtijTGGaaw9GL497kQ
example_title: PNEUMONIA
- src: https://drive.google.com/uc?id=1xjcIEDb8kuSd4wF44gCEgsc0PfRvs53m
example_title: NORMAL
metrics:
- accuracy
model-index:
- name: vit-base-xray-pneumonia
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-xray-pneumonia
This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the [chest-xray-pneumonia](https://www.kaggle.com/paultimothymooney/chest-xray-pneumonia) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3387
- Accuracy: 0.9006
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0002
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.1233 | 0.31 | 100 | 1.1662 | 0.6651 |
| 0.0868 | 0.61 | 200 | 0.3387 | 0.9006 |
| 0.1387 | 0.92 | 300 | 0.5297 | 0.8237 |
| 0.1264 | 1.23 | 400 | 0.4566 | 0.8590 |
| 0.0829 | 1.53 | 500 | 0.6832 | 0.8285 |
| 0.0734 | 1.84 | 600 | 0.4886 | 0.8157 |
| 0.0132 | 2.15 | 700 | 1.3639 | 0.7292 |
| 0.0877 | 2.45 | 800 | 0.5258 | 0.8846 |
| 0.0516 | 2.76 | 900 | 0.8772 | 0.8013 |
| 0.0637 | 3.07 | 1000 | 0.4947 | 0.8558 |
| 0.0022 | 3.37 | 1100 | 1.0062 | 0.8045 |
| 0.0555 | 3.68 | 1200 | 0.7822 | 0.8285 |
| 0.0405 | 3.99 | 1300 | 1.9288 | 0.6779 |
| 0.0012 | 4.29 | 1400 | 1.2153 | 0.7981 |
| 0.0034 | 4.6 | 1500 | 1.8931 | 0.7308 |
| 0.0339 | 4.91 | 1600 | 0.9071 | 0.8590 |
| 0.0013 | 5.21 | 1700 | 1.6266 | 0.7580 |
| 0.0373 | 5.52 | 1800 | 1.5252 | 0.7676 |
| 0.001 | 5.83 | 1900 | 1.2748 | 0.7869 |
| 0.0005 | 6.13 | 2000 | 1.2103 | 0.8061 |
| 0.0004 | 6.44 | 2100 | 1.3133 | 0.7981 |
| 0.0004 | 6.75 | 2200 | 1.2200 | 0.8045 |
| 0.0004 | 7.06 | 2300 | 1.2834 | 0.7933 |
| 0.0004 | 7.36 | 2400 | 1.3080 | 0.7949 |
| 0.0003 | 7.67 | 2500 | 1.3814 | 0.7917 |
| 0.0004 | 7.98 | 2600 | 1.2853 | 0.7965 |
| 0.0003 | 8.28 | 2700 | 1.3644 | 0.7933 |
| 0.0003 | 8.59 | 2800 | 1.3137 | 0.8013 |
| 0.0003 | 8.9 | 2900 | 1.3507 | 0.7997 |
| 0.0003 | 9.2 | 3000 | 1.3751 | 0.7997 |
| 0.0003 | 9.51 | 3100 | 1.3884 | 0.7981 |
| 0.0003 | 9.82 | 3200 | 1.3831 | 0.7997 |
## Example Images
#### Pneumonia Chest X-Ray

#### Normal Chest X-Ray

### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
megagonlabs/cocosum-cont-self
|
megagonlabs
| 2022-03-09T05:18:29Z | 0 | 0 | null |
[
"license:bsd-3-clause",
"region:us"
] | null | 2022-03-07T23:27:29Z |
---
license: bsd-3-clause
---
See original GitHub repo for more details [here](https://github.com/megagonlabs/cocosum)
|
megagonlabs/cocosum-comm-self
|
megagonlabs
| 2022-03-09T05:18:23Z | 0 | 0 | null |
[
"license:bsd-3-clause",
"region:us"
] | null | 2022-03-07T23:31:25Z |
---
license: bsd-3-clause
---
See original GitHub repo for more details [here](https://github.com/megagonlabs/cocosum)
|
megagonlabs/cocosum-comm-few
|
megagonlabs
| 2022-03-09T05:18:11Z | 0 | 0 | null |
[
"license:bsd-3-clause",
"region:us"
] | null | 2022-03-07T23:32:02Z |
---
license: bsd-3-clause
---
See original GitHub repo for more details [here](https://github.com/megagonlabs/cocosum)
|
aaraki/distilbert-base-uncased-finetuned-cola
|
aaraki
| 2022-03-09T02:08:47Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-09T01:56:17Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: distilbert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.40967417350821667
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5026
- Matthews Correlation: 0.4097
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.5335 | 1.0 | 535 | 0.5026 | 0.4097 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
KoichiYasuoka/roberta-base-ukrainian
|
KoichiYasuoka
| 2022-03-08T23:33:19Z | 22 | 2 |
transformers
|
[
"transformers",
"pytorch",
"roberta",
"fill-mask",
"ukrainian",
"masked-lm",
"ubertext",
"uk",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2022-03-08T23:25:41Z |
---
language:
- "uk"
tags:
- "ukrainian"
- "masked-lm"
- "ubertext"
license: "cc-by-sa-4.0"
pipeline_tag: "fill-mask"
mask_token: "[MASK]"
---
# roberta-base-ukrainian
## Model Description
This is a RoBERTa model pre-trained on [Корпус UberText](https://lang.org.ua/uk/corpora/#anchor4). You can fine-tune `roberta-base-ukrainian` for downstream tasks, such as [POS-tagging](https://huggingface.co/KoichiYasuoka/roberta-base-ukrainian-upos), dependency-parsing, and so on.
## How to Use
```py
from transformers import AutoTokenizer,AutoModelForMaskedLM
tokenizer=AutoTokenizer.from_pretrained("KoichiYasuoka/roberta-base-ukrainian")
model=AutoModelForMaskedLM.from_pretrained("KoichiYasuoka/roberta-base-ukrainian")
```
|
sanchit-gandhi/wav2vec2-2-rnd
|
sanchit-gandhi
| 2022-03-08T22:30:32Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"speech-encoder-decoder",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:librispeech_asr",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-06T15:38:27Z |
---
tags:
- generated_from_trainer
datasets:
- librispeech_asr
model-index:
- name: ''
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
#
This model was trained from scratch on the librispeech_asr dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9599
- Wer: 0.1442
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 20.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 6.1431 | 1.68 | 1500 | 6.0870 | 1.4277 |
| 5.498 | 3.36 | 3000 | 5.5505 | 1.6318 |
| 3.575 | 5.04 | 4500 | 3.7856 | 0.6683 |
| 1.7532 | 6.73 | 6000 | 2.4603 | 0.3576 |
| 1.6379 | 8.41 | 7500 | 1.8847 | 0.2932 |
| 1.3145 | 10.09 | 9000 | 1.5027 | 0.2222 |
| 0.8389 | 11.77 | 10500 | 1.2637 | 0.1855 |
| 0.9239 | 13.45 | 12000 | 1.1424 | 0.1683 |
| 0.6666 | 15.13 | 13500 | 1.0562 | 0.1593 |
| 0.5258 | 16.82 | 15000 | 0.9911 | 0.1489 |
| 0.4733 | 18.5 | 16500 | 0.9599 | 0.1442 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu113
- Datasets 1.18.3
- Tokenizers 0.11.0
|
huggingartists/lady-gaga
|
huggingartists
| 2022-03-08T20:28:19Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lady-gaga",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-02T23:29:05Z |
---
language: en
datasets:
- huggingartists/lady-gaga
tags:
- huggingartists
- lyrics
- lm-head
- causal-lm
widget:
- text: "I am"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/e7e76c378cb43b4b1ff03947d5c0481a.400x400x1.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lady Gaga</div>
<a href="https://genius.com/artists/lady-gaga">
<div style="text-align: center; font-size: 14px;">@lady-gaga</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lady Gaga.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lady-gaga).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lady-gaga")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/17c0d4ej/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lady Gaga's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2j7yp9qd) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2j7yp9qd/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lady-gaga')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lady-gaga")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lady-gaga")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
OrfeasTsk/bert-base-uncased-finetuned-squadv2
|
OrfeasTsk
| 2022-03-08T18:35:59Z | 7 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2022-03-07T14:48:36Z |
{ 'max_seq_length': 384,
'batch_size': 8,
'learning_rate': {'val': 5e-5, 'schelduler': 'Linear'},
'max_clip_norm': None,
'epochs': 2
}
|
OrfeasTsk/bert-base-uncased-finetuned-triviaqa-large-batch
|
OrfeasTsk
| 2022-03-08T18:35:17Z | 6 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2022-03-07T20:17:41Z |
{ 'max_seq_length': 384,
'batch_size': 24,
'learning_rate': {'val': 3e-5, 'schelduler': 'Linear'},
'max_clip_norm': None,
'epochs': 2
}
|
OrfeasTsk/bert-base-uncased-finetuned-squadv2-large-batch
|
OrfeasTsk
| 2022-03-08T18:34:54Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2022-03-08T18:23:33Z |
{ 'max_seq_length': 384,
'batch_size': 24,
'learning_rate': {'val': 3e-5, 'schelduler': 'Linear'},
'max_clip_norm': None,
'epochs': 2
}
|
z-uo/bert-qasper
|
z-uo
| 2022-03-08T18:31:21Z | 68 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"en",
"dataset:z-uo/qasper-squad",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2022-03-08T09:28:02Z |
---
language: en
datasets:
- z-uo/qasper-squad
---
# bert-base for QA with qasper
Train from bert-base-uncased.
How to use by python code:
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
# Load model with pipeline
model_name = "z-uo/bert-qasper"
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
# Get predictions
QA_input = {
'question': 'what they propose?',
'context': "In this paper, we provide an innovative contribution in the research domain dedicated to crop mapping by exploiting the of Sentinel-2 satellite images time series, with the specific aim to extract information on 'where and when' crops are grown. The final goal is to set up a workflow able to reliably identify (classify) the different crops that are grown in a given area by exploiting an end-to-end (3+2)D convolutional neural network (CNN) for semantic segmentation. The method also has the ambition to provide information, at pixel level, regarding the period in which a given crop is cultivated during the season. To this end, we propose a solution called Class Activation Interval (CAI) which allows us to interpret, for each pixel, the reasoning made by CNN in the classification determining in which time interval, of the input time series, the class is likely to be present or not. Our experiments, using a public domain dataset, show that the approach is able to accurately detect crop classes with an overall accuracy of about 93% and that the network can detect discriminatory time intervals in which crop is cultivated. These results have twofold importance: (i) demonstrate the ability of the network to correctly interpret the investigated physical process (i.e., bare soil condition, plant growth, senescence and harvesting according to specific cultivated variety) and (ii) provide further information to the end-user (e.g., the presence of crops and its temporal dynamics)."
}
res = nlp(QA_input)
# Load model & tokenizer without pipeline
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
|
Ameer05/bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch-tweak-lr-8-100-1
|
Ameer05
| 2022-03-08T16:43:01Z | 9 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bart",
"text2text-generation",
"summarization",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
summarization
| 2022-03-08T08:57:44Z |
---
tags:
- summarization
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch-tweak-lr-8-100-1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch-tweak-lr-8-100-1
This model is a fine-tuned version of [Ameer05/model-token-repo](https://huggingface.co/Ameer05/model-token-repo) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6315
- Rouge1: 61.441
- Rouge2: 52.9403
- Rougel: 58.3426
- Rougelsum: 60.8249
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| No log | 0.91 | 5 | 2.0139 | 53.4301 | 46.6698 | 50.644 | 53.3985 |
| No log | 1.91 | 10 | 1.6309 | 61.4629 | 53.8884 | 59.0867 | 60.8823 |
| No log | 2.91 | 15 | 1.5379 | 61.2938 | 53.7208 | 59.0644 | 60.7381 |
| No log | 3.91 | 20 | 1.4470 | 63.2667 | 55.9273 | 60.5112 | 62.7538 |
| 1.5454 | 4.91 | 25 | 1.4353 | 62.7166 | 54.8328 | 60.0101 | 62.1378 |
| 1.5454 | 5.91 | 30 | 1.4411 | 59.7469 | 51.9068 | 57.036 | 58.9474 |
| 1.5454 | 6.91 | 35 | 1.5195 | 64.152 | 57.1447 | 61.362 | 63.5951 |
| 1.5454 | 7.91 | 40 | 1.6174 | 60.1464 | 51.5654 | 57.1676 | 59.4405 |
| 0.5429 | 8.91 | 45 | 1.7451 | 61.9696 | 53.6421 | 58.5884 | 61.3286 |
| 0.5429 | 9.91 | 50 | 1.9081 | 60.3296 | 52.3052 | 57.6518 | 59.7854 |
| 0.5429 | 10.91 | 55 | 1.9721 | 61.5597 | 51.9027 | 57.1184 | 60.6717 |
| 0.5429 | 11.91 | 60 | 2.0471 | 61.2222 | 53.9475 | 58.725 | 60.6668 |
| 0.5429 | 12.91 | 65 | 2.1422 | 60.1915 | 52.0627 | 56.9955 | 59.438 |
| 0.1506 | 13.91 | 70 | 2.1542 | 61.6915 | 53.045 | 58.1727 | 60.8765 |
| 0.1506 | 14.91 | 75 | 2.1885 | 59.8069 | 51.6543 | 56.8112 | 59.2055 |
| 0.1506 | 15.91 | 80 | 2.3146 | 61.695 | 53.2666 | 57.9003 | 61.1108 |
| 0.1506 | 16.91 | 85 | 2.3147 | 60.4482 | 52.1694 | 57.0649 | 59.7882 |
| 0.0452 | 17.91 | 90 | 2.1731 | 60.0259 | 51.5046 | 56.7399 | 59.2955 |
| 0.0452 | 18.91 | 95 | 2.2690 | 60.0534 | 52.4819 | 57.1631 | 59.5056 |
| 0.0452 | 19.91 | 100 | 2.2990 | 58.0737 | 48.8098 | 54.5684 | 57.3187 |
| 0.0452 | 20.91 | 105 | 2.2704 | 61.8982 | 53.9077 | 58.6909 | 61.4252 |
| 0.0267 | 21.91 | 110 | 2.3012 | 62.0174 | 53.5427 | 58.5278 | 61.1921 |
| 0.0267 | 22.91 | 115 | 2.3569 | 61.6327 | 53.7387 | 58.8908 | 61.1623 |
| 0.0267 | 23.91 | 120 | 2.3579 | 60.228 | 52.3747 | 58.1448 | 59.7322 |
| 0.0267 | 24.91 | 125 | 2.3389 | 60.4902 | 51.7935 | 57.0689 | 59.7132 |
| 0.0267 | 25.91 | 130 | 2.3168 | 58.8469 | 50.3181 | 55.7386 | 58.3598 |
| 0.0211 | 26.91 | 135 | 2.4147 | 59.4225 | 50.8405 | 56.503 | 58.7221 |
| 0.0211 | 27.91 | 140 | 2.3631 | 59.7489 | 51.2137 | 57.3204 | 59.3348 |
| 0.0211 | 28.91 | 145 | 2.3850 | 60.1718 | 51.4176 | 57.2152 | 59.5157 |
| 0.0211 | 29.91 | 150 | 2.4610 | 60.1433 | 51.433 | 56.6256 | 59.3265 |
| 0.0175 | 30.91 | 155 | 2.4400 | 58.8345 | 49.7031 | 55.3079 | 57.9236 |
| 0.0175 | 31.91 | 160 | 2.4506 | 59.209 | 50.1626 | 55.6451 | 58.5791 |
| 0.0175 | 32.91 | 165 | 2.4316 | 59.7713 | 50.8999 | 56.4235 | 58.9845 |
| 0.0175 | 33.91 | 170 | 2.2781 | 60.1822 | 51.9435 | 57.4586 | 59.6766 |
| 0.0175 | 34.91 | 175 | 2.3849 | 58.2328 | 49.2106 | 55.1516 | 57.5072 |
| 0.0141 | 35.91 | 180 | 2.4872 | 58.4916 | 50.3345 | 55.5991 | 58.1131 |
| 0.0141 | 36.91 | 185 | 2.4883 | 59.0957 | 49.76 | 55.3567 | 58.076 |
| 0.0141 | 37.91 | 190 | 2.4327 | 58.091 | 48.8628 | 54.8678 | 57.5406 |
| 0.0141 | 38.91 | 195 | 2.4998 | 57.7428 | 48.7366 | 54.2166 | 56.7643 |
| 0.0089 | 39.91 | 200 | 2.4107 | 60.1662 | 51.9832 | 57.1372 | 59.6989 |
| 0.0089 | 40.91 | 205 | 2.4700 | 58.2159 | 49.3934 | 54.9265 | 57.4126 |
| 0.0089 | 41.91 | 210 | 2.4833 | 58.7434 | 49.6619 | 55.5239 | 57.9562 |
| 0.0089 | 42.91 | 215 | 2.4703 | 60.2984 | 51.3168 | 56.9082 | 59.3958 |
| 0.0062 | 43.91 | 220 | 2.5306 | 60.5455 | 52.1189 | 57.3213 | 60.0232 |
| 0.0062 | 44.91 | 225 | 2.5181 | 60.2149 | 51.2187 | 56.1935 | 59.3471 |
| 0.0062 | 45.91 | 230 | 2.4871 | 59.8013 | 51.6114 | 56.0911 | 59.0902 |
| 0.0062 | 46.91 | 235 | 2.4811 | 58.0271 | 48.9441 | 54.3108 | 57.3647 |
| 0.0062 | 47.91 | 240 | 2.5290 | 62.5087 | 54.6149 | 59.638 | 62.0455 |
| 0.0072 | 48.91 | 245 | 2.5194 | 58.7193 | 49.9679 | 55.6517 | 58.1569 |
| 0.0072 | 49.91 | 250 | 2.5708 | 58.4626 | 49.5257 | 54.5032 | 58.1413 |
| 0.0072 | 50.91 | 255 | 2.6449 | 58.446 | 49.4625 | 55.1092 | 58.03 |
| 0.0072 | 51.91 | 260 | 2.5592 | 58.859 | 49.4398 | 55.1503 | 57.9663 |
| 0.0056 | 52.91 | 265 | 2.5086 | 59.7322 | 51.3051 | 56.5401 | 59.2726 |
| 0.0056 | 53.91 | 270 | 2.4846 | 57.8603 | 48.2408 | 54.3847 | 57.115 |
| 0.0056 | 54.91 | 275 | 2.5509 | 58.9506 | 50.045 | 55.6658 | 58.3618 |
| 0.0056 | 55.91 | 280 | 2.5032 | 60.2524 | 51.8167 | 56.98 | 59.7506 |
| 0.0056 | 56.91 | 285 | 2.5012 | 60.0596 | 51.4924 | 56.7181 | 59.5037 |
| 0.0054 | 57.91 | 290 | 2.5176 | 61.0622 | 52.6235 | 57.9317 | 60.5036 |
| 0.0054 | 58.91 | 295 | 2.5024 | 62.9246 | 54.8544 | 59.9824 | 62.5584 |
| 0.0054 | 59.91 | 300 | 2.5687 | 62.2602 | 53.9673 | 58.9862 | 61.5837 |
| 0.0054 | 60.91 | 305 | 2.5890 | 62.5706 | 54.227 | 59.2032 | 62.125 |
| 0.0036 | 61.91 | 310 | 2.5454 | 62.1565 | 53.2585 | 58.7169 | 61.3943 |
| 0.0036 | 62.91 | 315 | 2.5629 | 62.8292 | 54.6781 | 59.9889 | 62.254 |
| 0.0036 | 63.91 | 320 | 2.5581 | 58.8394 | 50.4421 | 56.0742 | 58.1945 |
| 0.0036 | 64.91 | 325 | 2.5532 | 59.5814 | 51.1335 | 56.5841 | 59.196 |
| 0.0031 | 65.91 | 330 | 2.5826 | 59.0485 | 50.3992 | 55.5283 | 58.3757 |
| 0.0031 | 66.91 | 335 | 2.5815 | 61.4832 | 52.7977 | 57.7351 | 60.9888 |
| 0.0031 | 67.91 | 340 | 2.5865 | 61.7836 | 53.6797 | 58.6743 | 61.3765 |
| 0.0031 | 68.91 | 345 | 2.6007 | 61.2253 | 52.8781 | 57.7006 | 60.717 |
| 0.0031 | 69.91 | 350 | 2.6210 | 60.717 | 52.4933 | 57.5089 | 60.4196 |
| 0.0035 | 70.91 | 355 | 2.6169 | 61.3491 | 53.3932 | 58.2288 | 60.8793 |
| 0.0035 | 71.91 | 360 | 2.6025 | 62.0101 | 54.0289 | 59.0822 | 61.7202 |
| 0.0035 | 72.91 | 365 | 2.5705 | 61.2227 | 52.9937 | 58.2493 | 60.6631 |
| 0.0035 | 73.91 | 370 | 2.5623 | 59.1718 | 50.7827 | 56.1851 | 58.7118 |
| 0.002 | 74.91 | 375 | 2.5536 | 58.4201 | 49.6923 | 55.0398 | 57.7707 |
| 0.002 | 75.91 | 380 | 2.5478 | 60.2307 | 51.7503 | 57.3173 | 59.692 |
| 0.002 | 76.91 | 385 | 2.6039 | 58.7637 | 49.741 | 55.5341 | 58.0784 |
| 0.002 | 77.91 | 390 | 2.6371 | 59.3929 | 50.6444 | 55.9887 | 58.813 |
| 0.002 | 78.91 | 395 | 2.6238 | 59.0572 | 50.605 | 55.6631 | 58.4366 |
| 0.0019 | 79.91 | 400 | 2.5783 | 57.9852 | 49.2588 | 54.822 | 57.4643 |
| 0.0019 | 80.91 | 405 | 2.5982 | 58.0218 | 49.1651 | 54.9876 | 57.4066 |
| 0.0019 | 81.91 | 410 | 2.6141 | 60.3133 | 51.5723 | 56.9476 | 59.715 |
| 0.0019 | 82.91 | 415 | 2.5904 | 60.8199 | 51.8956 | 58.406 | 60.323 |
| 0.0017 | 83.91 | 420 | 2.5718 | 60.3449 | 51.1433 | 57.6984 | 59.7513 |
| 0.0017 | 84.91 | 425 | 2.5737 | 60.151 | 51.1986 | 57.3376 | 59.378 |
| 0.0017 | 85.91 | 430 | 2.5807 | 60.9273 | 52.2469 | 58.2038 | 60.1642 |
| 0.0017 | 86.91 | 435 | 2.5900 | 60.1846 | 51.6144 | 57.5407 | 59.5109 |
| 0.0011 | 87.91 | 440 | 2.6066 | 62.0776 | 53.6022 | 59.157 | 61.6201 |
| 0.0011 | 88.91 | 445 | 2.6231 | 61.8822 | 53.5232 | 58.965 | 61.401 |
| 0.0011 | 89.91 | 450 | 2.6273 | 60.3358 | 51.9941 | 57.3823 | 59.7729 |
| 0.0011 | 90.91 | 455 | 2.6194 | 60.0196 | 51.6134 | 57.1357 | 59.4594 |
| 0.0011 | 91.91 | 460 | 2.6118 | 60.6898 | 52.1328 | 57.3076 | 60.0351 |
| 0.0015 | 92.91 | 465 | 2.6032 | 61.2119 | 52.5034 | 57.8098 | 60.6634 |
| 0.0015 | 93.91 | 470 | 2.6040 | 61.4812 | 52.8197 | 57.9668 | 60.8767 |
| 0.0015 | 94.91 | 475 | 2.6158 | 61.4046 | 52.8905 | 57.8958 | 60.804 |
| 0.0015 | 95.91 | 480 | 2.6280 | 62.1764 | 53.8521 | 58.8608 | 61.6138 |
| 0.0012 | 96.91 | 485 | 2.6304 | 62.2028 | 53.8967 | 58.8976 | 61.6409 |
| 0.0012 | 97.91 | 490 | 2.6328 | 61.7371 | 53.3908 | 58.4107 | 61.1382 |
| 0.0012 | 98.91 | 495 | 2.6331 | 61.441 | 52.9403 | 58.3426 | 60.8249 |
| 0.0012 | 99.91 | 500 | 2.6315 | 61.441 | 52.9403 | 58.3426 | 60.8249 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.9.1
- Datasets 1.18.4
- Tokenizers 0.10.3
|
Rawat29/distilroberta-base-finetuned-wikitext2
|
Rawat29
| 2022-03-08T16:19:47Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"fill-mask",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2022-03-08T15:23:19Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: distilroberta-base-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilroberta-base-finetuned-wikitext2
This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.8512
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.084 | 1.0 | 2406 | 1.9229 |
| 1.9999 | 2.0 | 4812 | 1.8832 |
| 1.9616 | 3.0 | 7218 | 1.8173 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
Rayyan/M
|
Rayyan
| 2022-03-08T15:59:51Z | 0 | 0 | null |
[
"license:cc-by-nc-sa-4.0",
"region:us"
] | null | 2022-03-08T15:59:51Z |
---
license: cc-by-nc-sa-4.0
---
|
HuggingAlex1247/distilbert-base-german-europeana-cased-germeval_14
|
HuggingAlex1247
| 2022-03-08T15:58:06Z | 7 | 0 |
transformers
|
[
"transformers",
"tf",
"tensorboard",
"distilbert",
"token-classification",
"de",
"dataset:germeval_14",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-08T15:23:57Z |
---
language:
- de
license: mit
datasets:
- germeval_14
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: distilbert-base-german-europeana-cased-germeval_14
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: germeval_14
type: germeval_14
args: default
metrics:
- name: precision
type: precision
value: 0.7437097717963721
- name: recall
type: recall
value: 0.7571485305798252
- name: f1
type: f1
value: 0.7503689855357669
- name: accuracy
type: accuracy
value: 0.9541357066185896
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-german-europeana-cased-germeval_14
This model is a fine-tuned version of [dbmdz/distilbert-base-german-europeana-cased](https://huggingface.co/dbmdz/distilbert-base-german-europeana-cased) on the germeval_14 dataset.
It achieves the following results on the evaluation set:
- precision: 0.7437
- recall: 0.7571
- f1: 0.7504
- accuracy: 0.9541
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- num_train_epochs: 10
- train_batch_size: 8
- eval_batch_size: 8
- learning_rate: 1e-05
- weight_decay_rate: 0.01
- num_warmup_steps: 0
- fp16: True
### Framework versions
- Transformers 4.17.0
- Datasets 1.18.0
- Tokenizers 0.11.6
|
mmgyorke/vit-world-landmarks
|
mmgyorke
| 2022-03-08T14:58:47Z | 72 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"vit",
"image-classification",
"huggingpics",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
image-classification
| 2022-03-08T14:40:15Z |
---
tags:
- image-classification
- pytorch
- huggingpics
metrics:
- accuracy
model-index:
- name: vit-world-landmarks
results:
- task:
name: Image Classification
type: image-classification
metrics:
- name: Accuracy
type: accuracy
value: 1.0
---
# vit-world-landmarks
Autogenerated by HuggingPics🤗🖼️
Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb).
Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics).
## Example Images
#### arc de triomphe

#### big ben

#### la sagrada familia

#### leaning tower of pisa

#### taj mahal

|
AlekseyKorshuk/bert-finetuned-ner
|
AlekseyKorshuk
| 2022-03-08T14:27:56Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"token-classification",
"generated_from_trainer",
"dataset:wnut_17",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-08T12:40:05Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- wnut_17
model-index:
- name: bert-finetuned-ner
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the wnut_17 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 425 | 0.3961 | 0.5707 | 0.2847 | 0.3799 | 0.9058 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
Noricum/wav2vec2-large-xlsr-53-german
|
Noricum
| 2022-03-08T13:44:49Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"automatic-speech-recognition",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-02T23:29:04Z |
# Wav2vec2 German Model
This model has been fine-tuned on the wav2vec-large-xlsr-53 with the German CommonVoice dataset.
It achieves a 11.26 WER on the full test dataset.
It was basically trained with the code provided by [Max Idahl](https://huggingface.co/maxidl/wav2vec2-large-xlsr-german) with small adjustments in data preprocessing and on training parameters.
You can use it to transcribe your own files by the following code. Please note, that your input file must be *.wav, encoded in 16 kHz and be single channel. To convert an audio file using ffmpeg use: "ffmpeg -i input.wav -ar 16000 -ac 1 output.wav". The transcribe process is very memory consuming (around 10GB per 10 seconds). If the script ends with "Killed" it means the Python interpreter ran out of memory. In this case, try with a shorter audio file.
```python
# !pip3 install transformers torch soundfile
import soundfile as sf
import torch
from transformers import Wav2Vec2ForCTC, Wav2Vec2Tokenizer
# load pretrained model
tokenizer = Wav2Vec2Tokenizer.from_pretrained("Noricum/wav2vec2-large-xlsr-53-german")
model = Wav2Vec2ForCTC.from_pretrained("Noricum/wav2vec2-large-xlsr-53-german")
#load audio
audio_input, _ = sf.read("/path/to/your/audio.wav")
# transcribe
input_values = tokenizer(audio_input, return_tensors="pt").input_values
logits = model(input_values).logits
predicted_ids = torch.argmax(logits, dim=-1)
transcription = tokenizer.batch_decode(predicted_ids)[0]
print(str(transcription))
```
To evaluate the model on the full CommonVoice test dataset, run this script:
```python
import re
import torch
import torchaudio
from datasets import load_dataset, load_metric
from transformers import Wav2Vec2ForCTC, Wav2Vec2Processor
test_dataset = load_dataset("common_voice", "de", split="test") # use "test[:1%]" for 1% sample
wer = load_metric("wer")
processor = Wav2Vec2Processor.from_pretrained("Noricum/wav2vec2-large-xlsr-53-german")
model = Wav2Vec2ForCTC.from_pretrained("Noricum/wav2vec2-large-xlsr-53-german")
model.to("cuda")
chars_to_ignore_regex = '[\\\\,\\\\?\\\\.\\\\!\\\\-\\\\;\\\\:\\\\"\\\\“]'
resampler = torchaudio.transforms.Resample(48_000, 16_000)
# Preprocessing the datasets.
# We need to read the aduio files as arrays
def speech_file_to_array_fn(batch):
batch["sentence"] = re.sub(chars_to_ignore_regex, '', batch["sentence"]).lower()
speech_array, sampling_rate = torchaudio.load(batch["path"])
batch["speech"] = resampler(speech_array).squeeze().numpy()
return batch
test_dataset = test_dataset.map(speech_file_to_array_fn)
# Preprocessing the datasets.
# We need to read the audio files as arrays
def evaluate(batch):
inputs = processor(batch["speech"], sampling_rate=16_000, return_tensors="pt", padding=True)
with torch.no_grad():
logits = model(inputs.input_values.to("cuda"), attention_mask=inputs.attention_mask.to("cuda")).logits
pred_ids = torch.argmax(logits, dim=-1)
batch["pred_strings"] = processor.batch_decode(pred_ids)
return batch
result = test_dataset.map(evaluate, batched=True, batch_size=4) # batch_size=8 -> requires ~14.5GB GPU memory
# Chunked version, see https://discuss.huggingface.co/t/spanish-asr-fine-tuning-wav2vec2/4586/5:
import jiwer
def chunked_wer(targets, predictions, chunk_size=None):
if chunk_size is None: return jiwer.wer(targets, predictions)
start = 0
end = chunk_size
H, S, D, I = 0, 0, 0, 0
while start < len(targets):
chunk_metrics = jiwer.compute_measures(targets[start:end], predictions[start:end])
H = H + chunk_metrics["hits"]
S = S + chunk_metrics["substitutions"]
D = D + chunk_metrics["deletions"]
I = I + chunk_metrics["insertions"]
start += chunk_size
end += chunk_size
return float(S + D + I) / float(H + S + D)
print("Total (chunk_size=1000), WER: {:2f}".format(100 * chunked_wer(result["pred_strings"], result["sentence"], chunk_size=1000)))
```
Output: Total (chunk_size=1000), WER: 11.256522
|
huggingtweets/feufillet-greatestquotes-hostagekiller
|
huggingtweets
| 2022-03-08T13:28:29Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"huggingtweets",
"en",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-08T13:26:49Z |
---
language: en
thumbnail: http://www.huggingtweets.com/feufillet-greatestquotes-hostagekiller/1646746104400/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1197820815636672513/JSCZmPDf_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1473236995497500675/FtwXDZld_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/378800000520968918/d38fd96468e9ba14c1f9f022eb0c4e61_400x400.png')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">sexy.funny.cute.pix & HUSSY2K. & Great Minds Quotes</div>
<div style="text-align: center; font-size: 14px;">@feufillet-greatestquotes-hostagekiller</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from sexy.funny.cute.pix & HUSSY2K. & Great Minds Quotes.
| Data | sexy.funny.cute.pix | HUSSY2K. | Great Minds Quotes |
| --- | --- | --- | --- |
| Tweets downloaded | 3091 | 3191 | 3200 |
| Retweets | 149 | 865 | 0 |
| Short tweets | 576 | 374 | 2 |
| Tweets kept | 2366 | 1952 | 3198 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3afdee2s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @feufillet-greatestquotes-hostagekiller's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/25fcmxer) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/25fcmxer/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/feufillet-greatestquotes-hostagekiller')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
|
alirezafarashah/wav2vec2-base-ks
|
alirezafarashah
| 2022-03-08T11:41:09Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"audio-classification",
"generated_from_trainer",
"dataset:superb",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
audio-classification
| 2022-03-08T11:33:12Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- superb
metrics:
- accuracy
model-index:
- name: wav2vec2-base-ks
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-ks
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the superb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0982
- Accuracy: 0.9825
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Accuracy | Validation Loss |
|:-------------:|:-----:|:----:|:--------:|:---------------:|
| 0.8465 | 1.0 | 399 | 0.8179 | 0.7516 |
| 0.2962 | 2.0 | 798 | 0.9771 | 0.2077 |
| 0.1891 | 3.0 | 1197 | 0.9819 | 0.1195 |
| 0.19 | 4.0 | 1596 | 0.9825 | 0.0982 |
| 0.1685 | 5.0 | 1995 | 0.9825 | 0.0952 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.14.0
- Tokenizers 0.10.3
|
StivenLancheros/biobert-base-cased-v1.2-finetuned-ner-Concat_CRAFT_es
|
StivenLancheros
| 2022-03-08T10:57:12Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"token-classification",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-08T09:29:54Z |
---
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: biobert-base-cased-v1.2-finetuned-ner-Concat_CRAFT_es
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# biobert-base-cased-v1.2-finetuned-ner-Concat_CRAFT_es
This model is a fine-tuned version of [dmis-lab/biobert-base-cased-v1.2](https://huggingface.co/dmis-lab/biobert-base-cased-v1.2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2079
- Precision: 0.8487
- Recall: 0.8443
- F1: 0.8465
- Accuracy: 0.9693
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.0698 | 1.0 | 2719 | 0.1463 | 0.8132 | 0.8233 | 0.8182 | 0.9643 |
| 0.0321 | 2.0 | 5438 | 0.1612 | 0.8321 | 0.8463 | 0.8392 | 0.9681 |
| 0.0154 | 3.0 | 8157 | 0.1832 | 0.8448 | 0.8404 | 0.8426 | 0.9683 |
| 0.0058 | 4.0 | 10876 | 0.2079 | 0.8487 | 0.8443 | 0.8465 | 0.9693 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
SGrannemann/bert-finetuned-ner
|
SGrannemann
| 2022-03-08T10:25:07Z | 4 | 0 |
transformers
|
[
"transformers",
"tf",
"bert",
"token-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-08T08:47:55Z |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: bert-finetuned-ner
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0225
- Validation Loss: 0.0519
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 2631, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.0226 | 0.0519 | 0 |
| 0.0229 | 0.0519 | 1 |
| 0.0225 | 0.0519 | 2 |
### Framework versions
- Transformers 4.17.0
- TensorFlow 2.8.0
- Datasets 1.18.4
- Tokenizers 0.11.6
|
akshaychaudhary/distilbert-base-uncased-finetuned-devops1-ner
|
akshaychaudhary
| 2022-03-08T09:58:20Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"token-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-08T09:29:06Z |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: distilbert-base-uncased-finetuned-devops1-ner
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-devops1-ner
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9870
- Precision: 0.0572
- Recall: 0.2689
- F1: 0.0944
- Accuracy: 0.7842
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 72 | 0.6027 | 0.0484 | 0.2269 | 0.0798 | 0.7861 |
| No log | 2.0 | 144 | 0.8631 | 0.0573 | 0.2857 | 0.0955 | 0.7771 |
| No log | 3.0 | 216 | 0.9870 | 0.0572 | 0.2689 | 0.0944 | 0.7842 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
brad1141/bert-finetuned-ner
|
brad1141
| 2022-03-08T09:31:59Z | 5 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"longformer",
"token-classification",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-02T23:29:05Z |
---
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-finetuned-ner
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6434
- Precision: 0.8589
- Recall: 0.8686
- F1: 0.8637
- Accuracy: 0.8324
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.615 | 1.0 | 1741 | 0.6111 | 0.8200 | 0.8652 | 0.8420 | 0.8046 |
| 0.4795 | 2.0 | 3482 | 0.5366 | 0.8456 | 0.8803 | 0.8626 | 0.8301 |
| 0.3705 | 3.0 | 5223 | 0.5412 | 0.8527 | 0.8786 | 0.8655 | 0.8339 |
| 0.2749 | 4.0 | 6964 | 0.5906 | 0.8559 | 0.8711 | 0.8634 | 0.8316 |
| 0.2049 | 5.0 | 8705 | 0.6434 | 0.8589 | 0.8686 | 0.8637 | 0.8324 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
vneralla/xlrs-53-finnish
|
vneralla
| 2022-03-08T08:59:34Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"jax",
"wav2vec2",
"pretraining",
"speech",
"automatic-speech-recognition",
"multilingual",
"dataset:common_voice",
"arxiv:2006.13979",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-02T23:29:05Z |
---
language: multilingual
datasets:
- common_voice
tags:
- speech
- automatic-speech-recognition
license: apache-2.0
---
# Wav2Vec2-XLSR-53
[Facebook's XLSR-Wav2Vec2](https://ai.facebook.com/blog/wav2vec-20-learning-the-structure-of-speech-from-raw-audio/)
The base model pretrained on 16kHz sampled speech audio. When using the model make sure that your speech input is also sampled at 16Khz. Note that this model should be fine-tuned on a downstream task, like Automatic Speech Recognition. Check out [this blog](https://huggingface.co/blog/fine-tune-wav2vec2-english) for more information.
[Paper](https://arxiv.org/abs/2006.13979)
Authors: Alexis Conneau, Alexei Baevski, Ronan Collobert, Abdelrahman Mohamed, Michael Auli
**Abstract**
This paper presents XLSR which learns cross-lingual speech representations by pretraining a single model from the raw waveform of speech in multiple languages. We build on wav2vec 2.0 which is trained by solving a contrastive task over masked latent speech representations and jointly learns a quantization of the latents shared across languages. The resulting model is fine-tuned on labeled data and experiments show that cross-lingual pretraining significantly outperforms monolingual pretraining. On the CommonVoice benchmark, XLSR shows a relative phoneme error rate reduction of 72% compared to the best known results. On BABEL, our approach improves word error rate by 16% relative compared to a comparable system. Our approach enables a single multilingual speech recognition model which is competitive to strong individual models. Analysis shows that the latent discrete speech representations are shared across languages with increased sharing for related languages. We hope to catalyze research in low-resource speech understanding by releasing XLSR-53, a large model pretrained in 53 languages.
The original model can be found under https://github.com/pytorch/fairseq/tree/master/examples/wav2vec#wav2vec-20.
# Usage
See [this notebook](https://colab.research.google.com/github/patrickvonplaten/notebooks/blob/master/Fine_Tune_XLSR_Wav2Vec2_on_Turkish_ASR_with_%F0%9F%A4%97_Transformers.ipynb) for more information on how to fine-tune the model.

|
Ameer05/bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch-tweak-lr-8-10-1
|
Ameer05
| 2022-03-08T08:48:11Z | 12 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bart",
"text2text-generation",
"summarization",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
summarization
| 2022-03-08T08:28:45Z |
---
tags:
- summarization
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch-tweak-lr-8-10-1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch-tweak-lr-8-10-1
This model is a fine-tuned version of [Ameer05/model-token-repo](https://huggingface.co/Ameer05/model-token-repo) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4855
- Rouge1: 58.3832
- Rouge2: 49.9973
- Rougel: 55.3055
- Rougelsum: 57.7139
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| No log | 0.91 | 5 | 2.0183 | 52.3098 | 45.5304 | 49.2759 | 51.7456 |
| No log | 1.91 | 10 | 1.6564 | 61.815 | 53.9035 | 58.4243 | 60.784 |
| No log | 2.91 | 15 | 1.5330 | 61.3032 | 54.12 | 58.9152 | 60.7178 |
| No log | 3.91 | 20 | 1.4539 | 63.3012 | 56.2987 | 61.0907 | 62.5217 |
| 1.5646 | 4.91 | 25 | 1.4578 | 62.4815 | 55.1453 | 60.3921 | 61.6067 |
| 1.5646 | 5.91 | 30 | 1.4284 | 61.5347 | 54.1271 | 58.8474 | 60.5427 |
| 1.5646 | 6.91 | 35 | 1.4467 | 61.5081 | 53.8512 | 59.2782 | 60.6928 |
| 1.5646 | 7.91 | 40 | 1.4653 | 59.5349 | 51.8208 | 56.5996 | 58.8211 |
| 0.6692 | 8.91 | 45 | 1.4740 | 57.2917 | 49.5416 | 54.8409 | 56.6276 |
| 0.6692 | 9.91 | 50 | 1.4855 | 58.3832 | 49.9973 | 55.3055 | 57.7139 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.9.1
- Datasets 1.18.4
- Tokenizers 0.10.3
|
SGrannemann/distilbert-base-uncased-finetuned-imdb
|
SGrannemann
| 2022-03-08T08:36:19Z | 3 | 0 |
transformers
|
[
"transformers",
"tf",
"distilbert",
"fill-mask",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2022-03-08T07:43:47Z |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: distilbert-base-uncased-finetuned-imdb
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 2e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': -688, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
### Framework versions
- Transformers 4.17.0
- TensorFlow 2.8.0
- Tokenizers 0.11.6
|
Narsil/totallysafe
|
Narsil
| 2022-03-08T08:23:38Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tf",
"gpt2",
"text-generation",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-07T22:13:40Z |
---
pipeline_tag: text-generation
---
This is a totally safe and groundbreaking model.
GPT3 performance with under 10Mo model.
|
cammy/bart-large-cnn-10k-pad-early-lit
|
cammy
| 2022-03-08T08:20:01Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bart",
"text2text-generation",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-08T05:46:12Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-10k-pad-early-lit
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-large-cnn-10k-pad-early-lit
This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3758
- Rouge1: 27.7351
- Rouge2: 13.1664
- Rougel: 21.6559
- Rougelsum: 24.648
- Gen Len: 69.343
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| 0.2516 | 1.0 | 9998 | 0.3540 | 28.1151 | 13.3875 | 22.1496 | 25.1745 | 66.578 |
| 0.1747 | 2.0 | 19996 | 0.3758 | 27.7351 | 13.1664 | 21.6559 | 24.648 | 69.343 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.2
- Datasets 1.18.3
- Tokenizers 0.11.0
|
huggingtweets/betonkoepfin-littlehorney-plusbibi1
|
huggingtweets
| 2022-03-08T07:46:04Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"huggingtweets",
"en",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-08T07:44:50Z |
---
language: en
thumbnail: http://www.huggingtweets.com/betonkoepfin-littlehorney-plusbibi1/1646725560421/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1386970823681052680/oA_4HBKl_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1425205160578588673/LBMG1HOO_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1500892464772751365/6uhqt-Jx_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bibi und Anna & Betty S. & Vanny_Bunny™</div>
<div style="text-align: center; font-size: 14px;">@betonkoepfin-littlehorney-plusbibi1</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Bibi und Anna & Betty S. & Vanny_Bunny™.
| Data | Bibi und Anna | Betty S. | Vanny_Bunny™ |
| --- | --- | --- | --- |
| Tweets downloaded | 1818 | 3243 | 3185 |
| Retweets | 9 | 213 | 494 |
| Short tweets | 341 | 552 | 339 |
| Tweets kept | 1468 | 2478 | 2352 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3nxb6yoh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @betonkoepfin-littlehorney-plusbibi1's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/365gy60z) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/365gy60z/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/betonkoepfin-littlehorney-plusbibi1')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
|
zhiweitong/dpr-ctx_encoder-single-nq-base
|
zhiweitong
| 2022-03-08T07:28:29Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"dpr",
"en",
"dataset:wiki_dpr",
"dataset:natural_questions",
"endpoints_compatible",
"region:us"
] | null | 2022-03-07T12:19:50Z |
---
language: en
datasets:
- wiki_dpr
- natural_questions
---
# dpr-ctx_encoder-single-nq-base
This encoder is used with [zhiweitong/dpr-answer_encoder-single-nq-base](https://huggingface.co/zhiweitong/dpr-answer_encoder-single-nq-base)
|
MikhailGalperin/distilbert-base-uncased-finetuned-ner
|
MikhailGalperin
| 2022-03-08T06:49:43Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"token-classification",
"generated_from_trainer",
"dataset:conll2003",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-07T20:29:52Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- conll2003
model-index:
- name: distilbert-base-uncased-finetuned-ner
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-ner
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the conll2003 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
Ameer05/bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch
|
Ameer05
| 2022-03-08T05:53:14Z | 9 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bart",
"text2text-generation",
"summarization",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
summarization
| 2022-03-08T05:33:06Z |
---
tags:
- summarization
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-large-cnn-samsum-rescom-finetuned-resume-summarizer-10-epoch
This model is a fine-tuned version of [Ameer05/model-token-repo](https://huggingface.co/Ameer05/model-token-repo) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5216
- Rouge1: 59.5791
- Rouge2: 51.3273
- Rougel: 56.9984
- Rougelsum: 59.1424
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| No log | 0.91 | 5 | 2.0124 | 53.776 | 46.7427 | 50.7565 | 53.5502 |
| No log | 1.91 | 10 | 1.6353 | 61.8019 | 53.8614 | 58.9744 | 61.339 |
| No log | 2.91 | 15 | 1.5321 | 59.7045 | 51.5968 | 57.0823 | 59.2417 |
| No log | 3.91 | 20 | 1.4569 | 62.4379 | 54.5464 | 59.9202 | 61.9242 |
| 1.5608 | 4.91 | 25 | 1.4613 | 63.3808 | 55.8818 | 61.432 | 63.0208 |
| 1.5608 | 5.91 | 30 | 1.4321 | 59.6761 | 50.9812 | 56.7977 | 59.1214 |
| 1.5608 | 6.91 | 35 | 1.4753 | 62.6439 | 54.7158 | 60.3831 | 62.1046 |
| 1.5608 | 7.91 | 40 | 1.4783 | 60.2735 | 52.7462 | 57.77 | 59.9725 |
| 0.6428 | 8.91 | 45 | 1.4974 | 62.8691 | 54.9062 | 60.3496 | 62.5132 |
| 0.6428 | 9.91 | 50 | 1.5216 | 59.5791 | 51.3273 | 56.9984 | 59.1424 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.9.1
- Datasets 1.18.4
- Tokenizers 0.10.3
|
jiobiala24/wav2vec2-base-cv
|
jiobiala24
| 2022-03-08T05:42:48Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-08T00:03:37Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- common_voice
model-index:
- name: wav2vec2-base-cv
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-cv
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1562
- Wer: 0.3804
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 4.563 | 3.18 | 500 | 2.9826 | 1.0 |
| 2.0012 | 6.37 | 1000 | 0.9528 | 0.5354 |
| 0.4841 | 9.55 | 1500 | 0.8838 | 0.4325 |
| 0.2748 | 12.74 | 2000 | 0.9437 | 0.4130 |
| 0.1881 | 15.92 | 2500 | 0.9603 | 0.4005 |
| 0.1426 | 19.11 | 3000 | 1.0605 | 0.3955 |
| 0.1134 | 22.29 | 3500 | 1.0733 | 0.3897 |
| 0.0963 | 25.48 | 4000 | 1.1387 | 0.3835 |
| 0.0829 | 28.66 | 4500 | 1.1562 | 0.3804 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
|
willcai/wav2vec2-large-xls-r-300m-tr-colab
|
willcai
| 2022-03-08T03:06:32Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-05T22:48:59Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- common_voice
model-index:
- name: wav2vec2-large-xls-r-300m-tr-colab
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-tr-colab
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4121
- Wer: 0.3112
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 4.1868 | 1.83 | 400 | 0.9812 | 0.8398 |
| 0.691 | 3.67 | 800 | 0.5571 | 0.6298 |
| 0.3555 | 5.5 | 1200 | 0.4676 | 0.4779 |
| 0.2451 | 7.34 | 1600 | 0.4572 | 0.4541 |
| 0.1844 | 9.17 | 2000 | 0.4743 | 0.4389 |
| 0.1541 | 11.01 | 2400 | 0.4583 | 0.4300 |
| 0.1277 | 12.84 | 2800 | 0.4565 | 0.3950 |
| 0.1122 | 14.68 | 3200 | 0.4761 | 0.4087 |
| 0.0975 | 16.51 | 3600 | 0.4654 | 0.3786 |
| 0.0861 | 18.35 | 4000 | 0.4503 | 0.3667 |
| 0.0775 | 20.18 | 4400 | 0.4600 | 0.3581 |
| 0.0666 | 22.02 | 4800 | 0.4350 | 0.3504 |
| 0.0627 | 23.85 | 5200 | 0.4211 | 0.3349 |
| 0.0558 | 25.69 | 5600 | 0.4390 | 0.3333 |
| 0.0459 | 27.52 | 6000 | 0.4218 | 0.3185 |
| 0.0439 | 29.36 | 6400 | 0.4121 | 0.3112 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.2+cu102
- Datasets 1.18.3
- Tokenizers 0.10.3
|
gayanin/t5-small-paraphrasing-mlm
|
gayanin
| 2022-03-08T01:54:54Z | 10 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T21:54:14Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: t5-small-paraphrasing-mlm
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-small-paraphrasing-mlm
This model is a fine-tuned version of [gayanin/t5-small-paraphrase-pubmed](https://huggingface.co/gayanin/t5-small-paraphrase-pubmed) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7030
- Rouge2 Precision: 0.6576
- Rouge2 Recall: 0.4712
- Rouge2 Fmeasure: 0.532
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge2 Precision | Rouge2 Recall | Rouge2 Fmeasure |
|:-------------:|:-----:|:------:|:---------------:|:----------------:|:-------------:|:---------------:|
| 0.9215 | 1.0 | 13833 | 0.8050 | 0.6352 | 0.454 | 0.5131 |
| 0.855 | 2.0 | 27666 | 0.7679 | 0.6411 | 0.4589 | 0.5184 |
| 0.8387 | 3.0 | 41499 | 0.7464 | 0.6464 | 0.4626 | 0.5226 |
| 0.8267 | 4.0 | 55332 | 0.7315 | 0.6513 | 0.4671 | 0.5273 |
| 0.7879 | 5.0 | 69165 | 0.7217 | 0.6534 | 0.4687 | 0.529 |
| 0.7738 | 6.0 | 82998 | 0.7142 | 0.6548 | 0.4688 | 0.5295 |
| 0.7793 | 7.0 | 96831 | 0.7094 | 0.6553 | 0.4694 | 0.53 |
| 0.7654 | 8.0 | 110664 | 0.7056 | 0.6573 | 0.4704 | 0.5313 |
| 0.7675 | 9.0 | 124497 | 0.7036 | 0.6577 | 0.4712 | 0.532 |
| 0.7662 | 10.0 | 138330 | 0.7030 | 0.6576 | 0.4712 | 0.532 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
huggingtweets/lilbratmia-littlehorney-plusbibi1
|
huggingtweets
| 2022-03-07T21:45:31Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"huggingtweets",
"en",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-07T21:35:06Z |
---
language: en
thumbnail: http://www.huggingtweets.com/lilbratmia-littlehorney-plusbibi1/1646689525715/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1386970823681052680/oA_4HBKl_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1500892464772751365/6uhqt-Jx_400x400.jpg')">
</div>
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1483439308166123530/vKFDbs48_400x400.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Bibi und Anna & Vanny_Bunny™ & 💞 Mia 💞</div>
<div style="text-align: center; font-size: 14px;">@lilbratmia-littlehorney-plusbibi1</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from Bibi und Anna & Vanny_Bunny™ & 💞 Mia 💞.
| Data | Bibi und Anna | Vanny_Bunny™ | 💞 Mia 💞 |
| --- | --- | --- | --- |
| Tweets downloaded | 1818 | 3230 | 3247 |
| Retweets | 9 | 503 | 134 |
| Short tweets | 341 | 343 | 1189 |
| Tweets kept | 1468 | 2384 | 1924 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/hm55g9hx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @lilbratmia-littlehorney-plusbibi1's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3dezdv7k) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3dezdv7k/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/lilbratmia-littlehorney-plusbibi1')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
|
gayanin/bart-paraphrasing-mlm
|
gayanin
| 2022-03-07T21:40:56Z | 6 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bart",
"text2text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T14:50:28Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: bart-paraphrasing-mlm
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-paraphrasing-mlm
This model is a fine-tuned version of [gayanin/bart-paraphrase-pubmed-1.1](https://huggingface.co/gayanin/bart-paraphrase-pubmed-1.1) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5510
- Rouge2 Precision: 0.7148
- Rouge2 Recall: 0.5223
- Rouge2 Fmeasure: 0.5866
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge2 Precision | Rouge2 Recall | Rouge2 Fmeasure |
|:-------------:|:-----:|:-----:|:---------------:|:----------------:|:-------------:|:---------------:|
| 0.6799 | 1.0 | 13833 | 0.5982 | 0.7016 | 0.5122 | 0.5756 |
| 0.5894 | 2.0 | 27666 | 0.5663 | 0.7093 | 0.5193 | 0.583 |
| 0.5329 | 3.0 | 41499 | 0.5540 | 0.7129 | 0.5212 | 0.5853 |
| 0.4953 | 4.0 | 55332 | 0.5510 | 0.7148 | 0.5223 | 0.5866 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
Manauu17/roberta_sentiments_es
|
Manauu17
| 2022-03-07T20:10:33Z | 4 | 2 |
transformers
|
[
"transformers",
"pytorch",
"tf",
"roberta",
"text-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-03T13:50:56Z |
# roberta_sentiments_es , a Sentiment Analysis model for Spanish sentences
This is a roBERTa-base model trained on ~58M tweets and finetuned for sentiment analysis. This model currently supports Spanish sentences
## Example of classification
```python
from transformers import AutoModelForSequenceClassification
from transformers import TFAutoModelForSequenceClassification
from transformers import AutoTokenizer
import numpy as np
import pandas as pd
from scipy.special import softmax
MODEL = 'Manauu17/roberta_sentiments_es_en'
tokenizer = AutoTokenizer.from_pretrained(MODEL)
# PyTorch
model = AutoModelForSequenceClassification.from_pretrained(MODEL)
text = ['@usuario siempre es bueno la opinión de un playo',
'Bendito año el que me espera']
encoded_input = tokenizer(text, return_tensors='pt', padding=True, truncation=True)
output = model(**encoded_input)
scores = output[0].detach().numpy()
# TensorFlow
model = TFAutoModelForSequenceClassification.from_pretrained(MODEL)
text = ['La guerra no es buena para nadie.','Espero que mi jefe me de mañana libre']
encoded_input = tokenizer(text, return_tensors='tf', padding=True, truncation=True)
output = model(encoded_input)
scores = output[0].numpy()
# Results
def get_scores(model_output, labels_dict):
scores = softmax(model_output)
frame = pd.DataFrame(scores, columns=labels.values())
frame.style.highlight_max(axis=1,color="green")
return frame
```
Output:
```
# PyTorch
get_scores(scores, labels_dict).style.highlight_max(axis=1, color="green")
Negative Neutral Positive
0 0.000607 0.004851 0.906596
1 0.079812 0.006650 0.001484
# TensorFlow
get_scores(scores, labels_dict).style.highlight_max(axis=1, color="green")
Negative Neutral Positive
0 0.017030 0.008920 0.000667
1 0.000260 0.001695 0.971429
```
|
espnet/Karthik_DSTC2_asr_train_asr_wav2vec_transformer
|
espnet
| 2022-03-07T19:38:16Z | 1 | 0 |
espnet
|
[
"espnet",
"tensorboard",
"audio",
"automatic-speech-recognition",
"en",
"dataset:sinhala",
"arxiv:1804.00015",
"license:cc-by-4.0",
"region:us"
] |
automatic-speech-recognition
| 2022-03-07T16:09:26Z |
---
tags:
- espnet
- audio
- automatic-speech-recognition
language: en
datasets:
- sinhala
license: cc-by-4.0
---
## ESPnet2 ASR pretrained model
### `espnet/Karthik_DSTC2_asr_train_asr_wav2vec_transformer`
This model was trained by Karthik using DSTC2/asr1 recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
```python
# coming soon
```
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson {Enrique Yalta Soplin} and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Enrique Yalta Soplin and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
pyf98/librispeech_conformer
|
pyf98
| 2022-03-07T18:33:17Z | 4 | 0 |
espnet
|
[
"espnet",
"audio",
"automatic-speech-recognition",
"en",
"dataset:librispeech",
"arxiv:1804.00015",
"license:cc-by-4.0",
"region:us"
] |
automatic-speech-recognition
| 2022-03-07T18:16:05Z |
---
tags:
- espnet
- audio
- automatic-speech-recognition
language: en
datasets:
- librispeech
license: cc-by-4.0
---
## ESPnet2 ASR model
### `pyf98/librispeech_conformer`
This model was trained by Yifan Peng using librispeech recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
```bash
cd espnet
git checkout c3569453a408fd4ff4173d9c1d2062c88d1fc060
pip install -e .
cd egs2/librispeech/asr1
./run.sh --skip_data_prep false --skip_train true --download_model pyf98/librispeech_conformer
```
<!-- Generated by scripts/utils/show_asr_result.sh -->
# RESULTS
## Environments
- date: `Mon Mar 7 12:26:10 EST 2022`
- python version: `3.9.7 (default, Sep 16 2021, 13:09:58) [GCC 7.5.0]`
- espnet version: `espnet 0.10.7a1`
- pytorch version: `pytorch 1.10.1`
- Git hash: `c3569453a408fd4ff4173d9c1d2062c88d1fc060`
- Commit date: `Sun Mar 6 23:58:36 2022 -0500`
## asr_train_asr_conformer8_raw_en_bpe5000_sp
### WER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
|beam60_ctc0.2/dev_clean|2703|54402|98.0|1.8|0.2|0.2|2.2|27.2|
|beam60_ctc0.2/dev_other|2864|50948|95.1|4.4|0.5|0.5|5.4|43.3|
|beam60_ctc0.2/test_clean|2620|52576|97.9|1.9|0.2|0.3|2.4|28.8|
|beam60_ctc0.2/test_other|2939|52343|95.2|4.3|0.5|0.6|5.4|45.5|
|beam60_ctc0.2_lm0.6/dev_clean|2703|54402|98.3|1.4|0.3|0.2|1.9|23.7|
|beam60_ctc0.2_lm0.6/dev_other|2864|50948|96.2|3.3|0.4|0.4|4.2|37.2|
|beam60_ctc0.2_lm0.6/test_clean|2620|52576|98.2|1.5|0.3|0.2|2.0|24.3|
|beam60_ctc0.2_lm0.6/test_other|2939|52343|96.1|3.3|0.6|0.4|4.4|39.9|
|beam60_ctc0.3/dev_clean|2703|54402|98.1|1.8|0.2|0.2|2.1|27.3|
|beam60_ctc0.3/dev_other|2864|50948|95.2|4.4|0.4|0.5|5.4|43.7|
|beam60_ctc0.3/test_clean|2620|52576|97.9|1.9|0.2|0.3|2.3|29.0|
|beam60_ctc0.3/test_other|2939|52343|95.2|4.3|0.4|0.6|5.4|45.7|
|beam60_ctc0.3_lm0.6/dev_clean|2703|54402|98.4|1.4|0.2|0.2|1.8|23.5|
|beam60_ctc0.3_lm0.6/dev_other|2864|50948|96.2|3.4|0.4|0.4|4.1|37.4|
|beam60_ctc0.3_lm0.6/test_clean|2620|52576|98.3|1.5|0.2|0.2|1.9|24.1|
|beam60_ctc0.3_lm0.6/test_other|2939|52343|96.2|3.3|0.5|0.5|4.3|39.9|
### CER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
|beam60_ctc0.2/dev_clean|2703|288456|99.4|0.3|0.3|0.2|0.8|27.2|
|beam60_ctc0.2/dev_other|2864|265951|98.1|1.1|0.8|0.6|2.5|43.3|
|beam60_ctc0.2/test_clean|2620|281530|99.4|0.3|0.3|0.2|0.8|28.8|
|beam60_ctc0.2/test_other|2939|272758|98.3|1.0|0.7|0.6|2.3|45.5|
|beam60_ctc0.2_lm0.6/dev_clean|2703|288456|99.4|0.3|0.3|0.2|0.8|23.7|
|beam60_ctc0.2_lm0.6/dev_other|2864|265951|98.4|0.9|0.7|0.5|2.1|37.2|
|beam60_ctc0.2_lm0.6/test_clean|2620|281530|99.4|0.2|0.4|0.2|0.8|24.3|
|beam60_ctc0.2_lm0.6/test_other|2939|272758|98.5|0.8|0.8|0.5|2.0|39.9|
|beam60_ctc0.3/dev_clean|2703|288456|99.5|0.3|0.2|0.2|0.7|27.3|
|beam60_ctc0.3/dev_other|2864|265951|98.2|1.1|0.7|0.6|2.4|43.7|
|beam60_ctc0.3/test_clean|2620|281530|99.4|0.3|0.3|0.2|0.8|29.0|
|beam60_ctc0.3/test_other|2939|272758|98.4|0.9|0.7|0.6|2.2|45.7|
|beam60_ctc0.3_lm0.6/dev_clean|2703|288456|99.5|0.2|0.2|0.2|0.7|23.5|
|beam60_ctc0.3_lm0.6/dev_other|2864|265951|98.5|0.9|0.7|0.5|2.0|37.4|
|beam60_ctc0.3_lm0.6/test_clean|2620|281530|99.5|0.2|0.3|0.2|0.7|24.1|
|beam60_ctc0.3_lm0.6/test_other|2939|272758|98.6|0.7|0.7|0.5|1.9|39.9|
### TER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
|beam60_ctc0.2/dev_clean|2703|68010|97.5|1.8|0.7|0.3|2.9|27.2|
|beam60_ctc0.2/dev_other|2864|63110|94.1|4.4|1.6|0.9|6.8|43.3|
|beam60_ctc0.2/test_clean|2620|65818|97.4|1.8|0.8|0.3|2.9|28.8|
|beam60_ctc0.2/test_other|2939|65101|94.1|4.1|1.8|0.8|6.7|45.5|
|beam60_ctc0.2_lm0.6/dev_clean|2703|68010|97.8|1.4|0.8|0.3|2.5|23.7|
|beam60_ctc0.2_lm0.6/dev_other|2864|63110|95.1|3.5|1.5|0.7|5.6|37.2|
|beam60_ctc0.2_lm0.6/test_clean|2620|65818|97.6|1.5|0.9|0.3|2.7|24.3|
|beam60_ctc0.2_lm0.6/test_other|2939|65101|95.0|3.2|1.8|0.6|5.6|39.9|
|beam60_ctc0.3/dev_clean|2703|68010|97.6|1.8|0.7|0.3|2.8|27.3|
|beam60_ctc0.3/dev_other|2864|63110|94.1|4.4|1.5|0.9|6.8|43.7|
|beam60_ctc0.3/test_clean|2620|65818|97.4|1.8|0.7|0.3|2.9|29.0|
|beam60_ctc0.3/test_other|2939|65101|94.2|4.1|1.7|0.8|6.6|45.7|
|beam60_ctc0.3_lm0.6/dev_clean|2703|68010|97.9|1.5|0.7|0.3|2.4|23.5|
|beam60_ctc0.3_lm0.6/dev_other|2864|63110|95.1|3.5|1.4|0.6|5.6|37.4|
|beam60_ctc0.3_lm0.6/test_clean|2620|65818|97.7|1.5|0.8|0.3|2.5|24.1|
|beam60_ctc0.3_lm0.6/test_other|2939|65101|95.1|3.2|1.7|0.6|5.5|39.9|
## ASR config
<details><summary>expand</summary>
```
config: conf/tuning/train_asr_conformer8.yaml
print_config: false
log_level: INFO
dry_run: false
iterator_type: sequence
output_dir: exp/asr_train_asr_conformer8_raw_en_bpe5000_sp
ngpu: 1
seed: 0
num_workers: 4
num_att_plot: 3
dist_backend: nccl
dist_init_method: env://
dist_world_size: 3
dist_rank: 0
local_rank: 0
dist_master_addr: localhost
dist_master_port: 59673
dist_launcher: null
multiprocessing_distributed: true
unused_parameters: false
sharded_ddp: false
cudnn_enabled: true
cudnn_benchmark: false
cudnn_deterministic: true
collect_stats: false
write_collected_feats: false
max_epoch: 50
patience: null
val_scheduler_criterion:
- valid
- loss
early_stopping_criterion:
- valid
- loss
- min
best_model_criterion:
- - valid
- acc
- max
keep_nbest_models: 10
nbest_averaging_interval: 0
grad_clip: 5.0
grad_clip_type: 2.0
grad_noise: false
accum_grad: 4
no_forward_run: false
resume: true
train_dtype: float32
use_amp: true
log_interval: null
use_matplotlib: true
use_tensorboard: true
use_wandb: false
wandb_project: null
wandb_id: null
wandb_entity: null
wandb_name: null
wandb_model_log_interval: -1
detect_anomaly: false
pretrain_path: null
init_param: []
ignore_init_mismatch: false
freeze_param: []
num_iters_per_epoch: null
batch_size: 20
valid_batch_size: null
batch_bins: 35000000
valid_batch_bins: null
train_shape_file:
- exp/asr_stats_raw_en_bpe5000_sp/train/speech_shape
- exp/asr_stats_raw_en_bpe5000_sp/train/text_shape.bpe
valid_shape_file:
- exp/asr_stats_raw_en_bpe5000_sp/valid/speech_shape
- exp/asr_stats_raw_en_bpe5000_sp/valid/text_shape.bpe
batch_type: numel
valid_batch_type: null
fold_length:
- 80000
- 150
sort_in_batch: descending
sort_batch: descending
multiple_iterator: false
chunk_length: 500
chunk_shift_ratio: 0.5
num_cache_chunks: 1024
train_data_path_and_name_and_type:
- - dump/raw/train_960_sp/wav.scp
- speech
- sound
- - dump/raw/train_960_sp/text
- text
- text
valid_data_path_and_name_and_type:
- - dump/raw/dev/wav.scp
- speech
- sound
- - dump/raw/dev/text
- text
- text
allow_variable_data_keys: false
max_cache_size: 0.0
max_cache_fd: 32
valid_max_cache_size: null
optim: adam
optim_conf:
lr: 0.0025
weight_decay: 1.0e-06
scheduler: warmuplr
scheduler_conf:
warmup_steps: 40000
token_list:
- <blank>
- <unk>
- ▁THE
- S
- ▁AND
- ▁OF
- ▁TO
- ▁A
- ▁IN
- ▁I
- ▁HE
- ▁THAT
- ▁WAS
- ED
- ▁IT
- ''''
- ▁HIS
- ING
- ▁YOU
- ▁WITH
- ▁FOR
- ▁HAD
- T
- ▁AS
- ▁HER
- ▁IS
- ▁BE
- ▁BUT
- ▁NOT
- ▁SHE
- D
- ▁AT
- ▁ON
- LY
- ▁HIM
- ▁THEY
- ▁ALL
- ▁HAVE
- ▁BY
- ▁SO
- ▁THIS
- ▁MY
- ▁WHICH
- ▁ME
- ▁SAID
- ▁FROM
- ▁ONE
- Y
- E
- ▁WERE
- ▁WE
- ▁NO
- N
- ▁THERE
- ▁OR
- ER
- ▁AN
- ▁WHEN
- ▁ARE
- ▁THEIR
- ▁WOULD
- ▁IF
- ▁WHAT
- ▁THEM
- ▁WHO
- ▁OUT
- M
- ▁DO
- ▁WILL
- ▁UP
- ▁BEEN
- P
- R
- ▁MAN
- ▁THEN
- ▁COULD
- ▁MORE
- C
- ▁INTO
- ▁NOW
- ▁VERY
- ▁YOUR
- ▁SOME
- ▁LITTLE
- ES
- ▁TIME
- RE
- ▁CAN
- ▁LIKE
- LL
- ▁ABOUT
- ▁HAS
- ▁THAN
- ▁DID
- ▁UPON
- ▁OVER
- IN
- ▁ANY
- ▁WELL
- ▁ONLY
- B
- ▁SEE
- ▁GOOD
- ▁OTHER
- ▁TWO
- L
- ▁KNOW
- ▁GO
- ▁DOWN
- ▁BEFORE
- A
- AL
- ▁OUR
- ▁OLD
- ▁SHOULD
- ▁MADE
- ▁AFTER
- ▁GREAT
- ▁DAY
- ▁MUST
- ▁COME
- ▁HOW
- ▁SUCH
- ▁CAME
- LE
- ▁WHERE
- ▁US
- ▁NEVER
- ▁THESE
- ▁MUCH
- ▁DE
- ▁MISTER
- ▁WAY
- G
- ▁S
- ▁MAY
- ATION
- ▁LONG
- OR
- ▁AM
- ▁FIRST
- ▁BACK
- ▁OWN
- ▁RE
- ▁AGAIN
- ▁SAY
- ▁MEN
- ▁WENT
- ▁HIMSELF
- ▁HERE
- NESS
- ▁THINK
- V
- IC
- ▁EVEN
- ▁THOUGHT
- ▁HAND
- ▁JUST
- ▁O
- ▁UN
- VE
- ION
- ▁ITS
- 'ON'
- ▁MAKE
- ▁MIGHT
- ▁TOO
- K
- ▁AWAY
- ▁LIFE
- TH
- ▁WITHOUT
- ST
- ▁THROUGH
- ▁MOST
- ▁TAKE
- ▁DON
- ▁EVERY
- F
- O
- ▁SHALL
- ▁THOSE
- ▁EYES
- AR
- ▁STILL
- ▁LAST
- ▁HOUSE
- ▁HEAD
- ABLE
- ▁NOTHING
- ▁NIGHT
- ITY
- ▁LET
- ▁MANY
- ▁OFF
- ▁BEING
- ▁FOUND
- ▁WHILE
- EN
- ▁SAW
- ▁GET
- ▁PEOPLE
- ▁FACE
- ▁YOUNG
- CH
- ▁UNDER
- ▁ONCE
- ▁TELL
- AN
- ▁THREE
- ▁PLACE
- ▁ROOM
- ▁YET
- ▁SAME
- IL
- US
- U
- ▁FATHER
- ▁RIGHT
- EL
- ▁THOUGH
- ▁ANOTHER
- LI
- RI
- ▁HEART
- IT
- ▁PUT
- ▁TOOK
- ▁GIVE
- ▁EVER
- ▁E
- ▁PART
- ▁WORK
- ERS
- ▁LOOK
- ▁NEW
- ▁KING
- ▁MISSUS
- ▁SIR
- ▁LOVE
- ▁MIND
- ▁LOOKED
- W
- RY
- ▁ASKED
- ▁LEFT
- ET
- ▁LIGHT
- CK
- ▁DOOR
- ▁MOMENT
- RO
- ▁WORLD
- ▁THINGS
- ▁HOME
- UL
- ▁THING
- LA
- ▁WHY
- ▁MOTHER
- ▁ALWAYS
- ▁FAR
- FUL
- ▁WATER
- CE
- IVE
- UR
- ▁HEARD
- ▁SOMETHING
- ▁SEEMED
- I
- LO
- ▁BECAUSE
- OL
- ▁END
- ▁TOLD
- ▁CON
- ▁YES
- ▁GOING
- ▁GOT
- RA
- IR
- ▁WOMAN
- ▁GOD
- EST
- TED
- ▁FIND
- ▁KNEW
- ▁SOON
- ▁EACH
- ▁SIDE
- H
- TON
- MENT
- ▁OH
- NE
- Z
- LING
- ▁AGAINST
- TER
- ▁NAME
- ▁MISS
- ▁QUITE
- ▁WANT
- ▁YEARS
- ▁FEW
- ▁BETTER
- ENT
- ▁HALF
- ▁DONE
- ▁ALSO
- ▁BEGAN
- ▁HAVING
- ▁ENOUGH
- IS
- ▁LADY
- ▁WHOLE
- LESS
- ▁BOTH
- ▁SEEN
- ▁SET
- ▁WHITE
- ▁COURSE
- IES
- ▁VOICE
- ▁CALLED
- ▁D
- ▁EX
- ATE
- ▁TURNED
- ▁GAVE
- ▁C
- ▁POOR
- MAN
- UT
- NA
- ▁DEAR
- ISH
- ▁GIRL
- ▁MORNING
- ▁BETWEEN
- LED
- ▁NOR
- IA
- ▁AMONG
- MA
- ▁
- ▁SMALL
- ▁REST
- ▁WHOM
- ▁FELT
- ▁HANDS
- ▁MYSELF
- ▁HIGH
- ▁M
- ▁HOWEVER
- ▁HERSELF
- ▁P
- CO
- ▁STOOD
- ID
- ▁KIND
- ▁HUNDRED
- AS
- ▁ROUND
- ▁ALMOST
- TY
- ▁SINCE
- ▁G
- AM
- ▁LA
- SE
- ▁BOY
- ▁MA
- ▁PERHAPS
- ▁WORDS
- ATED
- ▁HO
- X
- ▁MO
- ▁SAT
- ▁REPLIED
- ▁FOUR
- ▁ANYTHING
- ▁TILL
- ▁UNTIL
- ▁BLACK
- TION
- ▁CRIED
- RU
- TE
- ▁FACT
- ▁HELP
- ▁NEXT
- ▁LOOKING
- ▁DOES
- ▁FRIEND
- ▁LAY
- ANCE
- ▁POWER
- ▁BROUGHT
- VER
- ▁FIRE
- ▁KEEP
- PO
- FF
- ▁COUNTRY
- ▁SEA
- ▁WORD
- ▁CAR
- ▁DAYS
- ▁TOGETHER
- ▁IMP
- ▁REASON
- KE
- ▁INDEED
- TING
- ▁MATTER
- ▁FULL
- ▁TEN
- TIC
- ▁LAND
- ▁RATHER
- ▁AIR
- ▁HOPE
- ▁DA
- ▁OPEN
- ▁FEET
- ▁EN
- ▁FIVE
- ▁POINT
- ▁CO
- OM
- ▁LARGE
- ▁B
- ▁CL
- ME
- ▁GONE
- ▁CHILD
- INE
- GG
- ▁BEST
- ▁DIS
- UM
- ▁HARD
- ▁LORD
- OUS
- ▁WIFE
- ▁SURE
- ▁FORM
- DE
- ▁DEATH
- ANT
- ▁NATURE
- ▁BA
- ▁CARE
- ▁BELIEVE
- PP
- ▁NEAR
- ▁RO
- ▁RED
- ▁WAR
- IE
- ▁SPEAK
- ▁FEAR
- ▁CASE
- ▁TAKEN
- ▁ALONG
- ▁CANNOT
- ▁HEAR
- ▁THEMSELVES
- CI
- ▁PRESENT
- AD
- ▁MASTER
- ▁SON
- ▁THUS
- ▁LI
- ▁LESS
- ▁SUN
- ▁TRUE
- IM
- IOUS
- ▁THOUSAND
- ▁MONEY
- ▁W
- ▁BEHIND
- ▁CHILDREN
- ▁DOCTOR
- AC
- ▁TWENTY
- ▁WISH
- ▁SOUND
- ▁WHOSE
- ▁LEAVE
- ▁ANSWERED
- ▁THOU
- ▁DUR
- ▁HA
- ▁CERTAIN
- ▁PO
- ▁PASSED
- GE
- TO
- ▁ARM
- ▁LO
- ▁STATE
- ▁ALONE
- TA
- ▁SHOW
- ▁NEED
- ▁LIVE
- ND
- ▁DEAD
- ENCE
- ▁STRONG
- ▁PRE
- ▁TI
- ▁GROUND
- SH
- TI
- ▁SHORT
- IAN
- UN
- ▁PRO
- ▁HORSE
- MI
- ▁PRINCE
- ARD
- ▁FELL
- ▁ORDER
- ▁CALL
- AT
- ▁GIVEN
- ▁DARK
- ▁THEREFORE
- ▁CLOSE
- ▁BODY
- ▁OTHERS
- ▁SENT
- ▁SECOND
- ▁OFTEN
- ▁CA
- ▁MANNER
- MO
- NI
- ▁BRING
- ▁QUESTION
- ▁HOUR
- ▁BO
- AGE
- ▁ST
- ▁TURN
- ▁TABLE
- ▁GENERAL
- ▁EARTH
- ▁BED
- ▁REALLY
- ▁SIX
- 'NO'
- IST
- ▁BECOME
- ▁USE
- ▁READ
- ▁SE
- ▁VI
- ▁COMING
- ▁EVERYTHING
- ▁EM
- ▁ABOVE
- ▁EVENING
- ▁BEAUTIFUL
- ▁FEEL
- ▁RAN
- ▁LEAST
- ▁LAW
- ▁ALREADY
- ▁MEAN
- ▁ROSE
- WARD
- ▁ITSELF
- ▁SOUL
- ▁SUDDENLY
- ▁AROUND
- RED
- ▁ANSWER
- ICAL
- ▁RA
- ▁WIND
- ▁FINE
- ▁WON
- ▁WHETHER
- ▁KNOWN
- BER
- NG
- ▁TA
- ▁CAPTAIN
- ▁EYE
- ▁PERSON
- ▁WOMEN
- ▁SORT
- ▁ASK
- ▁BROTHER
- ▁USED
- ▁HELD
- ▁BIG
- ▁RETURNED
- ▁STRANGE
- ▁BU
- ▁PER
- ▁FREE
- ▁EITHER
- ▁WITHIN
- ▁DOUBT
- ▁YEAR
- ▁CLEAR
- ▁SIGHT
- ▁GRA
- ▁LOST
- ▁KEPT
- ▁F
- PE
- ▁BAR
- ▁TOWN
- ▁SLEEP
- ARY
- ▁HAIR
- ▁FRIENDS
- ▁DREAM
- ▁FELLOW
- PER
- ▁DEEP
- QUE
- ▁BECAME
- ▁REAL
- ▁PAST
- ▁MAKING
- RING
- ▁COMP
- ▁ACT
- ▁BAD
- HO
- STER
- ▁YE
- ▁MEANS
- ▁RUN
- MEN
- ▁DAUGHTER
- ▁SENSE
- ▁CITY
- ▁SOMETIMES
- ▁TOWARDS
- ▁ROAD
- ▁SP
- ▁LU
- ▁READY
- ▁FOOT
- ▁COLD
- ▁SA
- ▁LETTER
- ▁ELSE
- ▁MAR
- ▁STA
- BE
- ▁TRUTH
- ▁LE
- BO
- ▁BUSINESS
- CHE
- ▁JOHN
- ▁SUBJECT
- ▁COURT
- ▁IDEA
- ILY
- ▁RIVER
- ATING
- ▁FAMILY
- HE
- ▁DIDN
- ▁GLAD
- ▁SEVERAL
- IAL
- ▁UNDERSTAND
- ▁SC
- ▁POSSIBLE
- ▁DIFFERENT
- ▁RETURN
- ▁ARMS
- ▁LOW
- ▁HOLD
- ▁TALK
- ▁RU
- ▁WINDOW
- ▁INTEREST
- ▁SISTER
- SON
- ▁SH
- ▁BLOOD
- ▁SAYS
- ▁CAP
- ▁DI
- ▁HUMAN
- ▁CAUSE
- NCE
- ▁THANK
- ▁LATE
- GO
- ▁CUT
- ▁ACROSS
- ▁STORY
- NT
- ▁COUNT
- ▁ABLE
- DY
- LEY
- ▁NUMBER
- ▁STAND
- ▁CHURCH
- ▁THY
- ▁SUPPOSE
- LES
- BLE
- OP
- ▁EFFECT
- BY
- ▁K
- ▁NA
- ▁SPOKE
- ▁MET
- ▁GREEN
- ▁HUSBAND
- ▁RESPECT
- ▁PA
- ▁FOLLOWED
- ▁REMEMBER
- ▁LONGER
- ▁AGE
- ▁TAKING
- ▁LINE
- ▁SEEM
- ▁HAPPY
- LAND
- EM
- ▁STAY
- ▁PLAY
- ▁COMMON
- ▁GA
- ▁BOOK
- ▁TIMES
- ▁OBJECT
- ▁SEVEN
- QUI
- DO
- UND
- ▁FL
- ▁PRETTY
- ▁FAIR
- WAY
- ▁WOOD
- ▁REACHED
- ▁APPEARED
- ▁SWEET
- ▁FALL
- BA
- ▁PASS
- ▁SIGN
- ▁TREE
- IONS
- ▁GARDEN
- ▁ILL
- ▁ART
- ▁REMAIN
- ▁OPENED
- ▁BRIGHT
- ▁STREET
- ▁TROUBLE
- ▁PAIN
- ▁CONTINUED
- ▁SCHOOL
- OUR
- ▁CARRIED
- ▁SAYING
- HA
- ▁CHANGE
- ▁FOLLOW
- ▁GOLD
- ▁SW
- ▁FEELING
- ▁COMMAND
- ▁BEAR
- ▁CERTAINLY
- ▁BLUE
- ▁NE
- CA
- ▁WILD
- ▁ACCOUNT
- ▁OUGHT
- UD
- ▁T
- ▁BREATH
- ▁WANTED
- ▁RI
- ▁HEAVEN
- ▁PURPOSE
- ▁CHARACTER
- ▁RICH
- ▁PE
- ▁DRESS
- OS
- FA
- ▁TH
- ▁ENGLISH
- ▁CHANCE
- ▁SHIP
- ▁VIEW
- ▁TOWARD
- AK
- ▁JOY
- ▁JA
- ▁HAR
- ▁NEITHER
- ▁FORCE
- ▁UNCLE
- DER
- ▁PLAN
- ▁PRINCESS
- DI
- ▁CHIEF
- ▁HAT
- ▁LIVED
- ▁AB
- ▁VISIT
- ▁MOR
- TEN
- ▁WALL
- UC
- ▁MINE
- ▁PLEASURE
- ▁SMILE
- ▁FRONT
- ▁HU
- ▁DEAL
- OW
- ▁FURTHER
- GED
- ▁TRIED
- DA
- VA
- ▁NONE
- ▁ENTERED
- ▁QUEEN
- ▁PAY
- ▁EL
- ▁EXCEPT
- ▁SHA
- ▁FORWARD
- ▁EIGHT
- ▁ADDED
- ▁PUBLIC
- ▁EIGHTEEN
- ▁STAR
- ▁HAPPENED
- ▁LED
- ▁WALKED
- ▁ALTHOUGH
- ▁LATER
- ▁SPIRIT
- ▁WALK
- ▁BIT
- ▁MEET
- LIN
- ▁FI
- LT
- ▁MOUTH
- ▁WAIT
- ▁HOURS
- ▁LIVING
- ▁YOURSELF
- ▁FAST
- ▁CHA
- ▁HALL
- ▁BEYOND
- ▁BOAT
- ▁SECRET
- ENS
- ▁CHAIR
- RN
- ▁RECEIVED
- ▁CAT
- RESS
- ▁DESIRE
- ▁GENTLEMAN
- UGH
- ▁LAID
- EVER
- ▁OCCASION
- ▁WONDER
- ▁GU
- ▁PARTY
- DEN
- ▁FISH
- ▁SEND
- ▁NEARLY
- ▁TRY
- CON
- ▁SEEMS
- RS
- ▁BELL
- ▁BRA
- ▁SILENCE
- IG
- ▁GUARD
- ▁DIE
- ▁DOING
- ▁TU
- ▁COR
- ▁EARLY
- ▁BANK
- ▁FIGURE
- IF
- ▁ENGLAND
- ▁MARY
- ▁AFRAID
- LER
- ▁FO
- ▁WATCH
- ▁FA
- ▁VA
- ▁GRE
- ▁AUNT
- PED
- ▁SERVICE
- ▁JE
- ▁PEN
- ▁MINUTES
- ▁PAN
- ▁TREES
- NED
- ▁GLASS
- ▁TONE
- ▁PLEASE
- ▁FORTH
- ▁CROSS
- ▁EXCLAIMED
- ▁DREW
- ▁EAT
- ▁AH
- ▁GRAVE
- ▁CUR
- PA
- URE
- CENT
- ▁MILES
- ▁SOFT
- ▁AGO
- ▁POSITION
- ▁WARM
- ▁LENGTH
- ▁NECESSARY
- ▁THINKING
- ▁PICTURE
- ▁PI
- SHIP
- IBLE
- ▁HEAVY
- ▁ATTENTION
- ▁DOG
- ABLY
- ▁STANDING
- ▁NATURAL
- ▁APPEAR
- OV
- ▁CAUGHT
- VO
- ISM
- ▁SPRING
- ▁EXPERIENCE
- ▁PAT
- OT
- ▁STOPPED
- ▁REGARD
- ▁HARDLY
- ▁SELF
- ▁STRENGTH
- ▁GREW
- ▁KNIGHT
- ▁OPINION
- ▁WIDE
- ▁INSTEAD
- ▁SOUTH
- ▁TRANS
- ▁CORNER
- ▁LEARN
- ▁ISLAND
- ▁MI
- ▁THIRD
- ▁STE
- ▁STRAIGHT
- ▁TEA
- ▁BOUND
- ▁SEEING
- ▁JU
- ▁DINNER
- ▁BEAUTY
- ▁PEACE
- AH
- ▁REP
- ▁SILENT
- ▁CRE
- ALLY
- RIC
- ▁STEP
- ▁VER
- ▁JO
- GER
- ▁SITTING
- ▁THIRTY
- ▁SAVE
- ENED
- ▁GLANCE
- ▁REACH
- ▁ACTION
- ▁SAL
- ▁SAD
- ▁STONE
- ITIES
- ▁FRENCH
- ▁STRUCK
- ▁PAPER
- ▁WHATEVER
- ▁SUB
- ▁DISTANCE
- ▁WRONG
- ▁KNOWLEDGE
- ▁SAFE
- ▁SNOW
- ▁MUSIC
- ▁FIFTY
- RON
- ▁ATTEMPT
- ▁GOVERNMENT
- TU
- ▁CROWD
- ▁BESIDES
- ▁LOVED
- ▁BOX
- ▁DIRECTION
- ▁TRAIN
- ▁NORTH
- ▁THICK
- ▁GETTING
- AV
- ▁FLOOR
- ▁COMPANY
- ▁BLOW
- ▁PLAIN
- TRO
- ▁BESIDE
- ▁ROCK
- ▁IMMEDIATELY
- FI
- ▁SHADOW
- ▁SIT
- ORS
- ILE
- ▁DRINK
- ▁SPOT
- ▁DANGER
- ▁AL
- ▁SAINT
- ▁SLOWLY
- ▁PALACE
- IER
- ▁RESULT
- ▁PETER
- ▁FOREST
- ▁BELONG
- ▁SU
- ▁PAR
- RIS
- ▁TEARS
- ▁APPEARANCE
- ▁GATE
- BU
- ITION
- ▁QUICKLY
- ▁QUIET
- ▁LONDON
- ▁START
- ▁BROWN
- TRA
- KIN
- ▁CONSIDER
- ▁BATTLE
- ▁ANNE
- ▁PIECE
- ▁DIED
- ▁SUCCESS
- ▁LIPS
- ▁FILLED
- ▁FORGET
- ▁POST
- IFIED
- ▁MARGARET
- ▁FOOD
- HAM
- ▁PLEASANT
- ▁FE
- ▁EXPRESSION
- ▁POCKET
- ▁FRESH
- ▁WEAR
- TRI
- ▁BROKEN
- ▁LAUGHED
- GING
- ▁FOLLOWING
- WN
- IP
- ▁TOUCH
- ▁YOUTH
- ATIVE
- ▁LEG
- ▁WEEK
- ▁REMAINED
- ▁EASY
- NER
- RK
- ▁ENTER
- ▁FIGHT
- ▁PLACED
- ▁TRAVEL
- ▁SIMPLE
- ▁GIRLS
- ▁WAITING
- ▁STOP
- ▁WAVE
- AU
- ▁WISE
- ▁CAMP
- TURE
- UB
- ▁VE
- ▁OFFICE
- ▁GRAND
- ▁FIT
- ▁JUDGE
- UP
- MENTS
- ▁QUICK
- HI
- ▁FLO
- RIES
- VAL
- ▁COMFORT
- ▁PARTICULAR
- ▁STARTED
- ▁SUIT
- ▁NI
- ▁PALE
- ▁IMPOSSIBLE
- ▁HOT
- ▁CONVERSATION
- ▁SCENE
- ▁BOYS
- ▁WIN
- ▁BRE
- ▁SOCIETY
- ▁OUTSIDE
- ▁WRITE
- ▁EFFORT
- ▁TALKING
- ▁FORTUNE
- ▁NINE
- ▁WA
- ▁SINGLE
- ▁RULE
- ▁PORT
- ▁WINTER
- ▁CAST
- ▁CRA
- ▁HAPPEN
- ▁CRO
- ▁SHUT
- NING
- ▁GUN
- ▁NOBLE
- ▁BEGIN
- ▁PATH
- ▁SKY
- ▁WONDERFUL
- ▁SUDDEN
- ▁ARMY
- ▁CHE
- ▁WORTH
- ▁MOUNTAIN
- ▁MIN
- AG
- ▁FLU
- ▁GRACE
- ▁CHAPTER
- ▁BELOW
- ▁RING
- ▁TURNING
- ▁IRON
- ▁TOP
- ▁AFTERNOON
- ORY
- ▁EVIL
- ▁TRUST
- ▁BOW
- ▁TRI
- ▁SAIL
- ▁CONTENT
- ▁HORSES
- ITE
- ▁SILVER
- AP
- ▁LAD
- ▁RUNNING
- ▁HILL
- ▁BEGINNING
- ▁MAD
- ▁HABIT
- GRA
- ▁CLOTHES
- ▁MORROW
- ▁CRY
- ▁FASHION
- ▁PRESENCE
- ▁Z
- FE
- ▁ARRIVED
- ▁QUARTER
- ▁PERFECT
- ▁WO
- ▁TRA
- ▁USUAL
- ▁NECK
- ▁MARRIED
- ▁SEAT
- ▁WI
- ▁GAR
- ▁SAND
- ▁SHORE
- ▁GIVING
- NY
- ▁PROBABLY
- ▁MINUTE
- ▁EXPECT
- ▁DU
- ▁SHOT
- ▁INSTANT
- ▁DEGREE
- ▁COLOR
- ▁WEST
- RT
- ▁MARCH
- ▁BIRD
- ▁SHOWED
- ▁GREATER
- ▁SERIOUS
- ▁CARRY
- ▁COVERED
- ▁FORMER
- ▁LOUD
- ▁MOVED
- ▁MASS
- ▁SEEK
- ▁CHO
- GEN
- ▁ROMAN
- IB
- ▁MOON
- ▁BOARD
- ▁STREAM
- ▁EASILY
- ▁WISHED
- ▁SEARCH
- ▁COULDN
- ▁MONTHS
- ▁SICK
- LIE
- ▁DUTY
- ▁TWELVE
- ▁FAINT
- ▁STRANGER
- ▁SURPRISE
- ▁KILL
- ▁LEAVING
- ▁JOURNEY
- ▁SCARCELY
- ▁RAISED
- ▁SPEAKING
- ▁TERRIBLE
- ▁TOM
- ▁FIELD
- ▁GAME
- ▁QUA
- ▁PROMISE
- ▁LIE
- ▁CONDITION
- ▁TRO
- ▁PERSONAL
- ▁TALL
- ▁STICK
- ▁THREW
- ▁MARRY
- ▁VAN
- ▁BURN
- ▁ACCORDING
- ▁RISE
- ▁ATTACK
- ▁SWORD
- ▁GUESS
- ▁THOUGHTS
- ▁THIN
- ▁THROW
- ▁CALM
- SIDE
- ▁VILLAGE
- ▁DEN
- ▁ANXIOUS
- ▁MER
- GI
- ▁EXPECTED
- ▁BALL
- ▁ESPECIALLY
- ▁CHARGE
- ▁MEASURE
- ISE
- ▁NICE
- ▁TRYING
- ▁ALLOW
- ▁SHARP
- ▁BREAD
- ▁HONOUR
- ▁HONOR
- ▁ENTIRELY
- ▁BILL
- ▁BRI
- ▁WRITTEN
- ▁AR
- ▁BROKE
- ▁KILLED
- ▁MARK
- ▁VEN
- ▁LADIES
- ▁LEARNED
- ▁FLOWERS
- PLE
- ▁FORTY
- ▁OFFER
- ▁HAPPINESS
- ▁PRAY
- ▁CLASS
- ▁FER
- ▁PRINCIPLE
- GU
- ▁BOOKS
- ▁SHAPE
- ▁SUMMER
- ▁JACK
- ▁DRAW
- ▁GOLDEN
- ▁DECIDED
- ▁LEAD
- ▁UNLESS
- ▁HARM
- ▁LISTEN
- HER
- ▁SHOOK
- ▁INFLUENCE
- ▁PERFECTLY
- ▁MARRIAGE
- ▁BROAD
- ▁ESCAPE
- ▁STATES
- ▁MIDDLE
- ▁PLANT
- ▁MIL
- ▁MOVEMENT
- ▁NOISE
- ▁ENEMY
- ▁HISTORY
- ▁BREAK
- ROUS
- ▁UNDERSTOOD
- ▁LATTER
- FER
- ▁COMES
- ▁MERELY
- ▁SIMPLY
- WI
- ▁IMAGINE
- ▁LOWER
- ▁CONDUCT
- ▁BORN
- WA
- ▁YARD
- ▁KA
- ▁CLOSED
- ▁NOTE
- GA
- ▁STRA
- RAN
- ▁EXIST
- EV
- ▁SPEECH
- ▁BITTER
- JO
- ▁MAKES
- ▁GRASS
- ▁REPLY
- ▁CHANGED
- ▁MON
- ▁LYING
- ▁DANCE
- ▁FINALLY
- ▁AMERICAN
- ▁ENJOY
- ▁CONTAIN
- ▁MEANT
- USE
- ▁OBSERVED
- THER
- ▁LAUGH
- ▁AFTERWARDS
- ▁BEAT
- ▁RACE
- ▁EQUAL
- ▁RAIN
- PS
- ▁STEPS
- ▁BENEATH
- ▁TAIL
- ▁TASTE
- IO
- EY
- ▁CHAR
- ▁GE
- GN
- TIN
- ▁GROW
- ▁TE
- IANS
- ▁MOVE
- ▁REPEATED
- ▁DRIVE
- TUR
- ▁SI
- CLOCK
- ▁BRAVE
- ▁MADAME
- ▁LOT
- ▁CASTLE
- ▁HI
- AND
- ▁FUTURE
- ▁RELATION
- ▁SORRY
- ▁HEALTH
- ▁DICK
- ▁R
- ▁BUILDING
- ▁EDGE
- ▁BLESS
- ▁SPITE
- WE
- ▁MIS
- ▁PRISONER
- ▁ALLOWED
- ▁PH
- ▁CATCH
- MER
- ETH
- ▁COAT
- ▁COMPLETE
- ▁WOULDN
- ▁CREATURE
- ▁YELLOW
- ▁IMPORTANT
- ▁ADD
- ▁PASSING
- ▁DARKNESS
- ▁CARRIAGE
- ▁MILL
- ▁FIFTEEN
- NCY
- ▁HUNG
- ▁OB
- ▁PLEASED
- ▁SPREAD
- ▁CURIOUS
- ▁WORSE
- ▁CIRCUMSTANCES
- ▁GI
- LAR
- ▁CAL
- ▁HY
- ▁MERE
- ▁JANE
- ▁EAST
- BI
- ▁CUP
- ▁BLIND
- ▁PASSION
- ▁DISCOVERED
- ▁NOTICE
- ▁REPORT
- ▁SPACE
- ▁PRESENTLY
- ▁SORROW
- ▁PACK
- ▁DIN
- CY
- ▁DRY
- ▁ANCIENT
- ▁DRESSED
- ▁COVER
- ▁VO
- ▁EXISTENCE
- ▁EXACTLY
- ▁BEAST
- ▁PROPER
- ▁DROPPED
- ▁CLEAN
- ▁COLOUR
- ▁HOST
- ▁CHAMBER
- ▁FAITH
- LET
- ▁DETERMINED
- ▁PRIEST
- ▁STORM
- ▁SKIN
- ▁DARE
- ▁PERSONS
- ▁PICK
- ▁NARROW
- ▁SUPPORT
- ▁PRIVATE
- ▁SMILED
- ▁COUSIN
- ▁DRAWING
- ▁ATTEND
- ▁COOK
- ▁PREVENT
- ▁VARIOUS
- ▁BLA
- ▁FIXED
- ▁WEAK
- THE
- ▁HOLE
- ▁BOTTOM
- ▁NOBODY
- ADE
- ▁LEGS
- ITCH
- ▁INDIVIDUAL
- ▁EARS
- LIKE
- ▁ADVANTAGE
- ▁FRANCE
- ▁BON
- ▁WINE
- ▁LIVES
- OD
- ▁WALLS
- ▁TIRED
- ▁SHOP
- ▁ANIMAL
- ▁CRU
- ▁WROTE
- ▁ROYAL
- ▁CONSIDERED
- ▁MORAL
- ▁COMPANION
- ▁LOSE
- ▁ISN
- ▁BAG
- ▁LAKE
- ▁INTER
- ▁COM
- ▁LETTERS
- ▁LUCK
- ▁EAR
- ▁GERMAN
- ▁PET
- ▁SAKE
- ▁DROP
- ▁PAID
- ▁BREAKFAST
- ▁LABOR
- ▁DESERT
- ▁DECLARED
- ▁HUM
- ▁STUDY
- ▁INSTANCE
- ONE
- ▁SOMEWHAT
- ▁CLOTH
- ▁SPECIAL
- ▁COLONEL
- ▁SONG
- ▁MAIN
- ▁VALUE
- ▁PROUD
- ▁EXPRESS
- ▁NATION
- ▁HANDSOME
- ▁CONFESS
- ▁PU
- ▁PASSAGE
- ▁PERIOD
- ▁CUSTOM
- ▁HURT
- ▁SHOULDER
- ▁CHRIST
- ZA
- ▁RECEIVE
- ▁DIFFICULT
- ▁DEPEND
- ▁MEETING
- ▁CHI
- ▁GEN
- LIGHT
- ▁BELIEVED
- ▁SOCIAL
- ▁DIFFICULTY
- ▁GREATEST
- ▁DRAWN
- ▁GRANT
- ▁BIRDS
- ▁ANGRY
- ▁HEAT
- UFF
- ▁DUE
- ▁PLACES
- ▁SIN
- ▁COURAGE
- ▁EVIDENTLY
- ▁GENTLE
- ▁CRUEL
- ▁GEORGE
- ▁GRI
- ▁SERVANT
- ▁U
- ▁PURE
- OOK
- ▁KNOWS
- ▁KNOWING
- LF
- ▁WRITING
- ▁REMEMBERED
- ▁CU
- ▁HOLDING
- ▁TENDER
- ▁QUI
- ▁BURST
- ▁SURELY
- IGN
- ▁VALLEY
- ▁FU
- ▁BUTTER
- ▁SPOKEN
- ▁STORE
- ▁DISC
- ▁CHRISTIAN
- ▁PARIS
- ▁HENRY
- ▁FINISHED
- ▁PROVE
- ▁FOOL
- ▁SOLDIERS
- ▁LANGUAGE
- ▁INSIDE
- ▁BAN
- ▁FALLEN
- ROW
- ▁MAL
- ▁BABY
- ▁SITUATION
- ▁WATCHED
- ANS
- ▁RUIN
- ▁GENTLEMEN
- ▁FRO
- ▁FANCY
- ▁ACCEPT
- ▁SEASON
- ▁OURSELVES
- ▁SAN
- ▁SPEED
- IZED
- ▁COOL
- ▁SERVE
- ▁VESSEL
- ▁WILLIAM
- ▁OBLIGED
- ▁GROUP
- FORM
- ▁GOES
- UOUS
- ▁LEAVES
- ▁PECULIAR
- ▁NEWS
- ▁VAIN
- ▁EVERYBODY
- ▁PIN
- UG
- ▁FORGOTTEN
- ▁FRA
- GAN
- ▁CAREFULLY
- ▁FLASH
- UCH
- ▁FUR
- ▁MURDER
- ▁DELIGHT
- ▁WAITED
- ▁RENDER
- ▁PROPERTY
- ▁NOTICED
- ▁ROLL
- ▁KNOCK
- ▁EARNEST
- KI
- ▁HONEST
- ▁PROMISED
- ▁BAL
- AW
- ▁WALKING
- ANG
- ▁SQUARE
- ▁QUIETLY
- ▁CLOUD
- WOOD
- ▁FORMED
- ▁HIGHER
- ▁BUILT
- ▁FATE
- ▁TEACH
- MY
- ▁FALSE
- ▁YORK
- ▁DUST
- ▁CLIMB
- ▁FOND
- ▁GROWN
- ▁DESCEND
- ▁RAG
- ▁FRUIT
- ▁GENERALLY
- ▁OFFERED
- ▁ER
- ▁NURSE
- POSE
- ▁SPENT
- ▁JOIN
- ▁STATION
- ▁MEANING
- ▁SMOKE
- HOOD
- ▁ROUGH
- JU
- ▁LIKELY
- ▁SURFACE
- ▁KE
- ▁MONTH
- ▁POSSESSION
- ▁TONGUE
- ▁DUKE
- ▁NOSE
- ▁LAUGHING
- ▁WEATHER
- ▁WHISPERED
- ▁SYSTEM
- ▁LAWS
- DDLE
- ▁TOUCHED
- ▁TRADE
- LD
- ▁SURPRISED
- RIN
- ▁ARCH
- ▁WEALTH
- FOR
- ▁TEMPER
- ▁FRANK
- ▁GAL
- ▁BARE
- ▁OPPORTUNITY
- ▁CLAIM
- ▁ANIMALS
- ▁REV
- ▁COST
- ▁WASH
- ZE
- ▁CORN
- ▁OPPOSITE
- ▁POLICE
- ▁IDEAS
- LON
- ▁KEY
- ▁READING
- ▁COLLECT
- CHED
- ▁H
- ▁CROWN
- ▁TAR
- ▁SWIFT
- ▁SHOULDERS
- ▁ICE
- ▁GRAY
- ▁SHARE
- ▁PREPARED
- ▁GRO
- ▁UND
- ▁TER
- ▁EMPTY
- CING
- ▁SMILING
- ▁AVOID
- ▁DIFFERENCE
- ▁EXPLAIN
- ▁POUR
- ▁ATTRACT
- ▁OPENING
- ▁WHEEL
- ▁MATERIAL
- ▁BREAST
- ▁SUFFERING
- ▁DISTINCT
- ▁BOOT
- ▁ROW
- ▁FINGERS
- HAN
- ▁ALTOGETHER
- ▁FAT
- ▁PAPA
- ▁BRAIN
- ▁ASLEEP
- ▁GREY
- ▁SUM
- ▁GAS
- ▁WINDOWS
- ▁ALIVE
- ▁PROCEED
- ▁FLOWER
- ▁LEAP
- ▁PUR
- ▁PIECES
- ▁ALTER
- ▁MEMORY
- IENT
- ▁FILL
- ▁CLO
- ▁THROWN
- ▁KINGDOM
- ▁RODE
- IUS
- ▁MAID
- ▁DIM
- ▁BAND
- ▁VIRTUE
- ▁DISH
- ▁GUEST
- ▁LOSS
- ▁CAUSED
- ▁MOTION
- ▁POT
- ▁MILLION
- ▁FAULT
- ▁LOVELY
- ▁HERO
- PPING
- ▁UNITED
- ▁SPI
- SOME
- BRA
- ▁MOUNTAINS
- ▁NU
- ▁SATISFIED
- ▁DOLLARS
- ▁LOVER
- ▁CONCEAL
- ▁VAST
- ▁PULL
- ▁HATH
- ▁RUSH
- ▁J
- ▁DESPAIR
- EX
- ▁HEIGHT
- ▁CE
- ▁BENT
- ▁PITY
- ▁RISING
- ATH
- ▁PRIDE
- ▁HURRY
- KA
- ▁SETTLED
- ▁JUSTICE
- ▁LIFTED
- PEN
- ▁SOLDIER
- ▁FINDING
- ▁REMARK
- ▁REGULAR
- ▁STRUGGLE
- ▁MACHINE
- ▁SING
- ▁HURRIED
- ▁SUFFICIENT
- ▁REPRESENT
- ▁DOUBLE
- ▁ALARM
- ▁SUPPER
- ▁DREADFUL
- ▁FORE
- ATOR
- ▁STOCK
- ▁TIN
- ▁EXAMPLE
- ▁ROOF
- ▁FLOW
- ▁SUPPOSED
- ▁PRESERV
- ▁L
- ▁LISTENED
- OC
- ▁STO
- ▁SECURE
- ▁FRIGHTENED
- ▁DISTURB
- ▁EMOTION
- ▁SERVANTS
- ▁YO
- ▁BUY
- ▁FORCED
- ▁KITCHEN
- ▁TERROR
- ▁STAIRS
- ▁SIXTY
- KER
- ▁ORDINARY
- ▁DIRECTLY
- ▁HEADS
- ▁METHOD
- ▁FORGIVE
- ▁AWFUL
- ▁REFLECT
- ▁GREATLY
- ▁TALKED
- ▁RIDE
- STONE
- ▁FAVOUR
- ▁WELCOME
- ▁SEIZED
- OU
- ▁CONTROL
- ▁ORDERED
- ▁ANGEL
- ▁USUALLY
- ▁POET
- ▁BOLD
- LINE
- ▁ADVENTURE
- ▁WATCHING
- ▁FOLK
- ▁MISTRESS
- IZE
- ▁GROWING
- ▁CAVE
- ▁EVIDENCE
- ▁FINGER
- ▁SEVENTEEN
- ▁MOVING
- EOUS
- ▁DOESN
- ▁COW
- ▁TYPE
- ▁BOIL
- ▁TALE
- ▁DELIVER
- ▁FARM
- ▁MONSIEUR
- ▁GATHERED
- ▁FEELINGS
- ▁RATE
- ▁REMARKED
- ▁PUTTING
- ▁MAT
- ▁CONTRARY
- ▁CRIME
- ▁PLA
- ▁COL
- ▁NEARER
- TES
- ▁CIVIL
- ▁SHAME
- ▁LOOSE
- ▁DISCOVER
- ▁FLAT
- ▁TWICE
- ▁FAIL
- VIS
- ▁UNC
- EA
- ▁EUROPE
- ▁PATIENT
- ▁UNTO
- ▁SUFFER
- ▁PAIR
- ▁TREASURE
- OSE
- ▁EAGER
- ▁FLY
- ▁N
- ▁VAL
- ▁DAN
- ▁SALT
- ▁BORE
- BBE
- ▁ARTHUR
- ▁AFFAIRS
- ▁SLOW
- ▁CONSIST
- ▁DEVIL
- LAN
- ▁AFFECTION
- ▁ENGAGED
- ▁KISS
- ▁YA
- ▁OFFICER
- IFICATION
- ▁LAMP
- ▁PARTS
- HEN
- ▁MILK
- ▁PROCESS
- ▁GIFT
- ▁PULLED
- ▁HID
- ▁RAY
- ▁EXCELLENT
- ▁IMPRESSION
- ▁AUTHORITY
- ▁PROVED
- ▁TELLING
- TTE
- ▁TOWER
- ▁CONSEQUENCE
- ▁FAVOR
- ▁FLEW
- ▁CHARLES
- ISTS
- ▁ADDRESS
- ▁FAMILIAR
- ▁LIMIT
- ▁CONFIDENCE
- ▁RARE
- ▁WEEKS
- ▁WOODS
- ▁INTENTION
- ▁DIRECT
- ▁PERFORM
- ▁SOLEMN
- ▁DISTANT
- ▁IMAGE
- ▁PRESIDENT
- ▁FIRM
- ▁INDIAN
- ▁RANK
- ▁LIKED
- ▁AGREE
- ▁HOUSES
- ▁WIL
- ▁MATTERS
- ▁PRISON
- ▁MODE
- ▁MAJOR
- ▁WORKING
- ▁SLIP
- ▁WEIGHT
- ▁AWARE
- ▁BUSY
- ▁LOOKS
- ▁WOUND
- ▁THOR
- ▁BATH
- ▁EXERCISE
- ▁SIMILAR
- ▁WORE
- ▁AMOUNT
- ▁QUESTIONS
- ▁VIOLENT
- ▁EXCUSE
- ▁ASIDE
- ▁TUR
- ▁DULL
- OF
- ▁EMPEROR
- ▁NEVERTHELESS
- ▁SHOUT
- ▁EXPLAINED
- ▁SIZE
- ▁ACCOMPLISH
- FORD
- CAN
- ▁MISTAKE
- ▁INSTANTLY
- ▁SMOOTH
- ▁STRIKE
- ▁BOB
- ISED
- ▁HORROR
- ▁SCIENCE
- ▁PROTEST
- ▁MANAGE
- ▁OBEY
- ▁NECESSITY
- ▁SPLENDID
- ▁PRESS
- ▁INTERESTING
- ▁RELIGION
- ▁UNKNOWN
- ▁FIERCE
- ▁DISAPPEARED
- ▁HOLY
- ▁HATE
- ▁PLAYED
- ▁LIN
- ▁NATURALLY
- ▁DROVE
- ▁LOUIS
- TIES
- ▁BRAND
- INESS
- RIE
- ▁SHOOT
- ▁CONSENT
- ▁SEATED
- ▁LINES
- GUE
- ▁AGREED
- ▁CIRCLE
- ▁STIR
- ▁STREETS
- ▁TASK
- ▁RID
- ▁PRODUCED
- ▁ACCIDENT
- ▁WITNESS
- ▁LIBERTY
- ▁DETAIL
- ▁MINISTER
- ▁POWERFUL
- ▁SAVAGE
- ▁SIXTEEN
- ▁PRETEND
- ▁COAST
- ▁SQU
- ▁UTTER
- ▁NAMED
- ▁CLEVER
- ▁ADMIT
- ▁COUPLE
- ▁WICKED
- ▁MESSAGE
- ▁TEMPLE
- ▁STONES
- ▁YESTERDAY
- ▁HILLS
- DAY
- ▁SLIGHT
- ▁DIAMOND
- ▁POSSIBLY
- ▁AFFAIR
- ▁ORIGINAL
- ▁HEARING
- ▁WORTHY
- ▁SELL
- NEY
- ICK
- ▁COTTAGE
- ▁SACRIFICE
- ▁PROGRESS
- ▁SHOCK
- ▁DESIGN
- ▁SOUGHT
- ▁PIT
- ▁SUNDAY
- ▁OTHERWISE
- ▁CABIN
- ▁PRAYER
- ▁DWELL
- ▁GAIN
- ▁BRIDGE
- ▁PARTICULARLY
- ▁YIELD
- ▁TREAT
- RIGHT
- ▁OAK
- ▁ROPE
- WIN
- ▁ORDERS
- ▁SUSPECT
- ▁EDWARD
- AB
- ▁ELEVEN
- ▁TEETH
- ▁OCCURRED
- DDING
- ▁AMERICA
- ▁FALLING
- ▁LION
- ▁DEPART
- ▁KEEPING
- ▁DEMAND
- ▁PAUSED
- ▁CEASED
- INA
- ▁FUN
- ▁CHEER
- ▁PARDON
- ▁NATIVE
- LUS
- LOW
- ▁DOGS
- ▁REQUIRED
- ILITY
- ▁ELECT
- ▁ENTERTAIN
- ITUDE
- ▁HUGE
- ▁CARRYING
- ▁BLU
- ▁INSIST
- ▁SATISFACTION
- ▁HUNT
- ▁COUNTENANCE
- ▁UPPER
- ▁MAIDEN
- ▁FAILED
- ▁JAMES
- ▁FOREIGN
- ▁GATHER
- ▁TEST
- BOARD
- ▁TERMS
- ▁SILK
- ▁BEG
- ▁BROTHERS
- ▁PAGE
- ▁KNEES
- ▁SHOWN
- ▁PROFESSOR
- ▁MIGHTY
- ▁DEFI
- ▁CHARM
- ▁REQUIRE
- ▁LOG
- MORE
- ▁PROOF
- ▁POSSESSED
- ▁SOFTLY
- ▁UNFORTUNATE
- ▁PRICE
- ▁SEVERE
- ▁SINGING
- ▁STAGE
- ▁FREEDOM
- ▁SHOUTED
- ▁FARTHER
- ▁MAJESTY
- ▁PREVIOUS
- ▁GUIDE
- ▁MATCH
- ▁CHEST
- ▁INTENDED
- ▁BI
- ▁EXCITEMENT
- ▁OFFICERS
- ▁SUR
- ▁SHAKE
- ▁SENTIMENT
- ▁GENTLY
- ▁SUCCEEDED
- ▁MENTION
- ▁LOCK
- ▁ACQUAINTANCE
- ▁IMAGINATION
- ▁PHYSICAL
- ▁LEADING
- ▁SLAVE
- ▁CART
- ▁POINTED
- ▁STEAM
- ▁SHADE
- ▁PIPE
- ▁BASE
- ▁INVENT
- ▁ALAS
- ▁WORKED
- ▁REGRET
- ▁BUR
- ▁FAITHFUL
- ▁MENTIONED
- ▁RECORD
- ▁COMPLAIN
- ▁SUPERIOR
- ▁BAY
- ▁PAL
- EMENT
- UE
- ▁SEVENTY
- ▁HOTEL
- ▁SHEEP
- ▁MEAL
- ▁ADVICE
- ▁HIDDEN
- ▁DEMANDED
- ▁CONSCIOUS
- ▁BROW
- ▁POSSESS
- ▁FOURTH
- ▁EVENTS
- ▁FRI
- ▁PRAISE
- ▁ADVANCED
- ▁RESOLVED
- ▁STUFF
- ▁CHEERFUL
- ▁BIRTH
- ▁GRIEF
- ▁AFFORD
- ▁FAIRY
- ▁WAKE
- ▁SIDES
- ▁SUBSTANCE
- ▁ARTICLE
- ▁LEVEL
- ▁MIST
- ▁JOINED
- ▁PRACTICAL
- ▁CLEARLY
- ▁TRACE
- ▁AWAKE
- ▁OBSERVE
- ▁BASKET
- ▁LACK
- VILLE
- ▁SPIRITS
- ▁EXCITED
- ▁ABANDON
- ▁SHINING
- ▁FULLY
- ▁CALLING
- ▁CONSIDERABLE
- ▁SPRANG
- ▁MILE
- ▁DOZEN
- ▁PEA
- ▁DANGEROUS
- ▁WIT
- ▁JEW
- ▁POUNDS
- ▁FOX
- ▁INFORMATION
- ▁LIES
- ▁DECK
- NNY
- ▁PAUL
- ▁STARS
- ▁ANGER
- ▁SETTLE
- ▁WILLING
- ▁ADAM
- ▁FACES
- ▁SMITH
- ▁IMPORTANCE
- ▁STRAIN
- WAR
- ▁SAM
- ▁FEATHER
- ▁SERVED
- ▁AUTHOR
- ▁PERCEIVED
- ▁FLAME
- ▁DIVINE
- ▁TRAIL
- ▁ANYBODY
- ▁SIGH
- ▁DELICATE
- KY
- ▁FOLD
- ▁HAVEN
- ▁DESIRED
- ▁CURIOSITY
- ▁PRACTICE
- ▁CONSIDERATION
- ▁ABSOLUTELY
- ▁CITIZEN
- ▁BOTTLE
- ▁INTERESTED
- ▁MEAT
- ▁OCCUPIED
- ▁CHOOSE
- ▁THROAT
- ETTE
- ▁CANDLE
- ▁DAWN
- ▁PROTECT
- ▁SENTENCE
- IED
- ▁ROCKS
- ▁PORTION
- ▁APPARENTLY
- ▁PRESENTED
- ▁TIGHT
- ▁ACTUALLY
- ▁DYING
- ▁HAM
- ▁DAILY
- ▁SUFFERED
- ▁POLITICAL
- ▁BODIES
- ▁MODERN
- ▁COMPLETELY
- ▁SOONER
- TAN
- ▁PROP
- ▁ADVANCE
- ▁REFUSED
- ▁FARMER
- ▁POLITE
- ▁THUNDER
- ▁BRIEF
- ▁ELSIE
- ▁SAILOR
- ▁SUGGESTED
- ▁PLATE
- ▁AID
- ▁FLESH
- ▁WEEP
- ▁BUCK
- ▁ANTI
- ▁OCEAN
- ▁SPEND
- WELL
- ▁ODD
- ▁GOVERNOR
- ▁ENTRANCE
- ▁SUSPICION
- ▁STEPPED
- ▁RAPIDLY
- ▁CHECK
- ▁HIDE
- ▁FLIGHT
- ▁CLUB
- ▁ENTIRE
- ▁INDIANS
- ASH
- ▁CAPITAL
- ▁MAMMA
- HAR
- ▁CORRECT
- ▁CRACK
- ▁SENSATION
- ▁WORST
- ▁PACE
- ▁MIDST
- ▁AUGUST
- ▁PROPORTION
- ▁INNOCENT
- LINESS
- ▁REGARDED
- ▁DRIVEN
- ORD
- ▁HASTE
- ▁EDUCATION
- ▁EMPLOY
- ▁TRULY
- ▁INSTRUMENT
- ▁MAG
- ▁FRAME
- ▁FOOLISH
- ▁TAUGHT
- ▁HANG
- ▁ARGUMENT
- ▁NINETEEN
- ▁ELDER
- ▁NAY
- ▁NEEDED
- ▁NEIGHBOR
- ▁INSTRUCT
- ▁PAPERS
- ▁REWARD
- ▁EQUALLY
- ▁FIELDS
- ▁DIG
- HIN
- ▁CONDITIONS
- JA
- ▁SPAR
- ▁REQUEST
- ▁WORN
- ▁REMARKABLE
- ▁LOAD
- ▁WORSHIP
- ▁PARK
- ▁KI
- ▁INTERRUPTED
- ▁SKILL
- ▁TERM
- LAC
- ▁CRITIC
- ▁DISTRESS
- ▁BELIEF
- ▁STERN
- IGHT
- ▁TRACK
- ▁HUNTING
- ▁JEWEL
- ▁GRADUALLY
- ▁GLOW
- ▁RUSHED
- ▁MENTAL
- ▁VISITOR
- ▁PICKED
- ▁BEHOLD
- ▁EXPRESSED
- ▁RUB
- ▁SKI
- ARTAGNAN
- ▁MOREOVER
- ▁OPERATION
- ▁CAREFUL
- ▁KEEN
- ▁ASSERT
- ▁WANDER
- ▁ENEMIES
- ▁MYSTERIOUS
- ▁DEPTH
- ▁PREFER
- ▁CROSSED
- ▁CHARMING
- ▁DREAD
- ▁FLOUR
- ▁ROBIN
- ▁TRE
- ▁RELIEF
- ▁INQUIRED
- ▁APPLE
- ▁HENCE
- ▁WINGS
- ▁CHOICE
- ▁JUD
- OO
- ▁SPECIES
- ▁DELIGHTED
- IUM
- ▁RAPID
- ▁APPEAL
- ▁FAMOUS
- ▁USEFUL
- ▁HELEN
- ▁NEWSPAPER
- ▁PLENTY
- ▁BEARING
- ▁NERVOUS
- ▁PARA
- ▁URGE
- ▁ROAR
- ▁WOUNDED
- ▁CHAIN
- ▁PRODUCE
- ▁REFLECTION
- ▁MERCHANT
- ▁QUARREL
- ▁GLORY
- ▁BEGUN
- ▁BARON
- CUS
- ▁QUEER
- ▁MIX
- ▁GAZE
- ▁WHISPER
- ▁BURIED
- ▁DIV
- ▁CARD
- ▁FREQUENTLY
- ▁TIP
- ▁KNEE
- ▁REGION
- ▁ROOT
- ▁LEST
- ▁JEALOUS
- CTOR
- ▁SAVED
- ▁ASKING
- ▁TRIP
- QUA
- ▁UNION
- HY
- ▁COMPANIONS
- ▁SHIPS
- ▁HALE
- ▁APPROACHED
- ▁HARRY
- ▁DRUNK
- ▁ARRIVAL
- ▁SLEPT
- ▁FURNISH
- HEAD
- ▁PIG
- ▁ABSENCE
- ▁PHIL
- ▁HEAP
- ▁SHOES
- ▁CONSCIOUSNESS
- ▁KINDLY
- ▁EVIDENT
- ▁SCAR
- ▁DETERMIN
- ▁GRASP
- ▁STEAL
- ▁OWE
- ▁KNIFE
- ▁PRECIOUS
- ▁ELEMENT
- ▁PROCEEDED
- ▁FEVER
- ▁LEADER
- ▁RISK
- ▁EASE
- ▁GRIM
- ▁MOUNT
- ▁MEANWHILE
- ▁CENTURY
- OON
- ▁JUDGMENT
- ▁AROSE
- ▁VISION
- ▁SPARE
- ▁EXTREME
- ▁CONSTANT
- ▁OBSERVATION
- ▁THRUST
- ▁DELAY
- ▁CENT
- ▁INCLUD
- ▁LIFT
- ▁ADMIRE
- ▁ISSUE
- ▁FRIENDSHIP
- ▁LESSON
- ▁PRINCIPAL
- ▁MOURN
- ▁ACCEPTED
- ▁BURNING
- ▁CAPABLE
- ▁EXTRAORDINARY
- ▁SANG
- ▁REMOVED
- ▁HOPED
- ▁HORN
- ▁ALICE
- ▁MUD
- ▁APARTMENT
- ▁FIGHTING
- ▁BLAME
- ▁TREMBLING
- ▁SOMEBODY
- ▁ANYONE
- ▁BRIDE
- ▁READER
- ▁ROB
- ▁EVERYWHERE
- ▁LABOUR
- ▁RECALL
- ▁BULL
- ▁HIT
- ▁COUNCIL
- ▁POPULAR
- ▁CHAP
- ▁TRIAL
- ▁DUN
- ▁WISHES
- ▁BRILLIANT
- ▁ASSURED
- ▁FORGOT
- ▁CONTINUE
- ▁ACKNOWLEDG
- ▁RETREAT
- ▁INCREASED
- ▁CONTEMPT
- ▁GRANDFATHER
- ▁SYMPATHY
- ▁GHOST
- ▁STRETCHED
- ▁CREATURES
- ▁CAB
- ▁HIND
- ▁PLAYING
- ▁MISERABLE
- ▁MEMBERS
- ▁KINDNESS
- ▁HIGHEST
- ▁PRIM
- ▁KISSED
- ▁DESERVE
- ▁HUT
- ▁BEGGED
- ▁EIGHTY
- ▁CLOSELY
- ▁WONDERED
- ▁MILITARY
- ▁REMIND
- ▁ACCORDINGLY
- ▁LARGER
- ▁MAINTAIN
- ▁ENGINE
- ▁MOTIVE
- ▁DESTROY
- ▁STRIP
- ▁HANS
- ▁AHEAD
- ▁INFINITE
- ▁PROMPT
- ▁INFORMED
- TTLE
- ▁PEER
- ▁PRESSED
- ▁TRAP
- ▁SOMEWHERE
- ▁BOUGHT
- ▁VISIBLE
- ▁ASHAMED
- ▁TEAR
- ▁NEIGHBOUR
- ▁CONSTITUTION
- ▁INTELLIGENCE
- ▁PROFESSION
- ▁HUNGRY
- RIDGE
- ▁SMELL
- ▁STORIES
- ▁LISTENING
- ▁APPROACH
- ▁STRING
- ▁EXPLANATION
- ▁IMMENSE
- ▁RELIGIOUS
- ▁THROUGHOUT
- ▁HOLLOW
- ▁AWAIT
- ▁FLYING
- ▁SCREAM
- ▁ACTIVE
- ▁RUM
- ▁PRODUCT
- ▁UNHAPPY
- ▁VAGUE
- ARIES
- ▁ELIZABETH
- ▁STUPID
- ▁DIGNITY
- ▁ISABEL
- GAR
- ▁BRO
- ▁PITCH
- ▁COMRADE
- ▁STIFF
- ▁RECKON
- ▁SOLD
- ▁SPARK
- ▁STRO
- ▁CRYING
- ▁MAGIC
- ▁REPEAT
- PORT
- ▁MARKED
- ▁COMFORTABLE
- ▁PROJECT
- ▁BECOMING
- ▁PARENTS
- ▁SHELTER
- ▁STOLE
- ▁HINT
- ▁NEST
- ▁TRICK
- ▁THOROUGHLY
- ▁HOSPITAL
- ▁WEAPON
- ▁ROME
- ▁STYLE
- ▁ADMITTED
- ▁SAFETY
- FIELD
- ▁UNDERSTANDING
- ▁TREMBLE
- ▁PRINT
- ▁SLAVES
- ▁WEARY
- ▁ARTIST
- ▁CREDIT
- BURG
- ▁CONCLUSION
- ▁SELDOM
- ▁UNUSUAL
- ▁CLOUDS
- ▁UNABLE
- ▁GAY
- ▁HANGING
- ▁SCR
- ▁BOWED
- ▁DAVID
- ▁VOL
- ▁PUSHED
- ▁ESCAPED
- MOND
- ▁WARN
- ▁BETRAY
- ▁EGGS
- ▁PLAINLY
- ▁EXHIBIT
- ▁DISPLAY
- ▁MEMBER
- ▁GRIN
- ▁PROSPECT
- ▁BRUSH
- ▁BID
- ▁SUCCESSFUL
- ▁EXTENT
- ▁PERSUADE
- ▁MID
- ▁MOOD
- ▁ARRANGED
- ▁UNIVERSAL
- ▁JIM
- ▁SIGNAL
- ▁WHILST
- ▁PHILIP
- ▁WOLF
- RATE
- ▁EAGERLY
- ▁BILLY
- ▁RETURNING
- ▁CONSCIENCE
- ▁FORTUNATE
- ▁FEMALE
- ▁GLEAM
- ▁HASTILY
- ▁PROVIDED
- ▁OBTAIN
- ▁INSTINCT
- ▁CONCERNED
- ▁CONCERNING
- ▁SOMEHOW
- ▁PINK
- ▁RAGE
- ▁ACCUSTOMED
- ▁UNCONSCIOUS
- ▁ADVISE
- ▁BRANCHES
- ▁TINY
- ▁REFUSE
- ▁BISHOP
- ▁SUPPLY
- ▁PEASANT
- ▁LAWYER
- ▁WASTE
- ▁CONNECTION
- ▁DEVELOP
- ▁CORRESPOND
- ▁PLUM
- ▁NODDED
- ▁SLIPPED
- ▁EU
- ▁CONSTANTLY
- CUM
- MMED
- ▁FAIRLY
- HOUSE
- ▁KIT
- ▁RANG
- ▁FEATURES
- ▁PAUSE
- ▁PAINFUL
- ▁JOE
- ▁WHENCE
- ▁LAUGHTER
- ▁COACH
- ▁CHRISTMAS
- ▁EATING
- ▁WHOLLY
- ▁APART
- ▁SUPER
- ▁REVOLUTION
- ▁LONELY
- ▁CHEEKS
- ▁THRONE
- ▁CREW
- ▁ATTAIN
- ▁ESTABLISHED
- TIME
- ▁DASH
- ▁FRIENDLY
- ▁OPERA
- ▁EARL
- ▁EXHAUST
- ▁CLIFF
- ▁REVEAL
- ▁ADOPT
- ▁CENTRE
- ▁MERRY
- ▁SYLVIA
- ▁IDEAL
- ▁MISFORTUNE
- ▁FEAST
- ▁ARAB
- ▁NUT
- ▁FETCH
- ▁FOUGHT
- ▁PILE
- ▁SETTING
- ▁SOURCE
- ▁PERSIST
- ▁MERCY
- ▁BARK
- ▁LUC
- ▁DEEPLY
- ▁COMPARE
- ▁ATTITUDE
- ▁ENDURE
- ▁DELIGHTFUL
- ▁BEARD
- ▁PATIENCE
- ▁LOCAL
- ▁UTTERED
- ▁VICTORY
- ▁TREATED
- ▁SEPARATE
- ▁WAG
- ▁DRAGG
- ▁TITLE
- ▁TROOPS
- ▁TRIUMPH
- ▁REAR
- ▁GAINED
- ▁SINK
- ▁DEFEND
- ▁TIED
- ▁FLED
- ▁DARED
- ▁INCREASE
- ▁POND
- ▁CONQUER
- ▁FOREHEAD
- ▁FAN
- ▁ANXIETY
- ▁ENCOUNTER
- ▁SEX
- ▁HALT
- ▁SANK
- ▁CHEEK
- ▁HUMBLE
- ▁WRITER
- ▁EMPLOYED
- ▁DISTINGUISHED
- ▁RAISE
- ▁WHIP
- ▁GIANT
- ▁RANGE
- ▁OBTAINED
- ▁FLAG
- ▁MAC
- ▁JUMPED
- ▁DISCOVERY
- ▁NATIONAL
- ▁COMMISSION
- ▁POSITIVE
- ▁LOVING
- ▁EXACT
- ▁MURMURED
- ▁GAZED
- ▁REFER
- ▁COLLEGE
- ▁ENCOURAGE
- ▁NOVEL
- ▁CLOCK
- ▁MORTAL
- ▁ROLLED
- ▁RAT
- IZING
- ▁GUILTY
- ▁VICTOR
- WORTH
- ▁PRA
- ▁APPROACHING
- ▁RELATIVE
- ▁ESTATE
- ▁UGLY
- ▁METAL
- ▁ROBERT
- ▁TENT
- ▁ADMIRATION
- ▁FOURTEEN
- ▁BARBAR
- ▁WITCH
- ELLA
- ▁CAKE
- ▁SHONE
- ▁MANAGED
- ▁VOLUME
- ▁GREEK
- ▁DANCING
- ▁WRETCHED
- ▁CONDEMN
- ▁MAGNIFICENT
- ▁CONSULT
- J
- ▁ORGAN
- ▁FLEET
- ▁ARRANGEMENT
- ▁INCIDENT
- ▁MISERY
- ▁ARROW
- ▁STROKE
- ▁ASSIST
- ▁BUILD
- ▁SUCCEED
- ▁DESPERATE
- ▁WIDOW
- UDE
- ▁MARKET
- ▁WISDOM
- ▁PRECISE
- ▁CURRENT
- ▁SPOIL
- ▁BADE
- ▁WOODEN
- ▁RESIST
- ▁OBVIOUS
- ▁SENSIBLE
- FALL
- ▁ADDRESSED
- ▁GIL
- ▁COUNSEL
- ▁PURCHASE
- ▁SELECT
- ▁USELESS
- ▁STARED
- ▁ARREST
- ▁POISON
- ▁FIN
- ▁SWALLOW
- ▁BLOCK
- ▁SLID
- ▁NINETY
- ▁SPORT
- ▁PROVIDE
- ▁ANNA
- ▁LAMB
- ▁INTERVAL
- ▁JUMP
- ▁DESCRIBED
- ▁STRIKING
- ▁PROVISION
- ▁PROPOSED
- ▁MELANCHOLY
- ▁WARRIOR
- ▁SUGGEST
- ▁DEPARTURE
- ▁BURDEN
- ▁LIMB
- ▁TROUBLED
- ▁MEADOW
- ▁SACRED
- ▁SOLID
- ▁TRU
- ▁LUCY
- ▁RECOVER
- ▁ENERGY
- ▁POWDER
- ▁RESUMED
- ▁INTENSE
- ▁BRITISH
- ▁STRAW
- ▁AGREEABLE
- ▁EVERYONE
- ▁CONCERN
- ▁VOYAGE
- ▁SOUTHERN
- ▁BOSOM
- ▁UTTERLY
- ▁FEED
- ▁ESSENTIAL
- ▁CONFINE
- ▁HOUSEHOLD
- ▁EXTREMELY
- ▁WONDERING
- ▁LIST
- ▁PINE
- PHA
- ▁EXPERIMENT
- ▁JOSEPH
- ▁MYSTERY
- ▁RESTORE
- ▁BLUSH
- FOLD
- ▁CHOSEN
- ▁INTELLECT
- ▁CURTAIN
- OLOGY
- ▁MOUNTED
- ▁LAP
- ▁EPI
- ▁PUNISH
- ▁WEDDING
- ▁RECOGNIZED
- ▁DRIFT
- ▁PREPARATION
- ▁RESOLUTION
- ▁OPPRESS
- ▁FIX
- ▁VICTIM
- OGRAPH
- ▁SUMMON
- ▁JULIA
- ▁FLOOD
- ▁WAL
- ULATION
- ▁SLIGHTLY
- ▁LODGE
- ▁WIRE
- ▁CONFUSION
- ▁UNEXPECTED
- ▁CONCEIVE
- ▁PRIZE
- ▁JESUS
- ▁ADDITION
- ▁RUDE
- ▁FATAL
- ▁CARELESS
- ▁PATCH
- ▁KO
- ▁CATHERINE
- ▁PARLIAMENT
- ▁PROFOUND
- ▁ALOUD
- ▁RELIEVE
- ▁PUSH
- ABILITY
- ▁ACCOMPANIED
- ▁SOVEREIGN
- ▁SINGULAR
- ▁ECHO
- ▁COMPOSED
- ▁SHAKING
- ATORY
- ▁ASSISTANCE
- ▁TEACHER
- ▁HORRIBLE
- ▁STRICT
- ▁VERSE
- ▁PUNISHMENT
- ▁GOWN
- ▁MISTAKEN
- ▁VARI
- ▁SWEPT
- ▁GESTURE
- ▁BUSH
- ▁STEEL
- ▁AFFECTED
- ▁DIRECTED
- ▁SURROUNDED
- ▁ABSURD
- ▁SUGAR
- ▁SCRAP
- ▁IMMEDIATE
- ▁SADDLE
- ▁TY
- ▁ARISE
- ▁SIGHED
- ▁EXCHANGE
- ▁IMPATIENT
- ▁SNAP
- ▁EMBRACE
- ▁DISEASE
- ▁PROFIT
- ▁RIDING
- ▁RECOVERED
- ▁GOVERN
- ▁STRETCH
- ▁CONVINCED
- ▁LEANING
- ▁DOMESTIC
- ▁COMPLEX
- ▁MANIFEST
- ▁INDULGE
- ▁GENIUS
- ▁AGENT
- ▁VEIL
- ▁DESCRIPTION
- ▁INCLINED
- ▁DECEIVE
- ▁DARLING
- ▁REIGN
- HU
- ▁ENORMOUS
- ▁RESTRAIN
- ▁DUTIES
- BURY
- TTERED
- ▁POLE
- ▁ENABLE
- ▁EXCEPTION
- ▁INTIMATE
- ▁COUNTESS
- ▁TRIBE
- ▁HANDKERCHIEF
- ▁MIDNIGHT
- ▁PROBLEM
- ▁TRAMP
- ▁OIL
- CAST
- ▁CRUSH
- ▁DISCUSS
- ▁RAM
- ▁TROT
- ▁UNRE
- ▁WHIRL
- ▁LOCKED
- ▁HORIZON
- ▁OFFICIAL
- ▁SCHEME
- ▁DROWN
- ▁PIERRE
- ▁PERMITTED
- ▁CONNECTED
- ▁ASSURE
- ▁COCK
- ▁UTMOST
- ▁DEVOTED
- ▁RELI
- ▁SUFFICIENTLY
- ▁INTELLECTUAL
- ▁CARPET
- ▁OBJECTION
- ▁AFTERWARD
- ▁REALITY
- ▁NEGRO
- ▁RETAIN
- ▁ASCEND
- ▁CEASE
- ▁KATE
- ▁MARVEL
- KO
- ▁BOND
- MOST
- ▁COAL
- GATE
- ▁IGNORANT
- ▁BREAKING
- ▁TWIN
- ▁ASTONISHMENT
- ▁COFFEE
- ▁JAR
- ▁CITIES
- ▁ORIGIN
- ▁EXECUT
- ▁FINAL
- ▁INHABITANTS
- ▁STABLE
- ▁CHIN
- ▁PARTIES
- ▁PLUNGE
- ▁GENEROUS
- ▁DESCRIBE
- ▁ANNOUNCED
- ▁MERIT
- ▁REVERE
- ▁ERE
- ACIOUS
- ZI
- ▁DISAPPOINT
- ▁SUGGESTION
- ▁DOUBTLESS
- ▁TRUNK
- ▁STAMP
- ▁JOB
- ▁APPOINTED
- ▁DIVIDED
- ▁ACQUAINTED
- CHI
- ▁ABSOLUTE
- ▁FEARFUL
- ▁PRIVILEGE
- ▁CRAFT
- ▁STEEP
- ▁HUNTER
- ▁FORBID
- ▁MODEST
- ▁ENDEAVOUR
- ▁SWEEP
- ▁BEHELD
- ▁ABSORB
- ▁CONSTRUCT
- ▁EMPIRE
- ▁EXPEDITION
- ▁ERECT
- ▁OFFEND
- ▁INTEND
- ▁PERMIT
- ▁DESTROYED
- ▁CONTRACT
- ▁THIRST
- ▁WAGON
- ▁EVA
- ▁GLOOM
- ▁ATMOSPHERE
- ▁RESERVE
- ▁VOTE
- ▁GER
- ▁NONSENSE
- ▁PREVAIL
- ▁QUALITY
- ▁CLASP
- ▁CONCLUDED
- ▁RAP
- ▁KATY
- ▁ETERNAL
- ▁MUTTERED
- ▁NEGLECT
- ▁SQUIRE
- ▁CREEP
- LOCK
- ▁ELECTRIC
- ▁HAY
- ▁EXPENSE
- ▁SCORN
- ▁RETIRED
- ▁STOUT
- ▁MURMUR
- ▁SHARPLY
- ▁DISTRICT
- ▁LEAF
- ▁FAILURE
- WICK
- ▁JEAN
- ▁NUMEROUS
- ▁INFANT
- ▁REALIZED
- ▁TRAVELLER
- ▁HUNGER
- ▁JUNE
- ▁MUN
- ▁RECOMMEND
- ▁CREP
- ZZLE
- ▁RICHARD
- WORK
- ▁MONTE
- ▁PREACH
- ▁PALM
- AVI
- ▁ANYWHERE
- ▁DISPOSITION
- ▁MIRROR
- ▁VENTURE
- ▁POUND
- ▁CIGAR
- ▁INVITED
- ▁BENCH
- ▁PROTECTION
- ▁BENEFIT
- ▁THOMAS
- ▁CLERK
- ▁REPROACH
- ▁UNIFORM
- ▁GENERATION
- ▁SEAL
- ▁COMPASS
- ▁WARNING
- ▁EXTENDED
- ▁DIFFICULTIES
- ▁MAYBE
- ▁GROAN
- ▁AFFECT
- ▁COMB
- ▁EARN
- ▁WESTERN
- ▁IDLE
- ▁SCORE
- ▁TAP
- ▁ASTONISHED
- ▁INTRODUCED
- ▁LEISURE
- ▁LIEUTENANT
- ▁VIOLENCE
- ▁FIRMLY
- ▁MONSTER
- ▁UR
- ▁PROPERLY
- ▁TWIST
- ▁PIRATE
- ▁ROBBER
- ▁BATTER
- ▁WEPT
- ▁LEANED
- ▁FOG
- ▁ORNAMENT
- ▁ANDREW
- ▁BUSHES
- ▁REPUBLIC
- ▁CONFIDENT
- ▁LEAN
- ▁DART
- ▁STOOP
- ▁CURL
- ▁COUNTER
- ▁NORTHERN
- ▁PEARL
- ▁NEAREST
- ▁FRANCIS
- ▁WANDERING
- ▁FREQUENT
- ▁STARTLED
- ▁STATEMENT
- ▁OCCUR
- ▁BLOOM
- ▁NERVE
- ▁INSPECT
- ▁INDUCE
- ▁FLATTER
- ▁DATE
- ▁AMBITION
- ▁SLOPE
- ▁MALE
- ▁MADAM
- ▁MONK
- ▁RENT
- ▁CONFIRM
- ▁INVESTIGAT
- ▁RABBIT
- ▁REGIMENT
- ▁SUBMIT
- ▁SPELL
- ▁FURIOUS
- ▁RAIL
- ▁BESTOW
- ▁RALPH
- ▁SCATTERED
- ▁COMPELLED
- ▁THREAD
- ▁CHILL
- ▁DENY
- ▁PRONOUNC
- ▁MANKIND
- ▁CATTLE
- ▁EXECUTION
- ▁REBEL
- ▁SUPREME
- ▁VALUABLE
- ▁LIKEWISE
- ▁CONVEY
- ▁TIDE
- ▁GLOOMY
- ▁COIN
- ▁ACTUAL
- ▁TAX
- ▁PROVINCE
- ▁GRATEFUL
- ▁SPIRITUAL
- ▁VANISHED
- ▁DIANA
- ▁HAUNT
- ▁DRAGON
- ▁CRAWL
- ▁CHINA
- ▁GRATITUDE
- ▁NEAT
- ▁FINISH
- ▁INTENT
- ▁FRIGHT
- ▁EMBARRASS
- ▁THIRTEEN
- ▁RUTH
- ▁SLIGHTEST
- ▁DEVELOPMENT
- ▁INTERVIEW
- ▁SPECTACLE
- ▁BROOK
- VIE
- ▁WEAKNESS
- ▁AUDIENCE
- ▁CONSEQUENTLY
- ▁ABROAD
- ▁ASPECT
- ▁PAINTED
- ▁RELEASE
- ▁INSULT
- ▁SOOTH
- ▁DISAPPOINTMENT
- ▁EMERG
- ▁BRIG
- ▁ESTEEM
- ▁INVITATION
- ▁PASSENGER
- ▁PUBLISH
- ▁PIANO
- ▁IRISH
- ▁DESK
- ▁BEATEN
- ▁FIFTH
- ▁IMPULSE
- ▁SWEAR
- ▁EATEN
- ▁PURPLE
- ▁COMMITTED
- ▁COUNTRIES
- ▁PERCEIVE
- ISON
- ▁CELEBRAT
- ▁GRANDMOTHER
- ▁SHUDDER
- ▁SUNSHINE
- ▁SPANISH
- ▁HITHERTO
- ▁MARILLA
- ▁SNAKE
- ▁MOCK
- ▁INTERFERE
- ▁WALTER
- ▁AMID
- ▁MARBLE
- ▁MISSION
- TERIOR
- ▁DRIVING
- ▁FURNITURE
- ▁STEADY
- ▁CIRCUMSTANCE
- ▁INTERPRET
- ▁ENCHANT
- ▁ERROR
- ▁CONVICTION
- ▁HELPLESS
- ▁MEDICINE
- ▁QUALITIES
- ▁ITALIAN
- ▁HASTENED
- ▁OCCASIONALLY
- ▁PURSUED
- ▁HESITATED
- ▁INDEPENDENT
- ▁OLIVER
- ▁LINGER
- UX
- ▁EXAMINED
- ▁REPENT
- ▁PHYSICIAN
- ▁CHASE
- ▁BELOVED
- ▁ATTACHED
- ▁FLORENCE
- ▁HONEY
- ▁MOUSE
- ▁CRIES
- ▁BAKE
- ▁POEM
- ▁DESTRUCTION
- ▁FULFIL
- ▁MESSENGER
- ▁TRISTRAM
- ▁FANCIED
- ▁EXCESS
- ▁CURSE
- ▁CHU
- ▁QUANTITY
- ▁THORNTON
- ▁CREATED
- ▁CONTINUALLY
- ▁LIGHTNING
- ▁BORNE
- ▁TOTAL
- ▁DISPOSED
- ▁RIFLE
- ▁POLLY
- ▁GOAT
- ▁BACKWARD
- ▁VIRGINIA
- ▁KICK
- ▁PERIL
- ▁QUO
- ▁GLORIOUS
- ▁MULTITUDE
- ▁LEATHER
- ▁ABSENT
- ▁DEMON
- ▁DEBT
- ▁TORTURE
- ▁ACCORD
- ▁MATE
- ▁CATHOLIC
- ▁PILL
- ▁LIBRARY
- ▁PURSUIT
- ▁SHIRT
- ▁DEAREST
- ▁COLLAR
- ▁BEACH
- ▁ROBE
- ▁DECLARE
- ▁BRANCH
- ▁TEMPT
- ▁STEADILY
- ▁DISGUST
- ▁SILLY
- ▁ARRIVE
- ▁DRANK
- ▁LEVI
- ▁COMMUNICAT
- ▁RACHEL
- ▁WASHINGTON
- ▁RESIGN
- ▁MEANTIME
- ▁LACE
- ▁ENGAGEMENT
- ▁QUIVER
- ▁SEPARATED
- ▁DISCUSSION
- ▁VENTURED
- ▁SURROUNDING
- ▁POLISH
- ▁NAIL
- ▁SWELL
- ▁JOKE
- ▁LINCOLN
- ▁STUDENT
- ▁GLITTER
- ▁RUSSIAN
- ▁READILY
- ▁CHRIS
- ▁POVERTY
- ▁DISGRACE
- ▁CHEESE
- ▁HEAVILY
- ▁SCALE
- ▁STAFF
- ▁ENTREAT
- ▁FAREWELL
- ▁LUNCH
- ▁PEEP
- ▁MULE
- ▁SOMEONE
- ▁DISAPPEAR
- ▁DECISION
- ▁PISTOL
- ▁PUN
- ▁SPUR
- ▁ASSUMED
- ▁EXTEND
- ▁ENTHUSIASM
- ▁DEFINITE
- ▁UNDERTAKE
- ▁COMMITTEE
- ▁SIMON
- ▁FENCE
- ▁APPLIED
- ▁RELATED
- ▁VICE
- ▁UNPLEASANT
- ▁PROBABLE
- ▁PROCURE
- ▁FROWN
- ▁CLOAK
- ▁HUMANITY
- ▁FAMILIES
- ▁PHILOSOPHER
- ▁DWARF
- ▁OVERCOME
- ▁DEFEAT
- ▁FASTENED
- ▁MARSH
- ▁CLASSES
- ▁TOMB
- ▁GRACIOUS
- ▁REMOTE
- ▁CELL
- ▁SHRIEK
- ▁RESCUE
- ▁POOL
- ▁ORGANIZ
- ▁CHOSE
- ▁CUTTING
- ▁COWARD
- ▁BORDER
- ▁DIRTY
- ▁MONKEY
- ▁HOOK
- ▁CHUCK
- ▁EMILY
- ▁JEST
- ▁PLAC
- ▁WEIGH
- ▁ASSOCIATE
- ▁GLIMPSE
- ▁STUCK
- ▁BOLT
- ▁MURDERER
- ▁PONY
- ▁DISTINGUISH
- ▁INSTITUTION
- ▁CUNNING
- ▁COMPLIMENT
- ▁APPETITE
- ▁REPUTATION
- ▁FEEBLE
- ▁KIN
- ▁SERIES
- ▁GRACEFUL
- ▁PLATFORM
- ▁BREEZE
- ▁PHRASE
- ▁CLAY
- MONT
- ▁RATTL
- ▁OPPOSITION
- ▁LANE
- ▁BOAST
- ▁GROWTH
- ▁INCLINATION
- ▁BEHAVE
- ▁SUSAN
- ▁DISTINCTION
- ▁DISLIKE
- ▁NICHOLAS
- ▁SATISFY
- ▁DRAMA
- ▁ELBOW
- ▁GAZING
- ▁CONSUM
- ▁SPIN
- ▁OATH
- ▁CHANNEL
- ▁CHARACTERISTIC
- ▁SPEAR
- ▁SLAIN
- ▁SAUCE
- ▁FROG
- ▁CONCEPTION
- ▁TIMID
- ▁ZEAL
- ▁APPARENT
- SHIRE
- ▁CENTER
- ▁VARIETY
- ▁DUSK
- ▁APT
- ▁COLUMN
- ▁REVENGE
- ▁RIVAL
- ▁IMITAT
- ▁PASSIONATE
- ▁SELFISH
- ▁NORMAN
- ▁REPAIR
- ▁THRILL
- ▁TREATMENT
- ▁ROSA
- ▁MARTIN
- ▁INDIFFERENT
- ▁THITHER
- ▁GALLANT
- ▁PEPPER
- ▁RECOLLECT
- ▁VINE
- ▁SCARCE
- ▁SHIELD
- ▁MINGLED
- CLOSE
- ▁HARSH
- ▁BRICK
- ▁HUMOR
- ▁MISCHIEF
- ▁TREMENDOUS
- ▁FUNCTION
- ▁SMART
- ▁SULTAN
- ▁DISMISS
- ▁THREATENED
- ▁CHEAP
- ▁FLOCK
- ▁ENDEAVOR
- ▁WHISK
- ▁ITALY
- ▁WAIST
- ▁FLUTTER
- ▁SMOKING
- ▁MONARCH
- ▁AFRICA
- ▁ACCUSE
- ▁HERBERT
- ▁REFRESH
- ▁REJOICE
- ▁PILLOW
- ▁EXPECTATION
- ▁POETRY
- ▁HOPELESS
- ▁PERISH
- ▁PHILOSOPHY
- ▁WHISTLE
- ▁BERNARD
- ▁LAMENT
- ▁IMPROVE
- ▁SUP
- ▁PERPLEX
- ▁FOUNTAIN
- ▁LEAGUE
- ▁DESPISE
- ▁IGNORANCE
- ▁REFERENCE
- ▁DUCK
- ▁GROVE
- ▁PURSE
- ▁PARTNER
- ▁PROPHET
- ▁SHIVER
- ▁NEIGHBOURHOOD
- ▁REPRESENTATIVE
- SAIL
- ▁WIP
- ▁ACQUIRED
- ▁CHIMNEY
- ▁DOCTRINE
- ▁MAXIM
- ▁ANGLE
- ▁MAJORITY
- ▁AUTUMN
- ▁CONFUSED
- ▁CRISTO
- ▁ACHIEVE
- ▁DISGUISE
- ▁REDUCED
- ▁EARLIER
- ▁THEATRE
- ▁DECIDE
- MINATED
- OLOGICAL
- ▁OCCUPATION
- ▁VIGOROUS
- ▁CONTINENT
- ▁DECLINE
- ▁COMMUNITY
- ▁MOTIONLESS
- ▁HATRED
- ▁COMMUNICATION
- ▁BOWL
- ▁COMMENT
- ▁APPROVE
- ▁CEREMONY
- ▁CRIMINAL
- ▁SCIENTIFIC
- ▁DUCHESS
- ▁VIVID
- ▁SHIFT
- ▁AVAIL
- ▁DAMP
- ▁JOHNSON
- ▁SLENDER
- ▁CONTRAST
- ▁AMUSEMENT
- ▁PLOT
- ▁LYN
- ▁ASSOCIATION
- ▁SNATCH
- ▁UNCERTAIN
- ▁PRESSURE
- ▁PERCH
- ▁APPLY
- ▁PLANET
- ▁NOTWITHSTANDING
- ▁SWUNG
- ▁STIRRED
- ▁ATTENDANT
- ▁ENJOYMENT
- ▁WORRY
- ▁ALBERT
- ▁NAKED
- ▁TALENT
- ▁MARIAN
- ▁REFORM
- ▁DELIBERATE
- ▁INTELLIGENT
- ▁SENSITIVE
- ▁YONDER
- ▁PUPIL
- ▁FRIGHTFUL
- ▁DOUBTFUL
- ▁STANDARD
- ▁MAGISTRATE
- ▁SHEPHERD
- ▁STOMACH
- ▁DEPOSIT
- ▁RENEW
- ▁HEDGE
- ▁FRANCS
- ▁POSSIBILITY
- ▁RESEMBLE
- ▁FATIGUE
- ▁PORTRAIT
- ▁FAVORITE
- ▁CREAM
- ▁BURG
- ▁SECRETARY
- ▁DIVERS
- ▁ACTIVITY
- ▁SPECULAT
- ▁HUMOUR
- ▁FITTED
- ▁EXTERNAL
- ▁CETERA
- ▁WRAPPED
- ▁WHIT
- ▁FRED
- ▁EXAMINATION
- ▁LODGING
- ▁OWING
- ▁JAW
- ▁CROW
- ▁BALANCE
- ▁PUFF
- ▁TENDERNESS
- ▁PORTHOS
- ▁ANCHOR
- ▁INTERRUPT
- ▁NECESSARILY
- ▁PERPETUAL
- ▁AGONY
- ▁POPE
- ▁SCHOLAR
- ▁SCOTLAND
- ▁SUPPRESS
- ▁WRATH
- ▁WRECK
- ▁EXCEED
- ▁PERFECTION
- ▁INDIA
- ▁TRADITION
- ▁SECTION
- ▁EASTERN
- ▁DOORWAY
- ▁WIVES
- ▁CONVENTION
- ▁ANNOUNC
- ▁EGYPT
- ▁CONTRADICT
- ▁SCRATCH
- ▁CENTRAL
- ▁GLOVE
- ▁WAX
- ▁PREPARE
- ▁ACCOMPANY
- ▁INCREASING
- ▁LIBERAL
- ▁RAISING
- ▁ORANGE
- ▁SHOE
- ▁ATTRIBUTE
- ▁LITERATURE
- ▁PUZZLED
- ▁WITHDRAW
- ▁WHITHER
- ▁HAWK
- ▁MOONLIGHT
- ▁EXAMINE
- ▁HAPPILY
- ▁PRECEDE
- ▁DETECTIVE
- ▁INCHES
- ▁SOLITARY
- ▁DUTCH
- ▁NAPOLEON
- ▁UNEASY
- ▁CARDINAL
- ▁BLEW
- ▁FOWL
- ▁DECORAT
- ▁CHILDHOOD
- ▁TORMENT
- ▁LOSING
- ▁PERMISSION
- ▁BLANK
- ▁UPSTAIRS
- ▁CAPACITY
- ▁TRIFLE
- ▁FOLLY
- ▁RECOGNIZE
- ▁REMOVE
- ▁VENGEANCE
- ▁ENTERPRISE
- ▁BEDROOM
- ▁ANYHOW
- ▁INQUIRY
- ▁ASHES
- ▁DRAG
- ▁HUSH
- ▁AWKWARD
- ▁SATURDAY
- ▁GENUINE
- ▁SURVIV
- ▁SKIRT
- ▁AFFECTIONATE
- ▁TANG
- ▁MUTUAL
- ▁DISPUTE
- ▁EAGLE
- ▁INCOME
- ▁BIND
- ▁FAME
- ▁IMPROVEMENT
- ROVING
- ▁DIFFER
- ▁AWOKE
- ▁SLEEVE
- ▁SOLITUDE
- ▁FAVOURITE
- JI
- ▁DETECT
- ▁COMPREHEND
- ▁PREPARING
- ▁SERPENT
- ▁SUMMIT
- ▁KNOT
- ▁KNIT
- ▁COPY
- ▁STOPPING
- ▁FADED
- ▁HIDEOUS
- ▁JULIE
- STEAD
- ▁SHINE
- ▁CONFLICT
- ▁PROPOSITION
- ▁REFUGE
- ▁GALLERY
- ▁BUNDLE
- ▁AXE
- ▁SLAVERY
- ▁MASK
- ▁ALYOSHA
- ▁LADDER
- ▁DEPARTMENT
- ▁DISCHARGE
- ▁DEPRESS
- ▁GALLOP
- ▁SCARLET
- ▁KITTY
- ▁RECEIVING
- ▁SURRENDER
- ▁SUSTAIN
- ▁TWILIGHT
- ▁CONGRESS
- ▁IRELAND
- ▁FUNNY
- ▁LEND
- ▁CONSTITUTE
- ▁FUNERAL
- ▁CRYSTAL
- ▁SPAIN
- ▁EXCEEDINGLY
- ▁DAMN
- ▁COMMUN
- ▁CIVILIZATION
- ▁PREJUDICE
- ▁PORCH
- ▁ASSISTANT
- ▁INDUSTRY
- ▁TUMBLE
- ▁DEFENCE
- ▁HITHER
- ▁SMOT
- ▁COLONI
- ▁AMAZEMENT
- ▁MARGUERITE
- ▁MIRACLE
- ▁INHERIT
- ▁BEGGAR
- ▁ENVELOPE
- ▁INDIGNATION
- ▁NATASHA
- ▁PROPOSAL
- ▁FRAGMENT
- ▁ROUSED
- ▁ROAST
- ENCIES
- ▁COMMENCED
- ▁RESOURCE
- ▁POPULATION
- ▁QUOTH
- ▁PURSUE
- ▁EDUCAT
- ▁AFFLICT
- ▁CONTACT
- ▁CRIMSON
- ▁DIVISION
- ▁DISORDER
- ▁COPPER
- ▁SOLICIT
- ▁MODERATE
- ▁DRUM
- ▁SWIM
- ▁SALUTE
- ▁ASSUME
- ▁MUSCLE
- ▁OVERWHELM
- ▁SHAKESPEARE
- ▁STRUGGLING
- ▁TRANQUIL
- ▁CHICKEN
- ▁TREAD
- ▁CLAW
- ▁BIBLE
- ▁RIDGE
- ▁THREAT
- ▁VELVET
- ▁EXPOSED
- ▁IDIOT
- ▁BARREL
- ▁PENNY
- ▁TEMPTATION
- ▁DANGLARS
- ▁CENTURIES
- ▁DISTRIBUT
- ▁REJECT
- ▁RETORTED
- ▁CONCENTRAT
- ▁CORDIAL
- ▁MOTOR
- ▁CANNON
- KEEP
- ▁WRETCH
- ▁ASSURANCE
- ▁THIEF
- ▁SURVEY
- ▁VITAL
- ▁RAILWAY
- ▁JACKSON
- ▁CRASH
- ▁GROWL
- ▁COMBAT
- ▁RECOLLECTION
- ▁SECURITY
- ▁JACOB
- ▁CLUTCH
- ▁BLANKET
- ▁NANCY
- ▁CELLAR
- ▁CONVENIENT
- ▁INDIGNANT
- ▁COARSE
- ▁WORM
- ▁SCREEN
- ▁TRANSPORT
- ▁BULLET
- ▁APPRECIATE
- ▁DEVOTION
- ▁INVISIBLE
- ▁DRIED
- ▁MIXTURE
- ▁CANDID
- ▁PERFORMANCE
- ▁RIPE
- ▁EXQUISITE
- ▁BARGAIN
- ▁TOBACCO
- ▁LOYAL
- ▁MOULD
- ▁ATTENTIVE
- ▁DOROTHY
- ▁BRUTE
- ▁ESTABLISHMENT
- ▁ABILITY
- ▁INHABIT
- ▁OBSCURE
- ▁BORROW
- ▁ESSENCE
- ▁DISMAY
- ▁FLEE
- ▁BLADE
- ▁PLUCK
- ▁COFFIN
- ▁SUNSET
- ▁STEPHEN
- ▁ECONOMIC
- ▁HOLIDAY
- ▁MECHANICAL
- ▁COTTON
- ▁AWAKENED
- ▁SEIZE
- ▁RIDICULOUS
- ▁SANCHO
- ▁HESITATION
- ▁CORPSE
- ▁SAVING
- HOLD
- FOOT
- ▁ELDEST
- ▁DESPITE
- ▁EDITH
- ▁CHERISH
- ▁RESISTANCE
- ▁WILSON
- ▁ARGUE
- ▁INQUIRE
- ▁APPREHENSION
- ▁AVENUE
- ▁DRAKE
- ▁PROPOSE
- HURST
- ▁INFERIOR
- ▁STAIRCASE
- ▁WHEREFORE
- ▁CARLYLE
- ▁COUCH
- ▁ROUTE
- ▁POLITICS
- ▁TOMORROW
- ▁THRONG
- ▁NAUGHT
- ▁SUNLIGHT
- ▁INDIFFERENCE
- ▁OBEDIENCE
- ▁RECEPTION
- ▁VEGETABLE
- ▁IMPERFECT
- ▁RESIDENCE
- ▁TURKEY
- ▁VIOLET
- ▁SARAH
- ▁ALTAR
- ▁GRIEVE
- ▁JERK
- ▁ENSU
- ▁MAGICIAN
- ▁BLOSSOM
- ▁LANTERN
- ▁RESOLUTE
- ▁THOUGHTFULLY
- ▁FORTNIGHT
- ▁TRUMPET
- ▁VALJEAN
- ▁UNWILLING
- ▁LECTURE
- ▁WHEREUPON
- ▁HOLLAND
- ▁CHANGING
- ▁CREEK
- ▁SLICE
- ▁NORMAL
- ▁ANNIE
- ▁ACCENT
- ▁FREDERICK
- ▁DISAGREEABLE
- ▁RUBBED
- ▁DUMB
- ▁ESTABLISH
- ▁IMPORT
- ▁AFFIRM
- ▁MATTHEW
- ▁BRISK
- ▁CONVERT
- ▁BENDING
- ▁IVAN
- ▁MADEMOISELLE
- ▁MICHAEL
- ▁EASIER
- ▁JONES
- ▁FACING
- ▁EXCELLENCY
- ▁LITERARY
- ▁GOSSIP
- ▁DEVOUR
- ▁STAGGER
- ▁PENCIL
- ▁AVERAGE
- ▁HAMMER
- ▁TRIUMPHANT
- ▁PREFERRED
- ▁APPLICATION
- ▁OCCUPY
- ▁AUTHORITIES
- BURN
- ▁ASCERTAIN
- ▁CORRIDOR
- ▁DELICIOUS
- ▁PRACTISE
- ▁UNIVERSE
- ▁SHILLING
- ▁CONTEST
- ▁ASHORE
- ▁COMMIT
- ▁ADMINISTRATION
- ▁STUDIED
- ▁RIGID
- ▁ADORN
- ▁ELSEWHERE
- ▁INNOCENCE
- ▁JOURNAL
- ▁LANDSCAPE
- ▁TELEGRAPH
- ▁ANGRILY
- ▁CAMPAIGN
- ▁UNJUST
- ▁CHALLENGE
- ▁TORRENT
- ▁RELATE
- ▁ASSEMBLED
- ▁IMPRESSED
- ▁CANOE
- ▁CONCLUD
- ▁QUIXOTE
- ▁SATISFACTORY
- ▁NIECE
- ▁DEAF
- ▁RAFT
- ▁JIMMY
- ▁GLID
- ▁REGULAT
- ▁CHATTER
- ▁GLACIER
- ▁ENVY
- ▁STATUE
- ▁BOSTON
- ▁RICHMOND
- ▁DENIED
- ▁FANNY
- ▁SOLOMON
- ▁VULGAR
- ▁STALK
- ▁REPLACE
- ▁SPOON
- ▁BASIN
- ▁FEATURE
- ▁CONVICT
- ▁ARCHITECT
- ▁ADMIRAL
- ▁RIBBON
- ▁PERMANENT
- ▁APRIL
- ▁JOLLY
- ▁NEIGHBORHOOD
- ▁IMPART
- BOROUGH
- CAMP
- ▁HORRID
- ▁IMMORTAL
- ▁PRUDENCE
- ▁SPANIARD
- ▁SUPPOSING
- ▁TELEPHONE
- ▁TEMPERATURE
- ▁PENETRATE
- ▁OYSTER
- ▁APPOINTMENT
- ▁EGYPTIAN
- ▁DWELT
- ▁NEPHEW
- ▁RAILROAD
- ▁SEPTEMBER
- ▁DEVICE
- ▁WHEAT
- ▁GILBERT
- ▁ELEGANT
- ▁ADVERTISE
- ▁RATIONAL
- ▁TURTLE
- ▁BROOD
- ▁ASSEMBLY
- ▁CULTIVATE
- ▁EDITOR
- ▁SPECIMEN
- ▁UNDOUBTEDLY
- ▁WHALE
- ▁DROPPING
- ▁BALLOON
- ▁MEDICAL
- COMB
- ▁COMPOSITION
- ▁FOOTSTEPS
- ▁LAUNCELOT
- ▁DISCOURSE
- ▁ERRAND
- ▁CONVERSE
- ▁ADVANCING
- ▁DOWNSTAIRS
- ▁TUMULT
- ▁CORRUPT
- ▁SUFFICE
- ▁ANGUISH
- ▁SHAGGY
- ▁RETIRE
- ▁TIMBER
- ▁BLAZE
- ▁ABSTRACT
- ▁EMBROIDER
- ▁PHOTOGRAPH
- ▁PROSPERITY
- ▁TERRIBLY
- ▁TERRITORY
- ▁THRESHOLD
- ▁PAVEMENT
- ▁INJURED
- ▁LIMP
- ▁AGITATION
- ▁RASCAL
- ▁PRESUME
- ▁OBSERVING
- ▁OBSTACLE
- ▁SIMPLICITY
- ▁SLUMBER
- ▁SUPPLIED
- ▁COMBINATION
- ▁DRAIN
- ▁WILDERNESS
- ▁BELIEVING
- ▁VILLAIN
- ▁RECKLESS
- ▁INJURY
- ▁CLAPP
- ▁FRIDAY
- ▁HERCULES
- ▁KENNEDY
- ▁SYMPTOM
- ▁SLEDGE
- ▁CEILING
- ▁LEMON
- ▁PLAGUE
- ▁MONDAY
- ▁CANVAS
- ▁IMPATIENCE
- ▁UNCOMFORTABLE
- ▁ACCESS
- ▁FROZEN
- ▁SENATOR
- ▁FRANZ
- ▁SWIMMING
- ▁BARRIER
- ▁ADJUST
- ▁COMPARISON
- ▁PROCLAIM
- ▁WRINKL
- ▁OVERLOOK
- ▁MITYA
- ▁GUILT
- ▁PERCEPTION
- ▁PRECAUTION
- ▁SPECTATOR
- ▁SURPRISING
- ▁DISTRACT
- ▁DISDAIN
- ▁BONNET
- ▁MAGNET
- ▁PROFESS
- ▁CONFOUND
- ▁NARRATIVE
- ▁STRUCTURE
- ▁SKETCH
- ▁ULTIMATE
- ▁GLOBE
- ▁INSECT
- FICIENCY
- ▁ORCHARD
- ▁AMIABLE
- ▁DESCENT
- ▁INDEPENDENCE
- ▁MANUFACTURE
- ▁SPRINKLE
- ▁NIGHTINGALE
- ▁CUSHION
- ▁EMINENT
- ▁SCOTT
- ▁ARRAY
- ▁COSETTE
- ▁WAVING
- ▁EXTRACT
- ▁IRREGULAR
- ▁PERSECUT
- ▁DERIVED
- ▁WITHDREW
- ▁CAUTION
- ▁SUSPICIOUS
- ▁MEMORIES
- ▁NOWHERE
- ▁SUBTLE
- ▁THOROUGH
- Q
- ▁APPROPRIATE
- ▁SLAUGHTER
- ▁YOURSELVES
- ▁THUMB
- ▁TWAS
- ▁ABODE
- ▁BIDDING
- ▁CONSPICUOUS
- ▁REBECCA
- ▁SERGEANT
- ▁APRON
- ▁ANTICIPATE
- ▁DISCIPLINE
- ▁GLANCING
- ▁PILGRIM
- ▁SULLEN
- ▁CONTRIBUTE
- ▁PRAIRIE
- ▁CARVED
- ▁COMMERCE
- ▁EXCLAMATION
- ▁MUSCULAR
- ▁NOVEMBER
- ▁PHENOMENA
- ▁SYMBOL
- ▁UMBRELLA
- ▁DIMINISH
- ▁PARLOUR
- ▁THREATENING
- ▁STUMP
- ▁EXTENSIVE
- ▁PLEASING
- ▁REMEMBRANCE
- ▁COMBINED
- ▁SHERIFF
- ▁SHAFT
- ▁LAURA
- ▁INTERCOURSE
- ▁STRICKEN
- ▁SUPPLIES
- ▁LANDLORD
- ▁SHRINK
- ▁PRICK
- ▁CAESAR
- ▁DRUG
- ▁BEWILDERED
- ▁NAUTILUS
- ▁BRUTAL
- ▁COMMERCIAL
- ▁MAGGIE
- ▁SPHERE
- ▁VIRGIN
- ▁BRETHREN
- ▁DESTINY
- ▁POLICY
- ▁TERRIFIED
- ▁HOUSEKEEPER
- ▁CRAZY
- ▁ARDENT
- ▁DISCERN
- ▁WRAP
- ▁MARQUIS
- ▁RUSSIA
- MOUTH
- ▁BRITAIN
- ▁HARBOUR
- ▁CONCERT
- ▁DONKEY
- ▁DAMAGE
- ▁SLIM
- ABOUT
- ▁LUXURY
- ▁MONSTROUS
- ▁TENDENCY
- ▁PARADISE
- ▁CULTURE
- ▁JULIUS
- ▁RAOUL
- ▁REMEDY
- ▁DECAY
- ▁SCOLD
- ▁SPLIT
- ▁ASSAULT
- ▁DECEMBER
- ▁MOSCOW
- ▁EXPLORE
- ▁TROUSERS
- ▁WRIST
- PIECE
- ▁MUSKET
- ▁VALENTINE
- ▁TYRANT
- ▁ABRAHAM
- ▁MEDIUM
- ▁ARTIFICIAL
- ▁FACULTY
- ▁OBLIGATION
- ▁RESEMBLANCE
- ▁INQUIRIES
- ▁DETAIN
- ▁SWARM
- ▁PLEDGE
- ▁ADMIRABLE
- ▁DEFECT
- ▁SUPERINTEND
- ▁PATRIOT
- ▁CLUNG
- ▁DISMAL
- ▁RECIT
- ▁IGNOR
- ▁AMELIA
- ▁JUSTIFY
- ▁ELEPHANT
- ▁ESTIMATE
- ▁KNELT
- ▁SERVING
- ▁WHIM
- ▁SHRILL
- ▁STUDIO
- ▁TEXT
- ▁ALEXANDER
- ▁WROUGHT
- ▁ABUNDANT
- ▁SITUATED
- ▁REGAIN
- ▁FIERY
- ▁SNEER
- ▁SWEAT
- ▁GLARE
- ▁NIGH
- ▁ESCORT
- ▁INEVITABLE
- ▁PSMITH
- ▁RELUCTANT
- ▁PRECEDING
- ▁RESORT
- ▁OUTRAGE
- ▁AMBASSADOR
- ▁CONSOLATION
- ▁RECOGNITION
- ▁REMORSE
- ▁BEHALF
- ▁FORMIDABLE
- ▁GRAVITY
- ▁DIVIDE
- ▁CONFRONT
- ▁GIGANTIC
- ▁OCTOBER
- ▁FLANK
- ▁SLEW
- ▁CLARA
- ▁FILM
- ▁BULK
- ▁POMP
- ▁ELEANOR
- ▁EMPHASIS
- ▁JAPANESE
- ▁CAVALRY
- ▁EXCLUSIVE
- ▁PERFUME
- ▁BRONZE
- ▁FEDERAL
- ▁LIQUID
- ▁RUBBING
- ▁OVEN
- DOLPH
- ▁CONVULS
- ▁DEPRIVED
- ▁RESPONSIBILITY
- ▁SIGNIFICANT
- ▁WAISTCOAT
- ▁CLUSTER
- ▁MARTHA
- ▁REVERSE
- ▁ATTORNEY
- ▁DROOP
- ▁SKILFUL
- ▁HABITUAL
- ▁PUMP
- ▁INTERVEN
- ▁OWL
- ▁CONJECTURE
- ▁FANTASTIC
- ▁RESPONSIBLE
- ▁DESTINED
- ▁DOCUMENT
- ▁THEREUPON
- ▁GODDESS
- ▁PACIFIC
- ▁WARRANT
- ▁COSTUME
- ▁BRIDLE
- ▁CALIFORNIA
- ▁DEMOCRATIC
- ▁EUSTACE
- ▁SQUIRREL
- ▁UNCOMMON
- ▁MARVELLOUS
- ▁PLOUGH
- ▁TRAGEDY
- ▁VAULT
- ▁HESITATE
- ▁REFRAIN
- ▁ADMIRING
- ▁CORPORAL
- ▁ENTITLED
- ▁SHREWD
- ▁SQUEEZ
- ▁ACCURATE
- ▁TEMPEST
- ▁MONUMENT
- ▁SIEGE
- ▁CHINESE
- ▁RAVEN
- ▁LOUNG
- ▁ASSASSIN
- ▁INFLICT
- ▁AGITATED
- ▁DESIRABLE
- ▁EARLIEST
- ▁LAUNCH
- ▁PILOT
- ▁PULSE
- ▁MUTE
- LEIGH
- ▁LIQUOR
- ▁SCARECROW
- ▁SKULL
- ▁DESOLATE
- ▁SUBLIME
- ▁SERENE
- ▁RECESS
- ▁WAKING
- ▁CHARLOTTE
- ▁CIRCULAR
- ▁INJUSTICE
- ▁PINOCCHIO
- ▁PRISCILLA
- ▁THYSELF
- ▁OCCURRENCE
- ▁CASUAL
- ▁FRANTIC
- ▁LEGEND
- ▁FERTIL
- ▁BACKGROUND
- ▁DELICACY
- ▁ESTRALLA
- ▁MANUSCRIPT
- ▁RESPONSE
- ▁UNIVERSITY
- ▁WOLVES
- ▁SCANDAL
- ▁STUMBLE
- ▁HOARSE
- ▁BODILY
- ▁CONVENT
- ▁EXAMINING
- ▁INCAPABLE
- ▁PERCEIVING
- ▁PHILADELPHIA
- ▁SUBSEQUENT
- ▁THIEVES
- ▁ACCUMULAT
- ▁DAMSEL
- ▁SCOTCH
- ▁UNDERNEATH
- ▁NOBILITY
- ▁SMASH
- ▁REVOLT
- ▁ENGAGE
- ▁CATHEDRAL
- ▁CHAMPION
- ▁DESPATCH
- ▁ETERNITY
- ▁JANUARY
- ▁PLEADED
- ▁PROBABILITY
- ▁JIMMIE
- ▁PARALLEL
- ▁FISHERMAN
- ▁JERRY
- ▁SWORE
- ▁DRAUGHT
- ▁OPPONENT
- ▁PRIMITIVE
- ▁SIGNIFICANCE
- ▁SUBSTANTIAL
- ▁AMAZED
- ▁DUNBAR
- ▁COMMEND
- ▁CONTEMPLATE
- ▁TESTIMONY
- ▁IMPERIAL
- ▁ADAPT
- ▁JUICE
- ▁CALAMIT
- CULAR
- ▁CHATEAU
- ▁PHOENIX
- ▁PRUDENT
- ▁SOLUTION
- ▁VILLEFORT
- ▁REACTION
- ▁RELAX
- ▁YU
- ▁PROHIBIT
- ▁DISTRUST
- ▁PLUNDER
- ▁WELFARE
- ▁NAVIGAT
- ▁PARLOR
- ▁LAZY
- ▁DETACH
- OMETER
- ▁PRIV
- ▁DISCOURAGE
- ▁OBSTINATE
- ▁REJOICING
- ▁SERMON
- ▁VEHICLE
- ▁FANCIES
- ▁ENLIGHTEN
- ▁ACUTE
- ▁ILLUSION
- ▁ANTHEA
- ▁MARTIAN
- ▁EXCITE
- ▁GENEROSITY
- OLOGIST
- ▁AMAZING
- ▁UNWORTHY
- ▁INTERNAL
- ▁INCENSE
- ▁VIBRAT
- ▁ADHERE
- ROACH
- ▁FEBRUARY
- ▁MEXICAN
- ▁POTATOES
- ▁INCESSANT
- ▁INTERPOSED
- ▁PARCEL
- ▁VEXED
- ▁PROMOTE
- MIDST
- ▁ARISTOCRAT
- ▁CYRIL
- ▁EMBARK
- ▁ABUNDANCE
- ▁LITERALLY
- ▁SURGEON
- ▁TERRACE
- ▁ATLANTIC
- ▁MARTYR
- ▁SPECK
- ▁SENATE
- ▁LOAF
- ▁ADMINISTER
- ▁APPREHEND
- ▁SUBDUED
- ▁TEMPORARY
- ▁DOMINION
- ▁ELABORATE
- ▁DIGNIFIED
- ▁ELIZA
- ▁SPLASH
- ▁CONSEIL
- ▁DEXTER
- ▁UNSEEN
- ▁TRAGIC
- VOCATION
- ▁GRATIFY
- ▁BACHELOR
- ▁DEFENSE
- ▁EXCURSION
- ▁FACULTIES
- ▁PROPRIETOR
- ▁SYMPATHETIC
- ▁UNNECESSARY
- ▁RADIANT
- ▁VACANT
- ▁OUNCE
- ▁SCREW
- ▁PHENOMENON
- ▁PROMINENT
- ▁WORRIED
- ▁STUDIES
- ▁CLIMATE
- ▁KEITH
- ▁ARAMIS
- ▁BLISS
- ▁CONTINUAL
- ▁SURPASS
- ▁HEBREW
- ▁IDENTITY
- ▁PROVOKE
- ▁TEMPERAMENT
- ▁CHARIOT
- ▁HARBOR
- ▁NINTH
- ▁PRIOR
- ▁DESIROUS
- ▁JERUSALEM
- ▁UNDERTAKING
- ▁EDISON
- ▁MIRTH
- ▁SCOUT
- ▁APPARATUS
- ▁ILLUSTRATION
- ▁INTELLIGIBLE
- ▁INVARIABLY
- ▁PIERCED
- ▁REVIEW
- ▁FLICKER
- ▁HAZARD
- ▁REVELATION
- ▁DIXON
- ▁EXCITING
- ▁GOSPEL
- ▁CONSTANCE
- ▁OVERTAKE
- ▁GUINEA
- ▁ALADDIN
- ▁CHICAGO
- ▁TULLIVER
- ▁HAMILTON
- ▁GARRISON
- ▁DISCIPLE
- ▁INTENSITY
- ▁TRAITOR
- ▁CHANCELLOR
- ▁PROVERB
- ▁DAGGER
- ▁FORESEE
- ▁CONFIDE
- ▁GLIMMER
- ▁CHAUVELIN
- ▁ILLUSTRATE
- ▁VOLUNTEER
- ▁JUNGLE
- ▁STREAK
- ▁SUNRISE
- ▁DISSOLV
- ▁QUEST
- ▁AWHILE
- ▁FELICITY
- ▁LEGISLATURE
- ▁LEONORA
- ▁MAGAZINE
- ▁PITIFUL
- ▁COLONY
- ▁SHAWL
- ▁ARRIVING
- ▁FUNDAMENTAL
- ▁CARPENTER
- ▁OVERFLOW
- ▁EXPAND
- ▁HARVEST
- ▁FEMININE
- ▁INNUMERABLE
- ▁SCRAMBLE
- ▁TWENTIETH
- ▁TRIFLING
- ▁GHASTL
- ▁CONQUEST
- ▁DANIEL
- ▁FACILIT
- ▁FORSAKE
- ▁BEHAVIOUR
- ▁GORGEOUS
- ▁PRODUCING
- ▁HAPPIER
- ▁PROMISING
- ▁RAINBOW
- ▁INSTINCTIVELY
- ▁DECREE
- ▁EYEBROWS
- ▁IRRESISTIBLE
- ▁PHARAOH
- ▁SCROOGE
- ▁UNNATURAL
- ▁CRUMBS
- ▁REFINED
- ▁DREARY
- ▁TRENCH
- ▁CONVINCE
- ▁FRINGE
- ▁EXTREMITY
- ▁INTIMACY
- ▁SCOUNDREL
- ▁SUFFRAGE
- ▁UNEASINESS
- ▁BARRICADE
- ▁CIRCULAT
- ▁SAMUEL
- ▁BRUCE
- ▁DARCY
- <sos/eos>
init: null
input_size: null
ctc_conf:
dropout_rate: 0.0
ctc_type: builtin
reduce: true
ignore_nan_grad: true
joint_net_conf: null
model_conf:
ctc_weight: 0.3
lsm_weight: 0.1
length_normalized_loss: false
use_preprocessor: true
token_type: bpe
bpemodel: data/en_token_list/bpe_unigram5000/bpe.model
non_linguistic_symbols: null
cleaner: null
g2p: null
speech_volume_normalize: null
rir_scp: null
rir_apply_prob: 1.0
noise_scp: null
noise_apply_prob: 1.0
noise_db_range: '13_15'
frontend: default
frontend_conf:
n_fft: 512
hop_length: 256
fs: 16k
specaug: specaug
specaug_conf:
apply_time_warp: true
time_warp_window: 5
time_warp_mode: bicubic
apply_freq_mask: true
freq_mask_width_range:
- 0
- 27
num_freq_mask: 2
apply_time_mask: true
time_mask_width_ratio_range:
- 0.0
- 0.05
num_time_mask: 10
normalize: global_mvn
normalize_conf:
stats_file: exp/asr_stats_raw_en_bpe5000_sp/train/feats_stats.npz
preencoder: null
preencoder_conf: {}
encoder: conformer
encoder_conf:
output_size: 512
attention_heads: 8
linear_units: 2048
num_blocks: 12
dropout_rate: 0.1
positional_dropout_rate: 0.1
attention_dropout_rate: 0.1
input_layer: conv2d
normalize_before: true
macaron_style: true
rel_pos_type: latest
pos_enc_layer_type: rel_pos
selfattention_layer_type: rel_selfattn
activation_type: swish
use_cnn_module: true
cnn_module_kernel: 31
postencoder: null
postencoder_conf: {}
decoder: transformer
decoder_conf:
attention_heads: 8
linear_units: 2048
num_blocks: 6
dropout_rate: 0.1
positional_dropout_rate: 0.1
self_attention_dropout_rate: 0.1
src_attention_dropout_rate: 0.1
required:
- output_dir
- token_list
version: 0.10.7a1
distributed: true
```
</details>
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Kuray107/librispeech-semi-supervised-without-LM
|
Kuray107
| 2022-03-07T17:14:04Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-07T03:31:57Z |
---
tags:
- generated_from_trainer
model-index:
- name: librispeech-semi-supervised-without-LM
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# librispeech-semi-supervised-without-LM
This model was trained from scratch on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1837
- Wer: 0.0580
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 15
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.0565 | 0.56 | 1000 | 0.1354 | 0.0641 |
| 0.0548 | 1.12 | 2000 | 0.1320 | 0.0628 |
| 0.0478 | 1.68 | 3000 | 0.1247 | 0.0612 |
| 0.0451 | 2.24 | 4000 | 0.1256 | 0.0613 |
| 0.0401 | 2.8 | 5000 | 0.1269 | 0.0606 |
| 0.035 | 3.36 | 6000 | 0.1370 | 0.0595 |
| 0.0344 | 3.92 | 7000 | 0.1280 | 0.0589 |
| 0.031 | 4.48 | 8000 | 0.1350 | 0.0589 |
| 0.031 | 5.04 | 9000 | 0.1418 | 0.0614 |
| 0.0278 | 5.61 | 10000 | 0.1382 | 0.0604 |
| 0.0272 | 6.17 | 11000 | 0.1502 | 0.0615 |
| 0.0246 | 6.73 | 12000 | 0.1443 | 0.0609 |
| 0.0233 | 7.29 | 13000 | 0.1548 | 0.0589 |
| 0.0224 | 7.85 | 14000 | 0.1547 | 0.0599 |
| 0.0202 | 8.41 | 15000 | 0.1570 | 0.0590 |
| 0.0199 | 8.97 | 16000 | 0.1564 | 0.0594 |
| 0.0186 | 9.53 | 17000 | 0.1598 | 0.0595 |
| 0.0187 | 10.09 | 18000 | 0.1657 | 0.0585 |
| 0.017 | 10.65 | 19000 | 0.1690 | 0.0584 |
| 0.016 | 11.21 | 20000 | 0.1689 | 0.0588 |
| 0.0156 | 11.77 | 21000 | 0.1745 | 0.0585 |
| 0.0151 | 12.33 | 22000 | 0.1777 | 0.0583 |
| 0.0144 | 12.89 | 23000 | 0.1778 | 0.0590 |
| 0.0142 | 13.45 | 24000 | 0.1803 | 0.0585 |
| 0.0137 | 14.01 | 25000 | 0.1796 | 0.0581 |
| 0.0132 | 14.57 | 26000 | 0.1837 | 0.0580 |
### Framework versions
- Transformers 4.14.1
- Pytorch 1.10.2
- Datasets 1.18.2
- Tokenizers 0.10.3
|
Kevincp560/distilbart-cnn-12-3-finetuned-pubmed
|
Kevincp560
| 2022-03-07T15:55:27Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bart",
"text2text-generation",
"generated_from_trainer",
"dataset:pub_med_summarization_dataset",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T10:26:20Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- pub_med_summarization_dataset
metrics:
- rouge
model-index:
- name: distilbart-cnn-12-3-finetuned-pubmed
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: pub_med_summarization_dataset
type: pub_med_summarization_dataset
args: document
metrics:
- name: Rouge1
type: rouge
value: 40.5642
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbart-cnn-12-3-finetuned-pubmed
This model is a fine-tuned version of [sshleifer/distilbart-cnn-12-3](https://huggingface.co/sshleifer/distilbart-cnn-12-3) on the pub_med_summarization_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 2.1743
- Rouge1: 40.5642
- Rouge2: 16.9812
- Rougel: 25.3449
- Rougelsum: 36.46
- Gen Len: 141.95
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:--------:|
| 2.469 | 1.0 | 4000 | 2.2956 | 38.3713 | 15.2594 | 23.6734 | 34.1634 | 141.707 |
| 2.2527 | 2.0 | 8000 | 2.1994 | 39.5939 | 16.2376 | 24.6363 | 35.5106 | 141.831 |
| 2.0669 | 3.0 | 12000 | 2.1780 | 40.078 | 16.6705 | 25.1119 | 35.9605 | 141.8475 |
| 1.9275 | 4.0 | 16000 | 2.1669 | 40.0825 | 16.6169 | 24.9702 | 36.0191 | 141.928 |
| 1.8102 | 5.0 | 20000 | 2.1743 | 40.5642 | 16.9812 | 25.3449 | 36.46 | 141.95 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.6
|
kenjis2542/mt5-small-finetuned-5k-th-to-en
|
kenjis2542
| 2022-03-07T14:11:40Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"mt5",
"text2text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T12:49:31Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: mt5-small-finetuned-5k-th-to-en
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# mt5-small-finetuned-5k-th-to-en
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
- mixed_precision_training: Native AMP
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6
|
abhishek/autonlp-swahili-sentiment-615517563
|
abhishek
| 2022-03-07T12:54:03Z | 16 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"autonlp",
"unk",
"dataset:abhishek/autonlp-data-swahili-sentiment",
"co2_eq_emissions",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-07T12:52:44Z |
---
tags: autonlp
language: unk
widget:
- text: "I love AutoNLP 🤗"
datasets:
- abhishek/autonlp-data-swahili-sentiment
co2_eq_emissions: 1.9057858628956459
---
# Model Trained Using AutoNLP
- Problem type: Multi-class Classification
- Model ID: 615517563
- CO2 Emissions (in grams): 1.9057858628956459
## Validation Metrics
- Loss: 0.6990908980369568
- Accuracy: 0.695364238410596
- Macro F1: 0.6088819062581828
- Micro F1: 0.695364238410596
- Weighted F1: 0.677326207350606
- Macro Precision: 0.6945099492363175
- Micro Precision: 0.695364238410596
- Weighted Precision: 0.6938596845881614
- Macro Recall: 0.5738408020723632
- Micro Recall: 0.695364238410596
- Weighted Recall: 0.695364238410596
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/models/abhishek/autonlp-swahili-sentiment-615517563
```
Or Python API:
```
from transformers import AutoModelForSequenceClassification, AutoTokenizer
model = AutoModelForSequenceClassification.from_pretrained("abhishek/autonlp-swahili-sentiment-615517563", use_auth_token=True)
tokenizer = AutoTokenizer.from_pretrained("abhishek/autonlp-swahili-sentiment-615517563", use_auth_token=True)
inputs = tokenizer("I love AutoNLP", return_tensors="pt")
outputs = model(**inputs)
```
|
Splend1dchan/byt5small-glue-mprc
|
Splend1dchan
| 2022-03-07T11:14:07Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T10:51:18Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: byt5small-glue-mprc
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# byt5small-glue-mprc
This model is a fine-tuned version of [google/byt5-small](https://huggingface.co/google/byt5-small) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- distributed_type: tpu
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
### Framework versions
- Transformers 4.18.0.dev0
- Pytorch 1.6.0a0+bf2bbd9
- Datasets 1.12.1
- Tokenizers 0.11.6
|
cammy/bart-large-cnn-1000-pad-early-lit
|
cammy
| 2022-03-07T10:56:33Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bart",
"text2text-generation",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T10:31:23Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-1000-pad-early-lit
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-large-cnn-1000-pad-early-lit
This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4800
- Rouge1: 28.4538
- Rouge2: 13.5656
- Rougel: 22.2066
- Rougelsum: 25.3361
- Gen Len: 66.53
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| 0.1556 | 1.0 | 1000 | 0.4383 | 29.1275 | 14.1415 | 22.5802 | 26.37 | 65.93 |
| 0.0853 | 2.0 | 2000 | 0.4800 | 28.4538 | 13.5656 | 22.2066 | 25.3361 | 66.53 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.2
- Datasets 1.18.3
- Tokenizers 0.11.0
|
spy24/autonlp-parrot_paraphrasing-615317556
|
spy24
| 2022-03-07T09:36:20Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"autonlp",
"unk",
"dataset:spy24/autonlp-data-parrot_paraphrasing",
"co2_eq_emissions",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T09:35:01Z |
---
tags: autonlp
language: unk
widget:
- text: "I love AutoNLP 🤗"
datasets:
- spy24/autonlp-data-parrot_paraphrasing
co2_eq_emissions: 0.8335491678002559
---
# Model Trained Using AutoNLP
- Problem type: Summarization
- Model ID: 615317556
- CO2 Emissions (in grams): 0.8335491678002559
## Validation Metrics
- Loss: 0.0001514342293376103
- Rouge1: 100.0
- Rouge2: 51.4451
- RougeL: 100.0
- RougeLsum: 100.0
- Gen Len: 4.104
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/spy24/autonlp-parrot_paraphrasing-615317556
```
|
spy24/autonlp-optimized-paraphrasing-615217541
|
spy24
| 2022-03-07T08:56:14Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"autonlp",
"unk",
"dataset:spy24/autonlp-data-optimized-paraphrasing",
"co2_eq_emissions",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T08:54:32Z |
---
tags: autonlp
language: unk
widget:
- text: "I love AutoNLP 🤗"
datasets:
- spy24/autonlp-data-optimized-paraphrasing
co2_eq_emissions: 1.166696812121839
---
# Model Trained Using AutoNLP
- Problem type: Summarization
- Model ID: 615217541
- CO2 Emissions (in grams): 1.166696812121839
## Validation Metrics
- Loss: 0.00019549368880689144
- Rouge1: 100.0
- Rouge2: 51.4451
- RougeL: 100.0
- RougeLsum: 100.0
- Gen Len: 4.104
## Usage
You can use cURL to access this model:
```
$ curl -X POST -H "Authorization: Bearer YOUR_HUGGINGFACE_API_KEY" -H "Content-Type: application/json" -d '{"inputs": "I love AutoNLP"}' https://api-inference.huggingface.co/spy24/autonlp-optimized-paraphrasing-615217541
```
|
cammy/bart-large-cnn-finetuned-weaksup-1000-pad-early-new1
|
cammy
| 2022-03-07T06:18:16Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bart",
"text2text-generation",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T06:01:26Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-finetuned-weaksup-1000-pad-early-new1
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-large-cnn-finetuned-weaksup-1000-pad-early-new1
This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4948
- Rouge1: 28.1465
- Rouge2: 13.4076
- Rougel: 22.2763
- Rougelsum: 25.2087
- Gen Len: 68.58
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| 0.156 | 1.0 | 1000 | 0.4377 | 27.8782 | 13.1274 | 21.2329 | 24.6465 | 66.25 |
| 0.0843 | 2.0 | 2000 | 0.4948 | 28.1465 | 13.4076 | 22.2763 | 25.2087 | 68.58 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.2
- Datasets 1.18.3
- Tokenizers 0.11.0
|
Splend1dchan/byt5small-squad-5000
|
Splend1dchan
| 2022-03-07T04:39:29Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-07T04:25:56Z |
Byt5 trained on squad, input = 512, output = 256, 5000 steps
Tokenizer is Byt5
|
meame2010/rare-puppers
|
meame2010
| 2022-03-07T00:03:06Z | 68 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"vit",
"image-classification",
"huggingpics",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
image-classification
| 2022-03-07T00:02:57Z |
---
tags:
- image-classification
- pytorch
- huggingpics
metrics:
- accuracy
model-index:
- name: rare-puppers
results:
- task:
name: Image Classification
type: image-classification
metrics:
- name: Accuracy
type: accuracy
value: 0.644444465637207
---
# rare-puppers
Autogenerated by HuggingPics🤗🖼️
Create your own image classifier for **anything** by running [the demo on Google Colab](https://colab.research.google.com/github/nateraw/huggingpics/blob/main/HuggingPics.ipynb).
Report any issues with the demo at the [github repo](https://github.com/nateraw/huggingpics).
## Example Images
#### dog drinking water

#### dog eating food

#### dog playing toy

#### dog sleeping

|
PhilSad/GPT-J6B-Guided-SCP
|
PhilSad
| 2022-03-06T22:52:07Z | 9 | 2 |
transformers
|
[
"transformers",
"pytorch",
"gptj",
"text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-02T23:29:04Z |
Attempt of guided text generation to replace GPT-3 for :[This SCP Does Not Exist](https://www.thisscpdoesnotexist.ml)
Work in Porgress
Finetuned on a dataset of 1700 automatically generated samples from the [official SCP wiki](https://scp-wiki.wikidot.com/)
Exemple input :
```Prompt: SCP-9741 is a pair of jeans that looks really cool ### Generation: Item #: SCP-9741\nObject Class: Safe\nSpecial Containment Procedures:```
# Acknowledgment
This work was made possible thanks to the TPU Research Cloud program by Google
|
Kevincp560/distilbart-cnn-12-6-finetuned-pubmed
|
Kevincp560
| 2022-03-06T22:33:03Z | 4 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bart",
"text2text-generation",
"generated_from_trainer",
"dataset:pub_med_summarization_dataset",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-06T16:25:29Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- pub_med_summarization_dataset
metrics:
- rouge
model-index:
- name: distilbart-cnn-12-6-finetuned-pubmed
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: pub_med_summarization_dataset
type: pub_med_summarization_dataset
args: document
metrics:
- name: Rouge1
type: rouge
value: 40.0985
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbart-cnn-12-6-finetuned-pubmed
This model is a fine-tuned version of [sshleifer/distilbart-cnn-12-6](https://huggingface.co/sshleifer/distilbart-cnn-12-6) on the pub_med_summarization_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9895
- Rouge1: 40.0985
- Rouge2: 16.5016
- Rougel: 24.8319
- Rougelsum: 36.0775
- Gen Len: 141.884
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:--------:|
| 2.1709 | 1.0 | 4000 | 2.0257 | 38.1012 | 15.112 | 23.4064 | 33.9373 | 141.9195 |
| 1.9495 | 2.0 | 8000 | 1.9593 | 39.529 | 16.1693 | 24.487 | 35.5238 | 141.9785 |
| 1.756 | 3.0 | 12000 | 1.9488 | 39.9623 | 16.5799 | 24.949 | 35.9194 | 141.8855 |
| 1.6032 | 4.0 | 16000 | 1.9732 | 39.672 | 16.1994 | 24.5996 | 35.7021 | 141.921 |
| 1.4817 | 5.0 | 20000 | 1.9895 | 40.0985 | 16.5016 | 24.8319 | 36.0775 | 141.884 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.6
|
Ayham/roberta_ernie_summarization_cnn_dailymail
|
Ayham
| 2022-03-06T22:01:31Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"encoder-decoder",
"text2text-generation",
"generated_from_trainer",
"dataset:cnn_dailymail",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-06T14:27:11Z |
---
tags:
- generated_from_trainer
datasets:
- cnn_dailymail
model-index:
- name: roberta_ernie_summarization_cnn_dailymail
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta_ernie_summarization_cnn_dailymail
This model is a fine-tuned version of [](https://huggingface.co/) on the cnn_dailymail dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.10.3
|
osanseviero/xlm-roberta-base-finetuned-panx-de-fr
|
osanseviero
| 2022-03-06T21:30:10Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-06T20:35:13Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-de-fr
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-de-fr
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1754
- F1: 0.8616
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 12
- eval_batch_size: 12
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.2815 | 1.0 | 1430 | 0.2079 | 0.8067 |
| 0.1521 | 2.0 | 2860 | 0.1759 | 0.8525 |
| 0.093 | 3.0 | 4290 | 0.1754 | 0.8616 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.1
- Datasets 1.18.0
- Tokenizers 0.10.3
|
cammy/bart-large-cnn-finetuned-weaksup-1000-pad-early-new
|
cammy
| 2022-03-06T17:51:08Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bart",
"text2text-generation",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-06T16:33:39Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-finetuned-weaksup-1000-pad-early-new
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-large-cnn-finetuned-weaksup-1000-pad-early-new
This model is a fine-tuned version of [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4896
- Rouge1: 29.4505
- Rouge2: 14.4038
- Rougel: 23.1757
- Rougelsum: 26.3813
- Gen Len: 66.55
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| 0.154 | 1.0 | 1000 | 0.4255 | 27.2971 | 12.4331 | 20.851 | 23.9583 | 66.64 |
| 0.0806 | 2.0 | 2000 | 0.4896 | 29.4505 | 14.4038 | 23.1757 | 26.3813 | 66.55 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.2
- Datasets 1.18.3
- Tokenizers 0.11.0
|
Kuray107/swbd-5percent-supervised
|
Kuray107
| 2022-03-06T16:14:11Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-05T15:36:19Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: swbd-5percent-supervised
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swbd-5percent-supervised
This model is a fine-tuned version of [facebook/wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6970
- Wer: 0.1352
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 6.8534 | 0.64 | 1000 | 2.9535 | 1.0 |
| 1.8605 | 1.28 | 2000 | 0.7878 | 0.3719 |
| 0.9862 | 1.92 | 3000 | 0.5906 | 0.2684 |
| 0.8405 | 2.56 | 4000 | 0.5555 | 0.2151 |
| 0.6972 | 3.2 | 5000 | 0.5905 | 0.1992 |
| 0.6033 | 3.84 | 6000 | 0.4867 | 0.1781 |
| 0.5393 | 4.48 | 7000 | 0.5447 | 0.1805 |
| 0.529 | 5.12 | 8000 | 0.5398 | 0.1746 |
| 0.5072 | 5.77 | 9000 | 0.5093 | 0.1706 |
| 0.4331 | 6.41 | 10000 | 0.4990 | 0.1627 |
| 0.4837 | 7.05 | 11000 | 0.5319 | 0.1634 |
| 0.3867 | 7.69 | 12000 | 0.4866 | 0.1595 |
| 0.345 | 8.33 | 13000 | 0.5202 | 0.1582 |
| 0.372 | 8.97 | 14000 | 0.5396 | 0.1547 |
| 0.355 | 9.61 | 15000 | 0.5992 | 0.1493 |
| 0.3258 | 10.25 | 16000 | 0.5247 | 0.1527 |
| 0.3327 | 10.89 | 17000 | 0.5664 | 0.1512 |
| 0.3422 | 11.53 | 18000 | 0.5819 | 0.1456 |
| 0.2815 | 12.17 | 19000 | 0.5692 | 0.1453 |
| 0.2719 | 12.81 | 20000 | 0.5012 | 0.1476 |
| 0.2838 | 13.45 | 21000 | 0.5286 | 0.1454 |
| 0.2418 | 14.09 | 22000 | 0.6238 | 0.1486 |
| 0.2412 | 14.73 | 23000 | 0.5889 | 0.1456 |
| 0.2227 | 15.37 | 24000 | 0.5901 | 0.1459 |
| 0.2129 | 16.02 | 25000 | 0.5959 | 0.1454 |
| 0.2071 | 16.66 | 26000 | 0.6259 | 0.1427 |
| 0.2185 | 17.3 | 27000 | 0.6581 | 0.1437 |
| 0.1982 | 17.94 | 28000 | 0.6194 | 0.1411 |
| 0.1928 | 18.58 | 29000 | 0.5940 | 0.1409 |
| 0.1885 | 19.22 | 30000 | 0.6733 | 0.1417 |
| 0.1835 | 19.86 | 31000 | 0.6363 | 0.1393 |
| 0.1756 | 20.5 | 32000 | 0.6675 | 0.1382 |
| 0.1776 | 21.14 | 33000 | 0.6147 | 0.1407 |
| 0.1758 | 21.78 | 34000 | 0.6405 | 0.1420 |
| 0.1645 | 22.42 | 35000 | 0.6999 | 0.1401 |
| 0.1631 | 23.06 | 36000 | 0.6224 | 0.1385 |
| 0.1494 | 23.7 | 37000 | 0.6639 | 0.1374 |
| 0.1472 | 24.34 | 38000 | 0.6471 | 0.1373 |
| 0.1514 | 24.98 | 39000 | 0.6570 | 0.1395 |
| 0.1527 | 25.62 | 40000 | 0.6876 | 0.1375 |
| 0.1514 | 26.27 | 41000 | 0.6835 | 0.1376 |
| 0.1344 | 26.91 | 42000 | 0.6987 | 0.1372 |
| 0.1267 | 27.55 | 43000 | 0.7026 | 0.1362 |
| 0.1384 | 28.19 | 44000 | 0.7021 | 0.1366 |
| 0.1264 | 28.83 | 45000 | 0.7016 | 0.1355 |
| 0.1227 | 29.47 | 46000 | 0.6970 | 0.1352 |
### Framework versions
- Transformers 4.14.1
- Pytorch 1.10.2
- Datasets 1.18.2
- Tokenizers 0.10.3
|
crabz/distil-slovakbert-ner
|
crabz
| 2022-03-06T12:40:16Z | 5 | 1 |
transformers
|
[
"transformers",
"pytorch",
"roberta",
"token-classification",
"generated_from_trainer",
"dataset:wikiann",
"autotrain_compatible",
"region:us"
] |
token-classification
| 2022-03-06T12:17:02Z |
---
tags:
- generated_from_trainer
datasets:
- wikiann
inference: false
model-index:
- name: distil-slovakbert-ner
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distil-slovakbert-ner
This model is a fine-tuned version of [crabz/distil-slovakbert](https://huggingface.co/crabz/distil-slovakbert) on the wikiann sk dataset.
- F1: 0.9307
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10.0
### Training results
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.0+cu113
- Datasets 1.15.1
- Tokenizers 0.11.0
|
crabz/slovakbert-upos
|
crabz
| 2022-03-06T12:31:41Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"roberta",
"token-classification",
"license:mit",
"autotrain_compatible",
"region:us"
] |
token-classification
| 2022-03-05T16:47:22Z |
---
license: mit
inference: false
---
|
AG/pretraining
|
AG
| 2022-03-06T12:27:50Z | 17 | 0 |
transformers
|
[
"transformers",
"pytorch",
"roberta",
"feature-extraction",
"endpoints_compatible",
"region:us"
] |
feature-extraction
| 2022-03-02T23:29:04Z |
Pre trained on clus_ chapter only.
|
Kuray107/librispeech-100h-supervised
|
Kuray107
| 2022-03-06T08:07:22Z | 13 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-02T23:29:04Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: librispeech-100h-supervised
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# librispeech-100h-supervised
This model is a fine-tuned version of [facebook/wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0955
- Wer: 0.0345
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 24
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 15
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 4.8277 | 0.42 | 500 | 2.9071 | 1.0 |
| 2.0261 | 0.84 | 1000 | 0.3060 | 0.2496 |
| 0.2181 | 1.26 | 1500 | 0.1172 | 0.0873 |
| 0.1255 | 1.68 | 2000 | 0.0894 | 0.0637 |
| 0.0971 | 2.1 | 2500 | 0.0821 | 0.0560 |
| 0.078 | 2.52 | 3000 | 0.0751 | 0.0500 |
| 0.0706 | 2.94 | 3500 | 0.0721 | 0.0456 |
| 0.0609 | 3.36 | 4000 | 0.0755 | 0.0464 |
| 0.0572 | 3.78 | 4500 | 0.0705 | 0.0431 |
| 0.0528 | 4.2 | 5000 | 0.0715 | 0.0423 |
| 0.0481 | 4.62 | 5500 | 0.0691 | 0.0403 |
| 0.0471 | 5.04 | 6000 | 0.0743 | 0.0401 |
| 0.0412 | 5.46 | 6500 | 0.0757 | 0.0399 |
| 0.0416 | 5.88 | 7000 | 0.0688 | 0.0378 |
| 0.0391 | 6.3 | 7500 | 0.0704 | 0.0383 |
| 0.0367 | 6.72 | 8000 | 0.0742 | 0.0387 |
| 0.0349 | 7.14 | 8500 | 0.0732 | 0.0388 |
| 0.033 | 7.56 | 9000 | 0.0719 | 0.0374 |
| 0.0327 | 7.98 | 9500 | 0.0750 | 0.0369 |
| 0.0292 | 8.4 | 10000 | 0.0734 | 0.0368 |
| 0.0303 | 8.82 | 10500 | 0.0733 | 0.0365 |
| 0.0283 | 9.24 | 11000 | 0.0766 | 0.0357 |
| 0.0269 | 9.66 | 11500 | 0.0761 | 0.0350 |
| 0.0268 | 10.08 | 12000 | 0.0802 | 0.0359 |
| 0.0245 | 10.42 | 12500 | 0.0758 | 0.0354 |
| 0.023 | 10.84 | 13000 | 0.0775 | 0.0349 |
| 0.0186 | 11.26 | 13500 | 0.0817 | 0.0355 |
| 0.0176 | 11.68 | 14000 | 0.0853 | 0.0354 |
| 0.0163 | 12.1 | 14500 | 0.0880 | 0.0347 |
| 0.0156 | 12.52 | 15000 | 0.0864 | 0.0357 |
| 0.0141 | 12.94 | 15500 | 0.0897 | 0.0355 |
| 0.0134 | 13.36 | 16000 | 0.0915 | 0.0349 |
| 0.013 | 13.78 | 16500 | 0.0928 | 0.0350 |
| 0.0097 | 13.42 | 17000 | 0.0955 | 0.0345 |
### Framework versions
- Transformers 4.14.1
- Pytorch 1.10.2
- Datasets 1.18.2
- Tokenizers 0.10.3
|
clisi2000/distilbert-base-uncased-finetuned-emotion
|
clisi2000
| 2022-03-06T07:09:00Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:emotion",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-05T04:03:14Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- emotion
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-finetuned-emotion
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: emotion
type: emotion
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9245
- name: F1
type: f1
value: 0.9246284188099615
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2183
- Accuracy: 0.9245
- F1: 0.9246
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.8174 | 1.0 | 250 | 0.3166 | 0.905 | 0.9023 |
| 0.2534 | 2.0 | 500 | 0.2183 | 0.9245 | 0.9246 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.2+cpu
- Datasets 1.16.1
- Tokenizers 0.10.1
|
Kuray107/librispeech-5h-supervised
|
Kuray107
| 2022-03-06T06:43:53Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-05T23:00:11Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: librispeech-5h-supervised
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# librispeech-5h-supervised
This model is a fine-tuned version of [facebook/wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2041
- Wer: 0.0624
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 100
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.7758 | 11.11 | 1000 | 0.3120 | 0.2337 |
| 0.1238 | 22.22 | 2000 | 0.1651 | 0.0826 |
| 0.0383 | 33.33 | 3000 | 0.1667 | 0.0712 |
| 0.023 | 44.44 | 4000 | 0.1893 | 0.0685 |
| 0.0166 | 55.56 | 5000 | 0.2008 | 0.0666 |
| 0.0131 | 66.67 | 6000 | 0.1942 | 0.0639 |
| 0.0106 | 77.78 | 7000 | 0.1979 | 0.0628 |
| 0.0091 | 88.89 | 8000 | 0.2027 | 0.0628 |
| 0.008 | 100.0 | 9000 | 0.2041 | 0.0624 |
### Framework versions
- Transformers 4.14.1
- Pytorch 1.10.2
- Datasets 1.18.2
- Tokenizers 0.10.3
|
huggingtweets/ragnar_furup
|
huggingtweets
| 2022-03-05T18:34:56Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"gpt2",
"text-generation",
"huggingtweets",
"en",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-05T18:34:14Z |
---
language: en
thumbnail: http://www.huggingtweets.com/ragnar_furup/1646505291174/predictions.png
tags:
- huggingtweets
widget:
- text: "My dream is"
---
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1500138558765608969/Qgc4pMtC_400x400.jpg')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
<div
style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">R4 G4.mp3🌻</div>
<div style="text-align: center; font-size: 14px;">@ragnar_furup</div>
</div>
I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets).
Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)!
## How does it work?
The model uses the following pipeline.

To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI).
## Training data
The model was trained on tweets from R4 G4.mp3🌻.
| Data | R4 G4.mp3🌻 |
| --- | --- |
| Tweets downloaded | 1695 |
| Retweets | 889 |
| Short tweets | 104 |
| Tweets kept | 702 |
[Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3eum19q4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ragnar_furup's tweets.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/30kqu5u4) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/30kqu5u4/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingtweets/ragnar_furup')
generator("My dream is", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Boris Dayma*
[](https://twitter.com/intent/follow?screen_name=borisdayma)
For more details, visit the project repository.
[](https://github.com/borisdayma/huggingtweets)
|
nimrah/my-wav2vec2-base-timit-demo-colab-my
|
nimrah
| 2022-03-05T17:06:37Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-05T15:19:10Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: my-wav2vec2-base-timit-demo-colab-my
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my-wav2vec2-base-timit-demo-colab-my
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5569
- Wer: 0.3481
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.4083 | 4.0 | 500 | 1.0932 | 0.7510 |
| 0.5536 | 8.0 | 1000 | 0.4965 | 0.4819 |
| 0.2242 | 12.0 | 1500 | 0.4779 | 0.4077 |
| 0.1249 | 16.0 | 2000 | 0.4921 | 0.4006 |
| 0.0844 | 20.0 | 2500 | 0.4809 | 0.3753 |
| 0.0613 | 24.0 | 3000 | 0.5307 | 0.3680 |
| 0.0459 | 28.0 | 3500 | 0.5569 | 0.3481 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.10.3
|
batterydata/batterybert-cased
|
batterydata
| 2022-03-05T16:20:02Z | 106 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"exbert",
"en",
"dataset:batterypapers",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2022-03-02T23:29:05Z |
---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- batterypapers
---
# BatteryBERT-uncased model
Pretrained model on a large corpus of battery research papers using a masked language modeling (MLM) objective, starting with the [bert-base-cased](https://huggingface.co/bert-base-cased) weights. It was introduced in
[this paper](paper_link) and first released in
[this repository](https://github.com/ShuHuang/batterybert). This model is case-sensitive: it makes a difference between english and English.
## Model description
BatteryBERT is a transformers model pretrained on a large corpus of battery research papers in a self-supervised fashion, starting with the [bert-base-cased](https://huggingface.co/bert-base-cased) weights. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model
randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict
the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one
after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to
learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Training data
The BatteryBERT model was pretrained on the full text of battery papers only, after initialized from the [bert-base-cased](https://huggingface.co/bert-base-cased) weights. The paper corpus contains a total of 400,366 battery research papers that are published from 2000 to June 2021, from the publishers Royal Society of Chemistry (RSC), Elsevier, and Springer. The list of DOIs can be found at [Github](https://github.com/ShuHuang/batterybert/blob/main/corpus.txt).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 28,996. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 8 NVIDIA DGX A100 GPUs for 1,000,000 steps with a batch size of 256. The sequence length was limited to 512 tokens. The optimizer used is Adam with a learning rate of 2e-5, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
See the [model hub](https://huggingface.co/models?filter=batterybert) to look for fine-tuned versions on a task that
interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='batterydata/batterybert-cased')
>>> unmasker("Hello I'm a <mask> model.")
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('batterydata/batterybert-cased')
model = BertModel.from_pretrained('batterydata/batterybert-cased')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('batterydata/batterybert-cased')
model = TFBertModel.from_pretrained('batterydata/batterybert-cased')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
## Evaluation results
Final loss: 0.9609.
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
batterydata/batterybert-uncased
|
batterydata
| 2022-03-05T16:18:02Z | 15 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"exbert",
"en",
"dataset:batterypapers",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2022-03-02T23:29:05Z |
---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- batterypapers
---
# BatteryBERT-uncased model
Pretrained model on a large corpus of battery research papers using a masked language modeling (MLM) objective, starting with the [bert-base-uncased](https://huggingface.co/bert-base-uncased) weights. It was introduced in
[this paper](paper_link) and first released in
[this repository](https://github.com/ShuHuang/batterybert). This model is uncased: it does not make a difference
between english and English.
## Model description
BatteryBERT is a transformers model pretrained on a large corpus of battery research papers in a self-supervised fashion, starting with the [bert-base-uncased](https://huggingface.co/bert-base-uncased) weights. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model
randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict
the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one
after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to
learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Training data
The BatteryBERT model was pretrained on the full text of battery papers only, after initialized from the [bert-base-uncased](https://huggingface.co/bert-base-uncased) weights. The paper corpus contains a total of 400,366 battery research papers that are published from 2000 to June 2021, from the publishers Royal Society of Chemistry (RSC), Elsevier, and Springer. The list of DOIs can be found at [Github](https://github.com/ShuHuang/batterybert/blob/main/corpus.txt).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,522. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 8 NVIDIA DGX A100 GPUs for 1,000,000 steps with a batch size of 256. The sequence length was limited to 512 tokens. The optimizer used is Adam with a learning rate of 2e-5, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
See the [model hub](https://huggingface.co/models?filter=batterybert) to look for fine-tuned versions on a task that
interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='batterydata/batterybert-uncased')
>>> unmasker("Hello I'm a <mask> model.")
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('batterydata/batterybert-uncased')
model = BertModel.from_pretrained('batterydata/batterybert-uncased')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('batterydata/batterybert-uncased')
model = TFBertModel.from_pretrained('batterydata/batterybert-uncased')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
## Evaluation results
Final loss: 1.0317.
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
batterydata/batteryscibert-uncased
|
batterydata
| 2022-03-05T16:14:28Z | 9 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"exbert",
"en",
"dataset:batterypapers",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2022-03-02T23:29:05Z |
---
language: en
tags:
- exbert
license: apache-2.0
datasets:
- batterypapers
---
# BatterySciBERT-uncased model
Pretrained model on a large corpus of battery research papers using a masked language modeling (MLM) objective, starting with the [SciBERT-uncased](https://huggingface.co/allenai/scibert_scivocab_uncased) weights. It was introduced in
[this paper](paper_link) and first released in
[this repository](https://github.com/ShuHuang/batterybert). This model is uncased: it does not make a difference
between english and English.
## Model description
BatterySciBERT is a transformers model pretrained on a large corpus of battery research papers in a self-supervised fashion, starting with the [SciBERT-uncased](https://huggingface.co/allenai/scibert_scivocab_uncased) weights. This means
it was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts.
More precisely, it was pretrained with the Masked language modeling (MLM) objective. Taking a sentence, the model
randomly masks 15% of the words in the input then run the entire masked sentence through the model and has to predict
the masked words. This is different from traditional recurrent neural networks (RNNs) that usually see the words one
after the other, or from autoregressive models like GPT which internally mask the future tokens. It allows the model to
learn a bidirectional representation of the sentence.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Training data
The BatterySciBERT model was pretrained on the full text of battery papers only, after initialized from the [SciBERT-uncased](https://huggingface.co/allenai/scibert_scivocab_uncased) weights. The paper corpus contains a total of 400,366 battery research papers that are published from 2000 to June 2021, from the publishers Royal Society of Chemistry (RSC), Elsevier, and Springer. The list of DOIs can be found at [Github](https://github.com/ShuHuang/batterybert/blob/main/corpus.txt).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 31,090. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 8 NVIDIA DGX A100 GPUs for 1,000,000 steps with a batch size of 256. The sequence length was limited to 512 tokens. The optimizer used is Adam with a learning rate of 2e-5, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Intended uses & limitations
You can use the raw model for masked language modeling, but it's mostly intended to be fine-tuned on a downstream task.
See the [model hub](https://huggingface.co/models?filter=batterybert) to look for fine-tuned versions on a task that
interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='batterydata/batteryscibert-uncased')
>>> unmasker("Hello I'm a <mask> model.")
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('batterydata/batteryscibert-uncased')
model = BertModel.from_pretrained('batterydata/batteryscibert-uncased')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('batterydata/batteryscibert-uncased')
model = TFBertModel.from_pretrained('batterydata/batteryscibert-uncased')
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
## Evaluation results
Final loss: 1.095.
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
batterydata/batteryscibert-uncased-abstract
|
batterydata
| 2022-03-05T14:54:59Z | 25 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"Text Classification",
"en",
"dataset:batterydata/paper-abstracts",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-02T23:29:05Z |
---
language: en
tags: Text Classification
license: apache-2.0
datasets:
- batterydata/paper-abstracts
metrics: glue
---
# BatterySciBERT-uncased for Battery Abstract Classification
**Language model:** batteryscibert-uncased
**Language:** English
**Downstream-task:** Text Classification
**Training data:** training\_data.csv
**Eval data:** val\_data.csv
**Code:** See [example](https://github.com/ShuHuang/batterybert)
**Infrastructure**: 8x DGX A100
## Hyperparameters
```
batch_size = 32
n_epochs = 14
base_LM_model = "batteryscibert-uncased"
learning_rate = 2e-5
```
## Performance
```
"Validation accuracy": 97.12,
"Test accuracy": 97.47,
```
## Usage
### In Transformers
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
model_name = "batterydata/batteryscibert-uncased-abstract"
# a) Get predictions
nlp = pipeline('text-classification', model=model_name, tokenizer=model_name)
input = {'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'}
res = nlp(input)
# b) Load model & tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
batterydata/batteryonlybert-cased-abstract
|
batterydata
| 2022-03-05T14:54:53Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"Text Classification",
"en",
"dataset:batterydata/paper-abstracts",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-02T23:29:05Z |
---
language: en
tags: Text Classification
license: apache-2.0
datasets:
- batterydata/paper-abstracts
metrics: glue
---
# BatteryOnlyBERT-cased for Battery Abstract Classification
**Language model:** batteryonlybert-cased
**Language:** English
**Downstream-task:** Text Classification
**Training data:** training\_data.csv
**Eval data:** val\_data.csv
**Code:** See [example](https://github.com/ShuHuang/batterybert)
**Infrastructure**: 8x DGX A100
## Hyperparameters
```
batch_size = 32
n_epochs = 14
base_LM_model = "batteryonlybert-cased"
learning_rate = 2e-5
```
## Performance
```
"Validation accuracy": 97.33,
"Test accuracy": 97.34,
```
## Usage
### In Transformers
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
model_name = "batterydata/batteryonlybert-cased-abstract"
# a) Get predictions
nlp = pipeline('text-classification', model=model_name, tokenizer=model_name)
input = {'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'}
res = nlp(input)
# b) Load model & tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
batterydata/batteryscibert-cased-abstract
|
batterydata
| 2022-03-05T14:54:32Z | 8 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"Text Classification",
"en",
"dataset:batterydata/paper-abstracts",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-02T23:29:05Z |
---
language: en
tags: Text Classification
license: apache-2.0
datasets:
- batterydata/paper-abstracts
metrics: glue
---
# BatterySciBERT-cased for Battery Abstract Classification
**Language model:** batteryscibert-cased
**Language:** English
**Downstream-task:** Text Classification
**Training data:** training\_data.csv
**Eval data:** val\_data.csv
**Code:** See [example](https://github.com/ShuHuang/batterybert)
**Infrastructure**: 8x DGX A100
## Hyperparameters
```
batch_size = 32
n_epochs = 11
base_LM_model = "batteryscibert-cased"
learning_rate = 2e-5
```
## Performance
```
"Validation accuracy": 97.06,
"Test accuracy": 97.19,
```
## Usage
### In Transformers
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
model_name = "batterydata/batteryscibert-cased-abstract"
# a) Get predictions
nlp = pipeline('text-classification', model=model_name, tokenizer=model_name)
input = {'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'}
res = nlp(input)
# b) Load model & tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
batterydata/bert-base-cased-abstract
|
batterydata
| 2022-03-05T14:42:16Z | 7 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"text-classification",
"Text Classification",
"en",
"dataset:batterydata/paper-abstracts",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-02T23:29:05Z |
---
language: en
tags: Text Classification
license: apache-2.0
datasets:
- batterydata/paper-abstracts
metrics: glue
---
# BERT-base-cased for Battery Abstract Classification
**Language model:** bert-base-cased
**Language:** English
**Downstream-task:** Text Classification
**Training data:** training\_data.csv
**Eval data:** val\_data.csv
**Code:** See [example](https://github.com/ShuHuang/batterybert)
**Infrastructure**: 8x DGX A100
## Hyperparameters
```
batch_size = 32
n_epochs = 15
base_LM_model = "bert-base-cased"
learning_rate = 2e-5
```
## Performance
```
"Validation accuracy": 96.84,
"Test accuracy": 96.83,
```
## Usage
### In Transformers
```python
from transformers import AutoModelForSequenceClassification, AutoTokenizer, pipeline
model_name = "batterydata/bert-base-cased-abstract"
# a) Get predictions
nlp = pipeline('text-classification', model=model_name, tokenizer=model_name)
input = {'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'}
res = nlp(input)
# b) Load model & tokenizer
model = AutoModelForSequenceClassification.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
batterydata/batterybert-cased-squad-v1
|
batterydata
| 2022-03-05T13:50:54Z | 5,999 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"question answering",
"en",
"dataset:squad",
"dataset:batterydata/battery-device-data-qa",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2022-03-02T23:29:05Z |
---
language: en
tags: question answering
license: apache-2.0
datasets:
- squad
- batterydata/battery-device-data-qa
metrics: squad
---
# BatteryBERT-cased for QA
**Language model:** batterybert-cased
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD v1
**Eval data:** SQuAD v1
**Code:** See [example](https://github.com/ShuHuang/batterybert)
**Infrastructure**: 8x DGX A100
## Hyperparameters
```
batch_size = 16
n_epochs = 4
base_LM_model = "batterybert-cased"
max_seq_len = 386
learning_rate = 2e-5
doc_stride=128
max_query_length=64
```
## Performance
Evaluated on the SQuAD v1.0 dev set.
```
"exact": 81.54,
"f1": 89.16,
```
Evaluated on the battery device dataset.
```
"precision": 70.74,
"recall": 84.19,
```
## Usage
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "batterydata/batterybert-cased-squad-v1"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'What is the electrolyte?',
'context': 'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
naam/xlm-roberta-base-finetuned-panx-de
|
naam
| 2022-03-05T13:48:33Z | 7 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"dataset:xtreme",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-05T13:36:41Z |
---
license: mit
tags:
- generated_from_trainer
datasets:
- xtreme
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-de
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: xtreme
type: xtreme
args: PAN-X.de
metrics:
- name: F1
type: f1
value: 0.8594910162670748
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-de
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the xtreme dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1348
- F1: 0.8595
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.2556 | 1.0 | 525 | 0.1629 | 0.8218 |
| 0.1309 | 2.0 | 1050 | 0.1378 | 0.8522 |
| 0.0812 | 3.0 | 1575 | 0.1348 | 0.8595 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.9.1
- Datasets 1.16.1
- Tokenizers 0.10.3
|
nielsr/segformer-b0-finetuned-segments-sidewalk
|
nielsr
| 2022-03-05T09:39:11Z | 9 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"segformer",
"vision",
"image-segmentation",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
image-segmentation
| 2022-03-05T08:17:45Z |
---
license: apache-2.0
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-segments-sidewalk
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-finetuned-segments-sidewalk
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5679
- Miou: 0.2769
- Macc: 0.3331
- Overall Accuracy: 0.8424
- Per Category Iou: [nan, 0.7174911859423314, 0.8790751054409742, 0.6065232798410057, 0.6975274018055722, 0.3486407385349508, nan, 0.40093167116703843, 0.28779837903852556, 0.0, 0.7870339041746186, 0.0, 0.0, 0.0, 0.0, 0.1464360606454247, 0.0, 0.0, 0.6770283275082656, 0.0, 0.338555175257431, 0.14697310016578427, 0.0, nan, 0.0, 0.27163002251763635, 0.0, 0.0, 0.8257437911843676, 0.7169333376341568, 0.9108105550493353, 0.0, 0.0, 0.1016801552778885, 0.0]
- Per Category Accuracy: [nan, 0.9199960254104915, 0.9327745517652714, 0.7304629327758765, 0.7378309547498484, 0.45295941407150275, nan, 0.5188608021128075, 0.5327441812670195, 0.0, 0.9353764765979435, 0.0, 0.0, 0.0, 0.0, 0.1588525415198792, 0.0, 0.0, 0.9238854794385364, 0.0, 0.4400394213522207, 0.15130051149615126, 0.0, nan, 0.0, 0.3570096986572905, 0.0, 0.0, 0.9359897980968498, 0.8570458108260572, 0.9549583230619891, 0.0, 0.0, 0.11786971668879294, 0.0]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Miou | Macc | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:----------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 1.357 | 1.0 | 400 | 1.0006 | 0.1632 | 0.2069 | 0.7524 | [nan, 0.5642795884663824, 0.7491853309192827, 0.0, 0.40589649630192104, 0.02723606910696284, nan, 0.0002207740938439576, 0.0, 0.0, 0.6632462867093903, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.5671699281129761, 0.0, 0.0009207911027492868, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.7507253434892517, 0.6157793573905029, 0.8774768871968204, 0.0, 0.0, 0.0, 0.0] | [nan, 0.6839993330882016, 0.9786792586618772, 0.0, 0.4818162160949784, 0.02785198456498826, nan, 0.00022133459131411787, 0.0, 0.0, 0.9043689536433023, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8606078323791991, 0.0, 0.0009210330367246509, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.895198618615298, 0.8549807032886052, 0.9328734839751688, 0.0, 0.0, 0.0, 0.0] |
| 1.6346 | 2.0 | 800 | 0.7856 | 0.1903 | 0.2334 | 0.7917 | [nan, 0.6276046255936906, 0.8379492348238635, 0.0, 0.5220035981992285, 0.19441920935217594, nan, 0.16135703555333, 0.0, 0.0, 0.7357165628674137, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.567598980063164, 0.0, 0.07867871139133086, 0.0, 0.0, nan, 0.0, 0.02123705398363847, 0.0, 0.0, 0.7917172051343153, 0.6589515948064048, 0.8916684207946344, 0.0, 0.0, 0.00013685918191589503, 0.0] | [nan, 0.8610263337355926, 0.9499345560017969, 0.0, 0.5908796687797819, 0.2144081438468206, nan, 0.1813236746419022, 0.0, 0.0, 0.8825551027577866, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.9239907140298015, 0.0, 0.08495225520298297, 0.0, 0.0, nan, 0.0, 0.021302829364985724, 0.0, 0.0, 0.9258397010509258, 0.8834861376443207, 0.9489131468773239, 0.0, 0.0, 0.0001372777815910495, 0.0] |
| 0.659 | 3.0 | 1200 | 0.6798 | 0.2215 | 0.2687 | 0.8107 | [nan, 0.6728474586764454, 0.8404607924530816, 0.21147709475332813, 0.5407350347311378, 0.23535489130104167, nan, 0.3087159264982809, 0.0060319580742948155, 0.0, 0.7331305064022374, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.6378031991744924, 0.0, 0.35289337122777764, 6.24997656258789e-05, 0.0, nan, 0.0, 0.14698390926256938, 0.0, 0.0, 0.8019042204623998, 0.669283249725758, 0.8928145424856038, 0.0, 0.0, 0.03847722460691187, 0.0] | [nan, 0.866012011452706, 0.9627112260298595, 0.21236715482371135, 0.5645869262075475, 0.2750610095322395, nan, 0.3857655597748765, 0.0060319580742948155, 0.0, 0.939196440844118, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.8380282443529743, 0.0, 0.5749902063170915, 6.256068386334744e-05, 0.0, nan, 0.0, 0.1605725590139305, 0.0, 0.0, 0.9212803460870584, 0.8870298583701837, 0.959700359744241, 0.0, 0.0, 0.04453994364914478, 0.0] |
| 0.5481 | 4.0 | 1600 | 0.5999 | 0.2522 | 0.2998 | 0.8312 | [nan, 0.7078353465279917, 0.8661728761172196, 0.3857324719136883, 0.6338278880825696, 0.3440050078187208, nan, 0.35980405625532347, 0.23875867241702606, 0.0, 0.773703347865372, 0.0, 0.0, 0.0, 0.0, 0.0004931363471679884, 0.0, 0.0, 0.6554146448850521, 0.0, 0.367673493717809, 0.03089804641909161, 0.0, nan, 0.0, 0.21529017459808872, 0.0, 0.0, 0.818951849158376, 0.7007504838794707, 0.9053929635423027, 0.0, 0.0, 0.06626212301200333, 0.0] | [nan, 0.8955207784307155, 0.9536263694097721, 0.39712577675621036, 0.6989299616008556, 0.4248959179453637, nan, 0.42984959564233455, 0.26168627652468784, 0.0, 0.9055166364779607, 0.0, 0.0, 0.0, 0.0, 0.0004932058379466533, 0.0, 0.0, 0.8632164276000204, 0.0, 0.6365580872107307, 0.031401709658368616, 0.0, nan, 0.0, 0.2497286263775161, 0.0, 0.0, 0.9296676429517725, 0.8858954297713482, 0.9555756265860916, 0.0, 0.0, 0.0750792276952902, 0.0] |
| 0.7855 | 5.0 | 2000 | 0.5679 | 0.2769 | 0.3331 | 0.8424 | [nan, 0.7174911859423314, 0.8790751054409742, 0.6065232798410057, 0.6975274018055722, 0.3486407385349508, nan, 0.40093167116703843, 0.28779837903852556, 0.0, 0.7870339041746186, 0.0, 0.0, 0.0, 0.0, 0.1464360606454247, 0.0, 0.0, 0.6770283275082656, 0.0, 0.338555175257431, 0.14697310016578427, 0.0, nan, 0.0, 0.27163002251763635, 0.0, 0.0, 0.8257437911843676, 0.7169333376341568, 0.9108105550493353, 0.0, 0.0, 0.1016801552778885, 0.0] | [nan, 0.9199960254104915, 0.9327745517652714, 0.7304629327758765, 0.7378309547498484, 0.45295941407150275, nan, 0.5188608021128075, 0.5327441812670195, 0.0, 0.9353764765979435, 0.0, 0.0, 0.0, 0.0, 0.1588525415198792, 0.0, 0.0, 0.9238854794385364, 0.0, 0.4400394213522207, 0.15130051149615126, 0.0, nan, 0.0, 0.3570096986572905, 0.0, 0.0, 0.9359897980968498, 0.8570458108260572, 0.9549583230619891, 0.0, 0.0, 0.11786971668879294, 0.0] |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.6
|
espnet/YushiUeda_swbd_sentiment_asr_train_asr_conformer
|
espnet
| 2022-03-04T20:49:03Z | 0 | 1 |
espnet
|
[
"espnet",
"audio",
"automatic-speech-recognition",
"en",
"dataset:swbd_sentiment",
"arxiv:1804.00015",
"license:cc-by-4.0",
"region:us"
] |
automatic-speech-recognition
| 2022-03-04T20:47:02Z |
---
tags:
- espnet
- audio
- automatic-speech-recognition
language: en
datasets:
- swbd_sentiment
license: cc-by-4.0
---
## ESPnet2 ASR model
### `espnet/YushiUeda_swbd_sentiment_asr_train_asr_conformer`
This model was trained by YushiUeda using swbd_sentiment recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
```bash
cd espnet
git checkout e5c0e0dbdab7e56ea9bf0a852bac10a1d99acf64
pip install -e .
cd egs2/swbd_sentiment/asr1
./run.sh --skip_data_prep false --skip_train true --download_model espnet/YushiUeda_swbd_sentiment_asr_train_asr_conformer
```
<!-- Generated by scripts/utils/show_asr_result.sh -->
# RESULTS
## Environments
- date: `Thu Mar 3 21:34:18 EST 2022`
- python version: `3.7.11 (default, Jul 27 2021, 14:32:16) [GCC 7.5.0]`
- espnet version: `espnet 0.10.7a1`
- pytorch version: `pytorch 1.9.0+cu102`
- Git hash: `3b53aedc654fd30a828689c2139a1e130adac077`
- Commit date: `Fri Feb 25 00:13:16 2022 -0500`
## Using Conformer based encoder and Transformer based decoder with spectral augmentation and predicting transcript along with sentiment
- ASR config: [conf/tuning/train_asr_conformer.yaml](conf/tuning/train_asr_conformer.yaml)
- token_type: word
- labels: Positive, Neutral, Negative
|dataset|Snt|Intent Classification Macro F1 (%)| Weighted F1 (%)| Micro F1 (%)|
|---|---|---|---|---|
|decode_asr_asr_model_valid.acc.ave_10best/valid|2415|61.0|65.0|65.6|
|decode_asr_asr_model_valid.acc.ave_10best/test|2438|61.4|64.4|64.6|
## ASR config
<details><summary>expand</summary>
```
config: conf/train_asr.yaml
print_config: false
log_level: INFO
dry_run: false
iterator_type: sequence
output_dir: exp/asr_train_asr_raw_en_word
ngpu: 1
seed: 0
num_workers: 1
num_att_plot: 3
dist_backend: nccl
dist_init_method: env://
dist_world_size: 4
dist_rank: 0
local_rank: 0
dist_master_addr: localhost
dist_master_port: 42025
dist_launcher: null
multiprocessing_distributed: true
unused_parameters: false
sharded_ddp: false
cudnn_enabled: true
cudnn_benchmark: false
cudnn_deterministic: true
collect_stats: false
write_collected_feats: false
max_epoch: 50
patience: null
val_scheduler_criterion:
- valid
- loss
early_stopping_criterion:
- valid
- loss
- min
best_model_criterion:
- - valid
- acc
- max
keep_nbest_models: 10
nbest_averaging_interval: 0
grad_clip: 5.0
grad_clip_type: 2.0
grad_noise: false
accum_grad: 3
no_forward_run: false
resume: true
train_dtype: float32
use_amp: false
log_interval: null
use_matplotlib: true
use_tensorboard: true
use_wandb: false
wandb_project: null
wandb_id: null
wandb_entity: null
wandb_name: null
wandb_model_log_interval: -1
detect_anomaly: false
pretrain_path: null
init_param: []
ignore_init_mismatch: false
freeze_param: []
num_iters_per_epoch: null
batch_size: 20
valid_batch_size: null
batch_bins: 40000000
valid_batch_bins: null
train_shape_file:
- exp/asr_stats_raw_en_word/train/speech_shape
- exp/asr_stats_raw_en_word/train/text_shape.word
valid_shape_file:
- exp/asr_stats_raw_en_word/valid/speech_shape
- exp/asr_stats_raw_en_word/valid/text_shape.word
batch_type: numel
valid_batch_type: null
fold_length:
- 80000
- 150
sort_in_batch: descending
sort_batch: descending
multiple_iterator: false
chunk_length: 500
chunk_shift_ratio: 0.5
num_cache_chunks: 1024
train_data_path_and_name_and_type:
- - dump/raw/train/wav.scp
- speech
- sound
- - dump/raw/train/text
- text
- text
valid_data_path_and_name_and_type:
- - dump/raw/dev/wav.scp
- speech
- sound
- - dump/raw/dev/text
- text
- text
allow_variable_data_keys: false
max_cache_size: 0.0
max_cache_fd: 32
valid_max_cache_size: null
optim: adam
optim_conf:
lr: 0.0025
scheduler: warmuplr
scheduler_conf:
warmup_steps: 40000
token_list:
- <blank>
- <unk>
- i
- and
- the
- you
- that
- it
- a
- Neutral
- to
- uh
- '''s'
- of
- know
- Positive
- they
- in
- we
- '''t'
- have
- but
- so
- was
- like
- Negative
- yeah
- is
- just
- um
- well
- do
- for
- think
- don
- there
- or
- 'on'
- '''re'
- my
- what
- really
- be
- with
- not
- if
- are
- one
- he
- '''ve'
- because
- '''m'
- about
- all
- get
- can
- had
- out
- at
- them
- when
- this
- as
- oh
- lot
- up
- people
- some
- then
- would
- go
- right
- mean
- now
- time
- kind
- got
- going
- good
- she
- things
- more
- were
- from
- something
- been
- 'no'
- see
- me
- too
- an
- your
- much
- little
- guess
- how
- where
- our
- very
- here
- their
- thing
- two
- '''ll'
- other
- did
- years
- work
- even
- has
- any
- way
- probably
- those
- could
- say
- real
- back
- '''d'
- year
- down
- home
- than
- want
- didn
- into
- pretty
- okay
- who
- take
- huh
- school
- said
- make
- over
- kids
- never
- always
- put
- by
- her
- stuff
- went
- doing
- three
- these
- 'yes'
- which
- around
- only
- big
- maybe
- 'off'
- anything
- day
- t
- sure
- actually
- come
- money
- him
- different
- everything
- still
- used
- many
- five
- will
- sort
- nice
- us
- last
- his
- thought
- every
- most
- getting
- first
- feel
- bit
- need
- children
- same
- course
- also
- new
- care
- family
- hum
- long
- through
- before
- use
- done
- should
- house
- old
- let
- does
- car
- being
- problem
- doesn
- four
- seems
- though
- pay
- look
- whole
- great
- husband
- haven
- try
- live
- trying
- ever
- why
- read
- better
- find
- far
- keep
- ago
- sometimes
- watch
- interesting
- quite
- area
- hard
- talking
- else
- another
- part
- bad
- having
- twenty
- whatever
- place
- couple
- usually
- 'true'
- high
- texas
- seen
- fact
- s
- enough
- after
- own
- college
- while
- country
- hundred
- somebody
- few
- either
- times
- week
- away
- gonna
- type
- job
- six
- dollars
- tell
- might
- remember
- again
- came
- give
- started
- start
- ten
- made
- play
- able
- dallas
- enjoy
- working
- once
- c
- someone
- life
- least
- v
- everybody
- since
- fun
- both
- talk
- wouldn
- ones
- news
- anyway
- wasn
- person
- heard
- believe
- am
- th
- buy
- may
- point
- call
- night
- y
- almost
- bye
- isn
- system
- wanted
- called
- took
- state
- wife
- child
- half
- women
- goes
- next
- yet
- especially
- love
- looking
- parents
- gone
- such
- gets
- understand
- together
- movie
- until
- w
- days
- end
- saying
- idea
- saw
- music
- mother
- thirty
- couldn
- makes
- stay
- change
- m
- basically
- wonderful
- problems
- guy
- worked
- spend
- help
- lived
- credit
- whether
- seem
- eight
- n
- best
- world
- run
- hear
- bought
- young
- each
- months
- seven
- places
- supposed
- city
- matter
- coming
- exactly
- d
- small
- summer
- comes
- certain
- company
- less
- thinking
- won
- during
- b
- thousand
- agree
- show
- daughter
- sounds
- myself
- funny
- water
- o
- month
- dog
- fifty
- paper
- gotten
- found
- taking
- today
- certainly
- boy
- friends
- number
- mine
- program
- food
- son
- p
- older
- name
- air
- movies
- government
- moved
- schools
- outside
- deal
- close
- tried
- paying
- eat
- drive
- hours
- nine
- rather
- cars
- crime
- important
- war
- living
- between
- business
- anymore
- reason
- weeks
- public
- vote
- situation
- recently
- nothing
- easy
- sit
- pick
- taxes
- turn
- full
- percent
- making
- friend
- book
- happen
- minutes
- middle
- town
- watching
- paid
- eighty
- tax
- several
- listen
- set
- talked
- north
- takes
- reading
- definitely
- law
- jury
- kinds
- married
- u
- enjoyed
- says
- without
- works
- learn
- everyone
- drug
- major
- side
- cost
- room
- education
- morning
- computer
- involved
- mostly
- aren
- health
- l
- anybody
- along
- amount
- man
- against
- weather
- often
- under
- age
- forty
- insurance
- favorite
- hope
- card
- must
- happened
- lives
- left
- drugs
- expensive
- american
- miles
- yourself
- hour
- already
- plano
- cards
- decided
- large
- difference
- ahead
- fifteen
- camping
- told
- although
- second
- r
- woman
- twelve
- knew
- guys
- cut
- neat
- fish
- mind
- wrong
- unless
- sense
- instead
- leave
- wear
- class
- hand
- top
- walk
- bring
- past
- f
- running
- e
- absolutely
- weekend
- line
- books
- question
- team
- wish
- exercise
- interested
- areas
- baby
- states
- liked
- somewhere
- father
- experience
- phone
- case
- men
- lots
- cat
- society
- taken
- changed
- game
- worth
- seventy
- gun
- h
- wonder
- hit
- group
- service
- kept
- shows
- gosh
- early
- interest
- trouble
- control
- themselves
- ha
- finally
- using
- god
- dad
- cook
- hot
- difficult
- nursing
- front
- terms
- growing
- late
- kid
- looked
- felt
- rain
- teach
- tend
- realize
- weren
- sixty
- except
- needs
- social
- budget
- figure
- recycling
- lake
- wanna
- looks
- wh
- forth
- mom
- concerned
- south
- grew
- topic
- ways
- death
- christmas
- regular
- wait
- imagine
- television
- east
- trees
- check
- fairly
- hate
- general
- catch
- dinner
- built
- ready
- fine
- sister
- story
- playing
- starting
- homes
- office
- awful
- radio
- needed
- companies
- changes
- programs
- fishing
- nineteen
- ask
- tough
- cans
- easier
- yard
- cold
- ought
- street
- later
- door
- wants
- students
- national
- space
- across
- brother
- free
- local
- tha
- level
- happens
- sitting
- newspaper
- move
- countries
- store
- subject
- girl
- beautiful
- turned
- soon
- income
- putting
- church
- university
- dress
- information
- lately
- degree
- york
- vacation
- pollution
- totally
- winter
- america
- ah
- ours
- cats
- spent
- happy
- played
- consider
- cases
- spring
- california
- longer
- teacher
- oil
- send
- lost
- sports
- garden
- teachers
- families
- particular
- buying
- amazing
- likes
- football
- united
- teaching
- hey
- benefits
- brought
- gave
- party
- worry
- throw
- testing
- given
- bunch
- near
- nobody
- community
- driving
- open
- personal
- sell
- force
- chance
- wow
- test
- baseball
- within
- biggest
- quality
- building
- example
- seeing
- power
- afford
- support
- caught
- inside
- plan
- seemed
- ninety
- younger
- learned
- generation
- charge
- punishment
- rest
- dogs
- become
- clean
- short
- privacy
- g
- calls
- plus
- particularly
- decide
- terrible
- twice
- fall
- extra
- period
- choice
- hold
- ended
- hadn
- main
- guilty
- depends
- save
- excellent
- price
- strange
- feeling
- size
- trial
- military
- boys
- per
- bet
- judge
- parts
- noticed
- anywhere
- fan
- head
- center
- glad
- clothes
- rate
- stop
- eleven
- white
- stand
- suppose
- guns
- grade
- watched
- bigger
- scary
- issue
- special
- dollar
- green
- its
- jobs
- means
- black
- worse
- knows
- plastic
- low
- spending
- picked
- golf
- gas
- single
- neighborhood
- necessarily
- alone
- cooking
- newspapers
- pull
- fast
- completely
- road
- student
- crimes
- houses
- paint
- medical
- learning
- fair
- restaurant
- miss
- lawn
- giving
- washington
- doctor
- word
- killed
- recycle
- light
- cash
- visit
- familiar
- grass
- itself
- season
- chicken
- rid
- president
- stayed
- normally
- whenever
- machine
- graduate
- eighteen
- capital
- shouldn
- virginia
- private
- field
- magazines
- kill
- market
- apartment
- anyone
- waiting
- asked
- classes
- break
- crazy
- helps
- aware
- sunday
- hm
- speak
- term
- sound
- property
- sad
- comfortable
- waste
- channel
- evening
- cover
- heavy
- carry
- everyday
- systems
- gives
- wa
- answer
- higher
- unfortunately
- minute
- future
- serious
- snow
- available
- smaller
- handle
- ground
- behind
- huge
- west
- plant
- allowed
- wind
- peace
- costs
- cause
- serve
- rent
- lucky
- gee
- build
- english
- telling
- lose
- individual
- gardening
- busy
- order
- raised
- basic
- basis
- rock
- training
- happening
- opinion
- heart
- follow
- mainly
- history
- walking
- ye
- average
- towards
- houston
- games
- travel
- decision
- environment
- respect
- list
- hopefully
- grow
- others
- sorry
- san
- taught
- weight
- bags
- hurt
- finding
- attention
- hasn
- computers
- raise
- aerobics
- quick
- shot
- personally
- bedroom
- similar
- loved
- sixties
- park
- helping
- feet
- industry
- write
- generally
- weird
- record
- benefit
- pool
- mail
- pennsylvania
- glass
- notice
- calling
- process
- land
- originally
- richardson
- cities
- afraid
- utah
- entire
- colorado
- ball
- boat
- grandmother
- possible
- folks
- helped
- strong
- keeping
- bill
- keeps
- thank
- camp
- third
- types
- eventually
- obviously
- yesterday
- apparently
- instance
- pet
- central
- club
- flowers
- trash
- trip
- classical
- europe
- changing
- perhaps
- self
- color
- foot
- video
- based
- station
- saturday
- french
- normal
- fire
- '''clock'
- issues
- starts
- piece
- hobby
- quit
- prison
- parent
- oldest
- bush
- coverage
- police
- forget
- girls
- occasionally
- bank
- shape
- beginning
- moving
- sent
- vietnam
- nights
- current
- salary
- himself
- stories
- mountains
- aluminum
- luck
- invasion
- tape
- florida
- bed
- laws
- research
- mess
- hoping
- players
- tired
- thirteen
- magazine
- expect
- sleep
- words
- language
- push
- position
- hobbies
- background
- plants
- inches
- easily
- stopped
- murder
- shoot
- maryland
- hardly
- bills
- attitude
- pro
- civil
- sometime
- human
- wanting
- goodness
- security
- doctors
- kitchen
- somehow
- penalty
- county
- eating
- simply
- die
- bike
- reunion
- project
- typical
- j
- however
- total
- mexico
- base
- economy
- restaurants
- responsibility
- jail
- lower
- died
- tested
- safe
- voting
- elderly
- sh
- listening
- sudden
- numbers
- career
- stick
- born
- wondering
- poor
- painting
- active
- professional
- supposedly
- li
- lady
- reasons
- cool
- sixteen
- yep
- excuse
- horrible
- political
- red
- science
- federal
- besides
- shop
- opportunity
- ride
- planning
- degrees
- writing
- mexican
- engineering
- surprised
- bother
- share
- graduated
- account
- financial
- hands
- activities
- seventies
- step
- thanks
- bag
- role
- england
- limit
- willing
- hospital
- view
- band
- teams
- tonight
- groups
- advantage
- heat
- department
- turns
- tree
- telephone
- became
- brand
- criminal
- blue
- dry
- warm
- weekends
- grown
- stores
- rights
- garbage
- junior
- everywhere
- prices
- metric
- ran
- equipment
- till
- cross
- considered
- track
- moment
- figured
- americans
- met
- worst
- ridiculous
- grocery
- yours
- neighbor
- piano
- sold
- cowboys
- selling
- savings
- grandchildren
- nowadays
- add
- plays
- conversation
- lunch
- straight
- sentence
- floor
- dead
- fourteen
- meet
- ideas
- foods
- israel
- fix
- ourselves
- swimming
- upset
- sign
- sewing
- wood
- recipe
- van
- upon
- standard
- box
- win
- wall
- offer
- products
- otherwise
- pounds
- stations
- ex
- staying
- drop
- body
- carolina
- sales
- meal
- ice
- basketball
- mixed
- careful
- possibly
- sick
- farm
- retired
- compared
- western
- hearing
- finished
- separate
- mentioned
- soviet
- truck
- river
- defense
- oklahoma
- harder
- k
- re
- stuck
- cable
- trade
- favor
- positive
- related
- smoke
- effect
- various
- bottom
- awhile
- kindergarten
- beat
- court
- beach
- baltimore
- choose
- allow
- brown
- hang
- known
- sorts
- bathroom
- scared
- popular
- extremely
- politics
- hair
- policy
- wha
- saint
- covered
- ca
- sisters
- boston
- lakes
- forever
- fight
- downtown
- visa
- sauce
- garage
- lines
- suit
- whereas
- speech
- direction
- animals
- corps
- fit
- majority
- chinese
- dark
- painted
- milk
- concern
- dump
- nature
- safety
- shoes
- star
- questions
- switch
- clear
- trips
- management
- beyond
- depending
- sing
- iraq
- pressure
- cute
- runs
- windows
- salad
- board
- chicago
- population
- legal
- super
- '''all'
- puts
- slow
- pets
- forward
- thousands
- style
- debt
- becoming
- mo
- pop
- violent
- italian
- earlier
- cheap
- weapons
- coast
- austin
- traveling
- passed
- x
- speaking
- points
- prefer
- threat
- further
- master
- table
- broken
- random
- row
- northern
- simple
- appreciate
- district
- train
- continue
- rangers
- pittsburgh
- truth
- value
- quickly
- raising
- pass
- tennis
- flower
- bass
- engine
- becomes
- variety
- jeans
- exciting
- organization
- spread
- sat
- incredible
- somewhat
- loan
- engineer
- doubt
- southern
- monday
- backyard
- forced
- papers
- express
- saving
- owned
- recent
- toward
- fortunate
- liberal
- shopping
- rough
- brothers
- worried
- meals
- scouts
- vacations
- hunting
- lawyers
- wisconsin
- bucks
- act
- voice
- helpful
- wide
- retirement
- cannot
- picture
- picking
- suspect
- spare
- held
- election
- study
- report
- begin
- antonio
- drove
- opposed
- league
- ju
- se
- solution
- closer
- character
- finish
- knowing
- million
- common
- services
- thinks
- player
- violence
- wrote
- highway
- reasonable
- afternoon
- series
- developed
- effort
- christian
- fantastic
- saved
- seventeen
- barbecue
- sun
- conditioning
- ohio
- babies
- arlington
- hole
- visited
- rural
- herself
- knowledge
- kn
- plans
- instruments
- above
- border
- bible
- losing
- china
- events
- leaving
- written
- taste
- friday
- schedule
- anytime
- showed
- aspect
- range
- earth
- rice
- broke
- tent
- excited
- roles
- situations
- rooms
- spot
- laid
- duty
- bottles
- russia
- fighting
- pound
- letter
- convenient
- thi
- storm
- original
- wild
- showing
- percentage
- required
- grandparents
- extent
- economic
- voted
- canada
- trust
- healthy
- dealing
- face
- hired
- discuss
- larger
- pleased
- eye
- constantly
- perfect
- stupid
- square
- mix
- meat
- semester
- necessary
- mandatory
- burning
- fly
- mothers
- aids
- checked
- bedrooms
- fresh
- advice
- tomatoes
- treat
- sale
- ford
- japanese
- burn
- correct
- limited
- sleeping
- actual
- ends
- female
- hundreds
- feelings
- impact
- leaves
- section
- lay
- provide
- planted
- factor
- fill
- rich
- deep
- someplace
- drives
- circumstances
- honda
- jersey
- smoking
- feels
- fifties
- access
- doors
- pattern
- names
- payment
- facilities
- automatic
- boxes
- hi
- pictures
- versus
- ability
- edge
- politicians
- amazed
- boss
- union
- neighbors
- distance
- prime
- article
- mistake
- grades
- bread
- bothers
- jeez
- rented
- fourth
- alcohol
- gulf
- catfish
- license
- shooting
- touch
- asking
- realized
- require
- natural
- expenses
- purchase
- energy
- talks
- colors
- smart
- considering
- lessons
- tremendous
- participate
- ages
- missed
- quiet
- cheaper
- cents
- payments
- iron
- frightening
- forgot
- cheese
- daughters
- lawyer
- creek
- dental
- seat
- humid
- belt
- michigan
- extended
- flat
- driver
- foreign
- stays
- adults
- songs
- due
- wet
- double
- stress
- desert
- drink
- material
- equal
- deterrent
- machines
- eastern
- boring
- apart
- vegetables
- recipes
- unusual
- responsible
- hire
- garland
- ho
- dangerous
- loans
- colleges
- served
- prisons
- recycled
- cousins
- gorgeous
- member
- values
- fell
- fund
- metal
- wolves
- technology
- form
- enjoyable
- entertainment
- successful
- juries
- brings
- likely
- convicted
- appeal
- minimum
- opposite
- sport
- complete
- smell
- gallon
- lord
- employees
- centers
- alive
- blow
- meant
- cutting
- relatives
- bus
- commit
- none
- jus
- holding
- sand
- swing
- courses
- ski
- breed
- heck
- casual
- blood
- admit
- join
- fi
- draw
- upper
- bell
- youngest
- traffic
- protect
- tends
- medicine
- strongly
- committed
- opinions
- brick
- sides
- congress
- gasoline
- regularly
- plenty
- collect
- williams
- tickets
- perspective
- damage
- present
- bowl
- kidding
- employee
- tests
- loves
- round
- nations
- german
- roof
- august
- october
- disney
- pieces
- solid
- knock
- facts
- concept
- specific
- option
- jump
- stage
- block
- items
- murders
- breaks
- dirty
- shirts
- package
- pair
- pants
- data
- opera
- standing
- roll
- count
- action
- physical
- differently
- teenagers
- checks
- replace
- independent
- neither
- tuition
- eyes
- theater
- educational
- bins
- animal
- reports
- senior
- window
- curious
- de
- argument
- june
- date
- extreme
- innocent
- december
- germany
- salt
- et
- cetera
- tomorrow
- educated
- clubs
- bird
- sons
- journal
- visiting
- pulled
- letting
- tech
- fixed
- el
- shorts
- assume
- message
- primarily
- signs
- cuts
- john
- jazz
- balance
- un
- walked
- shirt
- dropped
- latin
- feed
- influence
- wondered
- adult
- aid
- inner
- elementary
- negative
- swim
- projects
- raleigh
- practically
- grand
- nearly
- turning
- cleaning
- fort
- recommend
- ate
- skiing
- rules
- yellow
- cruise
- impressed
- address
- labor
- dish
- highly
- repair
- prior
- fee
- terribly
- experiences
- lead
- accept
- mart
- immediately
- portion
- nicer
- seafood
- fault
- disease
- truly
- wearing
- male
- dances
- closed
- product
- expected
- caused
- tapes
- relaxing
- culture
- technical
- criminals
- sentencing
- summertime
- indiana
- killing
- encourage
- housing
- practice
- ups
- stitch
- compare
- sentenced
- freedom
- belong
- purpose
- throwing
- crafts
- pushing
- sweet
- decent
- sew
- campus
- carpet
- channels
- repairs
- preschool
- please
- minnesota
- activity
- naturally
- cooked
- quarterback
- wise
- satisfied
- cadillac
- streets
- businesses
- honest
- automatically
- routine
- coach
- arm
- driven
- dishes
- mornings
- contact
- mall
- deficit
- humidity
- location
- fortunately
- atmosphere
- corporate
- meeting
- improvement
- engineers
- network
- dressed
- mcdonald
- spanish
- catholic
- organizations
- hill
- model
- fifth
- elected
- articles
- expecting
- seriously
- volunteer
- handy
- riding
- threw
- ooh
- trend
- ba
- arts
- thursday
- uncle
- relationship
- members
- throughout
- buffalo
- solve
- pain
- auto
- cholesterol
- planned
- prepared
- presented
- staff
- choices
- march
- filled
- overall
- discipline
- justice
- weights
- mile
- unit
- bringing
- beef
- camped
- wal
- mow
- microwave
- weapon
- inch
- rule
- traveled
- subscribe
- proper
- di
- classic
- software
- pays
- complex
- missing
- shepherd
- pleasure
- st
- cream
- expense
- automobile
- hers
- orleans
- king
- philosophy
- singing
- eighties
- enjoys
- democratic
- significant
- chore
- ev
- combination
- patterns
- disappointed
- republican
- media
- pre
- sesame
- fixing
- seconds
- passing
- daily
- trek
- signed
- raining
- accident
- scale
- interests
- route
- ma
- whoever
- reach
- judges
- evidence
- european
- seasons
- supporting
- dirt
- loose
- france
- cancer
- planting
- iowa
- increase
- hospitals
- maintain
- odd
- pregnant
- math
- press
- agency
- shrimp
- beer
- key
- puppy
- sending
- hardest
- tr
- wi
- return
- corner
- suits
- dakota
- al
- immediate
- possibility
- hooked
- song
- stadium
- frame
- dig
- navy
- comedy
- annual
- fear
- island
- exercising
- fancy
- fat
- enjoying
- motivated
- design
- affect
- investment
- recall
- co
- luxury
- trim
- flexible
- international
- furniture
- potatoes
- wou
- fellow
- breakfast
- bath
- trucks
- uses
- onto
- beans
- apple
- alabama
- records
- musical
- tie
- setting
- offs
- michael
- bugs
- freeze
- anyhow
- properly
- underneath
- dining
- aside
- quarter
- kentucky
- skills
- parole
- parks
- nation
- complain
- wine
- summers
- fans
- golden
- unanimous
- shift
- warranty
- plastics
- rates
- rains
- charged
- lincoln
- decisions
- checking
- gray
- laugh
- hills
- commercial
- recognize
- quote
- receive
- recording
- illegal
- generations
- advance
- motor
- outdoor
- lab
- honestly
- rap
- oriented
- match
- art
- fiction
- manage
- flip
- appropriate
- strict
- mad
- mental
- hung
- adds
- mileage
- bicycle
- thoroughly
- elections
- deserve
- indian
- according
- latest
- bu
- ta
- vehicle
- holidays
- july
- junk
- emergency
- convinced
- graduating
- kick
- including
- teenage
- ceiling
- valley
- victim
- ocean
- hell
- steel
- rainy
- noise
- marvelous
- drunk
- studying
- mountain
- hood
- greatest
- facility
- generate
- desk
- improve
- tells
- sex
- results
- si
- manager
- goal
- teenager
- concert
- copy
- africa
- paycheck
- woods
- lubbock
- sentences
- prevent
- impossible
- split
- faster
- speed
- thin
- chose
- monthly
- stands
- turkey
- repeat
- japan
- financially
- lights
- page
- pulling
- explain
- potential
- rape
- wash
- minor
- thrown
- professor
- pan
- vegetable
- fried
- onions
- roommate
- effects
- wire
- shame
- individuals
- sweat
- scene
- yards
- whose
- thoughts
- draft
- useful
- welfare
- organized
- communities
- realistic
- directly
- print
- printer
- purchased
- aunt
- prepare
- millions
- challenge
- twins
- badly
- thick
- pure
- bar
- roads
- missouri
- tall
- library
- added
- sam
- marriage
- gardens
- lesser
- views
- understanding
- prove
- deer
- delicious
- containers
- depend
- denver
- favorites
- tear
- site
- code
- winds
- parties
- relatively
- opened
- falling
- fascinating
- forties
- options
- sharing
- attached
- owner
- version
- modern
- standpoint
- eaten
- fully
- neck
- trials
- knee
- uncomfortable
- temperature
- chemical
- processing
- fruit
- lovely
- bothered
- pot
- causes
- rea
- diet
- theory
- conflict
- earn
- disagree
- exposed
- administration
- breaking
- buildings
- fence
- shocked
- retire
- wedding
- ch
- dust
- acid
- pushed
- blame
- contract
- carried
- nurse
- overseas
- texan
- fuel
- whe
- vehicles
- increased
- necessity
- plate
- hitting
- reduce
- blocks
- hide
- silly
- length
- writer
- film
- development
- refrigerator
- engines
- louis
- relate
- citizens
- dorm
- began
- hawaii
- january
- wheel
- gourmet
- shots
- bushes
- theirs
- outrageous
- sea
- hook
- conscious
- videos
- mastercard
- suburb
- chevy
- tiny
- mowing
- bulbs
- flag
- detroit
- brakes
- charges
- retriever
- towns
- contribute
- arms
- slacks
- definite
- difficulty
- produce
- cultures
- cou
- discovered
- whatnot
- philadelphia
- ou
- electronic
- strictly
- tendency
- mister
- regard
- con
- approach
- friendly
- handled
- governor
- louisiana
- urban
- develop
- pardon
- construction
- classroom
- personality
- currently
- tour
- apply
- memory
- francisco
- affected
- complicated
- risk
- shock
- roses
- movement
- tied
- teaches
- nuts
- halfway
- softball
- masters
- causing
- cake
- unbelievable
- cast
- characters
- actor
- association
- wallpaper
- habit
- blowing
- expert
- screen
- bake
- dessert
- tents
- minneapolis
- tin
- wars
- steps
- structure
- motivation
- buddy
- minds
- wound
- coat
- holes
- covers
- shell
- tries
- undergraduate
- springs
- banks
- kuwait
- kansas
- established
- dozen
- steak
- following
- massachusetts
- jewish
- affects
- hotel
- sight
- tight
- birthday
- statement
- weeds
- consumer
- understood
- tastes
- cartoons
- apartments
- cares
- settled
- september
- letters
- atlanta
- newer
- guarantee
- citizen
- occasion
- attorneys
- tom
- levels
- sweaters
- tires
- direct
- wagon
- remarkable
- result
- shower
- hello
- commercials
- cassette
- forms
- standards
- james
- native
- falls
- comment
- peers
- wore
- pleasant
- mid
- region
- essentially
- differences
- fitness
- symphony
- finger
- ad
- sounded
- joined
- trained
- toyota
- motors
- aspects
- candidate
- votes
- hunt
- electronics
- charging
- registered
- ed
- electric
- bite
- gifts
- manufacturing
- farmers
- participating
- legislation
- los
- angeles
- ticket
- survive
- catching
- eliminate
- ryan
- luckily
- teeth
- ill
- hated
- offices
- file
- hassle
- universal
- entertain
- roast
- traditional
- entertaining
- crisis
- officer
- saudi
- participated
- profession
- gue
- soap
- johnson
- task
- dumb
- gain
- broad
- surgery
- dressing
- condition
- tex
- grill
- camper
- note
- managed
- increasing
- rained
- parking
- wake
- mistakes
- pitch
- cucumbers
- prescription
- shut
- forgotten
- conditions
- rehabilitation
- gold
- waited
- substitute
- lift
- crowd
- gym
- tools
- divorced
- practical
- avoid
- spray
- seats
- severe
- litter
- trunk
- programming
- soft
- discover
- cs
- zero
- firm
- army
- post
- rarely
- virtually
- suddenly
- relative
- technically
- frustrating
- nursery
- checkbook
- rolls
- colored
- division
- jack
- districts
- guitar
- leaders
- permanent
- puerto
- su
- ultimately
- race
- biking
- statistics
- accepted
- hussein
- steal
- shown
- menu
- pension
- youth
- pride
- create
- knit
- walks
- guide
- fry
- til
- requirements
- reporting
- networks
- chain
- soil
- jumped
- hysterical
- target
- wasting
- horse
- buses
- dear
- butter
- thanksgiving
- instrument
- cared
- unemployment
- switchboard
- vice
- morals
- focus
- beds
- wednesday
- george
- principal
- non
- scores
- grandfather
- qualified
- burned
- courts
- cousin
- proud
- ham
- hits
- literally
- transferred
- institution
- debts
- collection
- weed
- cigarettes
- homework
- corruption
- clarion
- purposes
- improved
- applied
- closet
- corn
- tomato
- lasagna
- pickup
- collecting
- immigration
- sooner
- resources
- largest
- hurting
- soccer
- treated
- shore
- bored
- abuse
- mayor
- continental
- professionals
- verdict
- carrying
- button
- drinking
- dying
- reliable
- transportation
- subjects
- fees
- unfortunate
- evenings
- craft
- scout
- languages
- scratch
- sears
- thirties
- solutions
- sherman
- stack
- funds
- skirt
- fed
- correctly
- listened
- clothing
- serving
- supervisor
- mark
- materials
- lewisville
- below
- chemicals
- era
- incentive
- coffee
- offered
- interior
- determine
- sets
- alternative
- instructor
- dance
- saddam
- discussion
- joke
- boating
- fabulous
- ship
- funding
- groceries
- entirely
- sitter
- communications
- democrat
- cafeteria
- corporation
- squash
- peppers
- nor
- pour
- flour
- waco
- controls
- argentina
- flying
- coal
- nuclear
- february
- saturdays
- phoenix
- electrical
- wage
- laying
- effective
- robin
- wealthy
- hampshire
- concerns
- hall
- figures
- rochester
- agreement
- pages
- bitty
- cowboy
- dealers
- features
- argue
- commitment
- hanging
- policeman
- critical
- user
- dried
- strip
- pie
- balls
- eggs
- among
- lifting
- phase
- desire
- final
- jogging
- bless
- attack
- taxed
- acres
- april
- oven
- pack
- claim
- gorbachev
- wherever
- troops
- illinois
- industries
- trailer
- grab
- pitching
- nineties
- ranch
- ti
- mortgage
- mill
- sue
- register
- attorney
- alike
- adopted
- tournament
- involvement
- silver
- perfectly
- slightly
- meetings
- primary
- sixth
- employer
- survey
- indoor
- partly
- addition
- nervous
- georgia
- recreation
- internal
- rise
- schooling
- previous
- mood
- stolen
- birds
- director
- named
- mustang
- mystery
- upstairs
- goods
- reunions
- perform
- reality
- hurry
- scattered
- environmental
- limits
- cleaned
- tons
- concrete
- belts
- cabin
- rolling
- review
- invaded
- invade
- obvious
- requires
- typically
- religious
- religion
- opportunities
- intelligent
- peter
- album
- drawing
- trumpet
- stock
- household
- customer
- kay
- cotton
- tennessee
- specifically
- lowest
- moon
- reputation
- honor
- secretary
- rico
- assumed
- realizing
- attitudes
- rat
- vegetarian
- occurred
- practicing
- promote
- adding
- designed
- delivered
- nah
- category
- disk
- exact
- pilot
- costing
- brake
- mercedes
- pr
- abortion
- texans
- moral
- capable
- applications
- beneficial
- flavor
- drain
- reporter
- clock
- aggravating
- politically
- governments
- clearly
- designing
- burden
- laughed
- topics
- chunk
- spots
- streams
- efficient
- slowly
- arkansas
- discussed
- conservative
- flute
- choir
- sugar
- answering
- lists
- babysitter
- impression
- lets
- david
- forces
- thumb
- cop
- creative
- dip
- switched
- pine
- content
- aerobic
- conversations
- touched
- candidates
- legitimate
- assistant
- annoying
- finance
- vietnamese
- husbands
- storms
- pump
- lawns
- patio
- roots
- russian
- plot
- mouth
- amounts
- suffering
- headlines
- hunter
- acre
- ties
- measure
- la
- trout
- guidelines
- bonus
- emotional
- cow
- unique
- providing
- encouraged
- positions
- barely
- criteria
- olds
- tradition
- scares
- workers
- iran
- toys
- tornado
- moves
- ton
- recyclable
- crowded
- ladies
- melt
- crack
- finances
- score
- crawfish
- transmission
- purple
- mavericks
- eve
- babysitting
- committing
- maintenance
- exposure
- cassettes
- socially
- reagan
- soup
- hiking
- athlete
- cheesecake
- grandson
- skunk
- addison
- skied
- realistically
- profit
- emissions
- skirts
- heels
- awards
- silence
- lambs
- whatsoever
- lotus
- offering
- unquote
- forest
- phones
- miniature
- medium
- grandma
- goo
- finishing
- judicial
- penalties
- ki
- hose
- hungry
- success
- monitor
- application
- pink
- depressing
- supper
- bureaucracy
- status
- territory
- mississippi
- exercises
- preference
- peo
- packages
- broadcast
- doctorate
- scholarship
- grows
- lean
- anxious
- core
- voluntary
- minority
- couples
- ears
- crochet
- selected
- voters
- democrats
- authority
- airport
- horror
- fox
- sub
- professors
- legs
- stir
- celery
- eats
- chocolate
- cup
- asleep
- studies
- afterwards
- slip
- lap
- connection
- individually
- dependent
- foundation
- worthwhile
- fields
- freedoms
- giants
- stars
- kittens
- vet
- balanced
- homeless
- birth
- mu
- campaign
- empty
- scenes
- heads
- kicked
- messed
- arabia
- greatly
- bob
- talent
- nurses
- strike
- reached
- dedicated
- suggested
- guard
- basement
- laughing
- communication
- ghost
- abused
- token
- plane
- beating
- former
- films
- fought
- failed
- lesson
- lo
- walls
- sink
- girlfriend
- accused
- hurts
- loud
- gang
- consistent
- stereo
- fa
- struggling
- interview
- employment
- borrowed
- spoiled
- tub
- tea
- mex
- lemon
- bin
- evidently
- grant
- tremendously
- cartons
- opening
- mi
- skin
- seed
- acceptable
- filter
- golly
- sits
- coke
- followed
- basics
- psychology
- operate
- owns
- freezing
- nissan
- te
- accidents
- settle
- leader
- poverty
- dr
- masking
- fiancee
- jugs
- landfill
- heavily
- lie
- trends
- interstate
- competitive
- arguments
- weigh
- competition
- surprising
- temporary
- inclined
- overnight
- priority
- darn
- honey
- roy
- accurate
- rocks
- babysit
- priced
- twin
- le
- ban
- athletes
- lack
- pond
- muscles
- connecticut
- anyways
- pacific
- owners
- freon
- responsibilities
- toxic
- permit
- closely
- pitched
- dresses
- scenery
- kevin
- costner
- greater
- enemy
- granted
- welcome
- define
- advertising
- salesman
- reverse
- ideal
- locked
- directions
- object
- figuring
- frequently
- boot
- therefore
- jails
- murdered
- purdue
- received
- led
- picks
- include
- democracy
- studied
- fond
- climate
- alaska
- sake
- avid
- healthier
- fired
- connected
- stealing
- chances
- humane
- supported
- enjoyment
- penny
- turtles
- encouraging
- ea
- marketing
- garlic
- broccoli
- potato
- suburbs
- formal
- rush
- concentrate
- woodworking
- leaf
- cent
- automobiles
- ozone
- devices
- source
- comedies
- landing
- semi
- agent
- string
- precious
- ugly
- phenomenal
- hilarious
- winning
- doe
- mobile
- farther
- chili
- landscape
- path
- someday
- complaining
- sky
- load
- baked
- stove
- bend
- en
- command
- decides
- attacks
- wished
- ac
- yearly
- weekly
- indeed
- brief
- mike
- dealer
- emergencies
- event
- charlotte
- slapstick
- purely
- included
- unfair
- meaning
- injuries
- vermont
- cornstarch
- egg
- worrying
- wrap
- buff
- advertisements
- plain
- chores
- mention
- allows
- novels
- bases
- billion
- protected
- workout
- cancel
- daddy
- outdoors
- novel
- bruce
- awfully
- constant
- spends
- accent
- deductions
- dealt
- informed
- tournaments
- snake
- penn
- sox
- tho
- root
- rip
- combat
- polls
- sundays
- blank
- frozen
- assistance
- ads
- hiring
- drivers
- recession
- convert
- alternate
- dryer
- lightning
- gr
- chair
- emotionally
- angry
- mature
- treatment
- lousy
- seventh
- ninth
- deck
- printed
- answers
- jumping
- mentality
- popcorn
- shade
- oaks
- reasonably
- budgeting
- controlled
- british
- unreal
- mini
- performance
- tip
- ge
- handgun
- toy
- skip
- armed
- fleas
- redo
- deposit
- goldfish
- childhood
- removed
- surprises
- dodge
- consulting
- sacrifice
- placed
- sailing
- classics
- bottle
- secretaries
- diesel
- liter
- chosen
- boats
- returned
- item
- november
- adoption
- fewer
- pizza
- feature
- nebraska
- cafe
- alzheimer
- agreed
- choosing
- council
- bermuda
- suspense
- satisfaction
- winters
- headed
- murphy
- customers
- habits
- norm
- loss
- bec
- crawl
- exist
- attractive
- wor
- leg
- selection
- prob
- sources
- audience
- styles
- davis
- borrow
- goals
- determined
- accounts
- pat
- vs
- whi
- advantages
- diapers
- pin
- models
- queen
- sticks
- mesquite
- canal
- incredibly
- feeding
- importance
- salvador
- fathers
- regardless
- translation
- frustrated
- bond
- structured
- counting
- factors
- economical
- involves
- radical
- depressed
- universities
- shall
- tank
- jesus
- counselor
- proposal
- allowing
- pocket
- airplane
- gangs
- saints
- consideration
- dolls
- horses
- spouse
- midwest
- fashioned
- screw
- curriculum
- oakland
- candy
- blanket
- backpack
- industrial
- smog
- canyon
- elect
- backed
- bear
- comfort
- economically
- warmer
- sunny
- exhausted
- afternoons
- ranger
- worries
- orange
- physically
- experiment
- famous
- copies
- cardboard
- pa
- demand
- polluted
- tail
- compatible
- wordperfect
- drag
- float
- carter
- presidential
- dug
- israelis
- relations
- arab
- rings
- estate
- salaries
- recognition
- headline
- nowhere
- ratings
- asia
- ei
- lifestyle
- tenth
- preparing
- cookies
- fifteenth
- bait
- experienced
- defendant
- surprise
- cocaine
- reminds
- liquid
- destroy
- century
- admire
- rare
- tuned
- schwartzkopf
- reduced
- cruel
- cheers
- picnic
- accounting
- pace
- jane
- tune
- knees
- holy
- owe
- pepper
- worms
- bricks
- mound
- additional
- flow
- tended
- refuse
- landfills
- stance
- cry
- dumping
- memories
- anyplace
- geared
- arrangements
- depth
- tuesday
- raw
- neighborhoods
- policemen
- net
- located
- trail
- edition
- purchases
- injury
- beliefs
- statements
- sin
- cultural
- shorter
- guilt
- 'false'
- economics
- enormous
- lifetime
- advanced
- adopt
- mechanical
- liters
- dream
- bachelor
- nasty
- scare
- laundry
- strikes
- quilt
- chlorine
- shed
- whom
- ds
- convince
- courtroom
- volleyball
- domestic
- stomach
- concerts
- stepfather
- typewriter
- clouds
- rating
- gifted
- generals
- clip
- screwed
- australia
- maine
- quarters
- chrysler
- oldsmobile
- pistol
- membership
- seldom
- supply
- tornadoes
- hu
- oth
- porch
- persian
- lakers
- tarpley
- seattle
- thrilled
- boards
- brian
- roughly
- paints
- attic
- ceilings
- baths
- pig
- killer
- pros
- paris
- brooks
- dealership
- developing
- islands
- kennedy
- ending
- ratio
- created
- separated
- lasts
- wives
- jean
- spaghetti
- village
- biased
- operating
- enid
- crappie
- employers
- conference
- tuna
- tole
- pollutants
- jones
- handling
- emission
- vary
- initially
- finds
- obligation
- select
- carefully
- barrier
- strangest
- spaniel
- blues
- comparison
- attend
- focused
- ver
- blacks
- jurors
- floors
- spell
- wears
- heel
- wooden
- assistants
- accustomed
- mild
- bands
- bang
- alrighty
- campbell
- tours
- panama
- believes
- corrupt
- cocoa
- interestingly
- makeup
- communism
- etcetera
- historical
- heating
- hispanic
- bilingual
- ultimate
- bicycling
- elsewhere
- scientific
- combine
- ar
- consequences
- gal
- cure
- grader
- corporations
- stitching
- grief
- leading
- graphics
- regards
- rank
- personalities
- mission
- whiz
- voter
- controlling
- believed
- minded
- kyle
- author
- certified
- shelter
- historically
- protecting
- fits
- carrots
- knitting
- professionally
- specialty
- jars
- needlework
- robert
- regarding
- billions
- rental
- nolan
- ruined
- searching
- taco
- mama
- relationships
- exchange
- highways
- handicapped
- scouting
- discouraging
- dropping
- electricity
- stacks
- catalytic
- muffler
- pipe
- error
- compete
- cajun
- haul
- discussing
- kurds
- anti
- orchestra
- needle
- ireland
- investments
- dramatically
- drawback
- raises
- growth
- definition
- guatemala
- receiving
- reported
- aikman
- shoulder
- banking
- highest
- jimmy
- jim
- cardinals
- jamaica
- magic
- convictions
- usage
- hamburgers
- sporting
- muscle
- sophisticated
- element
- occur
- designated
- depression
- covering
- tooth
- filling
- sharp
- strawberry
- relax
- advise
- enter
- throat
- instances
- allowance
- stronger
- debate
- literature
- shelves
- remove
- advertised
- progress
- smith
- richard
- raped
- offense
- detail
- christians
- tore
- accomplish
- released
- loaning
- bright
- intense
- dies
- peas
- steaks
- spicy
- conditioner
- convenience
- drought
- cups
- nee
- russians
- yeltsin
- thirds
- acting
- northwest
- freeway
- curbside
- corpus
- publicized
- mets
- memorial
- onion
- garages
- employed
- lazy
- wrestling
- crab
- loaded
- stationary
- coupons
- ripped
- balances
- convict
- loving
- represent
- judgment
- pork
- wasted
- selecting
- recover
- divide
- civic
- builds
- quicker
- translate
- churches
- slice
- discount
- swear
- nap
- centered
- vitamins
- planes
- contractor
- drastically
- elaborate
- continued
- decline
- uncles
- utilities
- camera
- musicians
- musician
- condominium
- augustine
- tolerant
- southwest
- counselors
- mirrors
- communicate
- worker
- medication
- powerful
- manure
- replaced
- redone
- shotgun
- memphis
- turtle
- supreme
- owning
- cycle
- jay
- airline
- sir
- method
- mayonnaise
- execution
- plea
- mower
- buttons
- campaigns
- log
- quarterbacks
- hamburger
- arizona
- ignore
- bred
- indianapolis
- envelope
- conversion
- hail
- flooding
- spanked
- fluid
- bay
- leather
- italy
- locations
- blew
- extensive
- traded
- transition
- kilometers
- robbing
- kills
- cadillacs
- randomly
- institute
- triangle
- mercury
- volvo
- dan
- leads
- pe
- rome
- attraction
- aunts
- latex
- texoma
- rabbit
- audi
- methodist
- basements
- tee
- clarinet
- walker
- massive
- stroke
- leak
- sites
- deals
- lined
- embarrassed
- slab
- officially
- behavior
- examples
- witness
- wishes
- unlisted
- terminal
- modem
- poodle
- weighs
- paul
- subscription
- chapter
- likewise
- documents
- shoe
- miserable
- jacket
- lax
- varies
- peach
- blows
- disco
- suicide
- bo
- downhill
- profitable
- twenties
- official
- pressures
- image
- monies
- absentee
- senate
- ethnic
- involve
- proven
- offenders
- afghans
- borders
- peaceful
- ab
- blown
- lock
- adequate
- scholarships
- offers
- bat
- injection
- useless
- revolution
- mormon
- enforce
- cosby
- preapproved
- fortune
- messing
- promised
- sum
- frankly
- damn
- gravy
- boil
- remembered
- consuming
- metropolitan
- gift
- seeds
- factories
- layer
- costly
- usual
- cooler
- daytime
- appearance
- sufficient
- balcony
- chasing
- chest
- las
- plumbing
- farming
- becau
- cleaner
- packed
- cried
- lover
- indians
- racial
- occasional
- rivers
- pollute
- locally
- contribution
- presentations
- laser
- represented
- guests
- apples
- hank
- closest
- oak
- missionaries
- rob
- mailing
- ring
- bias
- newsweek
- nicely
- tables
- zone
- faith
- cheapest
- excuses
- fail
- administrator
- baylor
- sued
- emotions
- appeared
- notes
- tying
- nail
- shake
- comp
- entry
- peer
- sore
- sticky
- pudding
- knowledgeable
- haze
- mass
- stressed
- academy
- considerably
- rowlett
- shortly
- nose
- ordered
- crying
- handed
- wages
- input
- praying
- warfare
- accomplished
- woke
- regulation
- equivalent
- bankrupt
- jog
- ell
- ri
- appeals
- extraordinary
- metroplex
- absolute
- conclusion
- accountable
- glory
- pray
- prisoners
- bomb
- destroyed
- testament
- pu
- suggest
- polish
- principle
- gardener
- beets
- behave
- periods
- shrubs
- sprinkler
- fajitas
- describe
- release
- motorcycle
- bound
- styrofoam
- valuable
- tolerate
- attempt
- jordan
- exists
- screaming
- stump
- breathing
- selfish
- dick
- blonde
- maximum
- max
- secret
- holds
- landscaping
- reads
- prevalent
- galveston
- weirdest
- joy
- nationwide
- soda
- coin
- dukakis
- steam
- embarrassing
- plates
- incorporate
- deductible
- machinery
- categories
- funded
- chairs
- recommended
- handicap
- bowling
- meantime
- accord
- tyler
- mosquitoes
- booklet
- coaches
- syria
- dinners
- holiday
- baltic
- priorities
- recognized
- wipe
- longest
- suburban
- delayed
- backgrounds
- varied
- eighth
- den
- coats
- theme
- nicest
- penney
- adjust
- hou
- toilet
- bullet
- rapidly
- capabilities
- hilly
- container
- layoff
- watches
- jewelry
- maker
- infant
- resent
- blade
- watering
- wildlife
- decorating
- fabric
- leadership
- privilege
- exotic
- loop
- seasoning
- chopped
- retiring
- backseat
- par
- leukemia
- ammunition
- barrel
- pontiac
- mazda
- expressway
- administer
- unions
- function
- stopping
- organize
- parenting
- schedules
- slept
- wheels
- resource
- competing
- sees
- careers
- pits
- carpeting
- legislature
- functional
- divorce
- bridge
- transfer
- needlepoint
- cookbook
- breast
- published
- portland
- throws
- counts
- larry
- louisville
- com
- glued
- tube
- slide
- protective
- felony
- dursban
- renting
- rebuild
- london
- shingles
- lea
- stink
- puppies
- schnauzer
- steering
- plugs
- mechanic
- worn
- inflation
- diving
- stretch
- purse
- introduced
- stripped
- occupied
- siamese
- controversy
- buick
- religiously
- allergic
- edges
- sail
- nancy
- biographies
- nonfiction
- thunderstorms
- intend
- educate
- nerve
- recordings
- concentration
- steve
- academic
- freshman
- sophomore
- neutered
- ponds
- disgusting
- narrow
- comparing
- associate
- adjusted
- cottage
- foster
- rake
- outstanding
- appreciated
- malpractice
- thankful
- personnel
- selective
- administrative
- comparable
- pier
- contributing
- cart
- explore
- commits
- affair
- cleveland
- glasses
- downstairs
- details
- backpacking
- blackberries
- alternator
- antilock
- peeves
- chris
- billy
- henry
- smooth
- polluting
- sweats
- fever
- sweater
- wyoming
- filmed
- guts
- respond
- theories
- database
- culturally
- threatened
- tears
- messages
- ear
- bark
- grandpa
- versions
- lee
- wave
- analysis
- gear
- comments
- colorful
- photography
- victims
- resolution
- stiff
- brazil
- minister
- interpret
- hero
- lebanon
- declare
- heritage
- escape
- columbia
- prescriptions
- assumption
- berkeley
- combined
- traditionally
- relaxation
- entering
- regulate
- consciousness
- react
- sexual
- proved
- booze
- cloth
- herald
- instructors
- vested
- consultant
- taxpayer
- lethal
- restricted
- pub
- directed
- frequent
- tempted
- hat
- treadmill
- abilene
- hates
- skinny
- turnout
- bouncing
- wayne
- beforehand
- deserves
- ninja
- expand
- probation
- eliminated
- yogurt
- powder
- boyfriend
- blankets
- alarm
- vacuum
- chop
- strips
- ruin
- knots
- bits
- rogers
- guessing
- addicted
- pitcher
- fingers
- rascal
- whip
- ag
- vegas
- response
- advocate
- donate
- proposed
- emphasis
- transit
- carpool
- map
- sheets
- punch
- calories
- strenuous
- laboratory
- resolve
- serves
- drum
- compact
- tigon
- initial
- moms
- identify
- respected
- vision
- visits
- eagle
- summary
- illustrated
- dial
- extraordinarily
- intelligence
- stages
- troy
- injured
- increases
- joints
- dayton
- mary
- deduct
- administrators
- pressing
- contest
- arguing
- marked
- seek
- gross
- roberts
- mentally
- session
- failing
- occasions
- videotape
- clever
- jerry
- mutant
- warning
- intellectual
- approve
- declared
- hallway
- edging
- pressed
- strawberries
- nieces
- sour
- homemade
- trick
- mixture
- solar
- inspection
- global
- winner
- drawn
- trace
- sympathetic
- managing
- anchors
- sulphur
- chuck
- overcrowded
- stole
- dean
- steven
- bi
- thursdays
- appear
- collapse
- dome
- flex
- stressful
- ok
- paroled
- apt
- patient
- injustice
- farmer
- socialized
- snap
- clay
- wintertime
- beaches
- touching
- curb
- clippings
- flowerbeds
- toes
- buffer
- hardware
- republic
- battle
- heading
- units
- shadow
- yankees
- rounded
- immigrant
- diseases
- caesar
- saves
- nephews
- slowed
- grounds
- snakes
- abilities
- missiles
- nova
- pen
- digging
- drew
- pools
- strung
- port
- sticking
- orioles
- hopes
- ov
- fertilizer
- railroad
- rub
- robberies
- theft
- tourist
- sta
- stood
- eligible
- freshwater
- saltwater
- shark
- fool
- commute
- deciding
- fam
- terrific
- catalogs
- froze
- ethic
- controversial
- crossed
- georgetown
- soy
- hoi
- pasta
- dreams
- painful
- filthy
- innocence
- leaning
- cleared
- feasible
- perception
- lottery
- parochial
- announced
- ll
- gallons
- kindercare
- behavioral
- classrooms
- merchandise
- washer
- refrigerators
- tinker
- supplies
- stimulation
- alert
- furthest
- cease
- reward
- biology
- starter
- prairie
- drill
- johnny
- experiments
- exercised
- paneling
- tougher
- strain
- noisy
- instill
- housework
- gap
- auditor
- dot
- maternity
- butler
- amarillo
- mulch
- actions
- lawsuits
- senators
- anniversary
- bonding
- leisure
- fertilize
- dragging
- decorated
- statewide
- format
- skeptical
- pad
- mode
- justify
- budgets
- seniors
- chief
- efforts
- hispanics
- drastic
- frost
- layoffs
- temperatures
- airlines
- hoses
- safer
- nails
- salads
- clients
- vans
- surely
- pulls
- operation
- sells
- bikes
- unable
- permanently
- slight
- rifle
- impulse
- manual
- handguns
- gauge
- someth
- youngsters
- karate
- hotels
- demanding
- wool
- warnings
- sanctions
- attract
- mysteries
- tenths
- pots
- neglected
- sliced
- leagues
- bulls
- celtics
- struggle
- qualify
- bars
- lucked
- cliff
- cabins
- relaxed
- gates
- oregon
- loads
- crystal
- fumes
- previews
- floating
- reviews
- peaks
- poorer
- matters
- continues
- costa
- geographic
- earthquake
- intrigued
- ain
- albums
- singapore
- proof
- bulb
- spayed
- fr
- skating
- robbery
- sector
- horn
- drafting
- premeditated
- frustration
- radiator
- boundaries
- bureau
- belonged
- nephew
- officers
- serger
- seam
- choral
- dating
- genuine
- requirement
- gradually
- asians
- establish
- effectively
- reel
- ra
- steady
- produces
- switzerland
- calm
- anthony
- suzuki
- plymouth
- sized
- thread
- centimeters
- recorder
- signal
- brands
- resolved
- converted
- dumped
- spur
- trap
- yell
- smarter
- humanities
- amherst
- sheriff
- safely
- completed
- equally
- labs
- foam
- sociology
- entertained
- lobster
- title
- recommendation
- residential
- vicious
- lease
- outer
- honesty
- switching
- freezer
- tollway
- heavier
- bahamas
- sperry
- rollers
- mowed
- cougar
- chi
- crooks
- lips
- remodeled
- cocker
- eigh
- syndrome
- overweight
- titles
- lettuce
- gather
- span
- greenville
- drip
- senator
- dam
- zip
- lexus
- peninsula
- counseling
- grapevine
- parental
- branch
- travels
- atlantic
- screening
- thr
- veterans
- substance
- golfers
- golfer
- manually
- carbon
- disposition
- harrison
- putt
- disability
- marry
- infants
- engaged
- braves
- mums
- provo
- boots
- commercialized
- replacing
- moisture
- assign
- router
- saws
- translators
- alleviate
- acquainted
- caring
- incinerator
- receipt
- scrub
- setup
- hazardous
- wardrobe
- jackets
- blouses
- suspenseful
- graphic
- gary
- monitoring
- hacker
- india
- desirable
- invite
- reaction
- fantasy
- shocking
- recorded
- addresses
- rig
- instructions
- faced
- advances
- paperwork
- tongue
- cha
- accommodate
- motion
- performed
- composer
- horrendous
- beatles
- crop
- applying
- budgeted
- coda
- seminars
- challenging
- righty
- cave
- dragged
- conscientious
- lenient
- warehouse
- managers
- windy
- allergies
- flu
- inordinately
- cinderella
- shoulders
- progressive
- cam
- colonial
- nicaragua
- exception
- translations
- scream
- independence
- cope
- economies
- tropical
- consequently
- difficulties
- plead
- disturbed
- correlation
- movements
- athletic
- stoned
- invested
- coincidence
- analyze
- chip
- miracle
- fif
- kee
- inmates
- external
- civilian
- trapped
- ghetto
- amenities
- clutch
- disposable
- makers
- pursue
- organ
- blast
- pluses
- racquetball
- lobbyists
- republicans
- outskirts
- carpenter
- buck
- predict
- backwards
- wok
- sweets
- ugh
- tablespoon
- singer
- shops
- singers
- stockings
- mirror
- crocheting
- zucchini
- voices
- pockets
- exhaust
- oxides
- victimized
- cynical
- colder
- castle
- listed
- deliberately
- spoken
- adventure
- repeats
- imagination
- viewing
- bench
- catcher
- bull
- corners
- dustin
- hoffman
- kmart
- concerning
- bulk
- accepting
- eerie
- na
- properties
- lying
- sturdy
- logic
- dated
- slick
- separating
- talented
- raiders
- device
- macintosh
- statistical
- sausage
- italians
- canoe
- thrill
- honeymoon
- arabs
- defending
- stability
- pops
- musicals
- sends
- asks
- ringing
- versa
- opens
- offhand
- dana
- envision
- philosophical
- charity
- volunteering
- commentaries
- informal
- commentary
- viewpoint
- independently
- sections
- nope
- firmly
- forcing
- flags
- gathered
- gett
- neil
- jagged
- awakening
- julia
- beside
- initiated
- pole
- kidnapping
- witnesses
- handles
- panel
- refined
- portions
- moments
- accessible
- hollywood
- norman
- assets
- tire
- pursued
- factory
- au
- romance
- fuels
- presentation
- closets
- hips
- rated
- publish
- protestant
- females
- crowds
- poorly
- identified
- buys
- stuffed
- chamber
- brass
- arrest
- productive
- ticks
- earned
- prisoner
- reimbursement
- spiritual
- z
- pronounce
- riskier
- protection
- consistently
- endless
- charles
- rebellion
- pacifist
- curse
- unto
- spirit
- barbara
- bombs
- tearing
- struck
- heaven
- theaters
- northeast
- licensed
- reducing
- peoples
- lithuania
- damaged
- bacon
- worm
- bug
- sprays
- bloom
- rye
- leasing
- nightmare
- beautifully
- washing
- nurseries
- neglect
- mixes
- frying
- guacamole
- disc
- populated
- cooperation
- bundle
- nickel
- rely
- insulation
- powers
- soldiers
- leery
- iraqi
- germans
- safest
- appears
- whoa
- republics
- participation
- reference
- disgusted
- hauling
- permitted
- orientals
- excluded
- stone
- sack
- crush
- fills
- crap
- fisher
- leap
- interact
- publicity
- brooklyn
- idiot
- easter
- vines
- extensively
- fou
- extras
- shootings
- knife
- outcome
- pensacola
- fished
- interviews
- disappointing
- overworked
- speedy
- apathy
- juror
- ann
- appointed
- spite
- ballot
- counter
- appetite
- technician
- complaints
- begins
- reaching
- referred
- influences
- swayed
- award
- slips
- stranded
- bankruptcy
- users
- socialize
- boom
- secondary
- captured
- backward
- intellectually
- bean
- measured
- remind
- bolt
- swung
- dryers
- extension
- hooks
- trinity
- lasting
- hatred
- snack
- altogether
- heal
- restore
- restored
- deeper
- strength
- link
- graders
- noticeable
- lowering
- preferred
- remarkably
- baroque
- barry
- townhouse
- fertilizing
- decade
- slower
- pl
- hop
- creates
- alternatives
- gains
- operated
- forgetting
- detector
- deliberate
- cycling
- legally
- bridges
- prize
- adolescents
- gamut
- slant
- fascinated
- baskets
- glue
- collector
- accountant
- rides
- def
- remote
- professions
- suggesting
- crafty
- remembers
- bears
- identical
- burns
- basket
- believer
- document
- korea
- lasted
- meatballs
- waist
- rear
- stretching
- fold
- kroger
- linoleum
- angle
- wo
- diverse
- buyer
- bullets
- banning
- bargain
- breeding
- humor
- evil
- q
- illness
- peop
- oldsmobiles
- fiance
- bodied
- educating
- showers
- mud
- connect
- bothering
- rebuilding
- kuwaiti
- possibilities
- overcast
- cloudy
- hurricanes
- forecast
- ru
- therapist
- scott
- rugs
- angel
- wheat
- editor
- caretaker
- liking
- kiss
- inevitably
- chat
- unhappy
- comfortably
- litt
- variation
- protest
- fences
- samples
- messy
- affectionate
- disabled
- barking
- production
- kelly
- corvette
- fanatic
- towel
- firing
- coaching
- presents
- burglar
- overcrowding
- lane
- imprisonment
- arrested
- asian
- wrecked
- beauty
- olympics
- conviction
- playground
- garth
- rs
- jam
- literary
- cre
- execute
- cartoon
- nearby
- fundamental
- ribbon
- bobby
- montessori
- sofa
- fetched
- rolled
- sewed
- starters
- crocheted
- liberties
- nintendo
- majoring
- associated
- threatening
- freezes
- traction
- perspectives
- southeast
- carp
- advertise
- pint
- merit
- durham
- meryl
- snowed
- advisors
- terrorism
- sectors
- joint
- terrain
- citizenship
- melted
- ounces
- ounce
- keys
- races
- smokers
- sensible
- bradshaw
- hip
- af
- richmond
- sen
- readily
- consistency
- canned
- enforcement
- contracts
- cons
- differ
- suffer
- tool
- specialist
- flies
- confidence
- esteem
- ironing
- inexpensive
- slots
- buffet
- cuisine
- congressman
- persuaded
- minorities
- stranger
- brush
- coastline
- blind
- cape
- dow
- partially
- calcium
- vast
- abroad
- museum
- physician
- physicians
- redid
- erie
- cooperative
- survival
- har
- exac
- intentionally
- affecting
- urine
- grandkids
- agricultural
- beam
- display
- constitution
- capitol
- ordinary
- babysat
- aggressive
- journalism
- grad
- tia
- olive
- collin
- casserole
- cakes
- operas
- accents
- almo
- oprah
- tiles
- tile
- trillions
- struggled
- tips
- tulsa
- museums
- sailboat
- perch
- styling
- seville
- rotten
- ken
- dentist
- maverick
- medicare
- douglas
- leased
- insane
- madison
- dock
- subdivision
- pouring
- wooded
- departments
- airplanes
- pilots
- premium
- ol
- liberty
- malls
- fossil
- produced
- bumper
- purchasing
- gentleman
- tribe
- wordstar
- rinse
- santa
- broth
- thomas
- addressed
- unconsciously
- enchiladas
- slickers
- rib
- lawry
- housekeeping
- opener
- doll
- sierra
- nuskin
- legend
- ruben
- batteries
- drywall
- disturbing
- relief
- devastating
- confined
- strides
- incineration
- drums
- cement
- leaked
- presently
- semiconductor
- firms
- foremost
- hoods
- sample
- client
- update
- predominantly
- gory
- dancing
- inherent
- harmed
- sneak
- invisible
- obligated
- invariably
- supervisors
- dentists
- chew
- randy
- understandable
- springer
- artist
- stardom
- taylor
- synthesis
- adapt
- pla
- labeled
- label
- attended
- manuals
- stephen
- stimulating
- improvements
- veterinarian
- serial
- wrongly
- preschoolers
- conditioned
- detailed
- unload
- highs
- collar
- identification
- stones
- zoo
- owens
- sandinistas
- greedy
- kings
- roosevelt
- bananas
- tempting
- lessened
- performances
- greek
- plots
- sean
- statehood
- quo
- assuming
- significantly
- woul
- ve
- occurring
- stringent
- troubled
- resistance
- regional
- disastrous
- practices
- alternates
- approved
- believing
- joe
- iraqis
- habitual
- bone
- dope
- threaten
- inventory
- bibs
- tasted
- afghan
- quilts
- riot
- earning
- backup
- christ
- begun
- guaranteed
- beats
- monetary
- ne
- involving
- punishable
- instantly
- hog
- logistics
- joining
- tutor
- doggone
- hats
- remodeling
- allen
- cabinets
- motivate
- inspired
- computerized
- pers
- extremes
- willingness
- excitement
- jacobs
- architect
- lump
- shared
- evaluate
- exclusive
- expanded
- tablespoons
- ginger
- peanuts
- sang
- choirs
- finals
- aggravated
- okra
- ruled
- landmark
- restrictions
- smack
- investing
- drier
- hotter
- orlando
- adventures
- scrap
- battery
- timing
- boeing
- alcoholic
- sullivan
- continuing
- ukraine
- adjustments
- astros
- claws
- declawed
- rushed
- stray
- void
- chase
- messes
- procedures
- underwear
- skill
- politician
- mitch
- caddo
- prizes
- lids
- files
- tra
- questioned
- wolf
- thunder
- howl
- buffaloes
- honduras
- wealth
- contributes
- wider
- soak
- installed
- converter
- authorities
- visible
- ash
- suspected
- agencies
- mouse
- printout
- producing
- unix
- blueberry
- hike
- overly
- baker
- assault
- restraint
- enj
- danny
- couch
- arnold
- ridge
- gene
- clo
- unemployed
- ahold
- dislike
- equality
- mistaken
- aged
- quoted
- harsh
- realizes
- upstate
- expend
- brinkley
- complaint
- slanted
- restricting
- halls
- wheelchair
- supervised
- terry
- monstrous
- drawbacks
- fights
- learns
- fallen
- challenged
- rewarding
- mailed
- snowing
- ni
- wreck
- amongst
- misery
- schwarzenegger
- goofy
- entered
- rationale
- prosecutor
- excused
- bare
- lawsuit
- audio
- teti
- eh
- lacking
- memorable
- wisdom
- succeed
- jokes
- frenchman
- liability
- workmen
- executives
- marijuana
- surface
- lengths
- fondue
- cheddar
- watermelon
- saucepan
- lukewarm
- cookbooks
- collected
- saran
- hollow
- warming
- spa
- bathing
- incur
- institutions
- freshmen
- sinking
- description
- graduates
- nelson
- commerce
- recruiting
- homemaker
- cri
- ankle
- install
- sympathy
- burnt
- episode
- awesome
- scandal
- grasp
- multiple
- fonda
- tolerance
- enforced
- lighter
- enemies
- gentle
- avoided
- approaches
- sheep
- grace
- reserve
- claimed
- abusing
- borrowing
- servants
- stops
- moist
- ass
- kin
- trimmed
- varieties
- experimenting
- mashed
- foo
- barbecued
- barbecues
- marinate
- manages
- sacks
- giant
- pact
- confused
- stepping
- seams
- michener
- blooming
- stewart
- tim
- rebel
- grammar
- yankee
- restriction
- biblical
- paychecks
- request
- stable
- diego
- lush
- ga
- limb
- flooded
- strokes
- animated
- muddy
- sharks
- quantum
- partners
- deedee
- formula
- subtle
- solved
- tow
- bounds
- rooting
- championship
- toronto
- ontario
- cabbage
- cantaloupe
- siding
- twist
- sirens
- reminded
- affluent
- bee
- captain
- tackle
- advancement
- isolated
- destroying
- foggy
- regulating
- cigarette
- linguistics
- canadian
- payless
- cashways
- bucket
- cereal
- maxed
- rally
- richards
- convention
- everytime
- mar
- dairy
- doubts
- pursuing
- flight
- crew
- oops
- misses
- amazingly
- punished
- suited
- flexibility
- rehabilitate
- deduction
- debit
- executive
- requested
- implemented
- disadvantage
- shoddy
- naive
- moscow
- marcos
- shoots
- blessed
- cad
- noon
- formed
- bargains
- circuit
- dissertation
- serviceable
- roughing
- cots
- condo
- poles
- locks
- ob
- hearts
- passover
- seder
- catholics
- attacking
- syrian
- bagels
- affairs
- iranian
- ideals
- dividend
- voluntarily
- devote
- performing
- pipes
- arteriosclerosis
- nonexistent
- torn
- outfits
- prejudice
- invited
- remembering
- remedial
- certification
- textured
- insides
- tone
- tornados
- exxon
- brain
- photographer
- audit
- mainframe
- jet
- upgraded
- baghdad
- scheduled
- receptacles
- continual
- potentially
- prestige
- perceived
- trivial
- broader
- sided
- claims
- adjustment
- tread
- richland
- discouraged
- stepdaughter
- sacrificed
- possession
- castroville
- timer
- shady
- lehrer
- editorial
- embroidery
- envelopes
- continuous
- typing
- claude
- aging
- attending
- trainable
- watered
- composition
- dis
- disabilities
- intentions
- inter
- gay
- facing
- interviewed
- seasonal
- patch
- peculiar
- rec
- brilliant
- invest
- payday
- buddies
- wiped
- indoors
- fiddle
- inspect
- peel
- hors
- impress
- ridden
- objects
- surprisingly
- servicemen
- teeny
- equitable
- tier
- stair
- targets
- knocked
- accuracy
- impressive
- cycles
- writers
- rehabilitated
- fleet
- drops
- quarts
- peeve
- sa
- pregnancy
- meets
- campsite
- specialized
- indicated
- beings
- obnoxious
- stereotype
- communist
- sway
- soviets
- monetarily
- circle
- blah
- carnival
- outs
- indication
- gigantic
- ownership
- feeds
- latch
- pansies
- cau
- screened
- references
- tabs
- steamed
- blueberries
- desserts
- sandwich
- slices
- mba
- describing
- duke
- mechanics
- secorski
- financing
- punishments
- whack
- addiction
- '7'
- specials
- climbing
- shells
- spectrum
- ins
- ants
- painter
- painters
- noises
- rats
- sequel
- rocky
- stallone
- pai
- exterior
- afterward
- greasy
- builders
- intervention
- solving
- appliances
- fu
- hesitant
- incorrectly
- lizards
- bats
- evils
- refugees
- permission
- dive
- instituted
- parked
- landry
- scope
- eagles
- cows
- orders
- tokyo
- subway
- remorse
- heinous
- manufacturer
- occupation
- neal
- brushes
- manhattan
- stud
- leftover
- coll
- rifles
- shelf
- robbed
- temporarily
- inconvenient
- limitations
- spelling
- precise
- commodore
- specifications
- belief
- aggravates
- nev
- bites
- knox
- overheard
- rows
- frederick
- pointed
- stu
- rusty
- reelected
- loses
- pretend
- symptoms
- biography
- destroys
- delicate
- speakers
- happier
- grub
- raiser
- petroleum
- menial
- jeff
- blink
- recommending
- diner
- streep
- copper
- explosives
- disappear
- cosmopolitan
- swimmer
- vogue
- felon
- converting
- bolts
- ross
- ro
- reject
- outfit
- automotive
- mexicans
- envious
- risking
- shifts
- cylinder
- gaining
- tragic
- expressing
- expression
- chilly
- yorker
- dall
- deny
- bonuses
- lucrative
- congressmen
- portray
- needing
- scallops
- susan
- protein
- gained
- baking
- academically
- kenyon
- admissions
- sciences
- provides
- preparation
- logical
- cage
- owed
- devastated
- despite
- pillsbury
- surrounding
- prosecution
- liable
- limitation
- writes
- follows
- nash
- paso
- juice
- reusable
- procedure
- vegetation
- bach
- delivery
- rapes
- thou
- contemporary
- brookhaven
- heater
- curiosity
- fuse
- assembly
- limestone
- danger
- ferry
- ducks
- pilgrimage
- annoyance
- seniority
- ben
- partner
- executed
- healing
- darker
- diff
- routes
- touring
- footage
- abandoned
- retain
- warped
- leslie
- mockingbird
- tricky
- steep
- overwhelming
- killers
- calendar
- faculty
- bingo
- fog
- rationing
- visas
- awareness
- howard
- repairing
- bathrooms
- upside
- symbol
- conception
- veteran
- daylight
- babysitters
- valentine
- ideally
- driveway
- digest
- danielle
- severely
- confident
- idaho
- searched
- appointment
- givers
- pappasito
- dillard
- expertise
- tasty
- publisher
- reruns
- soaps
- repaired
- theatre
- cedar
- mainstream
- refer
- tina
- secure
- rockets
- loo
- contacts
- carpooling
- appalachian
- adventurous
- hostages
- fatal
- patients
- '2'
- sunfish
- donated
- shepherds
- joey
- treats
- researcher
- unnecessary
- stucco
- payroll
- scan
- conductors
- versed
- midway
- beard
- princess
- naked
- custom
- mount
- marshmallows
- mommy
- committee
- allegedly
- tap
- woodstock
- routinely
- rod
- tuesdays
- patterned
- czar
- donald
- booked
- intent
- granddaughter
- chips
- sedan
- discounts
- inn
- dent
- crib
- deliver
- schutzhund
- alsatian
- refused
- nola
- grapes
- marinated
- maxima
- oahu
- conferences
- newly
- kauai
- maui
- hunters
- concentrated
- bakery
- hay
- sleeve
- niro
- builder
- curtain
- spain
- crust
- intriguing
- reimbursed
- licenses
- physics
- reaches
- donahue
- cruises
- nassau
- olives
- lodge
- grandsons
- acoustics
- waves
- uniforms
- fancier
- mesa
- dalmatians
- soapdish
- mushroom
- milwaukee
- violin
- harpsichord
- rumor
- disneyworld
- thinner
- carolyn
- risque
- saxophone
- jodie
- hopkins
- credibility
- barbies
- motel
- wendy
- broncos
- chico
- troop
- warranties
- picky
- aberdeen
- solicitors
- autumn
- nevada
- marlin
- operations
- exhibit
- shuttle
- wycliffe
- sheltie
- particulates
- colombo
- duties
- burner
- hometown
- permits
- contributions
- astronomical
- attire
- blazer
- critics
- omaha
- disturbs
- politeness
- polite
- presumably
- conscience
- canceled
- respects
- norms
- rang
- solicitations
- gossipy
- obtained
- frequency
- turf
- soliciting
- medications
- chow
- smiling
- leash
- acts
- gin
- dispute
- reactions
- intimidated
- alm
- inundated
- switches
- influenced
- rhythm
- sim
- mus
- jimi
- hendrix
- pitiful
- promise
- simon
- qualities
- achieve
- unexpected
- alw
- loaned
- quota
- holler
- leeway
- pains
- wing
- coordinated
- spelled
- skid
- counsel
- violation
- actu
- modeling
- lyrics
- oldies
- phil
- collins
- criticize
- suggestions
- petting
- farms
- exit
- determination
- preservation
- ted
- teddy
- underclass
- considerable
- watcher
- gathering
- sexually
- justified
- territories
- capita
- carefree
- taxing
- weak
- territorial
- resist
- attempts
- craze
- uni
- subscribed
- tractors
- regulated
- cal
- organic
- weaponry
- tanks
- offender
- cured
- slave
- foul
- flipping
- shades
- acclimated
- squares
- tapped
- jerusalem
- fearful
- interrupt
- interrupted
- erase
- monterey
- jose
- ram
- supplement
- standardized
- overtime
- amazes
- circumstance
- summons
- conservation
- indestructible
- littlest
- missionary
- wrapped
- ellen
- toyotas
- preferences
- rag
- straw
- wallpapering
- hoe
- vo
- tubes
- dulles
- incoming
- eldorado
- coun
- tenure
- evaluation
- assigned
- flatter
- chickens
- curry
- overextended
- compl
- housewife
- simmer
- yarn
- demo
- ensemble
- bas
- transmissions
- frivolous
- sessions
- grind
- ranges
- quits
- disconnected
- substances
- etched
- notion
- redeeming
- grabbing
- scrape
- por
- funniest
- rotted
- harvest
- adaptations
- mining
- incaviglia
- excess
- exhibition
- da
- nightmares
- biscuits
- echoes
- actress
- believable
- drafted
- truman
- snider
- extend
- planet
- packing
- dumpsters
- awakenings
- deniro
- actors
- ser
- garp
- attacked
- ralph
- rapid
- agreements
- forests
- polluters
- penalize
- undergrad
- output
- sensational
- failure
- fattening
- catered
- brownies
- crock
- downy
- delta
- cooled
- duplicate
- clearing
- pheasant
- genuinely
- capability
- shield
- agenda
- coup
- briefly
- context
- governors
- irish
- reserved
- collectors
- ole
- antique
- eights
- irate
- noticing
- solo
- shipped
- dramatic
- grateful
- segments
- updates
- trite
- platter
- inc
- incidences
- estimate
- walter
- cronkite
- mold
- efficiency
- spouses
- widely
- redskins
- lynn
- deaths
- observe
- educators
- nother
- visual
- graded
- objectives
- principals
- passes
- poli
- interaction
- prescribed
- breakthrough
- fake
- fears
- web
- housewives
- awake
- reservations
- suggestion
- genre
- innovative
- umbrella
- annoyed
- myth
- proportion
- generational
- exams
- gung
- essential
- pushers
- cathy
- sassafras
- dye
- barn
- outlets
- hollering
- dents
- scratches
- layers
- swiss
- cauliflower
- trays
- pans
- boiling
- vanilla
- custard
- unsweetened
- spoon
- freons
- officials
- disaster
- contributor
- analyzing
- respiratory
- powered
- desired
- trainer
- butt
- psychological
- majors
- staggering
- hamilton
- tracy
- protesting
- prejudices
- dale
- willie
- summoned
- questionnaire
- skipped
- bail
- hebert
- mangione
- breeze
- fairer
- regulations
- seriousness
- darkness
- remem
- judith
- dedicate
- owes
- domino
- insured
- backing
- risks
- devalued
- magnitude
- taped
- breakdown
- beep
- murderers
- murderer
- insanity
- slap
- wrist
- merry
- reinstated
- atrocities
- prayer
- premature
- pushes
- offend
- ridiculously
- bind
- identity
- bombed
- keepers
- deducted
- offset
- owing
- giveaway
- immigrants
- seeking
- insects
- daffodils
- bud
- dandelions
- plagued
- tiller
- trie
- plum
- fescue
- dries
- greenbelt
- cracks
- smokey
- megahertz
- samna
- proficient
- poison
- reused
- mash
- heights
- lone
- vicksburg
- handful
- futuristic
- patrick
- foggiest
- soldier
- buckets
- tot
- immigrate
- render
- fab
- principles
- payoff
- incinerators
- smelled
- ozarks
- disappeared
- tad
- tiers
- glance
- enlightening
- nashville
- fellows
- communicated
- catalog
- insight
- spoke
- flounder
- padre
- aransas
- dingy
- marriages
- becky
- squeezed
- triple
- caribbean
- bees
- lilac
- overhead
- static
- lumber
- juan
- irresponsible
- bold
- carmel
- smarts
- surf
- snappers
- snapper
- described
- aetna
- medi
- irving
- provided
- wells
- romania
- resort
- affords
- printing
- seminar
- thaw
- payoffs
- persuade
- judeo
- litigious
- opponent
- underdog
- equate
- fred
- divided
- separately
- turnover
- descent
- filet
- sole
- jerk
- therapy
- companions
- dresser
- explained
- hush
- agrees
- aff
- drama
- at&t
- modest
- bef
- prep
- vocational
- col
- inevitable
- atomic
- disadvantages
- distracted
- measurement
- arrogant
- clientele
- jelly
- biting
- acceptance
- fir
- overdue
- optima
- suckers
- honored
- chevrolet
- taurus
- recreational
- campers
- shines
- holly
- mattresses
- elastic
- hectic
- volunteered
- heartbreaking
- bargaining
- forgive
- adamant
- moderates
- egypt
- muslims
- palestinians
- poem
- naps
- demonstrations
- restless
- underlying
- dissatisfied
- proposing
- upbringing
- outlook
- quilting
- amish
- acreage
- eyed
- motivates
- vitamin
- drilled
- extensions
- quantities
- carson
- doses
- experimented
- chlorinated
- rode
- nationalities
- exam
- memorize
- readers
- scales
- grain
- matching
- explains
- semigloss
- marks
- experiencing
- upbeat
- connections
- dah
- seated
- alley
- uncertainty
- hoot
- itemize
- processors
- portable
- hewlett
- rival
- rugged
- decks
- printers
- obsolete
- quitting
- approximately
- martin
- achieved
- tact
- disappointment
- trusting
- corrected
- opted
- perjured
- barred
- script
- ironic
- witnessed
- answered
- dependents
- mobility
- preventative
- lung
- carrier
- filed
- pissed
- offensive
- opinionated
- textbooks
- forbid
- advertisement
- cordless
- porcelain
- sandy
- tracks
- amateur
- sings
- contraceptives
- luxuries
- continually
- perennials
- arriving
- bows
- ribbons
- designs
- bunny
- ink
- canvas
- crewel
- decorations
- victorian
- stiffen
- uncommon
- compensate
- typed
- correcting
- frustrations
- acted
- rumors
- lebanese
- newsmen
- chemistry
- tw
- literacy
- jackson
- macho
- hint
- cer
- cutbacks
- slogan
- preserving
- trigger
- greenhouse
- plattsburgh
- digital
- sane
- boost
- vacationing
- stationed
- slope
- attach
- starving
- distant
- mideast
- bureaucratic
- bearing
- nightline
- eng
- centuries
- decking
- crawling
- buds
- vine
- chops
- guest
- sucks
- tails
- '''oeuvres'
- cooks
- elegant
- crumbs
- crunchy
- bouillon
- 20/20
- cord
- irritated
- luggage
- climates
- richer
- civilized
- israeli
- jazzercise
- ego
- exer
- leaned
- firearm
- firearms
- twirling
- edited
- dribble
- accidental
- resale
- trading
- strangely
- cutlass
- semesters
- recipients
- recipient
- pathetic
- import
- partnership
- ambition
- disciplined
- prenatal
- peru
- thir
- filters
- tourists
- canadians
- panamanians
- initiate
- concentrating
- cellular
- awkward
- aw
- sanitation
- kuwaitis
- accomplishment
- defend
- amy
- sunshine
- hurricane
- flood
- muggy
- royals
- pitchers
- nat
- indicator
- lineup
- knives
- publishing
- laptop
- search
- significance
- chains
- jonathan
- petunias
- blooms
- stitches
- fruits
- righ
- opportune
- tang
- inspiring
- incomes
- ferraro
- isaiah
- alma
- mater
- dominant
- greed
- hud
- pit
- bounced
- installation
- stinking
- forgets
- morally
- millionaire
- observer
- restrict
- ancestors
- kitchenette
- neatest
- miniskirts
- grandmothers
- feminine
- marching
- bizarre
- overboard
- gu
- neon
- tints
- condominiums
- walt
- crummy
- flake
- woodwork
- widespread
- worldwide
- bow
- contrast
- vocal
- removing
- passive
- colonies
- bury
- presence
- quietly
- whichever
- vacant
- equity
- litters
- fin
- aquarium
- commands
- anticipate
- resulted
- ranches
- repentance
- mas
- olympic
- wicked
- climbed
- stretched
- explaining
- wayside
- combinations
- carpets
- str
- tickled
- tinted
- carmakers
- sporty
- miata
- authentic
- demands
- parkway
- gabriel
- shannon
- patriot
- mansion
- alan
- blessing
- catnip
- bombay
- himmy
- champion
- gloves
- devon
- curly
- mice
- associations
- haired
- qualifications
- attracted
- irritating
- cops
- irks
- ron
- relation
- germantown
- hondas
- skins
- errands
- pigs
- substituting
- spoil
- butts
- experts
- markets
- hong
- kong
- tens
- conflicts
- bangladesh
- prevention
- barrels
- lily
- humongous
- azaleas
- fielder
- cubs
- pri
- aft
- kinder
- callers
- capone
- arsenio
- flatliners
- scheduling
- threads
- bedspread
- lobby
- mckinney
- spaced
- ethical
- expenditures
- recovery
- sitters
- reader
- authors
- scraping
- backlash
- estes
- sensitive
- taxpayers
- fisherman
- soul
- lures
- hea
- propose
- reinforcement
- exempt
- pendulum
- applies
- flea
- skilled
- petty
- brochures
- bussed
- african
- glen
- godfather
- sooners
- hump
- summit
- strengthen
- meaningful
- steamer
- sprinkle
- skillet
- teflon
- passion
- increasingly
- privileges
- constitutional
- thousandths
- motorcycles
- eighths
- annoys
- horizon
- tooling
- essence
- decimal
- inherited
- fifths
- sweatshirts
- blouse
- programmer
- fashions
- taiwan
- keyboard
- unpopular
- plumber
- sucker
- transporting
- indifferent
- shallow
- undo
- seeming
- kilograms
- dates
- propaganda
- confidently
- badge
- clipper
- steelers
- temperament
- scoring
- warren
- proving
- arthritis
- revenue
- scheme
- os
- wholeheartedly
- unknown
- capacity
- noodles
- instincts
- lecture
- stanford
- unlike
- academics
- cannon
- instinct
- stereotypical
- mac
- firepower
- mug
- antenna
- denton
- psych
- hamsters
- smelling
- expenditure
- dec
- diploma
- radioactive
- packaging
- detect
- stream
- particles
- cattle
- creeks
- alaskan
- roam
- booster
- contagious
- scientist
- wednesdays
- shopper
- species
- tribes
- underpaid
- ambience
- texture
- enthralled
- mel
- presidents
- consultants
- persons
- sweaty
- speaker
- subsidy
- lies
- ano
- offenses
- housekeeper
- hottest
- firewheel
- salisbury
- hams
- locking
- prosecuting
- gettysburg
- arena
- openness
- duplex
- fords
- carburetor
- cap
- notch
- overlap
- dash
- vegetarians
- cleanliness
- vegan
- bodies
- utilize
- coo
- hens
- ballpark
- kicking
- getaway
- des
- vitelle
- a&m
- oriental
- yellowstone
- lion
- rio
- grande
- marble
- jealous
- ruins
- objecting
- fireman
- malicious
- compensation
- executing
- falsely
- statistic
- meanwhile
- storing
- internship
- cooper
- clinic
- cardiovascular
- rotate
- picturesque
- biggie
- killeen
- purebred
- virus
- affection
- caravan
- storage
- libber
- heated
- shrubbery
- supportive
- unacceptable
- appalled
- reimburse
- explorer
- middlekauff
- stiffer
- disneyland
- amusement
- solely
- lafayette
- allies
- liars
- masses
- majored
- discriminated
- valid
- lonely
- smile
- consists
- lisa
- floods
- historian
- societies
- eater
- rewiring
- praised
- openly
- logically
- nest
- pap
- supporter
- runner
- moth
- devastate
- mediocre
- excel
- insist
- halloween
- toning
- dramas
- shakespeare
- multimillionaire
- supervise
- imports
- inferior
- wallet
- dwell
- po
- iguana
- br
- twentieth
- assertive
- chewing
- freelance
- reputable
- avenues
- smoothly
- avenue
- classify
- spices
- tort
- riots
- methods
- textbook
- sprayed
- wiring
- busting
- minimal
- youngster
- manner
- fringe
- beeper
- pill
- spraying
- heavens
- splitting
- maturity
- cues
- nineteenth
- velcro
- cole
- codependency
- losses
- worlds
- representation
- roller
- maternal
- franchise
- bones
- quickie
- resorts
- inept
- tossed
- superior
- enthusiastic
- stripper
- eth
- shotguns
- vital
- mutual
- laura
- lotion
- accumulate
- dime
- unfinished
- toned
- treatments
- rust
- instruction
- productivity
- wherewithal
- indigent
- employ
- medicaid
- desperately
- equipped
- alto
- jerker
- christopher
- reeves
- climb
- mastercards
- beaver
- champions
- pines
- berries
- dutch
- shou
- cathedral
- constructed
- rainfall
- chased
- tossing
- peonies
- hardy
- divorces
- drank
- tan
- sunburn
- interfere
- fo
- custody
- bottoms
- guidance
- flew
- jar
- eisenhower
- bitter
- motivational
- presidency
- leaps
- noriega
- tunnel
- anger
- roger
- mis
- universe
- bargained
- interviewing
- potluck
- trump
- hyacinths
- purply
- mugged
- paroling
- int
- avon
- spectator
- deeply
- amou
- crepe
- pile
- toll
- dependable
- cavalier
- squish
- drinks
- census
- pell
- vienna
- waitresses
- ultra
- regency
- progressing
- retrievers
- prompt
- brisket
- reliability
- graveyard
- submit
- reception
- watercolor
- jan
- shanghai
- effected
- micro
- satisfying
- preston
- broiled
- violated
- appealed
- martha
- melodies
- speaks
- squad
- cutback
- texasville
- breathe
- homemakers
- dreyfuss
- spit
- presumed
- cra
- coordination
- irons
- perry
- stepmother
- ambulance
- deteriorated
- bunk
- flan
- vinegar
- pies
- happiest
- wheeling
- geriatric
- cockapoo
- rabbits
- ignored
- earnings
- pencil
- taller
- glorified
- sch
- eyre
- sung
- madam
- butterfly
- puccini
- canoeing
- receptive
- jackie
- gymnastics
- im
- steadily
- ronald
- brownwood
- temple
- substantial
- les
- broadway
- orthodontic
- verge
- orthopedic
- silverton
- drafter
- drawings
- unbiased
- equals
- secretarial
- overturned
- thelma
- louise
- tacky
- chipped
- sledding
- ambulatory
- reluctantly
- adequately
- cheryl
- hearty
- skim
- thai
- lunches
- molestation
- releasing
- sketch
- subscriptions
- upright
- paddle
- appliance
- tops
- pant
- gail
- centralized
- claus
- earns
- coit
- orchestras
- breasts
- chill
- punk
- '101'
- rebate
- perkins
- fluffy
- parker
- coppell
- bleeding
- pittosporum
- thumper
- carney
- trailers
- eager
- signature
- whoops
- discovery
- macaroni
- golfing
- superbowl
- tease
- includes
- desperate
- entitled
- dill
- suing
- semiautomatic
- cuddle
- legislate
- hubbard
- screams
- competitiveness
- mechanically
- jesuit
- duh
- haiti
- constituents
- ordering
- striped
- bonham
- donna
- du
- nist
- sheet
- sergeant
- rebuilt
- spy
- thorough
- fame
- hydrocarbons
- nitrogen
- ville
- manufacturers
- mats
- algebra
- glossy
- pathology
- towncar
- missions
- mat
- gut
- precaution
- kenosha
- pianos
- commissioners
- exemptions
- daytona
- holder
- gloss
- exploring
- hatchback
- abuses
- royalty
- rehearsals
- meg
- boise
- barbie
- radial
- lathe
- distributor
- parakeets
- chimney
- telecom
- bran
- piedmont
- howse
- duncanville
- admitted
- warriors
- marketplace
- dunn
- bradstreet
- vivaldi
- boutique
- decorative
- volume
- honeywell
- quicken
- strengthened
- quantity
- hinge
- cumbersome
- qua
- transport
- makings
- seal
- entitle
- opacity
- abouts
- forum
- ductwork
- shave
- interchange
- ber
- scruffy
- critic
- trivia
- sharon
- invitation
- astounded
- effectiveness
- insulted
- conspiracy
- paranoia
- surmise
- latches
- invading
- knocking
- ritual
- introducing
- click
- occurrences
- summed
- absenteeism
- errand
- discrimination
- improving
- uncertain
- suspicious
- detectors
- hammer
- royalties
- hideous
- militant
- objections
- absurd
- frampton
- performer
- eclectic
- listener
- ravi
- shankar
- spreadsheet
- dedication
- mardi
- gras
- straps
- convincing
- carl
- casually
- horrifying
- litigation
- retention
- dusty
- regulars
- texteller
- stripe
- tipped
- pastel
- pallet
- patent
- spin
- coul
- southbend
- variable
- intended
- workplace
- inputs
- toured
- reich
- genesis
- bottomed
- shoul
- devoted
- detriment
- manipulating
- softly
- alleged
- accuse
- exploiting
- cuba
- starve
- hun
- ashamed
- connery
- dwarf
- favors
- freer
- imposed
- demanded
- natives
- representative
- undoubtedly
- abou
- melting
- clinging
- quebec
- mountaineering
- implies
- fads
- institutes
- newsletter
- orientation
- meditation
- desks
- laborers
- keyed
- enc
- incorporated
- predominant
- intending
- trafficking
- aghast
- frito
- artistic
- kits
- pinks
- kit
- lilly
- greens
- stocking
- selections
- chapel
- percentile
- stabilized
- illegally
- errors
- nasa
- quaint
- mem
- supplemental
- applaud
- competitors
- generous
- repayment
- celebrated
- negatives
- ind
- privately
- brutal
- hoped
- slim
- administrating
- latter
- nickname
- customs
- defeating
- gadgets
- bluegrass
- pizzas
- anderson
- predominately
- standings
- moore
- pennant
- pirates
- appraised
- overpriced
- longevity
- satisfy
- resell
- editing
- availability
- prohibit
- janitors
- endurance
- mutually
- supervisory
- quotas
- swampers
- laborer
- happ
- mushrooms
- consisted
- terr
- siren
- alarms
- jamaican
- knitted
- granny
- moderate
- carpentry
- candle
- contributors
- ai
- comply
- helicopter
- sting
- nitrous
- chemist
- unseasonable
- ust
- nostalgic
- calligraphy
- tidbits
- mcgyver
- inventing
- baling
- washers
- junkyard
- portraying
- invented
- attempting
- innings
- ke
- weaned
- meows
- docile
- traumatic
- secretive
- daisy
- hype
- mimic
- predicting
- fictional
- swamp
- margin
- teasing
- crosses
- dang
- dumpster
- openings
- recycles
- imaginable
- folded
- straightened
- reminding
- settlement
- beaten
- ramifications
- margaret
- thatcher
- gandhi
- volcanos
- rhode
- residue
- pitted
- comeback
- nader
- volcano
- indicates
- previously
- regulatory
- arrows
- zoom
- calculate
- yugo
- pricing
- dos
- pastor
- sauces
- coleman
- sacramento
- backpacked
- undeveloped
- opposition
- negotiate
- factions
- refreshing
- reveal
- occupy
- responding
- tunes
- jigs
- instrumental
- mickey
- wills
- nickelodeon
- fl
- shenandoah
- flimsy
- programmers
- mentioning
- irritates
- aspen
- contel
- demonstrated
- surrogacy
- crass
- nurturing
- donation
- auction
- shelters
- bedridden
- gals
- '''am'
- factual
- nightly
- chancellor
- gaps
- newscaster
- excerpts
- rises
- choi
- assisted
- deteriorate
- sponsor
- caretakers
- supplemented
- possessions
- signing
- sectioned
- zones
- vikings
- hart
- educator
- beg
- initiative
- administrations
- maj
- sabbatical
- minuscule
- referring
- hourly
- gardened
- remotely
- shack
- broaden
- ivy
- couches
- careless
- anybo
- oreo
- twisted
- actresses
- kenny
- columbus
- disrupted
- mistrial
- chooses
- confession
- placing
- inception
- insure
- burglars
- jacques
- lewis
- chagrin
- ame
- preferably
- loudly
- epileptic
- aftermath
- snob
- broadened
- expectations
- swore
- amphetamines
- endangering
- hassles
- splotches
- scratching
- dread
- hardwood
- toothbrush
- proclaimed
- nicks
- breads
- chunks
- quart
- slender
- blender
- thickens
- thickened
- thicken
- cooling
- leaded
- endorse
- caprice
- converters
- arguable
- lit
- meteorological
- circulation
- lungs
- focal
- volkswagen
- pinned
- fulfilling
- obligations
- belonging
- wealthier
- adulthood
- functioning
- monster
- wandering
- ropes
- appreciation
- confess
- tolerances
- pete
- arnett
- sporadically
- impartial
- diversity
- affiliate
- cutesy
- beeped
- moody
- wonderfully
- vowed
- booklets
- recruit
- courthouse
- strangled
- testify
- neurotic
- crooked
- bracelet
- instructed
- whereabouts
- bracket
- koontz
- bachman
- letterman
- hologram
- pitches
- speculative
- deregulation
- teapot
- vaguely
- hoover
- pennies
- nickels
- investors
- holders
- asphalt
- charts
- kathy
- walkman
- simmons
- rapists
- manson
- repealed
- thousandth
- pac
- kingdoms
- ruler
- scriptural
- elses
- discernment
- walters
- wiley
- communists
- assaulted
- compensated
- medicines
- rude
- returns
- indebted
- deli
- strings
- crabgrass
- slimy
- tempered
- standby
- surgeon
- pruning
- undertaking
- irrigation
- leafy
- remain
- flowering
- chick
- lem
- humus
- barbe
- stoves
- flame
- grease
- tortillas
- turkeys
- smoked
- hickories
- spreadsheets
- specs
- montana
- hazards
- crash
- burlap
- coupon
- subtract
- compost
- branches
- heed
- staunch
- withstand
- buffers
- scuds
- provinces
- merely
- demilitarize
- confusing
- sucked
- incomprehensible
- disarm
- socialism
- boris
- nationality
- nut
- sabine
- consequence
- wade
- camps
- kingsley
- centennial
- canton
- dinky
- proclamation
- mason
- dixon
- seller
- avalon
- chilling
- wits
- characteristics
- tuberculosis
- wafer
- linear
- mismanaged
- outraged
- breyiana
- demos
- boggles
- contaminated
- refineries
- desires
- delaware
- caves
- fading
- anythi
- pantry
- crushers
- hallways
- casualties
- magnified
- tones
- questionable
- andy
- creatures
- extends
- fork
- spills
- degrading
- spark
- probab
- hints
- stereotypes
- romanticize
- thugs
- beaumont
- predictions
- barring
- substantially
- separates
- zealous
- farmhouse
- pumpkins
- planter
- creosote
- landlord
- brushing
- rose
- cantaloupes
- cubic
- wary
- youths
- hostilities
- judging
- burlington
- confronted
- slit
- divisions
- rash
- monterrey
- objective
- hamper
- grouper
- oysters
- tiring
- canals
- grabs
- grabbed
- dogfish
- antibiotics
- commuting
- deprived
- clinics
- infections
- enrolled
- rigid
- fined
- mills
- deceiving
- surroundings
- paths
- motive
- motivations
- upwards
- bundled
- doubling
- financed
- integrity
- benefitted
- perceive
- unfairness
- wiser
- segment
- vengeful
- pitifully
- massively
- respon
- represents
- speeches
- slapped
- inflammatory
- atrocious
- blitz
- zoning
- wholesaler
- turnovers
- argentine
- microwaves
- waxed
- flakes
- purplish
- cubes
- sherry
- argentinean
- sausages
- breaded
- publications
- thesis
- disgruntled
- cries
- replaces
- belongings
- roaches
- overhaul
- uniform
- discretionary
- emotion
- hence
- fines
- documentary
- dealings
- declaring
- dire
- squirrelly
- miscellaneous
- nd
- deposited
- scurried
- skaggs
- endangerment
- assumes
- endanger
- endangered
- accidentally
- suspicion
- continents
- ingrained
- confuse
- trans
- centimeter
- measurements
- peanut
- kindercares
- alphabet
- scold
- inappropriate
- trauma
- weath
- predictable
- inversions
- threesome
- novice
- rut
- yo
- delightful
- ferrari
- resembled
- satellite
- bathed
- jacuzzi
- wings
- fastest
- ant
- kitchens
- dented
- refresher
- kosher
- knishes
- mea
- unstable
- relevant
- americanized
- hugged
- scam
- apologize
- hug
- shiite
- poss
- wheth
- countrymen
- wom
- implementing
- decreasing
- finland
- selfishness
- benefited
- mil
- flunk
- canning
- zinc
- processed
- bogged
- distributed
- moderately
- companion
- organs
- sally
- petite
- isometrics
- ingestation
- plight
- surrounded
- directing
- coed
- subbing
- calculator
- behaved
- versatile
- applicable
- depot
- spackling
- creamy
- similarly
- formative
- contacting
- aptitude
- sounding
- upkeep
- cellar
- rents
- complexes
- nanny
- prefabs
- enou
- scoot
- emulate
- guru
- auditors
- packard
- matrix
- transparencies
- outdated
- advisor
- panhandle
- piling
- shredded
- pessimism
- racism
- destined
- fronts
- hippie
- texaco
- pennzoil
- miscarriage
- rational
- testimony
- testifying
- paralegal
- priors
- aggravate
- enlightened
- niceties
- flop
- horrified
- absence
- taxation
- flabbergasted
- gracious
- flops
- certificate
- explanation
- univer
- dustbuster
- plated
- bowls
- patty
- womb
- soothing
- repetitious
- wilder
- eleventh
- painless
- necessities
- harm
- magnolias
- raking
- underground
- grasses
- blend
- macneil
- jennings
- informative
- bureaus
- comics
- mourning
- lace
- weave
- lacy
- draping
- batting
- anticipating
- splurge
- deci
- typist
- damme
- bland
- widow
- dummies
- caan
- rescuers
- submarine
- studio
- survived
- einstein
- stepson
- literate
- honors
- lifesaver
- framing
- hindsight
- incidents
- outsiders
- jesse
- complains
- threatens
- entrepreneur
- achievement
- clue
- sights
- transplant
- glamorous
- uncontrollable
- constitute
- denial
- champlain
- resume
- technicians
- fad
- timid
- macon
- hous
- espec
- contacted
- liquor
- repairman
- popped
- radishes
- turnips
- loam
- intensive
- attachment
- pickles
- unfairly
- seasonings
- paralyzed
- spinal
- discrete
- seatbelt
- arrow
- reuse
- collects
- dorms
- perimeter
- orthopedist
- freak
- diane
- diver
- limping
- tights
- casts
- nautilus
- cushion
- singled
- tighter
- lonesome
- naw
- everyb
- imitate
- oscars
- booth
- demographic
- judgments
- texins
- crest
- demonstrator
- reps
- partying
- tracking
- perpetuate
- manpower
- coincide
- cl
- soreness
- nighttime
- evacuated
- winnebago
- benefiting
- incidence
- abundance
- creature
- aim
- shah
- felons
- unseasonably
- comparisons
- waning
- surviving
- diplomacy
- eliminating
- processes
- righteous
- filtered
- launch
- unmet
- strife
- ray
- blatant
- fax
- proactive
- buil
- treaty
- bully
- repay
- swallow
- evolve
- tug
- skewed
- intersection
- trampoline
- downs
- cy
- swept
- streak
- averages
- catches
- tigers
- strategy
- bayless
- advised
- brunt
- rooted
- dseg
- documentation
- floppy
- disks
- hus
- touchy
- linda
- rossa
- teen
- boo
- livingston
- seagull
- wro
- midland
- odessa
- practiced
- fur
- contra
- haunt
- resentment
- laughable
- arises
- browns
- topping
- toast
- mustard
- cucumber
- bonanza
- meta
- rearing
- robinson
- cylinders
- akeem
- dominate
- reselling
- jap
- wichita
- galen
- amrein
- snacks
- elephant
- transferring
- fare
- veterinarians
- wonders
- developer
- breathed
- limiting
- cookouts
- individuality
- frills
- fluctuates
- tastefully
- smashed
- organizing
- dare
- reform
- bri
- gate
- felonies
- ima
- racist
- gripe
- gar
- width
- spreader
- lightly
- freshly
- arthur
- waterfront
- movers
- frames
- enamel
- spun
- descendants
- favorable
- intervening
- advancing
- frightened
- revolting
- upsetting
- acquired
- creeps
- kitten
- teacup
- frustrates
- cheaply
- brunch
- crook
- mock
- primaries
- workday
- chows
- guinea
- harming
- bellies
- rubbed
- terrified
- louder
- lid
- collie
- mechanism
- inspected
- cheated
- fingernails
- uninformed
- disinterested
- honduran
- rica
- tourism
- enabled
- policies
- engrossed
- virgo
- elder
- ricans
- rican
- loaner
- revival
- christianity
- revered
- pyramid
- birthdays
- disciplinarian
- nutri
- stairs
- elevator
- powerhouse
- alway
- rehearse
- patriots
- photo
- guards
- congested
- incarcerating
- foreground
- snatched
- astro
- minivan
- subaru
- ticking
- rack
- upgrade
- retail
- campgrounds
- bearable
- dipper
- addict
- sportsmanship
- describes
- strasbourg
- missile
- bounce
- goll
- humiliating
- chauffeur
- valet
- condemning
- airs
- tithe
- blessings
- foley
- croak
- critters
- turkish
- himalayan
- patches
- paws
- lanky
- hillside
- communicating
- swam
- supervision
- stephanie
- keel
- tuba
- nerves
- turntable
- dual
- processor
- edit
- layout
- preventing
- overloaded
- mentions
- sevren
- montgomery
- piddly
- compressor
- prelude
- impractical
- wharf
- colts
- seahawks
- winners
- champs
- expansion
- attendance
- kites
- strangers
- tasting
- arrangement
- rewards
- interfering
- inhumane
- overtaken
- underwater
- intention
- philippines
- tag
- quarterly
- incentives
- justification
- sorting
- insurmountable
- forestry
- trails
- emphasized
- obtain
- cubicles
- advent
- op
- accurately
- orchids
- dodgers
- brat
- petrified
- circular
- terrifies
- niece
- laughs
- exc
- negate
- rejected
- lawlessness
- founded
- crippled
- perpetrators
- breath
- intake
- valleys
- pencils
- abreast
- ethics
- scandalous
- churchill
- dickens
- withstood
- mindless
- pi
- sincerely
- whew
- spreading
- petersburg
- finest
- southwestern
- cincinnati
- roaring
- perpetual
- lhasa
- scuba
- pampered
- dinosaur
- fires
- ventured
- dooming
- plunked
- cooperated
- adjusting
- decades
- valued
- downstream
- lure
- bumble
- wasp
- squirrels
- popularity
- isolation
- disciplining
- spank
- isolate
- handicraft
- dough
- ornaments
- empties
- posted
- ruining
- kurdish
- roseanne
- matthew
- brando
- levinson
- follower
- marino
- keystone
- cunningham
- tactics
- granada
- cuban
- salinas
- terrorist
- buried
- hyundee
- helicopters
- stepper
- pillow
- staring
- aqua
- blisters
- rubber
- trashed
- dwindling
- cooker
- cherry
- blackening
- gumbo
- portuguese
- ribs
- ya
- jumbo
- initiatives
- revolt
- obliged
- argues
- constrained
- fools
- indoctrinated
- millimeters
- fractions
- fittings
- wrench
- header
- screws
- progressively
- pullover
- smokes
- sw
- othe
- designer
- foolish
- puzzled
- warned
- cab
- tractor
- sixes
- diesels
- injector
- asylum
- governmental
- antiwar
- translated
- soapbox
- usable
- antimetric
- sweden
- midnight
- plains
- collapsible
- helper
- motivator
- huff
- phenomena
- temper
- miami
- cyclical
- oilers
- stallworth
- swan
- oppose
- decisive
- wrath
- constituency
- nuggets
- meatless
- ingredients
- hostess
- soybeans
- proteins
- belton
- pennsyl
- lsats
- als
- sev
- abcs
- especiall
- affordable
- carpools
- symbolic
- scenario
- gunfire
- outlaw
- abiding
- restrictive
- concealed
- sp
- deterrence
- weighed
- objection
- misusing
- impose
- crackdown
- dawn
- liners
- gerbils
- mutts
- counted
- eel
- tiniest
- debated
- symptom
- furnish
- nonsense
- handicrafts
- awarding
- topsy
- turvy
- worldly
- sparked
- reg
- flours
- dublin
- bulldozers
- overflow
- posters
- chained
- tabby
- rampant
- girlfriends
- inadequate
- '8088'
- monitors
- respectable
- secondly
- binary
- calibrated
- qualification
- brackets
- rescue
- passport
- mou
- alcoholics
- returning
- laurie
- clout
- grilled
- buffets
- brunches
- woodland
- colo
- prix
- seagal
- starred
- premise
- preoccupation
- belly
- millimeter
- darndest
- assembled
- hauled
- fertilizers
- prohibited
- facets
- denied
- loaf
- dawned
- boulders
- marbles
- duck
- shish
- odor
- boneless
- scrambled
- armenian
- consume
- punishing
- devil
- suffered
- agreeing
- enforcing
- burglaries
- rationalize
- busiest
- airy
- wires
- compartment
- soldered
- restrain
- overeat
- pastas
- minerals
- accepts
- supplements
- toledo
- oriole
- steeper
- moines
- bleachers
- collapsed
- herbs
- sill
- appleseed
- pecans
- wes
- enterprise
- bulletin
- electrician
- terminology
- gaithersburg
- valedictorian
- pushy
- seemingly
- rockies
- carries
- yells
- breezed
- solicit
- coworkers
- alright
- humans
- bust
- holdup
- underst
- convicting
- restoring
- ankles
- landscaped
- sal
- continuance
- pensions
- allergy
- baxter
- ceo
- homa
- rallies
- anaerobic
- improves
- ls
- adverse
- hunk
- pulse
- resting
- mirrored
- fireplace
- tucked
- condos
- abandon
- dennis
- distributing
- refuses
- glove
- pricey
- passenger
- lowered
- questioning
- dummy
- mans
- occupations
- norma
- techniques
- karen
- spotted
- incompetent
- exper
- priest
- kindergartners
- conform
- creativity
- manners
- mannerisms
- establishment
- norfork
- farthest
- charleston
- hairs
- follicles
- rehab
- fro
- weddings
- graduation
- med
- saudis
- thieves
- chaos
- promotion
- unconditional
- offspring
- quotes
- dumps
- bluebonnets
- absorb
- es
- flash
- medina
- salty
- beirut
- penalized
- lining
- faucets
- repainting
- arrange
- tripping
- ingest
- ingesting
- arteries
- reacts
- framers
- framed
- viable
- supports
- viewpoints
- delay
- nevertheless
- allocation
- infrastructure
- expended
- restock
- twen
- spider
- marigolds
- impatiens
- replacement
- teased
- bacillus
- gypsy
- toddlers
- recommendations
- skits
- attachments
- slacked
- contributed
- bombarded
- mrs
- cleaver
- senses
- romantic
- illiterate
- paced
- ridged
- totaled
- hesitate
- technologies
- stacked
- renters
- counties
- citibank
- scams
- swayze
- clyde
- drummer
- scratched
- demographics
- companionship
- dependency
- everyth
- prospective
- pairs
- unsupervised
- morton
- lu
- offended
- drinker
- measures
- lions
- arapaho
- drool
- yuppie
- cheat
- reinforced
- fashion
- defrosting
- pilaf
- mixing
- mushy
- korean
- auxiliary
- curriculums
- kathleen
- accordingly
- residency
- sportswise
- blitzer
- fanny
- treadmills
- cinema
- dripping
- shorted
- enlarge
- valves
- shingle
- fixtures
- detached
- stigma
- pioneers
- households
- beepers
- bulky
- vibrates
- hepatitis
- freed
- expectation
- boyfriends
- homeowners
- existence
- anguish
- charming
- weathered
- leveled
- wallpapered
- conserving
- diagnosed
- inspiration
- alerted
- swimmers
- extracurricular
- loser
- sats
- barber
- verses
- robber
- dachshunds
- spaniels
- anthropology
- presses
- clerical
- forthcoming
- homecoming
- famil
- familiarized
- virgin
- qui
- divine
- skates
- cot
- shove
- nannies
- objectivity
- digressing
- ordinarily
- weirder
- revolved
- hatchery
- intimate
- calendars
- decoration
- passage
- continuity
- percentages
- cavaliers
- ewing
- highlights
- patience
- bethesda
- beijing
- pooling
- restful
- pends
- dells
- starring
- rage
- terminator
- twists
- treble
- mackerel
- pike
- stung
- fleetwood
- displayed
- freaks
- backs
- buicks
- convertible
- vintage
- setter
- feathers
- conducted
- ethically
- patrol
- kidnapped
- pun
- exceedingly
- albany
- syracuse
- rapist
- investigation
- pamper
- waits
- assistantship
- newlyweds
- hopping
- annually
- journals
- figurines
- sanded
- 4h
- refinish
- hormones
- lip
- fender
- sparingly
- lime
- sands
- upscale
- gum
- rips
- shreds
- sponge
- mate
- averaged
- harvard
- successfully
- approaching
- nutrition
- conductor
- cringe
- mcneil
- criticism
- palo
- columns
- candles
- psycho
- deadly
- uneasy
- robocop
- molly
- savage
- resented
- retrospect
- juggling
- density
- crucial
- oft
- lame
- assaulting
- pleading
- psychiatrist
- psychiatrists
- psychotics
- assaults
- sponsors
- rainier
- snowy
- immune
- tawakoni
- cones
- fearless
- enclosed
- roofs
- sizes
- cei
- furnace
- ambitious
- poking
- fountains
- latitude
- underpass
- hiding
- petals
- slows
- oscar
- durant
- alo
- notorious
- settles
- smoker
- sponsored
- educations
- ele
- approached
- proponent
- thus
- endeavor
- wri
- fingerprints
- slipped
- fingerprinted
- astounding
- intervals
- contracted
- dea
- imm
- soaking
- visitors
- rug
- daddies
- conformist
- revolutionary
- kramer
- celebration
- feeder
- nets
- minnow
- burping
- purina
- parade
- compound
- pursuit
- refuted
- refute
- turnouts
- vi
- relates
- regain
- moats
- staubach
- encountered
- unrealistic
- landon
- portrayed
- josey
- clint
- jot
- baptist
- reflection
- damages
- shortage
- clerks
- doubled
- smallest
- pavilion
- fuses
- alter
- sensing
- bandit
- theatres
- ellison
- activist
- photographs
- hyacinth
- hollies
- spike
- perennial
- gomphrena
- repeating
- minimize
- ornamental
- happiness
- acquire
- congratulations
- simpler
- circles
- wham
- forgiving
- detrimental
- immature
- maple
- myrtles
- screwing
- disguise
- formatting
- paragraph
- voyager
- crank
- pepsi
- mcmahon
- racking
- recharged
- seabrook
- nucleus
- billed
- mints
- adaptation
- crown
- lunchtime
- celebrate
- incident
- shreveport
- limbo
- diaper
- chassis
- bent
- soapies
- bichon
- frise
- personable
- rin
- tervurien
- latchkey
- considerations
- sunroom
- rambler
- sandstone
- beltway
- adored
- surrendering
- cooperate
- allah
- sakes
- stirring
- pineapple
- oatmeal
- casseroles
- bronze
- catherine
- nissans
- escort
- trusted
- insurances
- provider
- postal
- recourse
- invades
- complained
- susceptible
- newhart
- comedians
- contrary
- bart
- simpson
- morocco
- continent
- ripping
- photos
- reef
- melbourne
- squirrel
- agents
- hockey
- christi
- diverted
- pea
- fiasco
- liver
- caution
- expediency
- misplaced
- technicalities
- technicality
- ruffle
- conducive
- sandwiches
- vendors
- pins
- ligaments
- beethoven
- mozart
- softer
- banned
- regime
- liberalization
- civics
- dart
- wasteful
- wounded
- mcmurtry
- trashy
- grou
- grouchy
- projectionist
- subtitles
- intuitive
- footnotes
- footnote
- operator
- lands
- appetizers
- premed
- specialize
- matinee
- cocoon
- alien
- maintained
- sharif
- oddly
- exceed
- incapacitated
- images
- dangerfield
- stacking
- leftovers
- catering
- scooped
- amelia
- anyth
- wolfe
- myths
- haggard
- phonetics
- relearning
- wheelers
- transaction
- checkup
- reserves
- cranky
- measuring
- coating
- cognitive
- jour
- austen
- reviewed
- attracts
- grandchild
- congealed
- soprano
- canoed
- cancun
- bummer
- teenaged
- manhood
- ostracized
- liken
- pear
- daytimes
- ransom
- sightseeing
- gubernatorial
- robb
- receipts
- gambling
- sedentary
- tortilla
- picante
- grated
- jell
- timely
- subjected
- athletics
- bathe
- commercially
- accordion
- miserables
- milkman
- travis
- phantom
- lloyd
- listens
- illnesses
- diligent
- invaluable
- scotland
- jaw
- periodically
- durango
- jeep
- destin
- jetty
- draftsman
- roman
- recognizes
- regarded
- mediation
- crises
- bystander
- awe
- prac
- gannan
- valerie
- addicts
- sayings
- possi
- restrooms
- festival
- alpine
- uneven
- sleds
- knob
- mows
- mulched
- presbyterian
- willingly
- littler
- strategies
- rapport
- walnut
- impersonal
- hack
- cheerful
- emily
- dell
- preschools
- pediatrician
- dane
- tangent
- backfire
- ethiopian
- venison
- fries
- waitress
- waiter
- attentive
- adventuresome
- heyday
- bernie
- dra
- assortment
- piled
- veal
- evident
- unleaded
- ambivalent
- clothe
- rehabilitating
- confessed
- amendment
- xeros
- quartet
- technique
- carols
- mechanisms
- decompose
- murray
- sorted
- dimes
- crusher
- renewed
- prostate
- antigen
- fourths
- smells
- spinner
- baits
- fisherwoman
- imitation
- sticker
- sn
- pantsuit
- pantsuits
- enthusiasm
- begging
- fitting
- harold
- taft
- milder
- gimmicks
- hemorrhaging
- mennonite
- sealer
- premier
- landed
- suites
- invalid
- invalids
- labels
- frugal
- substituted
- legacy
- reside
- partial
- yuck
- balloting
- sibling
- colds
- discontinued
- primitive
- tulips
- hazard
- codes
- zenith
- ques
- slides
- purity
- richie
- bushel
- wines
- napa
- ronnie
- whittle
- satire
- monotonous
- menus
- frankenstein
- blazing
- saddles
- grants
- hitler
- paintings
- specimen
- fussing
- presume
- pollu
- decorate
- kindergartner
- arguably
- cradle
- grave
- fluff
- swings
- queens
- beltline
- thrus
- aerosol
- corny
- fridays
- camry
- elway
- moneys
- exponentially
- crawls
- grieve
- greg
- foresee
- uninsured
- noses
- rudman
- accountability
- proportionally
- gruesome
- couscous
- repercussions
- wimpy
- shortened
- befitting
- nece
- asset
- flushed
- dressy
- slack
- sl
- tro
- bidness
- apiece
- smokeys
- sur
- outlawed
- legislating
- creating
- activated
- steinbeck
- grizzly
- encounters
- doubting
- doug
- ranked
- sierras
- rai
- tempe
- yelling
- explored
- bogey
- burgled
- plop
- pee
- ay
- handyman
- tighten
- loopholes
- withhold
- advantageous
- bueno
- librarian
- coma
- seasick
- minnows
- seas
- fore
- calico
- yaupon
- labrador
- wax
- scalp
- salsa
- hidden
- continuously
- hibiscus
- wetter
- mitsubishi
- '90210'
- nicole
- matlock
- charlene
- beverly
- shred
- pierre
- recognizing
- cinematography
- invasions
- premises
- '911'
- sitcoms
- misbehaving
- faces
- censor
- morality
- jumps
- finite
- infinite
- whining
- panels
- resurfaced
- cimarron
- jeopardizing
- retirees
- ladder
- investigative
- catastrophes
- existed
- halogen
- sulfur
- combustion
- hitch
- moynihan
- skillman
- lynch
- chil
- amnesty
- abstinence
- crayon
- detest
- ph
- allante
- peppy
- saddle
- inca
- dub
- regiment
- twisters
- toe
- prone
- adjustable
- conspired
- premiums
- reasonableness
- parkland
- losers
- witt
- greave
- wins
- dilemma
- reallowed
- implement
- unsmashed
- crazies
- fabricating
- sampling
- steele
- youn
- upsets
- magnetic
- resonance
- sober
- molesting
- boar
- constraints
- betcha
- severity
- entitlements
- reductions
- defaults
- blackman
- manned
- dealerships
- purrs
- feeders
- frontier
- jetsons
- nearest
- trough
- sli
- howatch
- birmingham
- disregard
- darned
- greenery
- tahoe
- skidding
- surveyors
- tracer
- '486'
- measles
- crunch
- burger
- cameroon
- scoutmaster
- sitcom
- seato
- colony
- nato
- disbanded
- arrive
- uncooked
- overdone
- yummy
- bendix
- pontiacs
- hattiesburg
- bir
- boa
- constrictor
- parrot
- overspending
- coughing
- julio
- misuse
- sniff
- milan
- anchoring
- tedious
- stragglers
- tobogganing
- baggy
- reduction
- hewett
- scaffolds
- excessive
- rep
- disappoints
- nairobi
- safari
- wesley
- hospice
- theoretically
- mishap
- electoral
- stew
- hardaway
- dioxide
- vapor
- aye
- pickings
- legitimately
- sails
- bisquick
- lopsided
- boarding
- freezers
- genealogy
- stash
- proliferates
- brokers
- patterson
- subsidized
- amway
- nonpolluting
- bicycles
- bullheads
- nikki
- jig
- stroll
- ogden
- puzzles
- combo
- airless
- scroll
- dolphin
- torpedo
- malamute
- trillion
- ludicrous
- payers
- column
- dumbbells
- controllers
- harrisville
- specialties
- virtue
- accrued
- transfusion
- refund
- pup
- patron
- parenthesis
- earmarked
- greatful
- striper
- senegalese
- perks
- parkinson
- industrialized
- truer
- dispose
- mega
- tonnage
- scrubber
- ammonia
- compounds
- acids
- thickness
- pronto
- finalization
- utmost
- cognizitive
- scarves
- uns
- unseasonal
- sleeves
- sweatpants
- corduroy
- compliments
- skorts
- nominated
- dud
- recurring
- fami
- overreact
- terror
- cohill
- cohi
- drivel
- eldon
- housepainter
- extracts
- overtly
- uncontrolled
- pirated
- ominous
- thief
- westerner
- lunatic
- violate
- socia
- jehovah
- mormons
- intrusive
- solicited
- invasive
- soli
- intruded
- defining
- surmised
- incorrect
- unsolicited
- nonsol
- unconscious
- cli
- sequence
- peddling
- harassment
- generated
- lois
- intimidating
- rver
- greeting
- stake
- mitzi
- yip
- ranging
- soaked
- rhyme
- ruckus
- parallels
- cov
- hooker
- absolu
- phenomenon
- brazilian
- listenable
- elec
- acoustic
- interchangeably
- folk
- arranger
- sitar
- muted
- existing
- tally
- slush
- stocks
- expired
- pleasures
- albridge
- slogans
- outlooks
- haggerty
- spookier
- pecially
- airways
- focusing
- taj
- mahals
- prolongs
- whim
- deserved
- prevents
- mopping
- odds
- unair
- facial
- beards
- skids
- repack
- buttoned
- starched
- suspenders
- reorganization
- cruddy
- reall
- notre
- dame
- explosion
- untypically
- accumulation
- flatlands
- zeppelin
- floyd
- brash
- bump
- bohemian
- rhapsody
- pumped
- siskel
- ebert
- thumbs
- travolta
- quee
- tokens
- divi
- showbiz
- admission
- scyene
- inexpensively
- sao
- paulo
- usefulness
- spheres
- spaniards
- rulers
- conquistadors
- socialistic
- horribly
- dishonor
- defenses
- sabotaged
- peasant
- exploitation
- exerts
- export
- broadcasting
- ruddy
- minist
- wr
- ler
- interpretations
- histories
- copes
- indicate
- resident
- fledged
- barefoot
- pejorative
- unrest
- citizenry
- ignorance
- ult
- constitutionally
- creole
- prohibitions
- strengths
- cuisines
- throes
- reassess
- functionally
- fractiousness
- faddish
- wellness
- biweekly
- dispensed
- distinctions
- dev
- fizzled
- acupuncture
- gestalt
- irony
- cert
- vigorous
- carbohydrates
- kinesiology
- calc
- calculated
- calisthenics
- myerson
- frantic
- astonishing
- mortars
- formulated
- sociopathic
- pronounced
- unfit
- mouthed
- transcribing
- customized
- anne
- glenn
- improvise
- concentrates
- password
- verbal
- rowing
- lution
- rower
- transforms
- markov
- naval
- postgraduate
- civilians
- mainline
- respondent
- unders
- allergist
- smorgasbord
- compensatory
- profile
- bonds
- deducting
- disproportionate
- brutally
- commuted
- delays
- electrocution
- determent
- deter
- dubious
- internally
- organiz
- coordinating
- scandals
- kisha
- knight
- pullman
- exacerbate
- clutches
- pads
- benz
- absorbed
- keyboards
- spaghettis
- lasagnas
- hor
- horseback
- dabbled
- banjo
- druther
- stre
- farts
- polly
- followers
- inspir
- booths
- commutiv
- billboards
- bartman
- simpsons
- debbie
- nigh
- appraisers
- onward
- ease
- folds
- performs
- tenured
- microcomputer
- comprehensive
- rigamarole
- teachable
- specially
- spicier
- tofu
- pistachios
- pistachio
- bumped
- curried
- saute
- gigs
- perse
- ow
- conventions
- slippers
- teller
- alterations
- utilitarian
- knickknacks
- sconces
- jalapeno
- almanac
- concluding
- warms
- shutting
- piloting
- spectacle
- lobbyist
- legislators
- individ
- unbelieving
- justifiable
- nucle
- kilowatt
- washes
- stinging
- swelter
- lively
- eureka
- rentals
- inspires
- glider
- welder
- treks
- '747'
- mindlessly
- pacifier
- reme
- destructed
- milton
- berle
- stepchild
- tumultuous
- regions
- siberia
- oppression
- attentions
- hopely
- catchers
- gladly
- unheard
- babe
- ruth
- thru
- lovingest
- cosmo
- pellet
- tod
- lovey
- dovey
- kneading
- trimming
- bonzo
- poindexter
- felix
- tortoise
- possessive
- bedtime
- rendering
- jessica
- tandy
- warmth
- manhunt
- manhunter
- dysfunction
- slay
- toothpicks
- outwardly
- awfulness
- wonderfulness
- lapses
- telecommunications
- profits
- waivers
- earners
- physicals
- subsist
- lodges
- moss
- footing
- alumi
- defrays
- defray
- unfold
- walmart
- discourages
- catatonic
- discovers
- buzzards
- pal
- imagined
- slaughter
- earthquakes
- robby
- graze
- indira
- observed
- attleboro
- freeways
- jets
- swinging
- kerosene
- eah
- boilerhouse
- powerhouses
- belch
- kodak
- smokestack
- phosphorous
- grenades
- photograph
- overstated
- environmentalists
- claiming
- automakers
- soot
- particulate
- meter
- tailpipe
- devise
- mufflers
- resumes
- graph
- erased
- simplified
- anduille
- doughnuts
- cobbler
- fudge
- fiber
- sloughs
- rafting
- potty
- packs
- noth
- outfitter
- headwaters
- damper
- hostage
- rhetoric
- rolm
- engi
- sheer
- estimated
- doctrine
- turks
- cheering
- reconcile
- divisive
- unprecedented
- authorize
- frontal
- sununu
- commend
- scud
- lefty
- frizzell
- galway
- harpist
- bagpipes
- whistle
- violins
- instrumentals
- rooney
- dancer
- entertainer
- eddy
- smiley
- burnette
- raspy
- playboys
- ernest
- tubbs
- rector
- scratchy
- opry
- stadler
- autry
- anymo
- vegetate
- fri
- relly
- complication
- eith
- demolishing
- stereos
- annoy
- troubleshooting
- initials
- conversed
- sexes
- consist
- childbearing
- storly
- var
- biological
- urges
- encumbered
- heirs
- characterized
- acquaintances
- terming
- emerging
- marathon
- idear
- discrepancies
- overview
- encapsulated
- introductory
- glamour
- updated
- airspace
- huntley
- analyst
- paragraphs
- noontime
- dose
- spee
- fastened
- wander
- aides
- debilitated
- arboretum
- maid
- tackles
- spinning
- irvin
- overwork
- reinjuring
- scab
- revamped
- metcalf
- smuggled
- investigated
- rehi
- renamed
- psychologists
- ration
- modalities
- learner
- kinesthetic
- gladewater
- baccalaureate
- unle
- commentator
- golsome
- superintendent
- adminis
- scarce
- overachievers
- overachiever
- beeps
- expre
- phoe
- easiest
- horizons
- hurtling
- brothers'
- clips
- madly
- fetish
- luring
- costuming
- remarked
- thriller
- distinguished
- terrorized
- branching
- vito
- flicks
- bawled
- toughest
- venue
- disrup
- sequestered
- entrapment
- displeasure
- waive
- bungling
- caricature
- bloodless
- comic
- functions
- thrash
- fixes
- climactic
- joseph
- reborn
- targeted
- hypercritical
- fart
- gags
- slapsti
- funniness
- gag
- retreading
- tec
- preemployment
- brazen
- wisened
- ventilated
- motorola
- tack
- orangish
- feat
- brighter
- coloring
- haphazard
- baseboards
- edger
- granary
- stocked
- formulas
- perfectionist
- tasks
- freehand
- gratin
- banana
- dissipate
- thickening
- globs
- rubbery
- blenders
- cools
- favoring
- nestle
- quik
- groedy
- whisk
- beater
- melon
- baler
- cond
- octane
- generating
- volt
- v8s
- repellent
- erupted
- meteorologists
- chernobyl
- tracers
- smoky
- array
- fiero
- undisciplined
- jacuzzis
- abdominals
- thighs
- mattered
- alienated
- suffocating
- choke
- differing
- grads
- quirks
- academies
- cadets
- espouse
- anglo
- saxon
- inveterate
- switcher
- dave
- wylie
- pumping
- weatherman
- hansen
- gordon
- lightfoot
- winston
- headphones
- toweling
- investigator
- tailing
- socialite
- extradited
- levy
- uplifting
- interpreting
- jur
- gui
- overcrowd
- connects
- businessmen
- sente
- penned
- duff
- penal
- beca
- litigating
- respo
- spiritually
- begats
- durn
- kratz
- kranz
- hedges
- nathaniel
- hawthorne
- storybooks
- woe
- glossary
- krantz
- twilight
- bogused
- fuck
- dares
- hangover
- sarcastic
- fishbone
- spirited
- venezuela
- avalanche
- gobs
- inflated
- beneath
- captures
- resulting
- risky
- contain
- vague
- guaranty
- guarantees
- guaranties
- disasters
- vulnerability
- regul
- workup
- incline
- unjust
- revoke
- reverked
- revoked
- vengeance
- sayeth
- mao
- tse
- chung
- temples
- unified
- humbly
- sovereignly
- rebuke
- ager
- preface
- admonition
- agrarian
- commander
- conceal
- napalm
- gro
- clayton
- uproots
- residents
- deba
- servant
- repaid
- granddaddy
- dodger
- militia
- bologna
- alleviating
- afresh
- lifestyles
- cabbages
- broccolis
- insecticides
- dandelion
- roly
- poly
- slug
- dragons
- sockets
- alkaline
- stem
- peaches
- silt
- shrivels
- mes
- cottonwoods
- irr
- smartest
- gardenias
- revitalizing
- mayb
- chopping
- blasted
- hybrid
- editions
- spruce
- dips
- dipping
- arabic
- pita
- eggplant
- marinating
- hickory
- clones
- mach
- databases
- searches
- deleting
- pieced
- bypass
- monochrome
- enthusiasts
- nathan
- swollen
- manuscripts
- composts
- nurserymen
- goop
- doorknob
- compress
- mugs
- expressions
- ungodly
- expansionism
- nationalistic
- succ
- origins
- angolan
- sinai
- warsaw
- militory
- indu
- chan
- clobber
- conquered
- autonomists
- shortages
- bulgaria
- czechoslovakia
- placate
- alienate
- emancipated
- slaves
- emancipate
- supplied
- battleground
- val
- verde
- briefcase
- bookcase
- armageddon
- grove
- imposing
- yoakum
- trilogy
- terrifying
- '''brien'
- crappy
- jakes
- compendium
- lobbying
- emancimation
- afterthought
- luted
- honorary
- isaac
- asimov
- robot
- developmental
- blockbuster
- mist
- dune
- freeman
- debating
- suave
- charac
- egalitarian
- scripture
- disciples
- wafers
- contradict
- buyers
- elma
- sheds
- pasadena
- refinery
- phoenixville
- grumble
- northwestern
- piped
- almetco
- pantr
- deanne
- multipurpose
- vide
- launched
- groupings
- gentlem
- dyke
- griffith
- idn
- brave
- shallows
- gig
- naughty
- murky
- spectrums
- abso
- feldon
- madonna
- lamar
- gators
- sneaky
- buckner
- stadiums
- cornell
- redwings
- peewee
- crude
- tilled
- screeching
- acorn
- scents
- pollinate
- yield
- tiered
- shrub
- locus
- thorns
- pollination
- pollinated
- littleton
- trucked
- shovel
- pressurized
- chainsaw
- dusk
- unfeeling
- spreads
- datsun
- ku
- klux
- klan
- incumbents
- larou
- larouche
- chord
- mayport
- brim
- snagging
- owl
- baiting
- oyster
- cracker
- trophies
- rockport
- netted
- ugliest
- archaic
- dots
- croaking
- croaker
- friendships
- copayment
- seclor
- exemplary
- snatch
- impressions
- inspections
- yellowish
- misty
- emphysema
- isolating
- biker
- vowel
- lint
- phrase
- cub
- smash
- conv
- ding
- dongs
- guathier
- eliminates
- briberies
- sidedness
- lengthy
- judo
- hoc
- deltaing
- disagreement
- wapner
- judean
- vibrant
- undoable
- semitic
- predetermined
- wandered
- defeated
- astaire
- sto
- plank
- poultry
- empenadas
- eu
- scallions
- sesa
- slivers
- overcook
- dashes
- ketchup
- bishu
- meats
- empanadas
- bun
- niokes
- requi
- bah
- humbug
- fives
- phony
- interdisciplinary
- dispelled
- grating
- reputations
- impaired
- institutional
- quiche
- growls
- overrun
- hussy
- settlements
- poll
- tiddlywinks
- volumes
- ignorant
- ironsides
- affixing
- chart
- commingle
- confusion
- issuer
- conven
- shucks
- profitability
- shifted
- itemized
- alpha
- beta
- accusation
- linemen
- rotation
- thereafter
- proves
- encouragement
- chemists
- overinflate
- southward
- nonconventional
- warheads
- parallel
- resolves
- negotiations
- inhabiting
- lith
- neutral
- crazier
- libya
- treaties
- overthrow
- survives
- inhabitants
- dancers
- outweigh
- wayward
- attained
- sharpness
- acuity
- disorient
- decimeter
- superpowers
- toddler
- indoctrinate
- understa
- skipping
- lows
- chillier
- handicappers
- mosey
- twosome
- mellowed
- doubles
- rationalizing
- purged
- goofed
- nastier
- cashed
- burgeoning
- metropolis
- carey
- thes
- intern
- sanger
- harris
- lifelong
- thunderbird
- citation
- mazaratti
- conceive
- degray
- stutters
- antennas
- roadside
- cords
- heaters
- hookups
- sopping
- dialect
- hums
- nuns
- trin
- shun
- hospitalized
- pumps
- stimul
- flipper
- retraining
- stagnant
- sores
- golan
- kishkes
- matzi
- goyim
- pocketful
- heston
- commandments
- grips
- muslim
- religions
- sects
- protestants
- lennon
- zionist
- nosed
- tampa
- scariest
- coincidently
- lox
- generic
- predates
- jihads
- toge
- secretly
- unity
- revert
- baltics
- forcibly
- impossibility
- insightful
- prays
- dissimilar
- forefathers
- esc
- disseminated
- giv
- postpones
- juniors
- disgust
- centeredness
- inability
- multicultural
- multiracial
- psychologist
- refers
- preoccupied
- infor
- cults
- motorbike
- maureen
- solomon
- eastland
- farmed
- millennium
- hopeless
- ideology
- eden
- distributorship
- supplier
- dirkson
- extansion
- dirk
- pearson
- embarked
- isometric
- chlorination
- firsthand
- detectives
- hunky
- dory
- gi
- barbados
- colleagues
- covert
- suburbia
- roasted
- goat
- hating
- stunts
- bending
- alleviates
- indicative
- handcuffed
- elem
- escalated
- bett
- reemphasis
- rote
- spitted
- memorizer
- wiping
- mennonites
- electronically
- determines
- sherwin
- molding
- bled
- spackle
- lighting
- nerdy
- garfunkel
- fascination
- innate
- supp
- manilow
- badness
- behinds
- pajamas
- yardage
- enclose
- fanatically
- subcontract
- ducts
- materialistic
- dwelling
- necess
- branched
- dishwasher
- inventions
- trashing
- diskette
- ordeal
- configured
- prestigious
- innova
- innovation
- audits
- pry
- peripherals
- lance
- restraints
- thermal
- razzle
- dazzle
- flats
- clairon
- rath
- educa
- feast
- waking
- tentatively
- receptacle
- raisers
- distribute
- disposables
- incremental
- fiery
- luther
- galvanized
- bashing
- environmentalist
- respons
- glow
- wartime
- overlook
- affirmative
- junkyards
- testimonies
- defendants
- legalistic
- achieving
- likelihood
- tilted
- sleaze
- protects
- choreographed
- patents
- antic
- repeater
- vendetta
- observing
- proceedings
- weightless
- effortless
- sweatless
- surveys
- adjusters
- expressed
- meningitis
- fetal
- terminated
- termination
- codependents
- goddess
- observations
- firemen
- overtones
- astonished
- phys
- cokes
- sternness
- forbi
- expressways
- patricia
- handlebars
- rewarded
- dubbed
- booger
- diamonds
- numbered
- redeem
- attache
- suitcases
- lamps
- wheelbarrows
- mixer
- toaster
- waffle
- clocks
- candlesticks
- aloud
- fussy
- babbly
- druthers
- rockville
- ballady
- abortions
- pregnancies
- handing
- landscapers
- replant
- alleys
- cultivate
- replenished
- subside
- prune
- hosted
- correspondents
- translating
- masks
- typeface
- piddley
- braunsfel
- unread
- skimming
- imperialism
- reasserting
- hangings
- needlepointed
- outlined
- intricate
- geometric
- upholster
- stiffened
- streamers
- stiffener
- quilted
- stamp
- foresaw
- refrain
- expedite
- franc
- francs
- diem
- consternation
- godfrey
- goodies
- prin
- perforated
- metrics
- typos
- retyping
- retypes
- encyclopedia
- prints
- limi
- clone
- bleep
- lionheart
- singular
- superstar
- norris
- deserts
- bates
- floats
- animation
- retitled
- reshot
- rout
- cosmic
- enlightenment
- dichotomy
- educatable
- prodigies
- precocious
- harks
- schoolwork
- construct
- convey
- verbally
- stressing
- penalizing
- eternity
- bradley
- activists
- demonstrating
- agreeable
- gerrymandered
- lipscomb
- disservice
- pauken
- politicking
- upmanship
- fooled
- nationally
- applicants
- dissolved
- shutdown
- mathematics
- outgo
- kidney
- positives
- spe
- sadder
- anxieties
- detected
- dismissal
- pard
- certainty
- handcraft
- wreaths
- eucalyptus
- dowels
- goofs
- bulch
- straying
- koala
- shapes
- wintered
- transplanting
- leafed
- pasture
- jungles
- rubs
- validity
- disagrees
- guessed
- lux
- accom
- transcontinental
- throats
- coalition
- armaments
- congressional
- fuss
- shiites
- fiddling
- shaped
- topsoil
- herb
- rollback
- spurts
- loppers
- rotor
- dethatch
- heave
- ingredient
- shrip
- fettucini
- straightens
- disconnect
- sucking
- depended
- peeled
- chestnuts
- burgundy
- browned
- bruises
- retires
- swivels
- collisions
- automation
- iaccoca
- airbags
- sc
- spine
- harness
- nifty
- chryslers
- aerodynamic
- conveyor
- magnet
- pennsylvanians
- brownie
- pamphlet
- slicks
- slot
- poundage
- instant
- wisely
- shboom
- befriended
- ironically
- resumed
- gymnasium
- flooring
- chrome
- height
- pounding
- engineered
- curbs
- gravity
- singles
- assorted
- immobilized
- screamed
- climbers
- limp
- matches
- ammn
- amm
- initi
- initiation
- mishandle
- guiding
- deregister
- tumbling
- themself
- banding
- pis
- julie
- tense
- bundles
- childish
- kazoo
- numb
- suffices
- rela
- weakness
- weaknesses
- experi
- temporaries
- retest
- retested
- rx7
- whatso
- seater
- narrowed
- assessment
- thirsty
- stint
- wanderlust
- poker
- admiration
- miners
- roadsides
- harvey
- uneducated
- flaunting
- relinquished
- strikers
- speeded
- aerobically
- calmed
- postnatal
- cise
- birthing
- axle
- windstorm
- overlooking
- embankment
- arkan
- sweeping
- tows
- beavers
- flee
- attitu
- flaunt
- americanism
- slums
- coops
- inoculation
- hungary
- requesting
- rotely
- panamanian
- quieted
- anticommunist
- excesses
- playtex
- flowery
- jaded
- comforts
- thorn
- bureaucratics
- dyed
- pollen
- gah
- blowy
- rebellions
- massacred
- protested
- diminishing
- renegade
- launching
- strifes
- defect
- obtaining
- globally
- demise
- glasnost
- escalate
- reins
- intentioned
- conveniences
- nonfeeling
- uphold
- unpopularity
- geez
- honorable
- massad
- madman
- straddle
- personalties
- rethinking
- gesture
- miscalculated
- liberate
- underestimated
- miscalculation
- huss
- assassinate
- staking
- precedent
- bullies
- powdered
- bombing
- khomeini
- normalized
- sanc
- juggle
- friction
- bookkeeping
- earner
- kite
- idling
- spooky
- lat
- tracing
- hitter
- shorten
- saberhagen
- crain
- craning
- reds
- stri
- fouls
- steinbrenner
- bogus
- workable
- peripheral
- notebook
- modems
- revise
- furnishes
- deadline
- courier
- magee
- peretti
- piercing
- fic
- soun
- illu
- illusions
- quintupled
- flied
- nailed
- gibbons
- exempts
- planters
- shedding
- proj
- beau
- insi
- sunlight
- sulked
- overmilitarization
- disparity
- civilization
- bigge
- trickle
- hemisphere
- kingsport
- masala
- sweeter
- amaretta
- dijon
- basil
- turgeon
- laroute
- gastro
- lamink
- restructured
- hardships
- subcultures
- debates
- patronizing
- demeaning
- midwife
- pater
- paternity
- troit
- misunderstood
- ranks
- aines
- peak
- olajuwon
- dunk
- businessman
- murchison
- bottomless
- leanings
- assholes
- reaganomics
- nonexempt
- visitations
- shuts
- hunts
- wan
- degreed
- jenny
- outdoorsie
- twix
- braniff
- gossip
- hound
- host
- pause
- mic
- '''clo'
- participators
- primal
- kicks
- tabloids
- journalistic
- fondly
- steeped
- repu
- unnecessarily
- glancing
- nod
- tonic
- unhooking
- uncoupling
- rotating
- rotated
- dieting
- ourself
- wrapping
- kip
- centrally
- sickness
- folder
- emphasize
- miniskirt
- evoke
- overdo
- laces
- flounces
- adornment
- unprofessional
- sexist
- tailored
- vulgar
- redford
- lewisburg
- emblems
- grotesque
- imag
- shoo
- padlock
- pawn
- someway
- neatness
- psychiatric
- hinkleys
- accidently
- distinguishable
- barbed
- curi
- prayed
- reestablish
- lengthways
- mounds
- clumps
- southw
- slapping
- formidable
- adcose
- exaggeration
- harmful
- structural
- hankering
- tick
- excalibur
- newmarket
- edmunds
- barnyard
- treacherous
- journey
- climbs
- creation
- touristing
- asbestos
- repaint
- roughed
- energized
- bids
- bleed
- caulk
- masonite
- bid
- varnished
- intervene
- toppling
- descend
- latinos
- mee
- meek
- europeans
- vocalism
- comparably
- bitch
- moan
- compromise
- dependence
- cartels
- mistreating
- slovak
- catacombs
- persecution
- idi
- amin
- oopsy
- pood
- greets
- recouped
- evi
- burial
- countenance
- uncanny
- litterbox
- anointed
- buzzer
- cheerleaders
- courage
- cheerleader
- precincts
- precinct
- harmfulness
- heroin
- forefront
- estimation
- demolish
- cur
- tract
- scaredy
- straits
- quieter
- comfy
- husb
- prance
- paw
- lovable
- lapdogs
- cockatoos
- squawking
- som
- cower
- akita
- aq
- padding
- chewed
- wiper
- blades
- tinkering
- rightly
- punctured
- patched
- restores
- feminist
- amer
- undoing
- stains
- altar
- spooked
- butterflies
- dee
- nicaraguan
- housed
- spiders
- repent
- evangelical
- surpassing
- override
- rejoice
- borrower
- bondage
- squatters
- witchcraft
- mayans
- incas
- worshipped
- pyramids
- sacrifices
- gods
- oppressed
- warehouses
- cumulative
- itemizing
- scrimp
- walkabout
- boonies
- attribute
- eric
- dickerson
- smi
- linebacker
- bickering
- wen
- appropriately
- arcade
- drafts
- archie
- manning
- nobodies
- showi
- furious
- veg
- padded
- opposing
- satin
- bridesmaids
- maids
- accessibility
- harsher
- aerostar
- stealth
- slipping
- celicas
- perfor
- racing
- surreal
- fulfilled
- blair
- reformed
- gambler
- microbiologist
- competitions
- minnea
- dowling
- ren
- entrances
- periphery
- paired
- deacons
- blesses
- fugate
- proverb
- macy
- lowe
- purebreds
- studs
- sweetest
- sweetheart
- breeders
- bree
- inbreeding
- inquisitive
- hindquarters
- predominate
- rex
- rexes
- rodents
- groundhogs
- mesh
- remains
- teetering
- refusal
- presc
- pharmacy
- mens
- absoluteness
- foiled
- mere
- outlawing
- conspicuous
- inconspicuous
- inappropriately
- hunted
- squirted
- novelty
- outdo
- raciness
- calculators
- euphonium
- mellow
- deejays
- grafting
- cough
- graphs
- sponsoring
- enhanced
- bytes
- '128'
- callously
- deterr
- blooded
- midsized
- porting
- attendant
- vessels
- overbuilding
- phe
- phenomenally
- galant
- serviced
- 49ers
- harbor
- niners
- kim
- redskin
- cartoonist
- ellicott
- basicall
- importantly
- devaluated
- goats
- schoolyard
- motherhood
- overcompensate
- destabilize
- vying
- regroup
- standpoints
- easterners
- couched
- proclaim
- weaving
- dike
- plug
- unveiling
- takers
- roomie
- slaughtered
- sudan
- occurrence
- shredding
- bedding
- wrappers
- reviving
- yosemite
- objectors
- assigning
- examined
- idealistic
- pakistan
- algeria
- blinking
- manipulations
- insofar
- clowns
- partition
- dividers
- baloney
- daylilies
- orchid
- closes
- velvety
- multiplied
- weeded
- lilies
- azalea
- glories
- ned
- skeldon
- ojeda
- hubie
- offerman
- prediction
- cecil
- orel
- hershiser
- darrell
- interleague
- introduce
- anoth
- homey
- randi
- dawdle
- steamy
- lawrence
- mae
- rambo
- hogan
- associates
- realist
- garments
- vogues
- knits
- garment
- loopers
- piping
- cording
- twe
- sewn
- exceptional
- bev
- reap
- sow
- establishes
- pardons
- lust
- incest
- swiftly
- integral
- reeks
- expediting
- compunction
- appropr
- sins
- stoning
- clog
- streamlining
- extremism
- bubble
- habitat
- humanity
- inefficient
- preconceived
- notions
- delivering
- spiraling
- conservatism
- hampers
- patchwork
- unflattering
- autobiographies
- randolph
- descriptive
- affluents
- tale
- binge
- bookl
- francis
- momentarily
- connecting
- sigh
- chowperd
- snowbirds
- spawned
- contend
- melts
- kitty
- apso
- panic
- preserve
- campsites
- twang
- pfeiffer
- rim
- glenrose
- latrines
- gemini
- genocide
- hmong
- unsure
- slash
- intercultural
- dissimilated
- conceptualize
- slavery
- linguist
- withholding
- worthless
- cambodians
- graft
- falk
- drugstore
- coils
- mosquito
- crickets
- foamy
- pristine
- froth
- bobber
- reeling
- saturated
- soggy
- damp
- claustrophobia
- terrify
- spanking
- revamping
- lev
- plaques
- stenciling
- cushions
- impeme
- interface
- janitor
- reams
- dalmarva
- deinking
- contaminate
- wastebaskets
- publicly
- yucky
- interven
- occupying
- schwartz
- iranians
- egyptians
- kane
- matinees
- burton
- batman
- glover
- kline
- dennehe
- goldblum
- clease
- arquett
- untouchables
- graffiti
- broderick
- marlon
- parody
- tinman
- humphrey
- bogart
- maltese
- falcon
- quinn
- rainman
- okie
- homeboys
- optimism
- reconstruction
- redefining
- trait
- longhorns
- randal
- streaky
- touted
- sentimental
- instability
- indoctrination
- marines
- ak
- 47s
- cubans
- capturing
- nicaraguans
- crate
- patrice
- lamumba
- teachings
- extremist
- gen
- irregardless
- albania
- revolts
- psychos
- chiefs
- staffs
- uprisings
- squadrons
- afghanistan
- boils
- cen
- berlin
- wat
- steppers
- soles
- reword
- indi
- environmentalism
- ruther
- environmentally
- blasphemy
- acutely
- bureaucracies
- relegated
- heartache
- grudge
- succeeding
- parish
- policed
- comforting
- reminders
- pyrex
- teaspoon
- blackened
- skewers
- basin
- chefs
- clams
- instinctual
- demographically
- democratically
- proposition
- proposals
- revolted
- obligatory
- considers
- australians
- looses
- leas
- denies
- hamilt
- passionate
- democ
- candi
- antigovernment
- misspending
- bastards
- inte
- hundredths
- sixteenths
- mismatch
- clamps
- meters
- drams
- perfume
- machinist
- indic
- indicators
- micrometer
- finders
- nondecimal
- halves
- listing
- beverages
- whiskey
- ploy
- conversant
- milling
- measu
- calipers
- pliers
- milliliter
- drilling
- hundre
- lawy
- strangle
- neiman
- marcus
- outgrowing
- necked
- embellished
- dre
- presentable
- outrageously
- busters
- campinas
- oursel
- asses
- orient
- optimist
- jungle
- resonates
- profound
- bullying
- dreamed
- wildest
- semantics
- transcribes
- onl
- guzzlers
- fours
- threes
- transverse
- mounted
- shoved
- serpentine
- stickers
- reinstalled
- nozzle
- stroking
- groves
- surinam
- natio
- internationally
- amaco
- mobil
- rectified
- inward
- hateful
- kilom
- thumbnail
- kilogram
- britain
- adopting
- precisely
- grams
- sync
- orchestrate
- unfamiliar
- toting
- stroganoff
- allendale
- waldwick
- adirondacks
- pancakes
- outgrew
- beth
- knowl
- roanoke
- randall
- duplicated
- gamble
- ditka
- nate
- newton
- branded
- outlaws
- webster
- cocky
- lambert
- bloopers
- receivers
- tackled
- necks
- fav
- entities
- overburdened
- fairness
- pondsy
- invu
- invulnerable
- belongs
- electing
- politic
- floored
- maryl
- nurture
- credits
- ukrainian
- scallop
- buns
- batter
- bourguignonne
- grudgingly
- pinch
- reversal
- beck
- subsidize
- bennington
- liber
- refinement
- etiquette
- advises
- renaissance
- bowdoin
- bucknell
- lectures
- confirm
- guitarist
- yale
- minoring
- irrevocable
- irrespective
- clinical
- pathologist
- kayla
- bachelors
- profess
- traced
- rung
- maladjusted
- compelling
- distaste
- resp
- beret
- uzis
- disorderly
- unc
- unconcealed
- matched
- vibes
- clearest
- confi
- junkins
- mandated
- prompted
- tobacco
- bandwagon
- cour
- tricked
- syst
- maintenances
- scoop
- fetch
- pooper
- scooper
- colombia
- reek
- kindhearted
- nixed
- asthma
- outgrown
- misclass
- stately
- sunk
- furnished
- swoop
- situational
- punches
- momentum
- lockheed
- arose
- courageous
- accredita
- accreditation
- keying
- adjacent
- refine
- classified
- chemicalwise
- refining
- strean
- stillwater
- stephenville
- toxins
- bacterial
- bleaching
- sinked
- australian
- dominique
- neek
- wimp
- feline
- unconditionally
- feisty
- snuggle
- investigate
- beaner
- wadded
- fixture
- decor
- panty
- garb
- polyesters
- wools
- neatly
- layerings
- eyesore
- mended
- ironed
- compose
- upgrading
- plummeted
- acro
- daltons
- wholly
- understands
- disadvantaged
- winnowed
- structures
- casing
- connectors
- workmanship
- hal
- fluke
- highlands
- patronage
- cranberry
- pou
- lobsters
- billboard
- steams
- culinary
- adventurer
- franchised
- shacks
- shoney
- reliably
- communercation
- compe
- renditions
- organizer
- defeat
- registration
- dragginess
- headache
- draggy
- locker
- sauna
- motiv
- agony
- dictatorship
- uganda
- mils
- distances
- centigrade
- celsius
- metropolitans
- heeley
- wentworth
- differential
- microns
- whatev
- responded
- favorably
- bagged
- ecological
- prod
- additives
- pickups
- hangers
- cupboards
- fountain
- faucet
- exceeding
- decomposed
- shocker
- bizmart
- upseted
- taxwise
- toilets
- smashing
- soaker
- sheltered
- disapp
- rankled
- cheerfully
- outermost
- inland
- curving
- ventura
- buildi
- overflows
- anaheim
- simi
- meanings
- rhymed
- balti
- strayed
- kabob
- breakfasts
- galunkies
- marsh
- pierogies
- grandparent
- newarth
- cholest
- margarine
- margarines
- kebabs
- utensils
- goulashes
- juices
- sealed
- galore
- finer
- drains
- shakers
- journalist
- crux
- remo
- appease
- pob
- patr
- paro
- paroles
- partake
- traumatizing
- viaducts
- ceremonies
- dozens
- pageants
- riveted
- confuses
- thrilling
- producers
- tony
- dorsett
- hershel
- rationalized
- cinemax
- correspondence
- '30'
- cod
- reso
- repossessed
- 635's
- looper
- ramblers
- brook
- dealie
- diversion
- chevys
- nex
- v8
- carburetors
- gingerly
- yanked
- tinkerer
- evaporator
- rubbing
- testers
- diagnostic
- tester
- diagnostics
- carriage
- chilton
- multiplying
- lincolns
- tremend
- leaking
- condenser
- busted
- haas
- ovolacto
- lard
- nutrient
- lactose
- synthesize
- slough
- utilizing
- rids
- utili
- paperback
- novelization
- lucas
- freder
- brink
- feinstein
- fairfax
- deaf
- insulate
- scrubby
- pecan
- paralegals
- clears
- interference
- surplus
- tariffs
- mon
- apprentices
- advisable
- journeyman
- exporting
- imminent
- oodles
- salutatorian
- prided
- welcom
- welcoming
- tol
- resentful
- zales
- spiegel
- hurried
- circulating
- walrus
- porpoises
- mainland
- sanctuary
- whooping
- cranes
- pelicans
- antone
- alamo
- brewery
- caverns
- uncourteous
- actua
- irritant
- hullabaloo
- stockholders
- inebriated
- unsafe
- surgeries
- subsidizing
- quack
- waiveable
- refresh
- somewh
- willy
- horton
- consolation
- microscopic
- kneecap
- curtailed
- forming
- bison
- weakening
- strengthening
- '401'
- continuation
- telephones
- handbook
- badger
- showering
- physiological
- advan
- fledgling
- bikers
- bicyclist
- knocks
- coronary
- artery
- decreases
- embark
- motivating
- disevered
- knobby
- vaulted
- woodhollow
- villa
- secluded
- joking
- sellers
- coworker
- doorstep
- housebroken
- playful
- gastrointestinal
- beagle
- romping
- waters
- retrieve
- paddled
- unrequir
- degenerating
- rosebud
- sociable
- smu
- synopsis
- furrier
- judgement
- distribution
- wrongfully
- penitentiary
- sitt
- caravans
- lending
- simulation
- resemble
- adroit
- oddity
- moonlighting
- strengthwise
- divulging
- tarnished
- faye
- socialist
- undone
- inefficiency
- platform
- lieu
- mamma
- disruptive
- brow
- browbeat
- wist
- mugging
- faceless
- persuadable
- thunderbirds
- topaz
- camaro
- reim
- dominated
- wrenches
- eas
- champ
- premeditate
- premeditatively
- stiffening
- lessening
- retarded
- pleaded
- phrased
- dayers
- correctness
- promoting
- niceness
- vouch
- waterfall
- busch
- blacksburg
- portsmith
- williamsburg
- epcot
- temp
- buccaneers
- assessing
- opp
- benef
- wadley
- milestone
- tainted
- snickered
- examine
- aircraft
- astound
- pusher
- circularly
- chairman
- judy
- perturbed
- promotions
- programmed
- brightens
- hallmark
- servi
- seizures
- brighten
- tonya
- sneaks
- rainstorm
- breezes
- temperate
- promises
- westernize
- intact
- extensly
- vely
- woodward
- projected
- commanders
- colin
- powell
- embargo
- misread
- earliest
- disarray
- hopeful
- prosecute
- stature
- statesman
- foreseeable
- selves
- volatile
- retile
- bathtubs
- scouter
- drippy
- panes
- putty
- gazoo
- pes
- pesticides
- bulging
- chlorinating
- coronarys
- diets
- quadrupled
- ingestion
- clogging
- primates
- regimen
- kenneth
- innovator
- inactivity
- neurosurgeon
- strictest
- idiots
- stan
- destruction
- symbolism
- evokes
- lynched
- modified
- possess
- condone
- adamantly
- symbolizes
- circum
- satisfactory
- budg
- spartan
- frugally
- jordache
- nonessential
- victory
- cliche
- enactment
- adjourned
- mot
- expending
- reasoning
- allege
- myriad
- departure
- restocked
- guided
- unconstitutional
- reforms
- gard
- arranging
- orig
- florist
- slowdown
- runners
- geraniums
- coleus
- vinca
- thuringiansis
- caterpillars
- expands
- unlicensed
- brittle
- excelled
- wei
- denotes
- tension
- bicep
- tricep
- instructing
- grindstone
- hovering
- configuration
- blended
- muscular
- dystrophy
- documentaries
- paroe
- planner
- uruguay
- concepts
- yuppies
- legislated
- dynamics
- auditing
- rev
- revenues
- millspec
- operates
- elevens
- hammers
- federalized
- ci
- emphas
- identi
- americard
- adios
- commu
- demeanor
- announcement
- calcutta
- foreigner
- worldliness
- attributed
- chuckle
- pogo
- mourn
- tolerated
- drumming
- scrunch
- glamor
- sprigs
- ricksun
- tender
- lamp
- ashes
- overcame
- nondescript
- damned
- hierarchy
- restructuring
- feminism
- boomer
- creep
- rapidity
- electroni
- luncheon
- existent
- consulted
- alters
- stamina
- goi
- denying
- revolve
- entrusting
- omniscious
- omniscipotent
- alec
- precedes
- daders
- shrinking
- worthy
- whate
- responses
- spoils
- flashbacks
- flashback
- fidgety
- discriminate
- pertaining
- distraction
- males
- ital
- entree
- sagar
- presby
- kimonos
- grishman
- bavarian
- constricted
- putrid
- folley
- tableclo
- crayons
- disintegration
- flickers
- prevalence
- excusing
- signals
- mechanized
- requiring
- antipasta
- stuffing
- poached
- kernel
- spinach
- wilson
- beeping
- bakes
- frosting
- frostings
- chatting
- mentor
- adversaries
- manuscript
- harried
- interruptions
- feedback
- videotaping
- adopts
- twelfth
- tangible
- overseen
- alternately
- ilk
- phonic
- pistons
- snooty
- telev
- leno
- carvey
- deduce
- cros
- wheeled
- porked
- termites
- chess
- rearrange
- hisself
- bathtub
- prettier
- rewired
- shorting
- surges
- famili
- rearranging
- shuffle
- pane
- breakers
- valve
- drips
- walkway
- splash
- vein
- downfall
- yuppiedom
- restructure
- biologically
- physiologically
- wonderment
- swooshed
- viva
- talents
- mongst
- jealousy
- computerizing
- pecking
- punched
- slightest
- epidemiological
- guesswork
- transmitted
- semen
- illegitimate
- exploded
- stepchildren
- socio
- radios
- faxes
- sensors
- stalk
- jurisdiction
- outnumber
- solicitation
- prostitution
- unlocked
- fallout
- probability
- indentured
- servitude
- vigilantes
- victimless
- ridicul
- auctioning
- bidding
- patios
- insecticide
- diazinon
- carefu
- deb
- wallpa
- stagger
- renovator
- sheeting
- resilient
- stairway
- sworn
- rud
- veto
- bout
- yea
- dams
- droughts
- reservoirs
- poole
- reflected
- counteract
- learners
- genius
- perspiration
- diagnose
- predisposition
- flashing
- drowsy
- facilitators
- manipulated
- burdening
- toot
- weekdays
- racket
- drawer
- dennison
- derby
- siphon
- cu
- uba
- tailgate
- deterrents
- publishers
- poisons
- ergotisms
- fungus
- gender
- confidential
- tide
- vatted
- archeology
- shoelace
- promising
- upcoming
- reprinting
- thurber
- hundredth
- riveting
- viorst
- sci
- revol
- revolves
- shoelaces
- binds
- melody
- workbooks
- workbook
- geometry
- cypress
- greece
- irrelevant
- tortola
- gorda
- infusion
- ethnicity
- familial
- acclimate
- retaining
- latino
- continentals
- roberto
- unprepared
- vociferous
- attain
- imported
- territorialism
- horns
- encompass
- handcrafts
- wreath
- phillips
- ranching
- contemplating
- stabilize
- occupies
- baseline
- flextime
- grading
- scribble
- sensitivities
- akin
- minimized
- prematurely
- dumper
- geria
- empathize
- tandem
- providers
- prohibitive
- fantastically
- moslem
- surro
- surrogate
- regretful
- arou
- swims
- nationals
- quarries
- tumbled
- avail
- denmark
- appliqued
- eraser
- maturing
- rite
- unmarried
- aquariums
- zoos
- paternal
- traditions
- disintegrated
- trinket
- sociologist
- multigeneration
- eightch
- scorer
- rebounders
- assists
- thown
- laker
- marriott
- spittering
- sputtering
- swimsuit
- mavs
- favored
- endorsements
- prospects
- stanley
- underclassmen
- myrna
- curfew
- fiscally
- jockey
- catton
- dives
- cayman
- itinerary
- viet
- doves
- abnormal
- puppet
- heartbeats
- reviewing
- bocket
- hannibal
- lector
- fascin
- luster
- attractiveness
- originality
- pinpoint
- lavon
- upstream
- sever
- benders
- grea
- musky
- perches
- salami
- sonar
- maneuver
- charter
- suntan
- hobbyist
- styled
- convertibles
- sevi
- welded
- welding
- sunroof
- soured
- contention
- jags
- contractors
- bends
- enthused
- enthusi
- ap
- vending
- cartilage
- glanced
- fenced
- econ
- repeatable
- bundy
- exe
- strauss
- punish
- electrocute
- problematic
- candid
- fraud
- intangible
- reinstate
- mario
- cuomo
- legislatures
- molested
- incarcerate
- sylvan
- reenacted
- paltry
- polishing
- lotions
- meniar
- cringes
- thrifty
- flier
- psycholinguistics
- ivory
- godsend
- pathe
- willow
- cana
- bacally
- obese
- reimburses
- collared
- widget
- bramalea
- 401k
- weeny
- nonex
- censored
- bombarding
- dramatize
- statues
- weld
- epoxy
- resin
- shattered
- statue
- cricket
- thatches
- thatched
- vapors
- stained
- lacquered
- tung
- fanatical
- pills
- hem
- sweating
- bulge
- wrinkles
- vices
- sha
- germ
- ecru
- undercoat
- peachy
- steamers
- mottled
- grey
- maroon
- vivid
- turquoise
- coral
- renovating
- hallucinations
- cloths
- slop
- soluble
- tricks
- skimp
- tediously
- rewallpaper
- racks
- metlife
- worki
- workm
- inconsistencies
- amateurs
- footballs
- fencing
- earl
- princeton
- pacers
- subminimum
- administered
- reluctant
- poured
- chiropractor
- cautious
- janitorial
- rafael
- septien
- applicant
- eduardo
- mana
- sai
- mafia
- newcomers
- ellis
- redoing
- comm
- elitist
- concise
- rathers
- yous
- segregate
- wretched
- horrid
- shortchanged
- brokaw
- demi
- ringwald
- sixteenth
- doogie
- howser
- freckly
- ferris
- moustache
- reeve
- dreaming
- ooze
- bride
- pretended
- occupational
- exemption
- judiciously
- incidental
- figuratively
- westport
- bradford
- indirectly
- clair
- dayt
- baldwin
- bebble
- foreclosed
- rider
- homestead
- creeping
- livable
- retrial
- retry
- wond
- seeded
- raping
- choking
- shotcross
- televised
- vendettas
- trialed
- revoted
- annihilated
- enterprises
- misgivings
- quiz
- sprint
- capture
- extending
- endowment
- joes
- alumni
- splits
- governme
- faired
- undertaken
- deficiency
- dilly
- sangre
- cristos
- wichitas
- lakefront
- pinon
- naturalist
- stools
- binding
- component
- carol
- playroom
- realtors
- dominantly
- alleyways
- shifting
- popping
- bangla
- hugo
- bedroo
- barometric
- borger
- funnel
- pillowy
- radar
- veer
- swirl
- junes
- budding
- crimp
- scorch
- distracting
- heats
- therapeutic
- northe
- mayer
- denison
- purify
- purifying
- philodendron
- acc
- divert
- blurred
- fluoro
- fluorocarbons
- provoking
- brandeis
- fift
- readings
- iliad
- mythology
- choo
- scientifically
- grumbled
- unpleasant
- imparting
- cluster
- vicarious
- compromised
- profiles
- telemarketeers
- outcry
- cited
- crashes
- eroded
- erosion
- lockers
- latitudes
- motorists
- liens
- representing
- landlo
- dakotas
- alarmed
- exclusion
- parameters
- interpreted
- adoptive
- carting
- arresting
- interval
- orwell
- tay
- unusually
- leathery
- venture
- wea
- pebbles
- drainage
- deceptive
- fiend
- wrinkled
- oils
- fishermen
- tricycles
- kiddie
- wilds
- calves
- heifer
- jea
- flared
- hep
- themsel
- continuum
- astute
- propagate
- raccoon
- filleted
- livestock
- whiskers
- growling
- widen
- weaker
- ticker
- pentagon
- whomever
- nutrisweet
- bitterness
- ancient
- vets
- complicate
- preregister
- registrations
- eligibility
- preceded
- theodore
- upward
- rascals
- stinks
- precluded
- gullibility
- democracies
- redistricting
- subsidizes
- lineman
- spilled
- camouflage
- booby
- traps
- apocalypse
- influx
- surge
- buckle
- overcome
- castaways
- depicting
- dudley
- bloody
- olden
- realism
- pioneer
- worship
- chri
- videotapes
- shrunk
- eastwood
- showy
- westerns
- cursed
- pointy
- melissa
- gilbert
- idol
- verse
- shep
- immemorial
- misdemeanor
- waving
- prevail
- appoint
- bailiffs
- clerk
- verbalize
- tripled
- cameras
- reporters
- prosecutors
- outweighs
- prosecuted
- sump
- sewage
- towed
- aut
- trad
- marina
- hears
- acclaim
- sequels
- earle
- recluse
- essays
- qu
- conclusions
- photographers
- arro
- gorillas
- sloth
- fascinates
- bottoming
- landers
- tycoon
- bloomed
- fade
- spiky
- bl
- hya
- colossians
- thistles
- landscaper
- junipers
- puny
- foliage
- iris
- fuzzies
- wildflower
- insists
- camcorder
- pastime
- muggings
- grates
- claustrophobic
- tendencies
- deviant
- anguished
- cleaners
- meridian
- inlaws
- sneakers
- jordans
- brains
- caps
- videoed
- repeated
- repetition
- termed
- allowable
- purs
- discretion
- freely
- altering
- preparations
- namely
- minuses
- factored
- competitor
- trevino
- influencing
- wholesome
- exclamations
- sportsman
- phooey
- applicator
- nurseryman
- elm
- circumference
- stubs
- propelled
- pest
- sawed
- rot
- rotter
- autobiography
- liquidating
- emulating
- compu
- ause
- accomplishing
- spacings
- formattings
- insert
- reset
- rewrite
- typesetting
- typeset
- spaces
- compatibles
- adhere
- brochco
- hillstreet
- finale
- nudity
- delight
- shudder
- flabby
- telemarketing
- classification
- lotteries
- kalamazoo
- sinus
- carton
- stakes
- mounts
- hub
- airports
- altitudes
- intermediate
- simp
- fluorides
- guerrilla
- marched
- lied
- expire
- xerox
- modify
- soo
- terminals
- insur
- breakable
- hangouts
- haunts
- southerners
- rudest
- bartenders
- wee
- ferrings
- taiwanese
- jambalaya
- wowed
- univerisity
- arias
- casks
- hospitalization
- hos
- crowns
- fluctuate
- celebr
- inordinate
- axe
- newscast
- js
- recap
- sensationalize
- sensationalized
- asinine
- puzzle
- precede
- preclu
- preclude
- stretches
- wakes
- depreciate
- tru
- unibody
- granddaughters
- gol
- wagging
- trainers
- airheaded
- yappy
- dignified
- culling
- tamper
- innately
- tractable
- selectively
- culled
- belgian
- distinct
- breeds
- kennel
- translates
- shit
- unreliable
- handlers
- indiscriminate
- breeder
- handler
- bab
- doorbell
- stipulation
- laundromat
- grasslands
- surrounds
- betty
- parades
- palestine
- id
- peg
- catalyst
- palestinian
- kindest
- abounding
- kindness
- godly
- compassion
- humanness
- mandarin
- oranges
- grape
- fridge
- gelatin
- carrot
- eggo
- waffles
- adolph
- breakfa
- craftsmanship
- opt
- stanza
- glitters
- oasis
- warp
- clearinghouse
- consolidating
- salespers
- tel
- compan
- announcing
- telepho
- discard
- episodes
- cramp
- vela
- someb
- thirtysomething
- mclaughlin
- yogi
- loner
- comedian
- cantankerous
- echoed
- withdrawal
- grumpy
- stooges
- mouthiest
- kiddos
- mouthy
- touristy
- besieged
- defini
- badgering
- galapagos
- sidney
- adelaide
- chengdu
- quingdao
- retreat
- flights
- rita
- oah
- destitute
- ree
- snorkeling
- prawns
- milli
- arsenal
- traffi
- bennett
- gangsters
- corp
- arr
- pris
- crowding
- statutory
- verbalizing
- stints
- citing
- intensity
- limbaugh
- lamenting
- microwaved
- healthiest
- teases
- accuses
- deprivation
- nourishing
- evaporated
- broil
- marinara
- grapefruit
- starch
- pleasurable
- kalli
- cater
- rodolfo
- royal
- maitre
- pilgrim
- unnatural
- lookout
- arby
- wastes
- reduces
- speedup
- healthily
- sup
- quoting
- disputes
- commas
- reevaluated
- inma
- blinded
- restitution
- willfully
- contradictory
- caveman
- coleslaw
- tablecloths
- bakeries
- regretted
- purch
- pastrami
- '''oeuvre'
- complicat
- sustain
- addressing
- fellowship
- prefers
- troublesome
- camels
- beatle
- orchestration
- okeydoke
- statler
- stated
- debut
- investigating
- bootstraps
- baptisms
- clergy
- imprisoned
- confiscated
- bourgeoisie
- commonality
- recanting
- courtyard
- motions
- commandant
- escaped
- perseverance
- bureauc
- persecuted
- dab
- chorus
- mothering
- rerate
- precluding
- analogy
- spade
- marketeer
- warring
- peacefully
- trampling
- fantas
- crabby
- coated
- willis
- sarandon
- gena
- vatican
- paradeso
- befriends
- friendship
- califor
- drying
- nippy
- mucky
- thunderstormed
- shoveling
- michelle
- lan
- footnoting
- retype
- appetizer
- criterion
- alumnae
- heavyset
- poignant
- subtleties
- gore
- warlock
- omelet
- characterizing
- conceited
- portay
- goer
- prosecu
- cutor
- struggles
- flowing
- ir
- slicing
- locust
- omar
- swallowed
- redwood
- brownstone
- caulking
- myneer
- spacious
- inhaled
- revived
- airway
- revive
- sol
- dignity
- luxurious
- blossoming
- brazos
- sleeps
- purdis
- sandlin
- quake
- mak
- caramelized
- customary
- orchard
- accor
- ply
- crier
- waistline
- jewels
- earhart
- thurow
- perceptive
- pinpointing
- flimflam
- hughes
- assis
- plod
- rereading
- ditched
- findings
- bonfire
- vanities
- temporally
- burdened
- cafeterias
- linen
- napkins
- duplexes
- hodgkin
- undergoing
- interim
- constancy
- sufficiently
- farfetched
- wheeler
- cock
- slowing
- pals
- unjudgmental
- homy
- reprimand
- secrets
- brooksville
- campuses
- eyesight
- enrichment
- schooled
- rejection
- proceed
- herman
- foreigners
- polluter
- rigs
- busses
- incinerate
- pollutant
- untold
- cockroach
- accelerated
- nutrients
- sponges
- tending
- newark
- vividly
- entrance
- biggies
- consumable
- calculation
- physiology
- snowball
- dieters
- robbers
- trendsetters
- correspond
- circulates
- centralize
- descendancy
- closeness
- caliber
- differentiate
- stevens
- shippensburg
- specializes
- novelist
- intricately
- johann
- sebastian
- copyright
- compile
- poems
- baudelaire
- jennie
- abridged
- reunited
- rituals
- equated
- communion
- repetitively
- vernon
- salmonella
- silverware
- caterer
- biographer
- obituaries
- succeeded
- vigor
- bulletins
- chorals
- beginner
- violinist
- percussion
- accompany
- choruses
- audition
- verdi
- hermit
- vacationed
- anonymous
- whirlwinded
- effortlessly
- elicited
- unwound
- guadalupe
- penetrates
- alda
- burt
- reynolds
- vignettes
- dinosaurs
- robots
- satur
- sniping
- howling
- gleason
- snippets
- idle
- workshop
- gra
- dividing
- moses
- hab
- scavenge
- conserve
- indulgent
- exceptions
- contemplate
- permitting
- calming
- aboard
- docks
- cozumel
- ocho
- rios
- jurisdictions
- tapping
- lynda
- slandered
- landslide
- thornburg
- landslided
- characteristically
- savory
- petition
- resisted
- dirtier
- muddier
- sensibilities
- transpired
- nixon
- edible
- accumulating
- elbow
- cho
- grandes
- refried
- katy
- avocados
- avocado
- coolwhip
- horseshoes
- auctions
- sidelines
- loosely
- socioeconomic
- tracked
- pressured
- vandalism
- outward
- custodial
- skyline
- irritable
- unattended
- environments
- dunked
- compaq
- honk
- prodigy
- mush
- shareware
- paradox
- shooter
- crawford
- andrew
- webber
- paranoid
- unlucky
- anonymously
- competency
- wholesale
- lon
- exa
- beginnings
- kuenzer
- rebelled
- debtor
- angela
- eyeglasses
- indiv
- staffing
- examines
- optometrist
- ophthalmologist
- extractions
- publication
- unfeasible
- bettle
- orthodontal
- outsor
- roo
- suite
- scattering
- leniency
- underhanded
- perpetrator
- injustices
- wherein
- dist
- unsavory
- elimi
- rarity
- chairmen
- ministers
- congregations
- catholicism
- forthright
- disorders
- soothe
- exertion
- characteristic
- cram
- guarded
- sacrificing
- mediators
- interpersonal
- mediator
- doable
- devised
- stimulations
- goof
- whipping
- nickie
- snail
- hards
- futuristically
- subjective
- harmony
- impregnated
- challenges
- motherly
- competent
- militaristic
- colonel
- infantry
- embrey
- reynold
- riddle
- aeronautical
- pratt
- whitney
- daphne
- dictated
- qualifying
- rhodes
- scholars
- homogeneous
- realities
- socialization
- insular
- sheriffs
- evict
- continuances
- abundantly
- appealing
- retried
- lowers
- percep
- gypped
- slicker
- bruno
- kirby
- chauvinistic
- punching
- correlations
- opium
- dens
- weakened
- duress
- drunken
- induced
- legalized
- quantify
- deg
- safeguards
- fraction
- oath
- sensings
- sentencings
- pertains
- introduction
- accordance
- clark
- parachute
- presiding
- reorganizing
- sweeper
- univerty
- versity
- lakeway
- expose
- jun
- bethany
- unfocused
- midst
- instigated
- marrie
- remained
- tomorr
- whitmore
- arbor
- slushy
- sled
- icy
- lingering
- exodus
- eternally
- snowfall
- grassy
- sachse
- goddard
- stickler
- mulcher
- seni
- antisocial
- adapting
- deteriorates
- glimpse
- unwilling
- appalachia
- stopgap
- rougher
- strategic
- fails
- worded
- peoria
- dropouts
- insecure
- scaring
- stylish
- interpretive
- fathom
- expanding
- wean
- referrals
- advisory
- myrtle
- barricaded
- blackberry
- defeats
- enchila
- boiled
- toasted
- calorie
- hereditary
- headstart
- preschooler
- tacos
- tamales
- romanian
- backfires
- waiters
- batty
- momo
- colter
- pas
- campari
- adventured
- souper
- prey
- backlogged
- patrolled
- frus
- imme
- dialogue
- aisles
- cornball
- overacted
- applauding
- waterskiing
- ashley
- jamie
- warner
- deanna
- cheeks
- backdraft
- berry
- raspberries
- shaved
- entrees
- accompaniments
- gershwin
- puree
- antipollution
- gases
- accumulates
- groundwater
- fusion
- optimistic
- pessimistic
- reconvicted
- sicko
- merciful
- cannibalism
- hunch
- coordinate
- communicable
- memos
- orchestral
- fiddler
- oboe
- classy
- corresponds
- christening
- elijah
- marches
- poinsettias
- bouncy
- haunting
- conventional
- disposal
- odors
- throwaway
- ditches
- drinkers
- churn
- shipwrecked
- explodes
- maims
- sylvester
- mermaid
- outfitted
- crushing
- hobnail
- phobia
- bifocers
- trifocals
- mccalls
- byte
- afflicted
- exceeded
- antibody
- realm
- telethons
- doling
- receives
- ociety
- aesthetic
- enhancing
- frightens
- dahmer
- burglary
- enquirer
- cranks
- fuzz
- repala
- sil
- shiny
- heartbeat
- spins
- rainbow
- packaged
- trespass
- tidbit
- refrozen
- cheesecakes
- refreeze
- liabilities
- wrecks
- tattoos
- speedboats
- chambers
- afloat
- maneuvers
- stormy
- nibble
- rope
- entice
- sneaking
- paged
- favo
- flyer
- shaky
- iffy
- sentra
- subdued
- urinalysis
- bums
- overdress
- overkill
- businesslike
- nylons
- nutrisystem
- dreaded
- toppers
- ceramics
- seamstress
- cramped
- negligent
- initiates
- squeegees
- newscasters
- postponed
- a1
- alfredo
- clowning
- circuits
- sfuzzi
- copeland
- transported
- thirteenth
- wobbly
- bookends
- jug
- viscosity
- saver
- brushed
- tooken
- turpentine
- towels
- shi
- jul
- shindig
- boulevard
- maizeland
- skier
- minnie
- canaveral
- reschedule
- hilton
- eighteenth
- raton
- '287'
- '70'
- broadmoor
- breckenridge
- trinidad
- '25'
- hexpired
- disheartening
- elders
- albertson
- limbs
- sodas
- arranged
- brookshires
- pickle
- piles
- emporium
- cinch
- consolidate
- alluring
- cupcake
- henpecked
- instilled
- gatherings
- subtracts
- debits
- incidentals
- scotch
- igloos
- strateg
- strategically
- incurred
- cashes
- reunio
- entryway
- roaming
- ris
- risen
- appraisal
- disoriented
- blissful
- unexpectedly
- cockroaches
- complacent
- bitterly
- polling
- campaigning
- napping
- structuring
- digested
- perfumes
- geese
- peaked
- balloon
- canyons
- weatherwise
- sleet
- maps
- sy
- pearls
- loafers
- distinguishes
- '1200'
- whereby
- extract
- generates
- bursts
- navc
- blazey
- obscure
- promotes
- goe
- refrigerate
- tartness
- raspberry
- connoisseur
- tastings
- mesina
- exorbitant
- kaiser
- mccullum
- catastrophic
- implants
- transplants
- howe
- dislikes
- chopin
- expresses
- discussions
- chords
- panicking
- kielbasa
- bak
- ravioli
- reggae
- twangy
- agr
- cackle
- atteck
- scholar
- adolf
- imaginative
- sty
- antiques
- winnie
- pooh
- grimm
- fairy
- tales
- gentlest
- jewel
- restroom
- spitz
- extravagant
- overpass
- littering
- timers
- tans
- mauve
- distantly
- swap
- bichons
- barks
- hind
- origina
- bernards
- lega
- belittling
- liberals
- suppos
- tcat
- examination
- clicker
- screens
- carpooled
- bolivia
- sundresses
- polyester
- overheat
- sweltering
- newborn
- pleats
- absent
- strep
- bookkeeper
- partitions
- duality
- extenuating
- newsworthy
- leafing
- mccall
- subscribing
- gott
- newsy
- putterer
- caladiums
- hardened
- semitropical
- carrollton
- architecture
- hairless
- coon
- manx
- tame
- ships
- folklore
- faint
- chincoteague
- burgers
- teriyaki
- shakes
- grandy
- fend
- snowballed
- inconveniences
- woozy
- sys
- squirt
- flicking
- whales
- showtime
- adder
- dragon
- rosa
- sorrento
- dine
- mah
- jongg
- yearbook
- imprinted
- depreciated
- cribs
- bestes
- giver
- enables
- ly
- confining
- bronco
- moder
- cowb
- cheer
- schnauzers
- dachshund
- starved
- curled
- skittish
- spaying
- belon
- severing
- sr
- suicidal
- craziness
- mistrust
- lacks
- poland
- weeding
- mankind
- uninsurable
- medcenter
- hearings
- overstaffed
- mortgages
- outlaid
- intergovernmental
- plugging
- indepth
- capsize
- sensationalism
- blase
- sel
- sadist
- oleo
- oregano
- ight
- semolina
- absorbs
- vulnerable
- align
- bombings
- aligned
- tensions
- forceful
- cr
- expedited
- deserving
- mandate
- grassroots
- introspective
- schoo
- visitation
- advantaged
- energies
- tiananmen
- custodians
- immigrated
- brightest
- burst
- lanes
- winterized
- yourselfer
- representatives
- homemaking
- accessed
- uzi
- flyswatter
- utilized
- acquiring
- illicit
- gatlinburg
- cosa
- hiked
- ardmore
- cloud
- ledges
- hyatt
- gully
- trench
- tenkiller
- enlisting
- seductive
- pinion
- totality
- revealed
- legislat
- abrupt
- ruder
- arrives
- '1'
- microcomputers
- gateway
- apollo
- faulkner
- emblem
- candice
- bergen
- ghosts
- haunted
- dianetics
- gibberish
- broudigan
- journeys
- mailman
- karl
- malone
- hacking
- fillmont
- generically
- cyclist
- techy
- hackers
- davy
- crockett
- sailor
- sailed
- mck
- equalize
- semiretired
- dementia
- insisted
- rejuvenating
- coldest
- cus
- celltrex
- jeri
- maceo
- rampages
- cocoons
- occa
- uniqueness
- winfrey
- prebuilt
- workbench
- subcontracted
- subbed
- scramble
- championships
- peacefulness
- birdie
- quadruple
- whizzing
- spectators
- scrambles
- kerr
- mcgee
- infrared
- suffice
- notifies
- supplying
- angles
- anticrime
- outings
- sec
- arlene
- lister
- poked
- togethers
- dearly
- swoosh
- skate
- begonias
- destruct
- concessions
- drizzly
- huddled
- cages
- fanatics
- straightforward
- piston
- oiling
- altog
- reelection
- provisional
- locate
- incomewise
- ifs
- ands
- buts
- '4'
- hel
- discontinue
- narrowing
- nitty
- gritty
- faithful
- shoppers
- yourselves
- straighten
- stems
- relating
- supporters
- antisupporters
- contras
- dictator
- fascist
- siesta
- mouths
- reflecting
- dabble
- chalk
- chesapeake
- suspended
- ath
- tutored
- goofing
- piney
- diameter
- calmness
- outwitting
- shiners
- infla
- inflatable
- raft
- cottonmouth
- coves
- walkie
- talkies
- handcrafted
- semifixed
- automated
- crafted
- stateside
- adage
- advising
- embarrassment
- jessie
- helms
- intelligently
- mistreated
- papa
- doc
- tyrant
- puberty
- tibby
- perfumed
- legendary
- brookies
- rainbows
- accommodated
- specialists
- replanted
- rods
- norfolk
- portsmouth
- hikes
- pests
- chaperon
- calloway
- variegated
- beetles
- borderline
- zaps
- ligustrum
- apron
- gourds
- bolton
- symphonies
- caller
- sax
- houseful
- crabs
- sensation
- tingling
- oddball
- waitressing
- crunches
- relevance
- federally
- hogs
- barns
- revealing
- horticultural
- groundskeepers
- dormant
- centipede
- crops
- behold
- cuttings
- mit
- diamante
- boozier
- passengers
- shining
- becca
- nina
- palmer
- remarrying
- griffins
- crackers
- burritos
- debone
- notoriety
- jurisprudence
- thoroughfare
- sleeper
- herd
- cima
- savages
- plywood
- beams
- migrate
- undercover
- barbiturates
- codeine
- drixoral
- unsolved
- mcgillis
- weeknights
- physicist
- facet
- hurst
- greensboro
- celebrities
- repeaters
- zealand
- statistically
- outbound
- astronomy
- gallagher
- pictured
- betters
- hubble
- telescope
- planets
- habitable
- backers
- zippers
- snaps
- dull
- pretechnology
- shelled
- duplicates
- regulat
- regulators
- regulator
- lever
- pulley
- chev
- oi
- resur
- ourse
- hesitating
- russ
- noons
- flaw
- gasket
- fury
- exceptionally
- surfaced
- repeatedly
- escapes
- pragmatic
- consti
- opponents
- laural
- squeaked
- andrews
- clou
- crept
- firewood
- maples
- dogwoods
- lowell
- unu
- periodicals
- historic
- interes
- lawful
- scanners
- attempted
- thoroughness
- mag
- announcers
- tele
- ivan
- rodriguez
- ballplayers
- routing
- enthusiast
- ducted
- gettin
- brussels
- sprouts
- kale
- pony
- grazing
- pears
- extinguishers
- depleter
- extinguisher
- timed
- contaminants
- probe
- ionization
- miller
- temptation
- squareness
- buckles
- fea
- lettering
- vin
- vinyl
- balloons
- recy
- commented
- nudge
- decomposable
- flips
- emptying
- regressive
- defen
- kate
- curves
- raphael
- atchafalaya
- sausa
- alvarez
- applebee
- nonstructured
- torture
- nur
- fai
- glorious
- esoteric
- producer
- hairspray
- batch
- partic
- preteen
- unlikely
- dynamic
- raunchy
- horrifyingly
- poppins
- differed
- eclipses
- belie
- lebaron
- peeling
- gears
- oklahoman
- beatings
- proy
- condoms
- stupidity
- truthful
- faded
- marker
- reflective
- adheres
- sealing
- dings
- variance
- prop
- pressuring
- primed
- bragging
- sickening
- shitty
- drags
- burners
- putts
- teeing
- lodging
- dialers
- provision
- specify
- dialing
- prised
- weir
- overloads
- hoosiers
- crossing
- delancey
- thrillers
- backless
- ani
- nick
- nite
- dragnet
- bald
- marlo
- collier
- brigham
- estonia
- agriculture
- foodwise
- rioting
- secede
- proportionately
- hinders
- tubs
- brougham
- trunks
- shy
- gadgetry
- '6'
- interiors
- veered
- revolving
- reverting
- envy
- exhausts
- hairy
- gettingest
- daught
- bertinelli
- dysfunctional
- childfaring
- miracles
- bette
- midler
- redbook
- previewing
- postage
- unauthorized
- mayors
- discredit
- ps
- productions
- chariots
- gladiator
- fluent
- batches
- subtitle
- subtitled
- gems
- supernatural
- accusing
- migh
- mondays
- thrust
- lifters
- drills
- rocking
- referee
- abrasive
- maintaining
- posed
- refusing
- coins
- conversions
- dormitory
- unused
- ramp
- hydraulic
- disposer
- escapement
- incorporating
- leonard
- nimoy
- trekkie
- luke
- spock
- mccoy
- admiral
- hobbled
- vulcans
- doohan
- scotty
- addams
- averaging
- decrease
- munich
- snows
- chattanooga
- lori
- coldness
- membered
- unemp
- fetus
- complications
- slobs
- equation
- nameless
- malformed
- sincere
- deliberations
- dismissed
- indicted
- revenge
- subsequent
- provoked
- provocation
- qualifies
- mitigating
- contender
- linguini
- hawaiian
- luau
- angie
- shellfish
- clam
- cheeses
- nachos
- resurrection
- lutheran
- scanned
- cooperating
- toss
- inmate
- interpretation
- blanks
- executioner
- bamorghini
- skyhawk
- dominican
- nantes
- castles
- vineyard
- consignment
- goodwill
- crushes
- sewer
- res
- unoccupied
- assassinated
- menace
- perspec
- relativity
- vantage
- weighted
- reflect
- subservient
- integration
- ith
- frien
- drudgery
- montpe
- mont
- monteplier
- montpelier
- everett
- yack
- tromping
- unlimited
- wedge
- fairway
- flus
- startling
- '286'
- turret
- scien
- simulators
- plugged
- upgrades
- custer
- '386'
- trenches
- trencher
- stunt
- cul
- sac
- rearranged
- clancy
- novell
- netware
- ark
- ladonna
- peck
- bourne
- ultimatum
- enveloped
- amsterdam
- holland
- harpsichordist
- forte
- warrington
- cheating
- harry
- heroic
- mayfield
- corrupts
- lig
- hatteras
- imaging
- legalese
- himsnelf
- koop
- scarcity
- highland
- jogs
- gyms
- inequities
- stimulate
- deductor
- bentsen
- drunks
- lafferty
- infringe
- snuffed
- snuff
- compares
- gilmore
- accomplishes
- william
- thrice
- mating
- sows
- suckling
- hernia
- carcass
- cloves
- pineapples
- cranberries
- hominy
- barb
- automatics
- avis
- crashed
- lens
- porsche
- turbo
- carrera
- mys
- mushrooming
- percentagewise
- folderol
- lifeguard
- jarring
- flui
- watchers
- pokes
- blamed
- ceases
- intravenous
- cell
- quests
- subsidies
- slashed
- entitlement
- trades
- beauticians
- unending
- spiral
- consumers
- unf
- ailments
- magerick
- celtic
- transplanted
- rolando
- harper
- plaint
- straighter
- dayer
- plumbed
- bolted
- logan
- accredited
- professorship
- distressing
- fiel
- treasury
- refunds
- halt
- spying
- scaled
- loading
- challenger
- stat
- mirv
- roomy
- cargo
- recommends
- volvos
- wagons
- conscientiously
- emiss
- hypothesize
- muncie
- terre
- haute
- triggering
- verify
- drivable
- emerges
- overgrazed
- reclaimed
- prettiest
- palm
- paintbrush
- septic
- hummingbirds
- hummingbird
- pooped
- annuals
- countrified
- supermarket
- coaster
- afterburners
- gliding
- oomph
- subs
- gambled
- insulating
- spec
- verandas
- genes
- drapes
- guppies
- platies
- fishies
- glacier
- playgrounds
- wilderness
- scaries
- rayburn
- curling
- nominal
- fulfill
- synagogue
- geriatrics
- app
- degenerative
- communiky
- enhance
- assist
- text
- biogra
- daniels
- prince
- phillip
- criticizing
- miniseries
- scarlett
- spectacular
- torrents
- ligh
- horizontally
- arid
- crisp
- sleigh
- brighton
- springtime
- skie
- hammered
- subtly
- brianna
- lib
- submerged
- loosening
- leaks
- tar
- gravel
- plastered
- drywalled
- plastering
- terri
- exasperating
- swelling
- squirming
- swells
- shrinks
- retains
- highlight
- captive
- legos
- technic
- lego
- stare
- engagements
- sousa
- refreshments
- rehearsal
- donations
- municipal
- conduct
- nitny
- altoona
- lockhaven
- nighttimes
- ama
- emerson
- maceboast
- circuitry
- vacationer
- wausau
- unduly
- sunglasses
- grip
- durable
- faulty
- recliner
- pinto
- sequoias
- redwoods
- bryce
- tetons
- sequoia
- driveways
- snowmen
- snowballs
- marketed
- acceleration
- suspension
- lumbar
- sma
- bur
- skyrocketing
- govern
- exclude
- ballgame
- warrant
- rounds
- brats
- eff
- nativity
- facings
- casings
- relieve
- strase
- reliever
- relieving
- sander
- cabinet
- equipments
- dado
- rotary
- sicknesses
- bryan
- mamas
- packards
- solburns
- frown
- niggardly
- chintzy
- megs
- mirroring
- epidemic
- immunizations
- rays
- mumps
- rubella
- inaccuracy
- defined
- issued
- hypocritical
- stings
- laundering
- contr
- governed
- discomfort
- stea
- holster
- spontaneous
- headquarters
- bitterest
- fluctuations
- texts
- doen
- rosie
- '''neil'
- thomases
- trimmer
- clump
- tithing
- homeowner
- computerization
- stale
- subroutine
- libra
- clara
- beastie
- triggered
- pledged
- fren
- ally
- organi
- trombone
- weathers
- facetious
- directors
- spells
- compulsive
- childr
- fluffs
- toppings
- brea
- torque
- underdrive
- sportier
- beetle
- coolers
- bonneville
- secondaries
- quadrajet
- compulsion
- elevation
- variations
- hilltops
- mines
- hamster
- cruelty
- parakeet
- parakreet
- burmese
- deactivated
- infatuated
- jobbies
- visualize
- boggling
- slid
- clamped
- kisses
- everywh
- brag
- gramm
- overturning
- renegotiate
- kickbacks
- valdez
- defi
- batted
- hangs
- threats
- emit
- che
- churning
- remembrance
- networking
- conformance
- wyatt
- extremey
- bennigan
- vincent
- chefalia
- whataburger
- zillion
- mercado
- juarez
- tallest
- ewaldes
- cont
- stoneleigh
- chews
- yapping
- collies
- roughest
- hollered
- battling
- obedience
- squats
- vaca
- pilgrims
- medieval
- relics
- bemerton
- newness
- turin
- muffins
- requests
- helman
- tart
- zing
- cele
- layering
- fluffier
- joins
- jennifer
- unselfish
- tutoring
- affiliated
- aimlessly
- perky
- shins
- hyper
- burdensome
- earphones
- timbuktu
- onna
- lieutenant
- biologist
- sliding
- tremors
- variedly
- bakers
- aprons
- sweatshirt
- wigs
- lamb
- bunnies
- symbols
- milky
- polytechnochloride
- mought
- trashmore
- lifts
- riverview
- tranged
- strongest
- recessionary
- stagnate
- unteachable
- prominent
- chide
- remaining
- backbone
- newborns
- fullest
- firewh
- daffodil
- jung
- aquinas
- libretto
- rossini
- mahler
- dutchen
- trumpets
- elixir
- floated
- swapped
- tyme
- tempco
- trooper
- gisland
- carribean
- unpacking
- lotto
- alcatraz
- hairdresser
- crui
- janice
- furry
- eaves
- rafter
- cactuses
- furrows
- wrung
- plink
- construe
- thinkings
- bue
- buechele
- grieves
- gullible
- manufactures
- borden
- bib
- overalls
- oshman
- evaluated
- unfor
- linguistic
- austria
- niagara
- coasts
- carolinas
- leisurely
- modesto
- cheeseburgers
- incapable
- hygienic
- inoperable
- oxygen
- banish
- relocated
- realtor
- listings
- precautions
- integrate
- cooperatives
- reallocate
- reorganize
- accelerate
- transient
- commish
- tenderhearted
- galaxies
- crud
- mutations
- feazure
- ballooned
- reclamation
- merits
- axiom
- fiends
- sensitivity
- aboveboard
- evaluating
- veggies
- unarmed
- resembling
- tallow
- scalloped
- weighing
- strap
- squeaker
- closing
- mullin
- squeakers
- marquee
- bluish
- hydrogen
- sulfide
- h2s
- ramps
- vaccine
- preventable
- syringes
- needles
- feared
- ruf
- riffraff
- haves
- nots
- earhout
- bulletproof
- vest
- hedge
- tollbooth
- hatcher
- taverns
- sailboats
- ancle
- lounge
- cocktail
- sailer
- cruiser
- hull
- spars
- rigging
- gusts
- wearisome
- flaky
- markups
- arming
- stra
- quail
- swedish
- munch
- intermission
- doughy
- frosts
- iceberg
- schoolteacher
- altrusa
- upholstery
- garl
- jupiter
- musically
- auditions
- repertory
- outlet
- auditory
- lear
- educationally
- verified
- chording
- pianist
- min
- ec
- subbranch
- emigrated
- beware
- entrepreneurial
- ventures
- banked
- stored
- footsteps
- postcards
- notify
- notifying
- steals
- hides
- subsequently
- corrective
- leers
- downright
- outright
- shu
- newest
- apathetic
- absol
- prolong
- roofing
- retool
- zigzag
- kan
- untalented
- washed
- salvageable
- gluing
- feds
- interrupting
- faults
- caucasian
- educ
- thei
- officed
- deputy
- pruned
- gladiolas
- amaryllis
- conf
- plantings
- sprout
- narcissus
- psychic
- rerun
- activate
- rusted
- rusts
- fenders
- repainted
- acco
- dreary
- expen
- salting
- weinstocks
- wad
- hilt
- dolphene
- feelt
- throwed
- wheelchairs
- emjoy
- anheimer
- tela
- kindly
- innovated
- endeavors
- adam
- particulars
- abusive
- evolutionary
- duplication
- imagers
- allocate
- optimally
- squawk
- evolution
- insurers
- entity
- burnable
- ticketed
- charities
- braved
- suede
- cardigan
- appointments
- unlined
- toasty
- lightweight
- fireplaces
- dense
- ethanol
- smokestacks
- mowers
- wedded
- organism
- nutritionally
- bamba
- szechuan
- pancho
- binders
- assignments
- developments
- cashew
- avoiding
- suey
- disburse
- squeeze
- sq
- faculties
- pauper
- brokerage
- anticipation
- cherished
- commodity
- famuel
- slopes
- biness
- furlough
- promoted
- nec
- shasta
- salmon
- sk
- walleye
- fighters
- fillet
- foil
- seekers
- scrutiny
- tarrant
- bobsy
- accu
- smiled
- growled
- mistrials
- railroaded
- convalescent
- unsettling
- senile
- graying
- exercisings
- unaffordable
- restricts
- casse
- gabrielli
- bankrupted
- cello
- viola
- composers
- boutiques
- darling
- chanting
- canseco
- ramming
- vinny
- utility
- outweighing
- sundance
- smithsonian
- crosswords
- planners
- artists
- bazo
- faron
- spiro
- gyro
- dulcimer
- jarreau
- contorted
- bonnie
- rait
- grammy
- unedu
- sprayer
- routers
- cookie
- varnish
- smoother
- hayloft
- franklin
- gradual
- increasement
- torpedoed
- downside
- blythe
- tonkin
- macintoshes
- graphical
- multitasking
- gestures
- vocabulary
- compilers
- consultation
- interactive
- discriminating
- correlate
- funnest
- gentler
- panicked
- sassy
- westmin
- westminster
- infra
- mondale
- situa
- circuses
- disrepair
- dashboard
- ce
- beefing
- patrols
- visibility
- lifted
- cumberland
- cobb
- thefts
- superficial
- cracked
- electrically
- manufactured
- bordering
- elects
- aerodyne
- aerob
- brace
- publicize
- killings
- duri
- commentators
- blurbs
- bog
- dur
- countdown
- newscasts
- unreasonable
- moderator
- unorganized
- moderated
- assumingly
- importers
- dahlmer
- ohi
- nightmarish
- withheld
- sovereign
- martial
- puritanical
- permissible
- acquitting
- acquit
- impaneling
- dismissing
- foreman
- deliberating
- una
- restate
- unannounced
- sweep
- definitive
- bodily
- behaviors
- enters
- privacies
- melanie
- spry
- announcements
- anson
- fayetteville
- waynesboro
- delinquency
- fre
- gainfully
- tremen
- thriving
- towar
- grit
- pail
- latent
- compression
- ovens
- armor
- fierce
- finagle
- nationalizing
- cutoff
- operat
- unionized
- distinction
- institutionally
- expedient
- innovativeness
- expedi
- unequal
- plaintiff
- novices
- bets
- leaky
- luby
- taping
- promo
- blurb
- mutt
- hooper
- veterin
- spay
- neuter
- frie
- shorties
- decreased
- unrestricted
- glut
- magnum
- rushes
- oper
- preset
- styro
- frank
- shocks
- allot
- frowned
- chronicle
- analytical
- abnormality
- overwhelmingly
- academia
- descriptions
- addictive
- reevaluate
- divvy
- allocated
- psy
- psychedelic
- crosby
- stills
- performers
- secular
- druggie
- shipping
- maximize
- actuall
- revelation
- polymers
- roadways
- hoop
- funn
- heavenly
- retailers
- induce
- inducement
- recycler
- saskatoon
- welfor
- employing
- deposits
- arithmetic
- sums
- colleague
- internet
- infusions
- incurring
- surveying
- assesses
- footloose
- smattering
- greetings
- snobby
- paled
- refrained
- acute
- indivigal
- thrives
- categorized
- receptionist
- lar
- curve
- critter
- incumbent
- entrenched
- standardizing
- <sos/eos>
init: null
input_size: null
ctc_conf:
dropout_rate: 0.0
ctc_type: builtin
reduce: true
ignore_nan_grad: true
joint_net_conf: null
model_conf:
ctc_weight: 0.5
ignore_id: -1
lsm_weight: 0.0
length_normalized_loss: false
report_cer: true
report_wer: true
sym_space: <space>
sym_blank: <blank>
extract_feats_in_collect_stats: true
use_preprocessor: true
token_type: word
bpemodel: null
non_linguistic_symbols: null
cleaner: null
g2p: null
speech_volume_normalize: null
rir_scp: null
rir_apply_prob: 1.0
noise_scp: null
noise_apply_prob: 1.0
noise_db_range: '13_15'
frontend: default
frontend_conf:
fs: 16k
specaug: specaug
specaug_conf:
apply_time_warp: true
time_warp_window: 5
time_warp_mode: bicubic
apply_freq_mask: true
freq_mask_width_range:
- 0
- 30
num_freq_mask: 2
apply_time_mask: true
time_mask_width_range:
- 0
- 40
num_time_mask: 2
normalize: utterance_mvn
normalize_conf: {}
preencoder: null
preencoder_conf: {}
encoder: conformer
encoder_conf:
output_size: 512
attention_heads: 4
linear_units: 2048
num_blocks: 12
dropout_rate: 0.1
positional_dropout_rate: 0.1
attention_dropout_rate: 0.1
input_layer: conv2d
normalize_before: true
macaron_style: true
pos_enc_layer_type: rel_pos
selfattention_layer_type: rel_selfattn
activation_type: swish
use_cnn_module: true
cnn_module_kernel: 31
postencoder: null
postencoder_conf: {}
decoder: transformer
decoder_conf:
attention_heads: 4
linear_units: 2048
num_blocks: 6
dropout_rate: 0.1
positional_dropout_rate: 0.1
self_attention_dropout_rate: 0.1
src_attention_dropout_rate: 0.1
required:
- output_dir
- token_list
version: 0.10.7a1
distributed: true
```
</details>
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Kuray107/wsj0-5percent-supervised
|
Kuray107
| 2022-03-04T20:16:51Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-03T14:31:38Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: wsj0-5percent-supervised
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wsj0-5percent-supervised
This model is a fine-tuned version of [facebook/wav2vec2-large-lv60](https://huggingface.co/facebook/wav2vec2-large-lv60) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3883
- Wer: 0.1555
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 12
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 300
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:----:|:---------------:|:------:|
| 6.0248 | 16.67 | 500 | 2.9406 | 1.0 |
| 2.0466 | 33.33 | 1000 | 0.3935 | 0.3300 |
| 0.1486 | 50.0 | 1500 | 0.3091 | 0.1931 |
| 0.052 | 66.67 | 2000 | 0.3562 | 0.2052 |
| 0.0309 | 83.33 | 2500 | 0.3252 | 0.1773 |
| 0.0228 | 100.0 | 3000 | 0.3360 | 0.1652 |
| 0.0177 | 116.67 | 3500 | 0.3423 | 0.1603 |
| 0.0142 | 133.33 | 4000 | 0.3416 | 0.1611 |
| 0.0119 | 150.0 | 4500 | 0.3663 | 0.1583 |
| 0.0094 | 166.67 | 5000 | 0.3617 | 0.1567 |
| 0.0093 | 183.33 | 5500 | 0.3738 | 0.1668 |
| 0.0079 | 200.0 | 6000 | 0.3881 | 0.1652 |
| 0.0065 | 216.67 | 6500 | 0.3752 | 0.1611 |
| 0.0056 | 233.33 | 7000 | 0.3798 | 0.1603 |
| 0.0057 | 250.0 | 7500 | 0.3944 | 0.1624 |
| 0.0047 | 266.67 | 8000 | 0.4038 | 0.1583 |
| 0.0041 | 283.33 | 8500 | 0.3928 | 0.1547 |
| 0.0036 | 300.0 | 9000 | 0.3883 | 0.1555 |
### Framework versions
- Transformers 4.14.1
- Pytorch 1.10.2
- Datasets 1.18.2
- Tokenizers 0.10.3
|
azaninello/distilgpt2-finetuned-shroomstoy
|
azaninello
| 2022-03-04T19:13:30Z | 7 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-04T19:07:36Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: distilgpt2-finetuned-shroomstoy
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilgpt2-finetuned-shroomstoy
This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 4.0958
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 10 | 4.1207 |
| No log | 2.0 | 20 | 4.1009 |
| No log | 3.0 | 30 | 4.0958 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.6
|
nimrah/wav2vec2-large-xls-r-300m-turkish-colab-9
|
nimrah
| 2022-03-04T18:24:21Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-04T17:28:10Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- common_voice
model-index:
- name: wav2vec2-large-xls-r-300m-turkish-colab-9
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-turkish-colab-9
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.03
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.10.3
|
nimrah/wav2vec2-large-xls-r-300m-hindi_home-colab-11
|
nimrah
| 2022-03-04T16:41:25Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-04T13:45:53Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- common_voice
model-index:
- name: wav2vec2-large-xls-r-300m-hindi_home-colab-11
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-hindi_home-colab-11
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 3.7649
- Wer: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.03
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:---:|
| 5.5971 | 44.43 | 400 | 3.7649 | 1.0 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
|
daisyxie21/bert-base-uncased-8-10-0.01
|
daisyxie21
| 2022-03-04T16:27:40Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-03-04T14:27:09Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: bert-base-uncased-8-10-0.01
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.0
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-uncased-8-10-0.01
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8324
- Matthews Correlation: 0.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.01
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| No log | 1.0 | 400 | 0.8324 | 0.0 |
| 1.0904 | 2.0 | 800 | 1.3157 | 0.0 |
| 0.9461 | 3.0 | 1200 | 0.4407 | 0.0 |
| 0.9565 | 4.0 | 1600 | 2.1082 | 0.0 |
| 1.024 | 5.0 | 2000 | 0.7220 | 0.0 |
| 1.024 | 6.0 | 2400 | 0.7414 | 0.0 |
| 0.8362 | 7.0 | 2800 | 0.4442 | 0.0 |
| 0.6765 | 8.0 | 3200 | 0.5481 | 0.0 |
| 0.5902 | 9.0 | 3600 | 0.5642 | 0.0 |
| 0.5476 | 10.0 | 4000 | 0.4449 | 0.0 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.9.0
- Datasets 1.18.3
- Tokenizers 0.11.0
|
jiobiala24/wav2vec2-base-2
|
jiobiala24
| 2022-03-04T15:56:54Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:common_voice",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-04T04:00:58Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- common_voice
model-index:
- name: wav2vec2-base-2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-2
This model is a fine-tuned version of [jiobiala24/wav2vec2-base-1](https://huggingface.co/jiobiala24/wav2vec2-base-1) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9415
- Wer: 0.3076
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.4206 | 1.96 | 1000 | 0.6022 | 0.3435 |
| 0.3278 | 3.93 | 2000 | 0.6191 | 0.3344 |
| 0.2604 | 5.89 | 3000 | 0.6170 | 0.3288 |
| 0.2135 | 7.86 | 4000 | 0.6590 | 0.3239 |
| 0.1805 | 9.82 | 5000 | 0.7359 | 0.3289 |
| 0.1582 | 11.79 | 6000 | 0.7450 | 0.3276 |
| 0.1399 | 13.75 | 7000 | 0.7914 | 0.3218 |
| 0.1252 | 15.72 | 8000 | 0.8254 | 0.3185 |
| 0.1095 | 17.68 | 9000 | 0.8524 | 0.3184 |
| 0.1 | 19.65 | 10000 | 0.8340 | 0.3165 |
| 0.0905 | 21.61 | 11000 | 0.8846 | 0.3161 |
| 0.0819 | 23.58 | 12000 | 0.8994 | 0.3142 |
| 0.0763 | 25.54 | 13000 | 0.9018 | 0.3134 |
| 0.0726 | 27.5 | 14000 | 0.9552 | 0.3081 |
| 0.0668 | 29.47 | 15000 | 0.9415 | 0.3076 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
|
jish/distilgpt2-finetuned-wikitext2
|
jish
| 2022-03-04T15:14:19Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-04T14:44:11Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: distilgpt2-finetuned-wikitext2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilgpt2-finetuned-wikitext2
This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.6423
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.7602 | 1.0 | 2334 | 3.6669 |
| 3.633 | 2.0 | 4668 | 3.6455 |
| 3.6078 | 3.0 | 7002 | 3.6423 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.6
|
augustoortiz/bert-finetuned-squad2
|
augustoortiz
| 2022-03-04T12:53:53Z | 4 | 0 |
transformers
|
[
"transformers",
"tf",
"bert",
"question-answering",
"generated_from_keras_callback",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2022-03-02T23:29:05Z |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: augustoortiz/bert-finetuned-squad2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# augustoortiz/bert-finetuned-squad2
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.2223
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 11091, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Epoch |
|:----------:|:-----:|
| 1.2223 | 0 |
### Framework versions
- Transformers 4.17.0.dev0
- TensorFlow 2.8.0
- Datasets 1.18.3
- Tokenizers 0.11.0
|
Ayham/bert_ernie_summarization_cnn_dailymail
|
Ayham
| 2022-03-04T12:51:38Z | 6 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"encoder-decoder",
"text2text-generation",
"generated_from_trainer",
"dataset:cnn_dailymail",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-04T05:25:42Z |
---
tags:
- generated_from_trainer
datasets:
- cnn_dailymail
model-index:
- name: bert_ernie_summarization_cnn_dailymail
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert_ernie_summarization_cnn_dailymail
This model is a fine-tuned version of [](https://huggingface.co/) on the cnn_dailymail dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.12.0.dev0
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.10.3
|
NbAiLab/roberta_jan_512_ncc
|
NbAiLab
| 2022-03-04T11:44:03Z | 60 | 0 |
transformers
|
[
"transformers",
"jax",
"tensorboard",
"roberta",
"fill-mask",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2022-03-02T23:29:04Z |
---
license: cc-by-sa-4.0
---
|
gustavecortal/T0_3B-8bit
|
gustavecortal
| 2022-03-04T10:32:31Z | 6 | 10 |
transformers
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"en",
"fr",
"dataset:bigscience/P3",
"arxiv:2110.08207",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-02T23:29:05Z |
---
language: fr
license: mit
tags:
- en
datasets:
- bigscience/P3
---
### Quantized BigScience's T0 3B with 8-bit weights
This is a version of [BigScience's T0](https://huggingface.co/bigscience/T0_3B) with 3 billion parameters that is modified so you can generate **and fine-tune the model in colab or equivalent desktop gpu (e.g. single 1080Ti)**. Inspired by [GPT-J 8bit](https://huggingface.co/hivemind/gpt-j-6B-8bit).
Here's how to run it: [](https://colab.research.google.com/drive/1lMja-CPc0vm5_-gXNXAWU-9c0nom7vZ9)
This model can be easily loaded using the `T5ForConditionalGeneration` functionality:
```python
from transformers import T5ForConditionalGeneration
model = T5ForConditionalGeneration.from_pretrained("gustavecortal/T0_3B-8bit")
```
Before loading, you have to Monkey-Patch T5:
```python
class T5ForConditionalGeneration(transformers.models.t5.modeling_t5.T5ForConditionalGeneration):
def __init__(self, config):
super().__init__(config)
convert_to_int8(self)
transformers.models.t5.modeling_t5.T5ForConditionalGeneration = T5ForConditionalGeneration
```
## Model Description
T0* shows zero-shot task generalization on English natural language prompts, outperforming GPT-3 on many tasks, while being 16x smaller. It is a series of encoder-decoder models trained on a large set of different tasks specified in natural language prompts. We convert numerous English supervised datasets into prompts, each with multiple templates using varying formulations. These prompted datasets allow for benchmarking the ability of a model to perform completely unseen tasks specified in natural language. To obtain T0*, we fine-tune a pretrained language model on this multitask mixture covering many different NLP tasks.
## Links
* [BigScience](https://bigscience.huggingface.co/)
* [Hivemind](https://training-transformers-together.github.io/)
* [Gustave Cortal](https://twitter.com/gustavecortal)
```bibtex
@misc{sanh2021multitask,
title={Multitask Prompted Training Enables Zero-Shot Task Generalization},
author={Victor Sanh and Albert Webson and Colin Raffel and Stephen H. Bach and Lintang Sutawika and Zaid Alyafeai and Antoine Chaffin and Arnaud Stiegler and Teven Le Scao and Arun Raja and Manan Dey and M Saiful Bari and Canwen Xu and Urmish Thakker and Shanya Sharma Sharma and Eliza Szczechla and Taewoon Kim and Gunjan Chhablani and Nihal Nayak and Debajyoti Datta and Jonathan Chang and Mike Tian-Jian Jiang and Han Wang and Matteo Manica and Sheng Shen and Zheng Xin Yong and Harshit Pandey and Rachel Bawden and Thomas Wang and Trishala Neeraj and Jos Rozen and Abheesht Sharma and Andrea Santilli and Thibault Fevry and Jason Alan Fries and Ryan Teehan and Stella Biderman and Leo Gao and Tali Bers and Thomas Wolf and Alexander M. Rush},
year={2021},
eprint={2110.08207},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
```
|
kabelomalapane/Helsinki-NLP-opus-finetuned-en-to-zu
|
kabelomalapane
| 2022-03-04T08:53:37Z | 3 | 0 |
transformers
|
[
"transformers",
"tf",
"marian",
"text2text-generation",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-03T17:46:12Z |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: kabelomalapane/Helsinki-NLP-opus-finetuned-en-to-zu
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# kabelomalapane/Helsinki-NLP-opus-finetuned-en-to-zu
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-mul](https://huggingface.co/Helsinki-NLP/opus-mt-en-mul) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.5907
- Validation Loss: 1.6321
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
This model is to be used to translate English into Zulu. But there are still some problems in running this model, so it's still to be modified.
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 783, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 2.1622 | 1.7379 | 0 |
| 1.7292 | 1.6529 | 1 |
| 1.5907 | 1.6321 | 2 |
### Framework versions
- Transformers 4.16.2
- TensorFlow 2.8.0
- Datasets 1.18.3
- Tokenizers 0.11.0
|
Yulinfeng/wsj0_2mix_enh_train_enh_mdc_raw_valid.si_snr.ave
|
Yulinfeng
| 2022-03-04T07:19:47Z | 0 | 0 |
espnet
|
[
"espnet",
"audio",
"audio-to-audio",
"en",
"dataset:wsj0_2mix",
"arxiv:1804.00015",
"license:cc-by-4.0",
"region:us"
] |
audio-to-audio
| 2022-03-04T07:19:31Z |
---
tags:
- espnet
- audio
- audio-to-audio
language: en
datasets:
- wsj0_2mix
license: cc-by-4.0
---
## ESPnet2 ENH model
### `Yulinfeng/wsj0_2mix_enh_train_enh_mdc_raw_valid.si_snr.ave`
This model was trained by earthmanylf using wsj0_2mix recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
```bash
cd espnet
git checkout ec1acec03d109f06d829b80862e0388f7234d0d1
pip install -e .
cd egs2/wsj0_2mix/enh1
./run.sh --skip_data_prep false --skip_train true --download_model Yulinfeng/wsj0_2mix_enh_train_enh_mdc_raw_valid.si_snr.ave
```
<!-- Generated by ./scripts/utils/show_enh_score.sh -->
# RESULTS
## Environments
- date: `Thu Mar 3 17:10:03 CST 2022`
- python version: `3.8.10 (default, May 19 2021, 18:05:58) [GCC 7.3.0]`
- espnet version: `espnet 0.10.7a1`
- pytorch version: `pytorch 1.5.1+cu101`
- Git hash: `ec1acec03d109f06d829b80862e0388f7234d0d1`
- Commit date: `Fri Feb 25 14:12:45 2022 +0800`
## ..
config: conf/tuning/train_enh_mdc.yaml
|dataset|PESQ|STOI|SAR|SDR|SIR|SI_SNR|
|---|---|---|---|---|---|---|
|enhanced_cv_min_8k|2.20|0.84|9.62|8.57|17.27|8.03|
|enhanced_tt_min_8k|2.18|0.85|9.56|8.50|17.28|7.97|
## ENH config
<details><summary>expand</summary>
```
config: conf/tuning/train_enh_mdc.yaml
print_config: false
log_level: INFO
dry_run: false
iterator_type: sequence
output_dir: exp/enh_train_enh_mdc_raw
ngpu: 1
seed: 0
num_workers: 4
num_att_plot: 3
dist_backend: nccl
dist_init_method: env://
dist_world_size: null
dist_rank: null
local_rank: 0
dist_master_addr: null
dist_master_port: null
dist_launcher: null
multiprocessing_distributed: false
unused_parameters: false
sharded_ddp: false
cudnn_enabled: true
cudnn_benchmark: false
cudnn_deterministic: true
collect_stats: false
write_collected_feats: false
max_epoch: 100
patience: 10
val_scheduler_criterion:
- valid
- loss
early_stopping_criterion:
- valid
- loss
- min
best_model_criterion:
- - valid
- si_snr
- max
- - valid
- loss
- min
keep_nbest_models: 1
nbest_averaging_interval: 0
grad_clip: 5.0
grad_clip_type: 2.0
grad_noise: false
accum_grad: 1
no_forward_run: false
resume: true
train_dtype: float32
use_amp: false
log_interval: null
use_matplotlib: true
use_tensorboard: true
use_wandb: false
wandb_project: null
wandb_id: null
wandb_entity: null
wandb_name: null
wandb_model_log_interval: -1
detect_anomaly: false
pretrain_path: null
init_param: []
ignore_init_mismatch: false
freeze_param: []
num_iters_per_epoch: null
batch_size: 8
valid_batch_size: null
batch_bins: 1000000
valid_batch_bins: null
train_shape_file:
- exp/enh_stats_8k/train/speech_mix_shape
- exp/enh_stats_8k/train/speech_ref1_shape
- exp/enh_stats_8k/train/speech_ref2_shape
valid_shape_file:
- exp/enh_stats_8k/valid/speech_mix_shape
- exp/enh_stats_8k/valid/speech_ref1_shape
- exp/enh_stats_8k/valid/speech_ref2_shape
batch_type: folded
valid_batch_type: null
fold_length:
- 80000
- 80000
- 80000
sort_in_batch: descending
sort_batch: descending
multiple_iterator: false
chunk_length: 500
chunk_shift_ratio: 0.5
num_cache_chunks: 1024
train_data_path_and_name_and_type:
- - dump/raw/tr_min_8k/wav.scp
- speech_mix
- sound
- - dump/raw/tr_min_8k/spk1.scp
- speech_ref1
- sound
- - dump/raw/tr_min_8k/spk2.scp
- speech_ref2
- sound
valid_data_path_and_name_and_type:
- - dump/raw/cv_min_8k/wav.scp
- speech_mix
- sound
- - dump/raw/cv_min_8k/spk1.scp
- speech_ref1
- sound
- - dump/raw/cv_min_8k/spk2.scp
- speech_ref2
- sound
allow_variable_data_keys: false
max_cache_size: 0.0
max_cache_fd: 32
valid_max_cache_size: null
optim: adam
optim_conf:
lr: 0.001
eps: 1.0e-08
weight_decay: 1.0e-07
scheduler: reducelronplateau
scheduler_conf:
mode: min
factor: 0.7
patience: 1
init: xavier_uniform
model_conf:
stft_consistency: false
loss_type: mask_mse
mask_type: PSM
ref_channel: 0
criterions:
- name: dpcl
conf:
loss_type: mdc
wrapper: dpcl
wrapper_conf:
weight: 1.0
use_preprocessor: false
encoder: stft
encoder_conf:
n_fft: 256
hop_length: 128
separator: dpcl
separator_conf:
rnn_type: blstm
num_spk: 2
nonlinear: relu
layer: 2
unit: 500
dropout: 0.1
emb_D: 40
decoder: stft
decoder_conf:
n_fft: 256
hop_length: 128
required:
- output_dir
version: 0.10.7a1
distributed: false
```
</details>
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
@inproceedings{ESPnet-SE,
author = {Chenda Li and Jing Shi and Wangyou Zhang and Aswin Shanmugam Subramanian and Xuankai Chang and
Naoyuki Kamo and Moto Hira and Tomoki Hayashi and Christoph B{"{o}}ddeker and Zhuo Chen and Shinji Watanabe},
title = {ESPnet-SE: End-To-End Speech Enhancement and Separation Toolkit Designed for {ASR} Integration},
booktitle = {{IEEE} Spoken Language Technology Workshop, {SLT} 2021, Shenzhen, China, January 19-22, 2021},
pages = {785--792},
publisher = {{IEEE}},
year = {2021},
url = {https://doi.org/10.1109/SLT48900.2021.9383615},
doi = {10.1109/SLT48900.2021.9383615},
timestamp = {Mon, 12 Apr 2021 17:08:59 +0200},
biburl = {https://dblp.org/rec/conf/slt/Li0ZSCKHHBC021.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
batterydata/batteryonlybert-uncased-squad-v1
|
batterydata
| 2022-03-03T20:25:01Z | 16 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"question answering",
"en",
"dataset:squad",
"dataset:batterydata/battery-device-data-qa",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2022-03-02T23:29:05Z |
---
language: en
tags: question answering
license: apache-2.0
datasets:
- squad
- batterydata/battery-device-data-qa
metrics: squad
---
# BatteryOnlyBERT-uncased for QA
**Language model:** batteryonlybert-uncased
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD v1
**Eval data:** SQuAD v1
**Code:** See [example](https://github.com/ShuHuang/batterybert)
**Infrastructure**: 8x DGX A100
## Hyperparameters
```
batch_size = 16
n_epochs = 2
base_LM_model = "batteryonlybert-uncased"
max_seq_len = 386
learning_rate = 2e-5
doc_stride=128
max_query_length=64
```
## Performance
Evaluated on the SQuAD v1.0 dev set.
```
"exact": 79.53,
"f1": 87.22,
```
Evaluated on the battery device dataset.
```
"precision": 67.20,
"recall": 83.82,
```
## Usage
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "batterydata/batteryonlybert-uncased-squad-v1"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'What is the electrolyte?',
'context': 'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
mcdzwil/bert-base-NER-finetuned-ner-ISU
|
mcdzwil
| 2022-03-03T20:21:38Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"token-classification",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2022-03-03T20:12:34Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: bert-base-NER-finetuned-ner-ISU
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-NER-finetuned-ner-ISU
This model is a fine-tuned version of [dslim/bert-base-NER](https://huggingface.co/dslim/bert-base-NER) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1090
- Precision: 0.9408
- Recall: 0.8223
- F1: 0.8776
- Accuracy: 0.9644
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 48 | 0.1411 | 0.8970 | 0.7840 | 0.8367 | 0.9473 |
| No log | 2.0 | 96 | 0.1231 | 0.9453 | 0.7964 | 0.8645 | 0.9589 |
| No log | 3.0 | 144 | 0.1090 | 0.9408 | 0.8223 | 0.8776 | 0.9644 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.6
|
batterydata/bert-base-cased-squad-v1
|
batterydata
| 2022-03-03T19:54:26Z | 71 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"question-answering",
"question answering",
"en",
"dataset:squad",
"dataset:batterydata/battery-device-data-qa",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2022-03-02T23:29:05Z |
---
language: en
tags: question answering
license: apache-2.0
datasets:
- squad
- batterydata/battery-device-data-qa
metrics: squad
---
# BERT-base-cased for QA
**Language model:** bert-base-cased
**Language:** English
**Downstream-task:** Extractive QA
**Training data:** SQuAD v1
**Eval data:** SQuAD v1
**Code:** See [example](https://github.com/ShuHuang/batterybert)
**Infrastructure**: 8x DGX A100
## Hyperparameters
```
batch_size = 32
n_epochs = 2
base_LM_model = "bert-base-cased"
max_seq_len = 386
learning_rate = 5e-5
doc_stride=128
max_query_length=64
```
## Performance
Evaluated on the SQuAD v1.0 dev set.
```
"exact": 81.30,
"f1": 88.58,
```
Evaluated on the battery device dataset.
```
"precision": 67.02,
"recall": 80.15,
```
## Usage
### In Transformers
```python
from transformers import AutoModelForQuestionAnswering, AutoTokenizer, pipeline
model_name = "batterydata/bert-base-cased-squad-v1"
# a) Get predictions
nlp = pipeline('question-answering', model=model_name, tokenizer=model_name)
QA_input = {
'question': 'What is the electrolyte?',
'context': 'The typical non-aqueous electrolyte for commercial Li-ion cells is a solution of LiPF6 in linear and cyclic carbonates.'
}
res = nlp(QA_input)
# b) Load model & tokenizer
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
```
## Authors
Shu Huang: `sh2009 [at] cam.ac.uk`
Jacqueline Cole: `jmc61 [at] cam.ac.uk`
## Citation
BatteryBERT: A Pre-trained Language Model for Battery Database Enhancement
|
Kevincp560/t5-base-finetuned-pubmed
|
Kevincp560
| 2022-03-03T16:06:16Z | 10 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"dataset:pub_med_summarization_dataset",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2022-03-03T13:28:37Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- pub_med_summarization_dataset
metrics:
- rouge
model-index:
- name: t5-base-finetuned-pubmed
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: pub_med_summarization_dataset
type: pub_med_summarization_dataset
args: document
metrics:
- name: Rouge1
type: rouge
value: 9.3771
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# t5-base-finetuned-pubmed
This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on the pub_med_summarization_dataset dataset.
It achieves the following results on the evaluation set:
- Loss: 2.6311
- Rouge1: 9.3771
- Rouge2: 3.7042
- Rougel: 8.4912
- Rougelsum: 9.0013
- Gen Len: 19.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:-----:|:---------------:|:------:|:------:|:------:|:---------:|:-------:|
| 2.0957 | 1.0 | 4000 | 1.9006 | 8.6968 | 3.2473 | 7.9565 | 8.3224 | 19.0 |
| 2.0489 | 2.0 | 8000 | 1.8571 | 8.6877 | 3.2461 | 7.9311 | 8.2991 | 19.0 |
| 2.7345 | 3.0 | 12000 | 2.6112 | 9.585 | 3.0129 | 8.4729 | 9.1109 | 19.0 |
| 3.0585 | 4.0 | 16000 | 2.7222 | 9.7011 | 3.3549 | 8.6588 | 9.2646 | 19.0 |
| 2.9437 | 5.0 | 20000 | 2.6311 | 9.3771 | 3.7042 | 8.4912 | 9.0013 | 19.0 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.10.0+cu111
- Datasets 1.18.3
- Tokenizers 0.11.6
|
nateraw/keras-dummy-model-mixin-demo-w-card
|
nateraw
| 2022-03-03T15:55:09Z | 0 | 0 |
keras
|
[
"keras",
"tf-keras",
"region:us"
] | null | 2022-03-02T23:29:05Z |
---
library_name: keras
---
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training Metrics
Model history needed
## Model Plot
<details>
<summary>View Model Plot</summary>

</details>
|
Subsets and Splits
Filtered Qwen2.5 Distill Models
Identifies specific configurations of models by filtering cards that contain 'distill', 'qwen2.5', '7b' while excluding certain base models and incorrect model ID patterns, uncovering unique model variants.
Filtered Model Cards Count
Finds the count of entries with specific card details that include 'distill', 'qwen2.5', '7b' but exclude certain base models, revealing valuable insights about the dataset's content distribution.
Filtered Distill Qwen 7B Models
Filters for specific card entries containing 'distill', 'qwen', and '7b', excluding certain strings and patterns, to identify relevant model configurations.
Filtered Qwen-7b Model Cards
The query performs a detailed filtering based on specific keywords and excludes certain entries, which could be useful for identifying a specific subset of cards but does not provide deeper insights or trends.
Filtered Qwen 7B Model Cards
The query filters for specific terms related to "distilled" or "distill", "qwen", and "7b" in the 'card' column but excludes certain base models, providing a limited set of entries for further inspection.
Qwen 7B Distilled Models
The query provides a basic filtering of records to find specific card names that include keywords related to distilled Qwen 7b models, excluding a particular base model, which gives limited insight but helps in focusing on relevant entries.
Qwen 7B Distilled Model Cards
The query filters data based on specific keywords in the modelId and card fields, providing limited insight primarily useful for locating specific entries rather than revealing broad patterns or trends.
Qwen 7B Distilled Models
Finds all entries containing the terms 'distilled', 'qwen', and '7b' in a case-insensitive manner, providing a filtered set of records but without deeper analysis.
Distilled Qwen 7B Models
The query filters for specific model IDs containing 'distilled', 'qwen', and '7b', providing a basic retrieval of relevant entries but without deeper analysis or insight.
Filtered Model Cards with Distill Qwen2.
Filters and retrieves records containing specific keywords in the card description while excluding certain phrases, providing a basic count of relevant entries.
Filtered Model Cards with Distill Qwen 7
The query filters specific variations of card descriptions containing 'distill', 'qwen', and '7b' while excluding a particular base model, providing limited but specific data retrieval.
Distill Qwen 7B Model Cards
The query filters and retrieves rows where the 'card' column contains specific keywords ('distill', 'qwen', and '7b'), providing a basic filter result that can help in identifying specific entries.