repo_id
stringlengths 4
110
| author
stringlengths 2
27
⌀ | model_type
stringlengths 2
29
⌀ | files_per_repo
int64 2
15.4k
| downloads_30d
int64 0
19.9M
| library
stringlengths 2
37
⌀ | likes
int64 0
4.34k
| pipeline
stringlengths 5
30
⌀ | pytorch
bool 2
classes | tensorflow
bool 2
classes | jax
bool 2
classes | license
stringlengths 2
30
| languages
stringlengths 4
1.63k
⌀ | datasets
stringlengths 2
2.58k
⌀ | co2
stringclasses 29
values | prs_count
int64 0
125
| prs_open
int64 0
120
| prs_merged
int64 0
15
| prs_closed
int64 0
28
| discussions_count
int64 0
218
| discussions_open
int64 0
148
| discussions_closed
int64 0
70
| tags
stringlengths 2
513
| has_model_index
bool 2
classes | has_metadata
bool 1
class | has_text
bool 1
class | text_length
int64 401
598k
| is_nc
bool 1
class | readme
stringlengths 0
598k
| hash
stringlengths 32
32
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
reichenbach/wav2vec2-large-xls-r-300m-hi
|
reichenbach
|
wav2vec2
| 30 | 9 |
transformers
| 1 |
automatic-speech-recognition
| true | false | false |
apache-2.0
|
['hi']
|
['common_voice']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer', 'hf-asr-leaderboard', 'robust-speech-event']
| true | true | true | 1,906 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-hi
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4749
- Wer: 0.9420
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7.5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 9.8626 | 4.76 | 400 | 3.6151 | 1.0 |
| 3.5463 | 9.52 | 800 | 3.5778 | 1.0 |
| 3.4415 | 14.28 | 1200 | 3.4525 | 1.0 |
| 3.0927 | 19.05 | 1600 | 2.6220 | 0.9860 |
| 2.0573 | 23.8 | 2000 | 2.3974 | 0.9610 |
| 1.5905 | 28.57 | 2400 | 2.4427 | 0.9558 |
| 1.426 | 33.33 | 2800 | 2.4736 | 0.9475 |
| 1.3147 | 38.09 | 3200 | 2.4494 | 0.9417 |
| 1.2642 | 42.85 | 3600 | 2.4665 | 0.9450 |
| 1.2289 | 47.62 | 4000 | 2.4749 | 0.9420 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.10.3
|
a983f532738e6c9ec04454fd0d2b3bb7
|
microsoft/git-large
|
microsoft
|
git
| 10 | 208 |
transformers
| 1 |
image-to-text
| true | false | false |
mit
|
['en']
| null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['vision', 'image-captioning']
| false | true | true | 2,941 | false |
# GIT (GenerativeImage2Text), large-sized
GIT (short for GenerativeImage2Text) model, large-sized version. It was introduced in the paper [GIT: A Generative Image-to-text Transformer for Vision and Language](https://arxiv.org/abs/2205.14100) by Wang et al. and first released in [this repository](https://github.com/microsoft/GenerativeImage2Text).
Disclaimer: The team releasing GIT did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
GIT is a Transformer decoder conditioned on both CLIP image tokens and text tokens. The model is trained using "teacher forcing" on a lot of (image, text) pairs.
The goal for the model is simply to predict the next text token, giving the image tokens and previous text tokens.
The model has full access to (i.e. a bidirectional attention mask is used for) the image patch tokens, but only has access to the previous text tokens (i.e. a causal attention mask is used for the text tokens) when predicting the next text token.

This allows the model to be used for tasks like:
- image and video captioning
- visual question answering (VQA) on images and videos
- even image classification (by simply conditioning the model on the image and asking it to generate a class for it in text).
## Intended uses & limitations
You can use the raw model for image captioning. See the [model hub](https://huggingface.co/models?search=microsoft/git) to look for
fine-tuned versions on a task that interests you.
### How to use
For code examples, we refer to the [documentation](https://huggingface.co/transformers/main/model_doc/git.html).
## Training data
From the paper:
> We collect 0.8B image-text pairs for pre-training, which include COCO (Lin et al., 2014), Conceptual Captions
(CC3M) (Sharma et al., 2018), SBU (Ordonez et al., 2011), Visual Genome (VG) (Krishna et al., 2016),
Conceptual Captions (CC12M) (Changpinyo et al., 2021), ALT200M (Hu et al., 2021a), and an extra 0.6B
data following a similar collection procedure in Hu et al. (2021a).
=> however this is for the model referred to as "GIT" in the paper, which is not open-sourced.
This checkpoint is "GIT-large", which is a smaller variant of GIT trained on 20 million image-text pairs.
See table 11 in the [paper](https://arxiv.org/abs/2205.14100) for more details.
### Preprocessing
We refer to the original repo regarding details for preprocessing during training.
During validation, one resizes the shorter edge of each image, after which center cropping is performed to a fixed-size resolution. Next, frames are normalized across the RGB channels with the ImageNet mean and standard deviation.
## Evaluation results
For evaluation results, we refer readers to the [paper](https://arxiv.org/abs/2205.14100).
|
3b96d9f9b3696d8c0794d6344e701575
|
redevaaa/fin1
|
redevaaa
|
bert
| 14 | 3 |
transformers
| 0 |
token-classification
| true | false | false |
apache-2.0
| null |
['fin']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,682 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fin1
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the fin dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0778
- Precision: 0.8315
- Recall: 0.9243
- F1: 0.8755
- Accuracy: 0.9852
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 129 | 0.0860 | 0.8535 | 0.9283 | 0.8893 | 0.9904 |
| No log | 2.0 | 258 | 0.1513 | 0.7993 | 0.9203 | 0.8556 | 0.9799 |
| No log | 3.0 | 387 | 0.0977 | 0.8221 | 0.9203 | 0.8684 | 0.9831 |
| 0.0017 | 4.0 | 516 | 0.0783 | 0.8286 | 0.9243 | 0.8738 | 0.9848 |
| 0.0017 | 5.0 | 645 | 0.0778 | 0.8315 | 0.9243 | 0.8755 | 0.9852 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.7.1
- Tokenizers 0.13.2
|
1ab446c17d7b1d1ab5f5f25472d74cb9
|
bertin-project/bertin-base-gaussian
|
bertin-project
|
roberta
| 71 | 14 |
transformers
| 0 |
fill-mask
| true | false | true |
cc-by-4.0
|
['es']
| null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['spanish', 'roberta']
| false | true | true | 1,206 | false |
This is a **RoBERTa-base** model trained from scratch in Spanish.
The training dataset is [mc4](https://huggingface.co/datasets/bertin-project/mc4-es-sampled ) subsampling documents to a total of about 50 million examples. Sampling is biased towards average perplexity values (using a Gaussian function), discarding more often documents with very large values (poor quality) of very small values (short, repetitive texts).
This model has been trained for 250.000 steps.
Please see our main [card](https://huggingface.co/bertin-project/bertin-roberta-base-spanish) for more information.
This is part of the
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
## Team members
- Eduardo González ([edugp](https://huggingface.co/edugp))
- Javier de la Rosa ([versae](https://huggingface.co/versae))
- Manu Romero ([mrm8488](https://huggingface.co/))
- María Grandury ([mariagrandury](https://huggingface.co/))
- Pablo González de Prado ([Pablogps](https://huggingface.co/Pablogps))
- Paulo Villegas ([paulo](https://huggingface.co/paulo))
|
84910a0becd8acf97fb05a138a805418
|
sameearif88/wav2vec2-base-timit-demo-colab1
|
sameearif88
|
wav2vec2
| 14 | 5 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,342 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-timit-demo-colab1
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7411
- Wer: 0.5600
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 5.0773 | 13.89 | 500 | 3.1073 | 1.0 |
| 1.2444 | 27.78 | 1000 | 0.7411 | 0.5600 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.11.0+cu113
- Datasets 1.18.3
- Tokenizers 0.10.3
|
7fd7bde366715d6ec8b1b93a2d568db5
|
Kuaaangwen/bert-base-cased-finetuned-revision-booklet-chemistry
|
Kuaaangwen
|
bert
| 9 | 3 |
transformers
| 0 |
fill-mask
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,377 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-base-cased-finetuned-revision-booklet-chemistry
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4864
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 105 | 1.8484 |
| No log | 2.0 | 210 | 1.6418 |
| No log | 3.0 | 315 | 1.5820 |
| No log | 4.0 | 420 | 1.4826 |
| 1.8696 | 5.0 | 525 | 1.4521 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
|
845229438346dc8989b243fd6a17da75
|
bnsh/ddpm-butterflies-128
|
bnsh
| null | 13 | 0 |
diffusers
| 0 | null | false | false | false |
apache-2.0
|
['en']
|
['huggan/smithsonian_butterflies_subset']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,226 | false |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# ddpm-butterflies-128
## Model description
This diffusion model is trained with the [🤗 Diffusers](https://github.com/huggingface/diffusers) library
on the `huggan/smithsonian_butterflies_subset` dataset.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training data
[TODO: describe the data used to train the model]
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- gradient_accumulation_steps: 1
- optimizer: AdamW with betas=(None, None), weight_decay=None and epsilon=None
- lr_scheduler: None
- lr_warmup_steps: 500
- ema_inv_gamma: None
- ema_inv_gamma: None
- ema_inv_gamma: None
- mixed_precision: fp16
### Training results
📈 [TensorBoard logs](https://huggingface.co/bnsh/ddpm-butterflies-128/tensorboard?#scalars)
|
0c16b6282239aed707c26869c67b229f
|
fathyshalab/massive_alarm-roberta-large-v1-5-50
|
fathyshalab
|
roberta
| 14 | 2 |
sentence-transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['setfit', 'sentence-transformers', 'text-classification']
| false | true | true | 1,460 | false |
# fathyshalab/massive_alarm-roberta-large-v1-5-50
This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Usage
To use this model for inference, first install the SetFit library:
```bash
python -m pip install setfit
```
You can then run inference as follows:
```python
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("fathyshalab/massive_alarm-roberta-large-v1-5-50")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
```
## BibTeX entry and citation info
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
|
30144d9fa0f5322d92ecf8afcd2e714d
|
Jellevdl/checkpoint-24453
|
Jellevdl
|
bert
| 10 | 19 |
transformers
| 0 |
question-answering
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 946 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# checkpoint-24453
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 12
- eval_batch_size: 12
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 48
- optimizer: Adafactor
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 6000
- num_epochs: 30
### Framework versions
- Transformers 4.18.0
- Pytorch 1.5.0
- Datasets 2.4.0
- Tokenizers 0.12.1
|
2c0902c01dc6d1a635c1c9735f8fb092
|
EMBO/sd-geneprod-roles
|
EMBO
|
roberta
| 10 | 9 |
transformers
| 0 |
token-classification
| true | false | false |
agpl-3.0
|
['english']
|
['EMBO/sd-nlp']
| null | 1 | 1 | 0 | 0 | 0 | 0 | 0 |
['token classification']
| false | true | true | 3,139 | false |
# sd-geneprod-roles
## Model description
This model is a [RoBERTa base model](https://huggingface.co/roberta-base) that was further trained using a masked language modeling task on a compendium of English scientific textual examples from the life sciences using the [BioLang dataset](https://huggingface.co/datasets/EMBO/biolang). It was then fine-tuned for token classification on the SourceData [sd-nlp](https://huggingface.co/datasets/EMBO/sd-nlp) dataset with the `GENEPROD_ROLES` configuration to perform pure context-dependent semantic role classification of bioentities.
## Intended uses & limitations
#### How to use
The intended use of this model is to infer the semantic role of gene products (genes and proteins) with regard to the causal hypotheses tested in experiments reported in scientific papers.
To have a quick check of the model:
```python
from transformers import pipeline, RobertaTokenizerFast, RobertaForTokenClassification
example = """<s>The <mask> overexpression in cells caused an increase in <mask> expression.</s>"""
tokenizer = RobertaTokenizerFast.from_pretrained('roberta-base', max_len=512)
model = RobertaForTokenClassification.from_pretrained('EMBO/sd-geneprod-roles')
ner = pipeline('ner', model, tokenizer=tokenizer)
res = ner(example)
for r in res:
print(r['word'], r['entity'])
```
#### Limitations and bias
The model must be used with the `roberta-base` tokenizer.
## Training data
The model was trained for token classification using the [EMBO/sd-nlp dataset](https://huggingface.co/datasets/EMBO/sd-nlp) which includes manually annotated examples.
## Training procedure
The training was run on an NVIDIA DGX Station with 4XTesla V100 GPUs.
Training code is available at https://github.com/source-data/soda-roberta
- Model fine-tuned: EMBL/bio-lm
- Tokenizer vocab size: 50265
- Training data: EMBO/sd-nlp
- Dataset configuration: GENEPROD_ROLES
- Training with 48771 examples.
- Evaluating on 13801 examples.
- Training on 15 features: O, I-CONTROLLED_VAR, B-CONTROLLED_VAR, I-MEASURED_VAR, B-MEASURED_VAR
- Epochs: 0.9
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `learning_rate`: 0.0001
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
## Eval results
On 7178 example of test set with `sklearn.metrics`:
```
precision recall f1-score support
CONTROLLED_VAR 0.81 0.86 0.83 7835
MEASURED_VAR 0.82 0.85 0.84 9330
micro avg 0.82 0.85 0.83 17165
macro avg 0.82 0.85 0.83 17165
weighted avg 0.82 0.85 0.83 17165
{'test_loss': 0.03846803680062294, 'test_accuracy_score': 0.9854472664459946, 'test_precision': 0.8156312625250501, 'test_recall': 0.8535974366443344, 'test_f1': 0.8341825841897008, 'test_runtime': 58.7369, 'test_samples_per_second': 122.206, 'test_steps_per_second': 1.924}
```
|
52fa5a191b21623256806eddcbc2ca21
|
Minxuan/distilbert-base-uncased-finetuned-emotion
|
Minxuan
|
distilbert
| 14 | 6 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,907 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-emotion
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2055
- Accuracy: 0.9355
- F1: 0.9354
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| 0.1775 | 1.0 | 250 | 0.1765 | 0.929 | 0.9287 |
| 0.1205 | 2.0 | 500 | 0.1516 | 0.9395 | 0.9393 |
| 0.0981 | 3.0 | 750 | 0.1530 | 0.9345 | 0.9351 |
| 0.0799 | 4.0 | 1000 | 0.1654 | 0.935 | 0.9348 |
| 0.0641 | 5.0 | 1250 | 0.1638 | 0.937 | 0.9364 |
| 0.0495 | 6.0 | 1500 | 0.1695 | 0.937 | 0.9369 |
| 0.0417 | 7.0 | 1750 | 0.1873 | 0.935 | 0.9350 |
| 0.0332 | 8.0 | 2000 | 0.1941 | 0.935 | 0.9351 |
| 0.0275 | 9.0 | 2250 | 0.1977 | 0.9385 | 0.9385 |
| 0.0224 | 10.0 | 2500 | 0.2055 | 0.9355 | 0.9354 |
### Framework versions
- Transformers 4.13.0
- Pytorch 1.11.0
- Datasets 1.16.1
- Tokenizers 0.10.3
|
bb9c88944c2fa2fec3e5ec2fe1e5c05e
|
Shobhank-iiitdwd/Distiled-bert-medium-squad2-QA
|
Shobhank-iiitdwd
|
bert
| 9 | 11 |
transformers
| 0 |
question-answering
| true | false | false |
mit
|
['en']
|
['squad_v2']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['exbert']
| true | true | true | 1,757 | false |
## Overview
**Language model:** deepset/roberta-base-squad2-distilled
**Language:** English
**Training data:** SQuAD 2.0 training set
**Eval data:** SQuAD 2.0 dev set
**Infrastructure**: 1x V100 GPU
**Published**: Apr 21st, 2021
## Details
- haystack's distillation feature was used for training. deepset/bert-large-uncased-whole-word-masking-squad2 was used as the teacher model.
## Hyperparameters
```
batch_size = 6
n_epochs = 2
max_seq_len = 384
learning_rate = 3e-5
lr_schedule = LinearWarmup
embeds_dropout_prob = 0.1
temperature = 5
distillation_loss_weight = 1
```
## Performance
```
"exact": 68.6431398972458
"f1": 72.7637083790805
```
## Authors
- Timo Möller: `timo.moeller [at] deepset.ai`
- Julian Risch: `julian.risch [at] deepset.ai`
- Malte Pietsch: `malte.pietsch [at] deepset.ai`
- Michel Bartels: `michel.bartels [at] deepset.ai`
## About us

We bring NLP to the industry via open source!
Our focus: Industry specific language models & large scale QA systems.
Some of our work:
- [German BERT (aka "bert-base-german-cased")](https://deepset.ai/german-bert)
- [GermanQuAD and GermanDPR datasets and models (aka "gelectra-base-germanquad", "gbert-base-germandpr")](https://deepset.ai/germanquad)
- [FARM](https://github.com/deepset-ai/FARM)
- [Haystack](https://github.com/deepset-ai/haystack/)
Get in touch:
[Twitter](https://twitter.com/deepset_ai) | [LinkedIn](https://www.linkedin.com/company/deepset-ai/) | [Discord](https://haystack.deepset.ai/community/join) | [GitHub Discussions](https://github.com/deepset-ai/haystack/discussions) | [Website](https://deepset.ai)
By the way: [we're hiring!](http://www.deepset.ai/jobs)
|
14f5618bc875d3281bfbb1ce1bae4fd2
|
bthomas/article2KW_test1.3_barthez-orangesum-title_finetuned_for_summerization
|
bthomas
|
mbart
| 10 | 3 |
transformers
| 0 |
summarization
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['summarization', 'generated_from_trainer']
| true | true | true | 1,591 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# article2KW_test1.3_barthez-orangesum-title_finetuned_for_summerization
This model is a fine-tuned version of [moussaKam/barthez-orangesum-title](https://huggingface.co/moussaKam/barthez-orangesum-title) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1217
- Rouge1: 0.2933
- Rouge2: 0.0810
- Rougel: 0.2937
- Rougelsum: 0.2933
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|
| 1.6383 | 1.0 | 1497 | 1.3027 | 0.2623 | 0.0709 | 0.2627 | 0.2625 |
| 1.201 | 2.0 | 2994 | 1.1714 | 0.2879 | 0.0748 | 0.2888 | 0.2884 |
| 1.036 | 3.0 | 4491 | 1.1217 | 0.2933 | 0.0810 | 0.2937 | 0.2933 |
### Framework versions
- Transformers 4.18.0
- Pytorch 1.11.0
- Datasets 2.3.2
- Tokenizers 0.11.0
|
f55377e4ae9240de6505ea259f600b4a
|
dxiao/bert-finetuned-ner-60percent
|
dxiao
|
bert
| 12 | 5 |
transformers
| 0 |
token-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,525 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner-60percent
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5110
- Precision: 0.7917
- Recall: 0.8333
- F1: 0.8120
- Accuracy: 0.9154
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 2022
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 45 | 0.5094 | 0.7638 | 0.8303 | 0.7957 | 0.9078 |
| No log | 2.0 | 90 | 0.4844 | 0.8137 | 0.8393 | 0.8263 | 0.9207 |
| No log | 3.0 | 135 | 0.5110 | 0.7917 | 0.8333 | 0.8120 | 0.9154 |
### Framework versions
- Transformers 4.24.0
- Pytorch 1.12.1+cu113
- Datasets 2.6.1
- Tokenizers 0.13.2
|
3a00abbe557b3c50f0bf480aa5f88ecf
|
lewtun/ccorgi-dog
|
lewtun
| null | 17 | 17 |
diffusers
| 2 |
text-to-image
| true | false | false |
creativeml-openrail-m
| null | null | null | 1 | 1 | 0 | 0 | 0 | 0 | 0 |
['pytorch', 'diffusers', 'stable-diffusion', 'text-to-image', 'diffusion-models-class', 'dreambooth-hackathon', 'animal']
| false | true | true | 719 | false |
# DreamBooth model for ccorgi trained by lewtun on the lewtun/corgi dataset.
This is a Stable Diffusion model fine-tuned the ccorgi concept taught to Stable Diffusion with DreamBooth.
It can be used by modifying the `instance_prompt`: **a photo of ccorgi dog**
This model was created as part of the DreamBooth Hackathon 🔥. Visit the [organisation page](https://huggingface.co/dreambooth-hackathon) for instructions on how to take part!
## Description
This is a Stable Diffusion model fine-tuned on `dog` images for the animal theme.
## Usage
```python
from diffusers import StableDiffusionPipeline
pipeline = StableDiffusionPipeline.from_pretrained('lewtun/ccorgi-dog')
image = pipeline().images[0]
image
```
|
85053384314a2e5c433a30fd3000c74c
|
facebook/mask2former-swin-tiny-cityscapes-panoptic
|
facebook
|
mask2former
| 5 | 10 |
transformers
| 0 |
image-segmentation
| true | false | false |
other
| null |
['coco']
| null | 1 | 0 | 1 | 0 | 0 | 0 | 0 |
['vision', 'image-segmentation']
| false | true | true | 2,960 | false |
# Mask2Former
Mask2Former model trained on Cityscapes panoptic segmentation (tiny-sized version, Swin backbone). It was introduced in the paper [Masked-attention Mask Transformer for Universal Image Segmentation
](https://arxiv.org/abs/2112.01527) and first released in [this repository](https://github.com/facebookresearch/Mask2Former/).
Disclaimer: The team releasing Mask2Former did not write a model card for this model so this model card has been written by the Hugging Face team.
## Model description
Mask2Former addresses instance, semantic and panoptic segmentation with the same paradigm: by predicting a set of masks and corresponding labels. Hence, all 3 tasks are treated as if they were instance segmentation. Mask2Former outperforms the previous SOTA,
[MaskFormer](https://arxiv.org/abs/2107.06278) both in terms of performance an efficiency by (i) replacing the pixel decoder with a more advanced multi-scale deformable attention Transformer, (ii) adopting a Transformer decoder with masked attention to boost performance without
without introducing additional computation and (iii) improving training efficiency by calculating the loss on subsampled points instead of whole masks.

## Intended uses & limitations
You can use this particular checkpoint for panoptic segmentation. See the [model hub](https://huggingface.co/models?search=mask2former) to look for other
fine-tuned versions on a task that interests you.
### How to use
Here is how to use this model:
```python
import requests
import torch
from PIL import Image
from transformers import AutoImageProcessor, Mask2FormerForUniversalSegmentation
# load Mask2Former fine-tuned on Cityscapes panoptic segmentation
processor = AutoImageProcessor.from_pretrained("facebook/mask2former-swin-tiny-cityscapes-panoptic")
model = Mask2FormerForUniversalSegmentation.from_pretrained("facebook/mask2former-swin-tiny-cityscapes-panoptic")
url = "http://images.cocodataset.org/val2017/000000039769.jpg"
image = Image.open(requests.get(url, stream=True).raw)
inputs = processor(images=image, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
# model predicts class_queries_logits of shape `(batch_size, num_queries)`
# and masks_queries_logits of shape `(batch_size, num_queries, height, width)`
class_queries_logits = outputs.class_queries_logits
masks_queries_logits = outputs.masks_queries_logits
# you can pass them to processor for postprocessing
result = processor.post_process_panoptic_segmentation(outputs, target_sizes=[image.size[::-1]])[0]
# we refer to the demo notebooks for visualization (see "Resources" section in the Mask2Former docs)
predicted_panoptic_map = result["segmentation"]
```
For more code examples, we refer to the [documentation](https://huggingface.co/docs/transformers/master/en/model_doc/mask2former).
|
09454ae0f3ff3ee79c094fa4a1782efe
|
climatebert/distilroberta-base-climate-f
|
climatebert
|
roberta
| 10 | 697 |
transformers
| 9 |
fill-mask
| true | false | false |
apache-2.0
|
['en']
| null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 2,946 | false |
# Model Card for distilroberta-base-climate-f
## Model Description
This is the ClimateBERT language model based on the FULL-SELECT sample selection strategy.
*Note: We generally recommend choosing this language model over those based on the other sample selection strategies (unless you have good reasons not to). This is also the only language model we will update from time to time.*
Using the [DistilRoBERTa](https://huggingface.co/distilroberta-base) model as starting point, the ClimateBERT Language Model is additionally pre-trained on a text corpus comprising climate-related research paper abstracts, corporate and general news and reports from companies. The underlying methodology can be found in our [language model research paper](https://arxiv.org/abs/2110.12010).
*Update September 2, 2022: Now additionally pre-trained on an even larger text corpus, comprising >2M paragraphs. If you are looking for the language model before the update (i.e. for reproducibility), just use an older commit like [6be4fbd](https://huggingface.co/climatebert/distilroberta-base-climate-f/tree/6be4fbd3fedfd78ccb3c730c1f166947fbc940ba).*
## Climate performance model card
| distilroberta-base-climate-f | |
|--------------------------------------------------------------------------|----------------|
| 1. Is the resulting model publicly available? | Yes |
| 2. How much time does the training of the final model take? | 48 hours |
| 3. How much time did all experiments take (incl. hyperparameter search)? | 350 hours |
| 4. What was the power of GPU and CPU? | 0.7 kW |
| 5. At which geo location were the computations performed? | Germany |
| 6. What was the energy mix at the geo location? | 470 gCO2eq/kWh |
| 7. How much CO2eq was emitted to train the final model? | 15.79 kg |
| 8. How much CO2eq was emitted for all experiments? | 115.15 kg |
| 9. What is the average CO2eq emission for the inference of one sample? | 0.62 mg |
| 10. Which positive environmental impact can be expected from this work? | This work can be categorized as a building block tools following Jin et al (2021). It supports the training of NLP models in the field of climate change and, thereby, have a positive environmental impact in the future. |
| 11. Comments | Block pruning could decrease CO2eq emissions |
## Citation Information
```bibtex
@article{wkbl2021,
title={ClimateBERT: A Pretrained Language Model for Climate-Related Text},
author={Webersinke, Nicolas and Kraus, Mathias and Bingler, Julia and Leippold, Markus},
journal={arXiv preprint arXiv:2110.12010},
year={2021}
}
```
|
847cf012ade8d026a1aba2b0d7fc4ec1
|
Maltehb/aelaectra-danish-electra-small-uncased-ner-dane
|
Maltehb
|
electra
| 8 | 12 |
transformers
| 0 |
token-classification
| true | true | false |
mit
|
['da']
|
['DAGW']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['ælæctra', 'pytorch', 'danish', 'ELECTRA-Small', 'replaced token detection']
| false | true | true | 6,448 | false |
# Ælæctra - Finetuned for Named Entity Recognition on the [DaNE dataset](https://danlp.alexandra.dk/304bd159d5de/datasets/ddt.zip) (Hvingelby et al., 2020) by Malte Højmark-Bertelsen.
**Ælæctra** is a Danish Transformer-based language model created to enhance the variety of Danish NLP resources with a more efficient model compared to previous state-of-the-art (SOTA) models.
Ælæctra was pretrained with the ELECTRA-Small (Clark et al., 2020) pretraining approach by using the Danish Gigaword Corpus (Strømberg-Derczynski et al., 2020) and evaluated on Named Entity Recognition (NER) tasks. Since NER only presents a limited picture of Ælæctra's capabilities I am very interested in further evaluations. Therefore, if you employ it for any task, feel free to hit me up your findings!
Ælæctra was, as mentioned, created to enhance the Danish NLP capabilties and please do note how this GitHub still does not support the Danish characters "*Æ, Ø and Å*" as the title of this repository becomes "*-l-ctra*". How ironic.🙂
Here is an example on how to load the finetuned Ælæctra-uncased model for Named Entity Recognition in [PyTorch](https://pytorch.org/) using the [🤗Transformers](https://github.com/huggingface/transformers) library:
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("Maltehb/-l-ctra-danish-electra-small-uncased-ner-dane")
model = AutoModelForTokenClassification.from_pretrained("Maltehb/-l-ctra-danish-electra-small-uncased-ner-dane")
```
### Evaluation of current Danish Language Models
Ælæctra, Danish BERT (DaBERT) and multilingual BERT (mBERT) were evaluated:
| Model | Layers | Hidden Size | Params | AVG NER micro-f1 (DaNE-testset) | Average Inference Time (Sec/Epoch) | Download |
| --- | --- | --- | --- | --- | --- | --- |
| Ælæctra Uncased | 12 | 256 | 13.7M | 78.03 (SD = 1.28) | 10.91 | [Link for model](https://www.dropbox.com/s/cag7prs1nvdchqs/%C3%86l%C3%A6ctra.zip?dl=0) |
| Ælæctra Cased | 12 | 256 | 14.7M | 80.08 (SD = 0.26) | 10.92 | [Link for model](https://www.dropbox.com/s/cag7prs1nvdchqs/%C3%86l%C3%A6ctra.zip?dl=0) |
| DaBERT | 12 | 768 | 110M | 84.89 (SD = 0.64) | 43.03 | [Link for model](https://www.dropbox.com/s/19cjaoqvv2jicq9/danish_bert_uncased_v2.zip?dl=1) |
| mBERT Uncased | 12 | 768 | 167M | 80.44 (SD = 0.82) | 72.10 | [Link for model](https://storage.googleapis.com/bert_models/2018_11_03/multilingual_L-12_H-768_A-12.zip) |
| mBERT Cased | 12 | 768 | 177M | 83.79 (SD = 0.91) | 70.56 | [Link for model](https://storage.googleapis.com/bert_models/2018_11_23/multi_cased_L-12_H-768_A-12.zip) |
On [DaNE](https://danlp.alexandra.dk/304bd159d5de/datasets/ddt.zip) (Hvingelby et al., 2020) without the *MISC-tag*, Ælæctra scores slightly worse than both cased and uncased Multilingual BERT (Devlin et al., 2019) and Danish BERT (Danish BERT, 2019/2020), however, Ælæctra is less than one third the size, and uses significantly fewer computational resources to pretrain and instantiate.
### Pretraining
To pretrain Ælæctra it is recommended to build a Docker Container from the [Dockerfile](https://github.com/MalteHB/Ælæctra/tree/master/notebooks/fine-tuning/). Next, simply follow the [pretraining notebooks](https://github.com/MalteHB/Ælæctra/tree/master/infrastructure/Dockerfile/)
The pretraining was done by utilizing a single NVIDIA Tesla V100 GPU with 16 GiB, endowed by the Danish data company [KMD](https://www.kmd.dk/). The pretraining took approximately 4 days and 9.5 hours for both the cased and uncased model
### Fine-tuning
To fine-tune any Ælæctra model follow the [fine-tuning notebooks](https://github.com/MalteHB/Ælæctra/tree/master/notebooks/fine-tuning/)
### References
Clark, K., Luong, M.-T., Le, Q. V., & Manning, C. D. (2020). ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators. ArXiv:2003.10555 [Cs]. http://arxiv.org/abs/2003.10555
Danish BERT. (2020). BotXO. https://github.com/botxo/nordic_bert (Original work published 2019)
Devlin, J., Chang, M.-W., Lee, K., & Toutanova, K. (2019). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. ArXiv:1810.04805 [Cs]. http://arxiv.org/abs/1810.04805
Hvingelby, R., Pauli, A. B., Barrett, M., Rosted, C., Lidegaard, L. M., & Søgaard, A. (2020). DaNE: A Named Entity Resource for Danish. Proceedings of the 12th Language Resources and Evaluation Conference, 4597–4604. https://www.aclweb.org/anthology/2020.lrec-1.565
Strømberg-Derczynski, L., Baglini, R., Christiansen, M. H., Ciosici, M. R., Dalsgaard, J. A., Fusaroli, R., Henrichsen, P. J., Hvingelby, R., Kirkedal, A., Kjeldsen, A. S., Ladefoged, C., Nielsen, F. Å., Petersen, M. L., Rystrøm, J. H., & Varab, D. (2020). The Danish Gigaword Project. ArXiv:2005.03521 [Cs]. http://arxiv.org/abs/2005.03521
#### Acknowledgements
As the majority of this repository is build upon [the works](https://github.com/google-research/electra) by the team at Google who created ELECTRA, a HUGE thanks to them is in order.
A Giga thanks also goes out to the incredible people who collected The Danish Gigaword Corpus (Strømberg-Derczynski et al., 2020).
Furthermore, I would like to thank my supervisor [Riccardo Fusaroli](https://github.com/fusaroli) for the support with the thesis, and a special thanks goes out to [Kenneth Enevoldsen](https://github.com/KennethEnevoldsen) for his continuous feedback.
Lastly, i would like to thank KMD, my colleagues from KMD, and my peers and co-students from Cognitive Science for encouriging me to keep on working hard and holding my head up high!
#### Contact
For help or further information feel free to connect with the author Malte Højmark-Bertelsen on [hjb@kmd.dk](mailto:hjb@kmd.dk?subject=[GitHub]%20ÆlæctraUncasedNER) or any of the following platforms:
[<img align="left" alt="MalteHB | Twitter" width="22px" src="https://cdn.jsdelivr.net/npm/simple-icons@v3/icons/twitter.svg" />][twitter]
[<img align="left" alt="MalteHB | LinkedIn" width="22px" src="https://cdn.jsdelivr.net/npm/simple-icons@v3/icons/linkedin.svg" />][linkedin]
[<img align="left" alt="MalteHB | Instagram" width="22px" src="https://cdn.jsdelivr.net/npm/simple-icons@v3/icons/instagram.svg" />][instagram]
<br />
</details>
[twitter]: https://twitter.com/malteH_B
[instagram]: https://www.instagram.com/maltemusen/
[linkedin]: https://www.linkedin.com/in/malte-h%C3%B8jmark-bertelsen-9a618017b/
|
c153de9760238320183f4201dd441e2f
|
eduardopds/bert-finetuned-ner
|
eduardopds
|
bert
| 8 | 9 |
transformers
| 0 |
token-classification
| false | true | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_keras_callback']
| true | true | true | 1,428 | false |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# eduardopds/bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0269
- Validation Loss: 0.0545
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 2631, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 0.1719 | 0.0627 | 0 |
| 0.0457 | 0.0576 | 1 |
| 0.0269 | 0.0545 | 2 |
### Framework versions
- Transformers 4.18.0
- TensorFlow 2.8.0
- Datasets 2.2.1
- Tokenizers 0.12.1
|
814553436379cb7f246fa1ddd0b64e2f
|
Salvatore/bert-finetuned-ner
|
Salvatore
|
bert
| 30 | 3 |
transformers
| 0 |
token-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 2,789 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-ner
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0997
- Proteinmutation F1: 0.1309
- Snp F1: 0.1953
- Dnamutation F1: 0.3778
- Precision: 0.2380
- Recall: 0.2416
- F1: 0.2398
- Accuracy: 0.9703
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Proteinmutation F1 | Snp F1 | Dnamutation F1 | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:------------------:|:------:|:--------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 324 | 0.0533 | 0.0396 | 0.2830 | 0.4667 | 0.2334 | 0.3221 | 0.2707 | 0.9788 |
| 0.1072 | 2.0 | 648 | 0.0437 | 0.6065 | 0.4906 | 0.5009 | 0.4802 | 0.6348 | 0.5468 | 0.9868 |
| 0.1072 | 3.0 | 972 | 0.0592 | 0.1379 | 0.2485 | 0.2005 | 0.1639 | 0.2228 | 0.1889 | 0.9731 |
| 0.0573 | 4.0 | 1296 | 0.0722 | 0.0749 | 0.2530 | 0.4692 | 0.2705 | 0.2959 | 0.2826 | 0.9749 |
| 0.0431 | 5.0 | 1620 | 0.0766 | 0.1574 | 0.1847 | 0.2540 | 0.1766 | 0.2285 | 0.1992 | 0.9723 |
| 0.0431 | 6.0 | 1944 | 0.0805 | 0.1099 | 0.2202 | 0.2383 | 0.1657 | 0.2097 | 0.1851 | 0.9715 |
| 0.0396 | 7.0 | 2268 | 0.0886 | 0.1337 | 0.2138 | 0.4318 | 0.2683 | 0.2678 | 0.2680 | 0.9724 |
| 0.0354 | 8.0 | 2592 | 0.0927 | 0.1535 | 0.2113 | 0.3769 | 0.2505 | 0.2528 | 0.2516 | 0.9714 |
| 0.0354 | 9.0 | 2916 | 0.0978 | 0.1011 | 0.2540 | 0.3812 | 0.2495 | 0.2528 | 0.2512 | 0.9705 |
| 0.0312 | 10.0 | 3240 | 0.0997 | 0.1309 | 0.1953 | 0.3778 | 0.2380 | 0.2416 | 0.2398 | 0.9703 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.2
- Datasets 2.0.0
- Tokenizers 0.12.1
|
0764f9d4d4b7c34a8dc4ac5b437869ec
|
espnet/jiyangtang_magicdata_asr_conformer_lm_transformer
|
espnet
| null | 32 | 3 |
espnet
| 0 |
automatic-speech-recognition
| false | false | false |
cc-by-4.0
|
['zh']
|
['magicdata']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['espnet', 'audio', 'automatic-speech-recognition']
| false | true | true | 24,651 | false |
## ESPnet2 ASR model
### `espnet/jiyangtang_magicdata_asr_conformer_lm_transformer`
This model was trained by Jiyang Tang using magicdata recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
Follow the [ESPnet installation instructions](https://espnet.github.io/espnet/installation.html)
if you haven't done that already.
```bash
cd espnet
git checkout 9d0f3b3e1be6650d38cc5008518f445308fe06d9
pip install -e .
cd egs2/magicdata/asr1
./run.sh --skip_data_prep false --skip_train true --download_model espnet/jiyangtang_magicdata_asr_conformer_lm_transformer
```
<!-- Generated by scripts/utils/show_asr_result.sh -->
# RESULTS
## Environments
- date: `Wed Sep 21 01:11:58 EDT 2022`
- python version: `3.9.12 (main, Apr 5 2022, 06:56:58) [GCC 7.5.0]`
- espnet version: `espnet 202207`
- pytorch version: `pytorch 1.8.1+cu102`
- Git hash: `9d0f3b3e1be6650d38cc5008518f445308fe06d9`
- Commit date: `Mon Sep 19 20:27:41 2022 -0400`
## asr_train_asr_raw_zh_char_sp
### WER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
|decode_asr_rnn_lm_lm_train_lm_transformer_zh_char_valid.loss.ave_asr_model_valid.acc.ave/test|24279|24286|84.4|15.6|0.0|0.0|15.6|15.6|
### CER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
|decode_asr_rnn_lm_lm_train_lm_transformer_zh_char_valid.loss.ave_asr_model_valid.acc.ave/test|24279|243325|96.4|1.7|2.0|0.1|3.7|15.6|
### TER
|dataset|Snt|Wrd|Corr|Sub|Del|Ins|Err|S.Err|
|---|---|---|---|---|---|---|---|---|
## ASR config
<details><summary>expand</summary>
```
config: conf/train_asr.yaml
print_config: false
log_level: INFO
dry_run: false
iterator_type: sequence
output_dir: exp/asr_train_asr_raw_zh_char_sp
ngpu: 0
seed: 0
num_workers: 4
num_att_plot: 3
dist_backend: nccl
dist_init_method: env://
dist_world_size: null
dist_rank: null
local_rank: null
dist_master_addr: null
dist_master_port: null
dist_launcher: null
multiprocessing_distributed: false
unused_parameters: false
sharded_ddp: false
cudnn_enabled: true
cudnn_benchmark: false
cudnn_deterministic: true
collect_stats: false
write_collected_feats: false
max_epoch: 20
patience: null
val_scheduler_criterion:
- valid
- acc
early_stopping_criterion:
- valid
- loss
- min
best_model_criterion:
- - valid
- acc
- max
keep_nbest_models: 10
nbest_averaging_interval: 0
grad_clip: 5
grad_clip_type: 2.0
grad_noise: false
accum_grad: 4
no_forward_run: false
resume: true
train_dtype: float32
use_amp: false
log_interval: null
use_matplotlib: true
use_tensorboard: true
create_graph_in_tensorboard: false
use_wandb: false
wandb_project: null
wandb_id: null
wandb_entity: null
wandb_name: null
wandb_model_log_interval: -1
detect_anomaly: false
pretrain_path: null
init_param: []
ignore_init_mismatch: false
freeze_param: []
num_iters_per_epoch: null
batch_size: 20
valid_batch_size: null
batch_bins: 20000000
valid_batch_bins: null
train_shape_file:
- exp/asr_stats_raw_zh_char_sp/train/speech_shape
- exp/asr_stats_raw_zh_char_sp/train/text_shape.char
valid_shape_file:
- exp/asr_stats_raw_zh_char_sp/valid/speech_shape
- exp/asr_stats_raw_zh_char_sp/valid/text_shape.char
batch_type: numel
valid_batch_type: null
fold_length:
- 80000
- 150
sort_in_batch: descending
sort_batch: descending
multiple_iterator: false
chunk_length: 500
chunk_shift_ratio: 0.5
num_cache_chunks: 1024
train_data_path_and_name_and_type:
- - dump/raw/train_noeng_sp/wav.scp
- speech
- sound
- - dump/raw/train_noeng_sp/text
- text
- text
valid_data_path_and_name_and_type:
- - dump/raw/dev/wav.scp
- speech
- sound
- - dump/raw/dev/text
- text
- text
allow_variable_data_keys: false
max_cache_size: 0.0
max_cache_fd: 32
valid_max_cache_size: null
optim: adam
optim_conf:
lr: 0.0005
scheduler: warmuplr
scheduler_conf:
warmup_steps: 30000
token_list:
- <blank>
- <unk>
- 的
- 我
- 一
- 歌
- 你
- 天
- 不
- 了
- 放
- 来
- 播
- 下
- 个
- 是
- 有
- 给
- 首
- 好
- 请
- 在
- 听
- 么
- 气
- 要
- 想
- 曲
- 上
- 吗
- 去
- 到
- 这
- 啊
- 点
- 那
- 没
- 就
- 说
- 大
- 唱
- 人
- 最
- 第
- 看
- 会
- 明
- 集
- 吧
- 音
- 还
- 乐
- 今
- 电
- 开
- 能
- 度
- 哪
- 里
- 多
- 打
- 十
- 可
- 怎
- 道
- 什
- 新
- 雨
- 以
- 家
- 回
- 话
- 儿
- 他
- 时
- 小
- 温
- 样
- 爱
- 都
- 吃
- 呢
- 知
- 谁
- 为
- 子
- 们
- 也
- 过
- 老
- 很
- 出
- 中
- 现
- 冷
- 和
- 情
- 行
- 心
- 发
- 专
- 几
- 视
- 张
- 事
- 二
- 辑
- 五
- 三
- 后
- 找
- 些
- 早
- 学
- 晚
- 车
- 别
- 演
- 手
- 呀
- 调
- 感
- 问
- 九
- 饭
- 快
- 风
- 得
- 如
- 自
- 生
- 少
- 地
- 用
- 叫
- 帮
- 机
- 台
- 班
- 欢
- 候
- 起
- 等
- 把
- 年
- 干
- 高
- 太
- 啦
- 方
- 提
- 面
- 八
- 四
- 信
- 意
- 王
- 真
- 求
- 热
- 喜
- 觉
- 周
- 近
- 名
- 做
- 公
- 告
- 关
- 六
- 字
- 安
- 再
- 变
- 间
- 国
- 分
- 着
- 哈
- 水
- 节
- 只
- 动
- 北
- 刚
- 空
- 月
- 玩
- 让
- 伤
- 东
- 谢
- 网
- 七
- 见
- 之
- 比
- 杰
- 又
- 买
- 对
- 始
- 无
- 查
- 声
- 文
- 经
- 醒
- 美
- 西
- 哦
- 走
- 两
- 海
- 妈
- 李
- 报
- 诉
- 接
- 定
- 午
- 外
- 才
- 流
- 长
- 宝
- 门
- 收
- 己
- 室
- 林
- 种
- 南
- 日
- 目
- 陈
- 许
- 词
- 服
- 设
- 记
- 频
- 琴
- 主
- 完
- 友
- 花
- 跟
- 钱
- 睡
- 像
- 嗯
- 何
- 京
- 所
- 预
- 边
- 带
- 作
- 零
- 头
- 号
- 果
- 嘛
- 路
- 办
- 吉
- 语
- 本
- 合
- 卫
- 影
- 市
- 摄
- 通
- 加
- 女
- 成
- 因
- 前
- 衣
- 然
- 档
- 位
- 聊
- 哥
- 载
- 原
- <space>
- 思
- 氏
- 同
- 题
- 但
- 红
- 火
- 她
- 亲
- 传
- 江
- 清
- 息
- 注
- 死
- 啥
- 州
- 片
- 朋
- 相
- 星
- 华
- 已
- 负
- 白
- 色
- 姐
- 春
- 转
- 半
- 换
- 黄
- 游
- 工
- 法
- 理
- 山
- 该
- 英
- 较
- 先
- 穿
- 推
- 直
- 力
- 当
- 冻
- 费
- 刘
- 男
- 写
- 场
- 呵
- 克
- 正
- 单
- 身
- 系
- 苏
- 婆
- 难
- 阳
- 光
- 重
- 荐
- 越
- 马
- 城
- 错
- 次
- 期
- 口
- 金
- 线
- 准
- 爸
- 忙
- 体
- 于
- 句
- 广
- 福
- 活
- 应
- 亮
- 黑
- 特
- 司
- 喝
- 式
- 飞
- 介
- 者
- 慢
- 静
- 百
- 平
- 绍
- 差
- 照
- 团
- 烦
- 便
- 师
- 站
- 德
- 短
- 远
- 需
- 谱
- 郑
- 化
- 或
- 器
- 急
- 钢
- 您
- 忘
- 店
- 妹
- 梦
- 青
- 适
- 总
- 每
- 业
- 夜
- 神
- 版
- 健
- 区
- 实
- 从
- 孩
- 奏
- 韩
- 伦
- 志
- 算
- 雪
- 世
- 认
- 眼
- 模
- 全
- 与
- 书
- 拿
- 送
- 结
- 其
- 解
- 格
- 洗
- 幸
- 舞
- 望
- 速
- 试
- 钟
- 内
- 联
- 停
- 丽
- 课
- 河
- 沙
- 笑
- 久
- 永
- 贝
- 民
- 址
- 超
- 教
- 代
- 件
- 降
- 脑
- 恋
- 常
- 交
- 低
- 伙
- 而
- 毛
- 阿
- 齐
- 习
- 量
- 段
- 选
- 欣
- 昨
- 进
- 闻
- 住
- 受
- 类
- 酒
- 背
- 藏
- 暴
- 摇
- 云
- 怕
- 考
- 咋
- 武
- 赶
- 孙
- 识
- 嵩
- 景
- 某
- 省
- 界
- 罗
- 任
- 坐
- 级
- 遇
- 麻
- 县
- 被
- 龙
- 品
- 蛋
- 湖
- 离
- 希
- 卖
- 轻
- 岁
- 香
- 赏
- 忆
- 答
- 滚
- 保
- 运
- 深
- 央
- 更
- 况
- 部
- ,
- 猪
- 休
- 校
- 留
- 嘿
- 弹
- 挺
- 院
- 泪
- 拉
- 懂
- 暖
- 讲
- 顺
- 底
- 卡
- 使
- 表
- 剧
- 包
- 故
- 导
- 凉
- 连
- 咱
- 制
- 蔡
- 容
- 向
- 物
- 微
- 步
- 切
- 搜
- 婚
- 童
- 约
- 芳
- 凯
- 复
- 未
- 陪
- 防
- 典
- 夏
- 万
- 备
- 指
- 冰
- 管
- 基
- 琪
- 宇
- 晓
- 房
- 良
- 戏
- 悲
- 牛
- 千
- 达
- 汉
- 拜
- 奇
- 梅
- 菜
- 满
- 徐
- 楼
- 询
- 图
- 改
- 练
- 敬
- 票
- 吴
- 络
- 码
- 整
- 简
- 队
- 购
- 普
- 附
- 响
- 胡
- 装
- 暑
- 非
- 喂
- 消
- 浪
- 凤
- 愿
- 累
- 球
- 聚
- 启
- 假
- 潮
- 弟
- 玉
- 绿
- 康
- 拍
- 失
- 哭
- 易
- 木
- 斯
- 跳
- 军
- 处
- 搞
- 升
- 除
- 傻
- 骗
- 证
- 杨
- 园
- 茹
- 赵
- 标
- 窗
- 庆
- 惠
- 够
- 烟
- 俊
- 掉
- 建
- 呗
- 插
- 座
- 害
- 智
- 贵
- 左
- 落
- 计
- 客
- 宁
- 梁
- 舒
- 取
- 往
- 漫
- 兰
- 战
- 随
- 晴
- 条
- 入
- 叶
- 强
- 伟
- 雅
- 尔
- 树
- 余
- 弄
- 季
- 排
- 伍
- 吹
- 宏
- 商
- 柔
- 郊
- 铁
- 遍
- 确
- 闭
- 雄
- 似
- 冒
- 待
- 尘
- 群
- 病
- 退
- 务
- 育
- 坏
- 娘
- 莫
- 资
- 楚
- 辛
- 索
- 利
- 数
- 秦
- 燕
- 且
- 录
- 姑
- 念
- 痛
- 冬
- 尾
- 共
- 初
- 粤
- 哎
- 印
- 示
- 抱
- 终
- 泉
- 货
- 肯
- 它
- 伞
- 性
- 古
- 跑
- 腾
- 鱼
- 曾
- 源
- 银
- 读
- 油
- 川
- 言
- 倩
- 峰
- 激
- 置
- 灯
- 独
- 命
- 谈
- 苦
- 限
- 乡
- 菲
- 伴
- 将
- 震
- 炎
- 散
- 依
- 米
- 及
- 贞
- 兴
- 湿
- 寒
- 敏
- 否
- 俩
- 祝
- 慧
- 精
- 律
- 功
- 托
- 洋
- 敢
- 街
- 铃
- 必
- 弦
- 寻
- 涵
- 突
- 皮
- 反
- 烧
- 秋
- 刮
- 末
- 双
- 细
- 范
- 由
- 君
- 款
- 邮
- 醉
- 紧
- 哲
- 缘
- 岛
- 疼
- 阴
- 旋
- 怪
- 草
- 持
- 狼
- 具
- 至
- 汪
- 鸡
- 医
- 邓
- 份
- 右
- 密
- 士
- 修
- 亚
- 画
- 灵
- 妇
- 甜
- 靠
- 荣
- 程
- 莲
- 魂
- 此
- 户
- 属
- 贤
- 充
- 萧
- 血
- 逼
- 闹
- 吸
- 娜
- 肉
- 抒
- 价
- 桥
- 剑
- 巴
- 暗
- 豆
- 迪
- 戴
- 迅
- 朝
- 艺
- 谭
- 治
- 祥
- 尽
- 闷
- 宫
- 艳
- 父
- 存
- 媳
- 跪
- 雾
- 杜
- 味
- 奕
- 兵
- 脸
- 炫
- 兄
- 妮
- 优
- 熊
- 床
- 般
- 净
- 航
- 帝
- 刻
- 孤
- 轩
- 村
- 支
- 玮
- 狗
- 纯
- 楠
- 呐
- 冠
- 元
- 盛
- 决
- 诗
- 爷
- 堵
- 陶
- 乖
- 迷
- 羽
- 忧
- 倒
- 蜜
- 晒
- 仔
- 却
- 姜
- 哟
- 餐
- 雷
- 鸟
- 馆
- 韶
- 箱
- 操
- 乌
- 借
- 恒
- 舍
- 药
- 块
- 澡
- 石
- 软
- 奶
- 笨
- 夫
- 朴
- 义
- 派
- 晨
- 佳
- 科
- 姿
- 显
- 咏
- 饿
- 付
- 宗
- 键
- 止
- 员
- 磊
- 勤
- 崔
- 偏
- 额
- 免
- 乱
- 怀
- 侠
- 岳
- 斌
- 助
- 征
- 概
- 吕
- 彩
- 板
- 松
- 各
- 组
- 历
- 济
- 象
- 茶
- 领
- 按
- 创
- 镇
- 翻
- 配
- 宿
- 咯
- 帅
- 型
- 估
- 佩
- 惜
- 详
- 续
- 蓝
- 麟
- 珠
- 颜
- 彦
- 农
- 盘
- 母
- 鞋
- 账
- 博
- 礼
- 环
- 套
- 效
- 郭
- 居
- 佑
- 根
- 惊
- 圳
- 叔
- 若
- 逆
- 鸿
- 锁
- 食
- 芸
- 裤
- 娱
- 漂
- 野
- 麦
- 豫
- 顾
- 爽
- 族
- 仙
- 围
- 观
- 链
- 嗨
- 厅
- 巍
- 劲
- 极
- 呼
- 咖
- 淑
- 丝
- 昌
- 嘉
- 绝
- 史
- 击
- 承
- 蔚
- 堂
- 沉
- 笔
- 朵
- 凰
- 琥
- 匆
- 炜
- 输
- 须
- 娴
- 嘻
- 牌
- 田
- 杀
- 滴
- 鬼
- 桦
- 赛
- 玟
- 抽
- 案
- 轮
- 立
- 摆
- 屋
- 诺
- 丁
- 佰
- 蒙
- 澄
- 羊
- 添
- 质
- 波
- 萨
- 狂
- 丹
- 屁
- 角
- 章
- 产
- 宜
- 笛
- 严
- 维
- 测
- 娃
- 料
- 宋
- 洲
- 卦
- 猜
- 港
- 挂
- 淘
- 郁
- 统
- 断
- 锅
- 稍
- 绮
- 汗
- 辉
- 乎
- 破
- 钧
- 芹
- 择
- 胖
- 即
- 呜
- 旅
- 拨
- 紫
- 哇
- 默
- 论
- 朱
- 登
- 脚
- 订
- 秀
- ?
- 社
- 飘
- 尚
- 另
- 骂
- 并
- 恶
- 扫
- 裸
- 姨
- 苹
- 压
- 厌
- 汇
- 爆
- 局
- 睛
- 庄
- 唐
- 嘞
- 偶
- 乔
- 染
- 熟
- 喆
- 愉
- 虎
- 技
- 威
- 布
- 嘴
- 湾
- 术
- 讨
- 尼
- 诶
- 坊
- 删
- 桑
- 庾
- 斗
- 呃
- 仁
- 训
- 汤
- 脱
- 凡
- 例
- 唉
- 畅
- 参
- 晕
- 肥
- 营
- 鲁
- 减
- 琳
- 瑞
- 透
- 素
- 厉
- 追
- 扰
- 控
- 谣
- 足
- 检
- 扬
- 娇
- 耳
- 津
- 倾
- 淡
- 露
- 妞
- 熙
- 值
- 罪
- 浩
- 探
- 盐
- 列
- 券
- 潘
- 官
- 篇
- 纪
- 签
- 棒
- 丑
- 陆
- 养
- 佛
- 唯
- 芮
- 哒
- 榜
- 培
- 疯
- 财
- 卷
- 痴
- 凌
- 瓜
- 猫
- 泡
- 据
- 厦
- 辣
- 恩
- 土
- 补
- 递
- 伏
- 灰
- 糖
- 玛
- 黎
- 湘
- 遥
- 谅
- 桃
- 曼
- 招
- 勇
- 泰
- 杭
- 缓
- 朗
- 替
- 刷
- 封
- 骨
- 盖
- 眠
- 担
- 忽
- 蛮
- 蜗
- 肚
- 喽
- 懒
- 继
- 辈
- 魔
- 哼
- 顶
- 冲
- 番
- 释
- 形
- 页
- 渡
- 触
- 裂
- 逛
- 圆
- 迎
- 态
- 弃
- 洛
- 丰
- 困
- 展
- 束
- 巧
- 临
- 际
- 涛
- 酷
- 洁
- 毕
- 呆
- 励
- 臭
- 暂
- 评
- 沧
- 磨
- 洞
- 厂
- 吵
- 煮
- 旧
- 幽
- 寄
- 政
- 丫
- 闯
- 举
- 误
- 护
- 状
- 寂
- 牙
- 杯
- 议
- 眉
- 享
- 剩
- 秘
- 噢
- 耿
- 致
- 偷
- 丢
- 刀
- 销
- 盒
- 编
- 珍
- 葛
- 译
- 颗
- 括
- 奥
- 鲜
- 沈
- 婷
- 摩
- 炒
- 惯
- 啡
- 混
- 燥
- 扣
- 晶
- 柏
- 拥
- 旭
- 拾
- 验
- 嫁
- 铺
- 棉
- 划
- 虾
- 浙
- 寓
- 剪
- 贴
- 圣
- 颖
- 申
- 枝
- 艾
- 旁
- 溪
- '?'
- 厚
- 驶
- 燃
- 虽
- 途
- 祖
- 职
- 泽
- 腿
- 薇
- 阵
- 移
- 淋
- 灭
- 寞
- 森
- 延
- 孝
- 沥
- 迟
- 伪
- 催
- 投
- 伯
- 谓
- 诚
- 架
- 耶
- 项
- 撒
- 邦
- 善
- 鼻
- 芬
- 闲
- 增
- 卓
- 层
- 鹏
- 敲
- 镖
- 粉
- 欧
- 纸
- 甘
- 昆
- 哩
- 坚
- 苍
- 积
- 筝
- 擦
- 董
- 吻
- 折
- 欺
- 疆
- 勒
- 售
- 船
- 胜
- 甄
- 杂
- 骑
- 贱
- 饼
- 称
- 隆
- 竟
- 逃
- 啷
- 引
- 宾
- 莉
- 境
- 奖
- 救
- 讯
- 恰
- 垃
- 圾
- 宅
- 潜
- 皇
- 符
- 徽
- 造
- 翔
- 粥
- 桌
- 租
- 险
- 驾
- 祭
- 昂
- 牧
- 宣
- 综
- 谷
- 私
- 瓷
- 避
- 肖
- 闪
- 圈
- 喱
- 耀
- 悟
- 秒
- 篮
- 逗
- 蝶
- 趣
- 恨
- 恐
- 饺
- 碎
- 奔
- 幼
- 股
- 锦
- 锡
- 椅
- 玲
- 刑
- 嗓
- 喊
- 虑
- 俺
- 镜
- 耐
- 鹿
- 狄
- 兮
- 返
- 恭
- 含
- 傅
- 沟
- 莹
- 妃
- 忠
- 赤
- 喔
- 抓
- 迈
- 众
- 豪
- 祈
- 馨
- 嬛
- 庭
- 异
- 辰
- 琅
- 荷
- 匪
- 吐
- 警
- 虹
- 吓
- 聪
- 悔
- 归
- 富
- 陕
- 魏
- 欲
- 菊
- 雹
- 隐
- 涯
- 忍
- 芦
- 琊
- 酸
- 逊
- 亦
- 咪
- 瞎
- 滨
- 胸
- 采
- 穹
- 究
- 炊
- 痒
- 莎
- 柳
- 井
- 洪
- 胎
- 鼓
- 润
- 迁
- 玫
- 滩
- 傲
- 袁
- 赚
- 研
- 躺
- 烤
- 莱
- 搬
- 蒋
- 曹
- 孟
- 嫂
- 甲
- 瑰
- 窝
- 令
- 堆
- 废
- 掌
- 巡
- 妙
- 袋
- 争
- 萌
- 挑
- 册
- 饮
- 勋
- 珊
- 戒
- 绵
- 亡
- 劳
- 搭
- 甩
- 匙
- 彭
- 锋
- 钥
- 率
- 吟
- 鼠
- 纱
- 坡
- 潇
- 挣
- 逝
- 针
- 弱
- 妍
- 稳
- 怒
- 塘
- 卢
- 宵
- 悠
- 饱
- 披
- 瘦
- 浮
- 烂
- 壶
- 截
- 勿
- 序
- 委
- 兔
- 塔
- 执
- 墨
- 府
- 宙
- 欠
- 巨
- 帽
- 占
- 顿
- 权
- 坠
- 碰
- 著
- 硬
- 炮
- 骚
- 肃
- 规
- 厕
- 贾
- 葫
- 徒
- 瓶
- 辽
- 耍
- 赢
- 桂
- 浦
- 趟
- 柯
- 悉
- 恼
- 禁
- 殊
- 卧
- 赞
- 益
- 责
- 虚
- 姓
- 愁
- 舅
- 残
- 既
- 拖
- 棍
- 幻
- 库
- 骄
- 烈
- 尊
- 伊
- 缺
- 迹
- 疑
- 汽
- 郎
- 鸭
- 仪
- 盗
- 幺
- 萱
- 胃
- 脏
- 努
- 勉
- 池
- 咳
- 奋
- 批
- 蝴
- 监
- 犯
- 滑
- 牵
- 冯
- 败
- 毒
- 怖
- 绪
- 帐
- 协
- 韵
- 怜
- 薛
- 姚
- 副
- 塞
- 蕉
- 夹
- 萝
- 爹
- 貌
- 奈
- 乞
- 隔
- 澳
- 姥
- 妖
- 腰
- 纳
- 龄
- 材
- 旗
- 萤
- 俗
- 昼
- 坛
- 霍
- 怡
- 丐
- 咒
- 础
- 嘎
- 虫
- 枪
- 遗
- 献
- 陌
- 侣
- 。
- 昧
- 筒
- 袭
- 厨
- 爬
- 茂
- 媛
- 慰
- 填
- 霞
- 娟
- 摸
- 逍
- 赫
- 霾
- 泥
- 暧
- 翅
- 谦
- 夕
- 瑶
- 鑫
- 刺
- 袖
- 拒
- 玄
- 涂
- 溜
- 旬
- 鸣
- 泷
- 距
- 阻
- 绩
- 狠
- 宽
- 狐
- 赖
- 握
- 循
- 靓
- 述
- 糕
- 踏
- 侯
- 劵
- 壮
- 抄
- 苟
- 岗
- 供
- 湛
- 炼
- 烫
- 棋
- 糊
- 饶
- 悄
- 霸
- 竹
- 哀
- 拔
- 蓉
- 旦
- 晰
- 振
- 漠
- 苗
- 帘
- 糟
- 崇
- 踩
- 汕
- 寝
- 刹
- 蔬
- 旺
- 躁
- 守
- 液
- 疗
- 晋
- 坤
- 洒
- 串
- 屏
- 翠
- 鹅
- 腻
- 毅
- 蹈
- 党
- 咩
- 灿
- 哄
- 核
- 横
- 谎
- 忏
- 映
- 倔
- 则
- 肤
- 贺
- 潍
- 焦
- 渐
- 坑
- 瞄
- 融
- 琼
- 尤
- 逸
- 碧
- 葡
- 卜
- 察
- 邢
- 薄
- 亏
- 绒
- 萄
- 婉
- 闺
- 势
- 描
- 均
- 梨
- 椒
- 慕
- 污
- 弯
- 繁
- 炸
- 肿
- 阅
- 肺
- 席
- 呦
- 碟
- 耻
- 端
- 叹
- 庸
- 危
- 痘
- 峡
- 腐
- 霜
- 拳
- 昴
- 荡
- 屎
- 纠
- 夸
- 尿
- 钰
- 撼
- 嗽
- 雯
- 症
- 衡
- 互
- 孔
- 钻
- 萍
- 娄
- 斤
- 悦
- 谊
- 扯
- 驴
- 歉
- 扎
- 庐
- 蒲
- 吼
- 熬
- 鸳
- 蒸
- 驹
- 允
- 射
- 酱
- 鸯
- 企
- 馒
- 乘
- 葱
- 泳
- 莞
- 脆
- 寨
- 损
- 陀
- 膀
- 淮
- 侃
- 霉
- 施
- 橙
- 煲
- 妆
- 审
- 宠
- 穷
- 敌
- 堡
- 樱
- 诞
- 胆
- 彤
- 祷
- 渭
- 霆
- 亭
- 璐
- 邵
- 壁
- 禺
- 墙
- 葬
- 垫
- 吾
- 粒
- 爵
- 弘
- 妻
- 蕾
- 咨
- 固
- 幕
- 粗
- 抢
- 访
- 贸
- 挥
- 饰
- 硕
- 域
- 岸
- 咬
- 晗
- 姆
- 骤
- 抖
- 判
- 鄂
- 获
- 锻
- 郝
- 柜
- 醋
- 桐
- 泣
- 粘
- 革
- 脾
- 尸
- 侧
- 辆
- 埋
- 稻
- 肠
- 嫌
- 彬
- 庚
- 彼
- 龟
- 弥
- 籍
- 纽
- 喷
- 氛
- 币
- 蠢
- 磁
- 袜
- 柴
- 寸
- 韦
- 忐
- 忑
- 恢
- 缩
- 捷
- 绕
- 翼
- 琦
- 玻
- 驻
- 屈
- 岩
- 颂
- 仓
- 茜
- 璃
- 裙
- 僵
- 柿
- 稿
- 巾
- 撑
- 尹
- 嘟
- 牡
- 昏
- 歇
- 诵
- 丸
- 梯
- 挡
- 袄
- 逢
- 徙
- 渴
- 仰
- 跨
- 碗
- 阔
- 税
- 拼
- 宥
- 丞
- 凶
- 析
- 炖
- 舌
- 抗
- 脖
- 甚
- 豚
- 敷
- 瓦
- 织
- 邀
- 浏
- 猛
- 歪
- 阶
- 兽
- 俄
- 鹤
- 禹
- 纹
- 闽
- 惹
- 煤
- 患
- 岭
- 瑜
- 稀
- 拆
- 凄
- 崎
- 芝
- 摊
- 尺
- 彻
- 览
- 贷
- 珂
- 憋
- 径
- 抚
- 魅
- 悬
- 胶
- 倍
- 贯
- 籁
- 乃
- 哑
- 惑
- 撞
- 箫
- 绣
- 扁
- 苑
- 靖
- 漏
- 挤
- 轶
- 叮
- 烨
- 菇
- 砸
- 趁
- 媚
- 仅
- 藤
- 邱
- 陵
- 躲
- 滋
- 叛
- 捉
- 孕
- 铜
- 衫
- 寿
- 寺
- 枫
- 豹
- 伽
- 翡
- 蜂
- 丙
- 姗
- 羡
- 凑
- 鄙
- 庙
- 铭
- 宰
- 廖
- 肩
- 臣
- 抑
- 辅
- 誓
- 扇
- 啪
- 羞
- 诊
- 敦
- 跃
- 俞
- 肝
- 坦
- 贡
- 踢
- 齿
- 尧
- 淀
- 叉
- 浴
- 狮
- 昊
- 蟹
- 捏
- 略
- 禾
- 纲
- 赔
- 憾
- 赋
- 丘
- 尝
- 钓
- 涕
- 猴
- 鸽
- 纵
- 奉
- 涨
- 揍
- 怨
- 挨
- 兜
- 冈
- 凭
- 策
- 裴
- 摔
- 喵
- 佐
- 喉
- 膏
- 瑟
- 抬
- 纷
- 廊
- 贼
- 煎
- 熄
- 渝
- 缠
- 纶
- 岚
- 衬
- 遮
- 翰
- 誉
- 摘
- 勾
- 赣
- 姬
- 娅
- 撤
- 霖
- 泊
- 膝
- 耽
- 犹
- 仍
- 辞
- 溃
- 骏
- 弓
- 膜
- 诱
- 慌
- 惨
- 噪
- 涩
- 潭
- 幂
- 梓
- 植
- 罚
- 扮
- 涮
- 雁
- 兆
- 舟
- 咸
- 犀
- 炉
- 筋
- 陇
- 狸
- 帕
- 噶
- 茄
- 嗒
- 纬
- 障
- 聘
- 盼
- 盟
- 咧
- 灏
- 菠
- 巷
- 帖
- 慈
- 枕
- 唤
- 慨
- 呛
- 叽
- 砖
- 窍
- 瞒
- 龚
- 促
- 尖
- 螺
- 捞
- 盆
- 茫
- 屌
- 械
- 乳
- 啤
- 玺
- 廷
- 谐
- 吖
- 帆
- 蛇
- 琵
- 琶
- 扑
- 跌
- 崩
- 扭
- 扔
- 咿
- 菩
- 茉
- 攻
- 虐
- 甸
- 璇
- 驰
- 瞬
- 鸦
- 厢
- 囊
- 闫
- 届
- 墓
- 芒
- 栗
- 沫
- 违
- 缝
- 棵
- 杏
- 赌
- 灾
- 颤
- 沂
- 肇
- 桶
- 霄
- !
- 咙
- 绥
- 仲
- 愈
- 竖
- 菌
- 捕
- 烘
- 阮
- 皆
- 咚
- 劫
- 揭
- 郸
- 庞
- 喇
- 拐
- 奴
- 咔
- 幅
- 偿
- 咦
- 召
- 薪
- 盯
- 黛
- 杉
- 辨
- 邯
- 枯
- 沃
- 吊
- 筷
- 陷
- 鹰
- 嗦
- 噻
- 屯
- 殇
- 抵
- 雕
- 辩
- 枣
- 捂
- 瘾
- 粮
- 巢
- 耗
- 储
- 殷
- 糯
- 轨
- 沾
- 淇
- 毁
- 沐
- 蚊
- 鉴
- 灌
- 玖
- 唔
- 芙
- 淳
- 昕
- 裹
- 茧
- 浑
- 睿
- 踪
- 邪
- 瘩
- 恺
- 斜
- 汰
- 逐
- 铮
- 毫
- 胞
- 昭
- 妥
- 筑
- 贪
- 蘑
- 皓
- 颐
- 疙
- 捡
- 泛
- 债
- 栎
- 棚
- 腹
- 构
- 蓬
- 宪
- 叭
- 愚
- 押
- 蜀
- 夷
- 娶
- 盾
- 倪
- 牟
- 抛
- 壳
- 衍
- 杆
- 撕
- 亿
- 纤
- 淹
- 翘
- 蔷
- 芊
- 罩
- 拯
- 嗷
- 浇
- 宴
- 遵
- 冥
- 祸
- 塑
- 沛
- 猎
- 携
- 噜
- 喘
- 缴
- 砍
- 唢
- 曦
- 遛
- 罢
- 峨
- 戚
- 稚
- 揉
- 堰
- 螃
- 薯
- 乙
- 矿
- 挽
- 弛
- 埃
- 淅
- 疲
- 窦
- 烛
- 媒
- 尬
- 汀
- 谨
- 罐
- 劣
- 伶
- 煜
- 栏
- 榆
- 矛
- 琐
- 槽
- 驼
- 渤
- 沒
- 泄
- 粑
- 匀
- 囧
- 茵
- 霹
- 澈
- 岑
- 乏
- 栋
- 拌
- 框
- 祁
- 叨
- 斋
- 玥
- 僧
- 疏
- 绳
- 晃
- 抹
- 授
- 蓄
- 檬
- 仇
- 毯
- 啵
- 泼
- 阁
- ','
- 邹
- 阎
- 渠
- 函
- 腊
- 割
- 绑
- 扶
- 肌
- 卑
- 匠
- 雳
- 绯
- 婧
- 煌
- 蒂
- 腔
- 仿
- 遭
- 阜
- 峻
- 劝
- 绎
- 黔
- 贫
- 剁
- 荆
- 樊
- 卸
- 锄
- 阕
- 狱
- 冉
- 鲍
- 荒
- 侄
- 唇
- 忌
- 掖
- 竞
- 匹
- 仗
- 锤
- 穆
- 践
- 冶
- 柱
- 聂
- 捧
- 唠
- 翁
- 掏
- 塌
- 沁
- 巩
- 沸
- 蜡
- 痕
- 削
- 晟
- 眯
- 灶
- 婴
- 啸
- 釜
- 兼
- 剂
- 氧
- 赐
- 铠
- 攀
- 扩
- 朦
- 胧
- 孽
- 挖
- 钞
- 碍
- 凝
- 鼎
- 屉
- 斑
- 抠
- 哗
- 哨
- 婶
- 劈
- 冕
- 霏
- 汾
- 雀
- 浚
- 屠
- 唰
- 疚
- 芽
- 惦
- 裕
- 仑
- 厘
- 烁
- 瞧
- 蚂
- 涿
- 尴
- 埔
- 橘
- 磕
- 苇
- 脂
- 臂
- 蛙
- 镁
- 绽
- 卿
- 荃
- 莺
- 迫
- 敖
- 呈
- 勃
- 碌
- 讶
- 赠
- 巫
- 篱
- 浓
- 攒
- 裁
- 嫣
- 彪
- 娣
- 坟
- 廉
- 聆
- 铉
- 瞌
- 葵
- 鞍
- 坎
- 畜
- 爪
- 锯
- 潼
- 矣
- 闸
- 俱
- 蹭
- 戈
- 扒
- 滤
- 撇
- 浅
- 唧
- 觅
- 婕
- 牢
- 堕
- 丈
- 滕
- 御
- 溢
- 阑
- 楞
- 伺
- 馋
- 禄
- 胳
- 措
- 伐
- 滔
- 沦
- 澎
- 谙
- 桢
- 肾
- 熏
- 炅
- 邻
- 吞
- 噔
- 哔
- 沿
- 竺
- 闵
- 妨
- 啰
- 儒
- 锈
- 虞
- 颠
- 脊
- 膊
- 搓
- 岐
- 浸
- 兹
- 吨
- 垂
- 晏
- 痹
- 哆
- 漆
- 叠
- 莓
- 嘀
- 挫
- 馈
- 愧
- 佟
- 疾
- 蒜
- 盈
- 侬
- 烊
- 炙
- 蜢
- 诡
- 莆
- 蛾
- 轴
- 妒
- 洱
- 擎
- 脉
- 飓
- 泫
- 浆
- 岔
- 蹦
- 愤
- 琛
- 趴
- 绘
- 忻
- 拽
- 牲
- 馅
- 鲨
- 靴
- 鳅
- 俐
- 罕
- 呕
- 凋
- 绫
- 蕊
- 圃
- 猥
- 氓
- 歧
- 秧
- 栈
- 梧
- 衷
- 巅
- 彝
- 嚎
- 菁
- 渔
- 茬
- 汐
- 拓
- 昔
- 囚
- 舜
- 搁
- 泸
- 涟
- 蚁
- 裳
- 鞭
- 辟
- 蝎
- 簧
- 予
- 倦
- 傍
- 荔
- 瞳
- 碑
- 桨
- 疫
- 骁
- 驿
- 柠
- 妾
- 隶
- 菏
- 煽
- 麒
- 奎
- 驯
- 飙
- 姻
- 沅
- 扉
- 斩
- 奢
- 蚌
- 掩
- 蹲
- 丧
- 辱
- 焉
- 佘
- 襄
- 芯
- 枉
- 谋
- 渊
- 哮
- 喀
- 朔
- 侏
- 姝
- 戎
- 磅
- 督
- 诛
- 奸
- 苞
- 庵
- 馄
- 聋
- 滁
- 垚
- 柬
- 猩
- 夺
- 啼
- 坝
- 竭
- 黏
- 衰
- 遂
- 潞
- 谜
- 蜻
- 蜓
- 瓣
- 秉
- 檐
- 楂
- 嗑
- 搅
- 嘚
- 倚
- 乒
- 宛
- 崽
- 恕
- 轰
- 淄
- 晞
- 酬
- 砂
- 筠
- 薰
- 蒿
- 瞅
- 勺
- 阙
- 伸
- 嚏
- 湄
- 咆
- 坂
- 役
- 掰
- 渣
- 魁
- 诅
- 浒
- 妓
- 珑
- 捎
- 焊
- 饲
- 脍
- 荫
- 堤
- 轿
- 乓
- 筹
- 撸
- 饨
- 渺
- 桓
- 旷
- 笙
- 晖
- 慎
- 埠
- 挪
- 汝
- 浊
- 仨
- 鳄
- 濮
- 汶
- 邰
- 钉
- 蔽
- 亨
- 屑
- 铅
- 喃
- 葩
- 哉
- 睁
- 骆
- 涉
- 汁
- 拦
- 痞
- 芜
- 俪
- 兑
- 梵
- 刊
- 缅
- 彰
- 俑
- 桔
- 堪
- 鸥
- 契
- 覆
- 拷
- 珞
- 诸
- 棱
- 忒
- 嫩
- 梶
- 贻
- 藕
- 愣
- 湃
- 趋
- 甭
- 嗖
- 怯
- 憧
- 珀
- 缸
- 蔓
- 稣
- 筱
- 杠
- 崖
- 凳
- 裆
- 隧
- 锣
- 嘣
- 瀑
- 漪
- 柄
- 凸
- 颁
- 迦
- 烙
- 岱
- 瑄
- 吭
- 肆
- 鳞
- 晾
- 憬
- 邑
- 甥
- 掀
- 褂
- 淫
- 瓢
- 暮
- 喧
- 祛
- 恙
- 禅
- 柚
- 樟
- 疮
- 嗡
- 懈
- 茨
- 矮
- 诠
- 侮
- 眨
- 羲
- 掐
- 琉
- 雍
- 晔
- 凹
- 怂
- 禧
- 蹬
- 绅
- 榄
- 箍
- 詹
- 溶
- 黯
- 啃
- 驸
- 朕
- 婺
- 援
- 铲
- 呻
- 犬
- 捣
- 眷
- 剃
- 惧
- 芷
- 叱
- 娥
- 钦
- 矫
- 憨
- 骊
- 坪
- 俏
- 炳
- 妲
- 冀
- 刁
- 馍
- 琢
- 扛
- 瞿
- 辙
- 茅
- 寡
- 絮
- 呷
- 哺
- 咕
- 驱
- 搂
- 圭
- 嫉
- 涓
- 茱
- '"'
- 笼
- 讽
- 涡
- 泓
- 弊
- 诀
- 璧
- 舔
- 嬅
- 亢
- 沪
- 绢
- 钙
- 喏
- 馥
- 怅
- 簿
- 薜
- 捶
- 冤
- 脐
- 岂
- 溺
- 蕙
- 铿
- 锵
- 锐
- 呸
- 砰
- 亩
- 漳
- 阪
- 栀
- 坞
- 跤
- 蓓
- 舰
- 缕
- 羁
- 芋
- 畔
- 衔
- 铝
- 盲
- 株
- 搏
- 曙
- 惩
- 逻
- 蹄
- 涤
- 宕
- 咤
- 尉
- 嘘
- 瀚
- 仃
- 稽
- 霑
- 飕
- 垮
- 酿
- 畏
- 鲸
- 梗
- 署
- 砒
- 雏
- 茗
- 恬
- 螂
- 拂
- 憔
- 悴
- 钗
- 棕
- 劭
- 歹
- 笠
- 厄
- 焖
- 拣
- 逮
- 蕴
- 淌
- 枸
- 杞
- 雇
- 漯
- 邂
- 逅
- ·
- 荟
- 塾
- 涌
- 挚
- 舱
- 惬
- 剖
- 榴
- 侦
- 摁
- 烹
- 烽
- 俘
- 麓
- 犊
- 酌
- 匿
- 梭
- 覃
- 隽
- 惆
- 掠
- 舵
- 艰
- 蟑
- 瘤
- 仆
- 穴
- 涅
- 衿
- 嚷
- 峪
- 榕
- 吒
- 酪
- 曝
- 帧
- 靶
- 嚣
- 踝
- 翊
- 陂
- 髓
- 瑚
- 裘
- 芍
- 炬
- 鲅
- 蚕
- 肢
- 颊
- 陛
- 籽
- 粟
- 滞
- 煞
- 乾
- 媞
- 刨
- 碾
- 瘫
- 盔
- 侈
- 徘
- 徊
- 熔
- 吆
- 褪
- 拟
- 廓
- 翟
- 俾
- 沽
- 垒
- 萎
- 僻
- 豌
- 卵
- 狡
- 篓
- 栽
- 崴
- 拧
- 颈
- 咐
- 胭
- 阱
- 鄱
- 漓
- 厥
- 烬
- 糙
- 褥
- 炕
- 恍
- 襟
- 韧
- 眸
- 毙
- 垢
- 叙
- 辜
- 酝
- 璋
- 荧
- 魇
- 皈
- 觞
- 喻
- 孺
- 匈
- 铛
- 诈
- 盏
- 淼
- 佣
- 苓
- 缚
- 洼
- 疡
- 猬
- 腑
- 阡
- 鲫
- 鹭
- 鹂
- 笆
- 埙
- 癌
- 璀
- 璨
- 疹
- 蓑
- 芭
- 嘶
- 桀
- 吩
- 泾
- 铂
- 倘
- 囗
- 璜
- 窃
- 癫
- 璞
- 墟
- 钩
- 粹
- 镐
- 韬
- 牺
- 寮
- 喳
- 鄞
- 笋
- 臧
- 疤
- 捐
- 腥
- 嬷
- 燮
- 濠
- 棠
- 夙
- 弑
- 乍
- 剔
- 嘈
- 钇
- 衅
- 挝
- 橡
- 矜
- 圩
- 恳
- 瑛
- 蔺
- 兖
- 焕
- 懿
- 钏
- 栾
- 筐
- 苒
- 碳
- 韭
- 箭
- 婵
- 迭
- 枷
- 孜
- 咽
- 悯
- 漉
- 噬
- 侍
- 蝉
- 涧
- 鹦
- 鹉
- 冼
- 竿
- …
- 袈
- 诏
- 锢
- 泠
- 匡
- 枚
- 坷
- 邝
- 癖
- 绷
- 皖
- 滦
- 滥
- 荨
- 虏
- 拈
- 浜
- 颓
- “
- ”
- 戳
- 钮
- 梳
- 溅
- 徨
- 旨
- 罂
- 蹉
- 腌
- 隙
- 侨
- 槟
- 泌
- 珈
- 芵
- 腮
- 晤
- 墩
- 鲤
- 扳
- 栓
- 窑
- 荏
- 饪
- 泵
- 猿
- 眀
- 嗝
- 禽
- 朽
- 偕
- 胀
- 谍
- 捅
- 蜉
- 蝣
- 蹋
- 拱
- 氯
- 噼
- 蚩
- 芥
- 蛟
- 貂
- 荚
- 痰
- 殿
- 遣
- 丛
- 碱
- 殖
- 炽
- 嚓
- 彗
- 窟
- 鳌
- 矶
- 镯
- 乜
- 髙
- 蛤
- 荤
- 坨
- 漱
- 惰
- 跎
- 萸
- 曰
- 亘
- 窘
- 厮
- 绐
- 黝
- 鞠
- 漩
- 蚱
- 垣
- 翩
- 嬴
- 彷
- 椰
- 砚
- 褐
- 黍
- 噗
- 耕
- 挠
- 妩
- 掂
- 峯
- 灸
- 晌
- 溧
- 鹃
- 屿
- 昙
- 廾
- 冢
- 龌
- 龊
- 瞪
- 刽
- 脓
- 壹
- 羱
- 奠
- 贰
- 佬
- 拙
- 颢
- 嘱
- 糗
- 昀
- 巳
- 辕
- 惫
- 黒
- 辐
- 窈
- 窕
- 拢
- 缪
- 逞
- 吝
- 裟
- 钝
- 寇
- 耙
- 隋
- 蝇
- 仟
- 铨
- 赊
- 皑
- 衢
- 胚
- 腺
- 啧
- 淤
- 妄
- 氢
- 寅
- 叻
- 嘲
- 叼
- 沮
- 磐
- 芈
- 饥
- 槿
- 卤
- 懵
- 惴
- 毋
- 箩
- 苔
- 峥
- 斥
- 矬
- 佚
- 肮
- 皎
- 憎
- 樨
- 讴
- 鳖
- 煦
- 焚
- 泗
- 皂
- 礁
- 睬
- 梢
- 妤
- 佗
- 蝌
- 蚪
- 渗
- 暇
- 卟
- 悼
- 瑨
- 伎
- 纺
- 耆
- 舶
- 礴
- 豺
- 涪
- 谬
- 赴
- 婪
- 吱
- 麽
- 犁
- 潸
- 鸪
- 鸢
- 鄯
- 讷
- 弶
- 橄
- 撬
- 赦
- 岷
- 垓
- 绞
- 虔
- 剥
- 澜
- 酗
- 谛
- 骥
- 撅
- 鱿
- 犷
- 讪
- 秃
- 卞
- 缆
- 蓦
- 庶
- 勐
- 笫
- 敛
- 弗
- 痱
- 啬
- 硚
- 昱
- 忿
- 撩
- 椿
- 侵
- 窄
- 邛
- 崃
- 涸
- 赈
- 狭
- 嵌
- 淖
- 瑙
- 踹
- 傈
- 僳
- 缭
- 睦
- 窜
- 嘅
- 樵
- 爰
- 侗
- 逑
- 弧
- 侑
- :
- 娉
- 蝙
- 蝠
- 骅
- 饴
- 揣
- /
- 鲈
- 綦
- 拴
- 硝
- 梆
- 馗
- 夭
- 扼
- 鳃
- 惚
- 扈
- 矢
- 藁
- 飚
- 妊
- 踮
- 惟
- 痊
- 艇
- 偎
- 魄
- 篝
- 簸
- 擞
- 粽
- 缥
- 缈
- 跷
- 咁
- 悍
- 菀
- 陡
- 橱
- 遐
- 榨
- 渎
- 蹂
- 躏
- 舂
- 轼
- 枰
- 焰
- 幌
- 邸
- 捜
- 灼
- 茯
- 芎
- 穗
- 棘
- 碜
- 颉
- 鹧
- 啄
- 趾
- 茎
- 揽
- 靳
- 黜
- 惋
- 亥
- 铡
- 栅
- 挞
- 眈
- 膘
- 犍
- 珉
- 镪
- 昵
- 霓
- 圪
- 汲
- 惺
- 瑕
- 桩
- 洽
- 唏
- 耒
- 唻
- 豁
- 郓
- 纣
- 亊
- 鳝
- 蟆
- 癣
- 碚
- 踌
- 殁
- 缉
- 痔
- 頔
- 蔫
- ;
- 掺
- 愫
- 祟
- 拘
- 蜘
- 蛛
- 涎
- 耸
- 揪
- 芪
- 腕
- 袍
- 慵
- 绻
- 绛
- 螨
- 捌
- 墅
- 篷
- 啾
- 孪
- 唬
- 褛
- 跶
- 壤
- 慷
- 痧
- 懦
- 郯
- 莴
- 茴
- 嘬
- 铎
- 辫
- 绚
- 簇
- 墘
- 婿
- 咻
- 斡
- 沱
- 譬
- 羔
- 藓
- 肋
- 棂
- 赎
- 炭
- 徵
- 簌
- 艘
- 苪
- 眶
- 嘭
- 霎
- 馊
- 秽
- 仕
- 镶
- 纨
- 摧
- 蒨
- 闰
- 迩
- 篙
- 嚯
- 郫
- 陋
- 殒
- 邃
- 浔
- 瑾
- 鳟
- 祯
- 泻
- 氟
- 猾
- 酥
- 萦
- 郴
- 祀
- 涼
- 屡
- 摹
- 毡
- 妪
- 郡
- 柘
- 裱
- 囔
- 楷
- 鄄
- 蕲
- 偲
- 菘
- 姣
- 瞥
- 肪
- 饽
- 惭
- 胁
- 垄
- 榻
- 讼
- 旱
- 鬓
- 凇
- 钊
- 掣
- 浣
- 凃
- 蓥
- 臊
- 夔
- 脯
- 苛
- 阀
- 睫
- 腋
- 姊
- 躬
- 瘁
- 奄
- 靡
- 盂
- 柑
- 渑
- 恻
- 缱
- 拎
- 恤
- 缶
- 嵬
- 簋
- 囤
- 褴
- 蔼
- 沌
- 薏
- 鸵
- 跋
- 篪
- 罡
- 颇
- 嗄
- 胺
- 烯
- 酚
- 祠
- 迢
- 硖
- 眺
- 珏
- 怆
- 斧
- 痪
- 祺
- 嘤
- 谑
- 婊
- 滂
- 骇
- 帔
- 荼
- 硅
- 猖
- 皱
- 顽
- 榔
- 锌
- 蔻
- 滢
- 茸
- 捋
- 壥
- 孰
- 娩
- 锥
- 逾
- 诬
- 娠
- 厝
- 噎
- 秤
- 祢
- 嗳
- 嗜
- 滘
- 尅
- 悚
- 履
- 馕
- 簪
- 俭
- 摞
- 妗
- 蛎
- 暹
- 钾
- 膨
- 孚
- 驷
- 卯
- 猇
- 褚
- 町
- 骞
- -
- 芩
- 赁
- 粱
- 隼
- 掘
- 莽
- 郾
- 擒
- 叁
- 敕
- 镊
- 惘
- 蚤
- 邳
- 嗫
- 扪
- 瀛
- 凿
- 雎
- 啲
- 鲲
- 帼
- 枭
- 羹
- 驳
- 铆
- 肴
- 嫦
- 媲
- 鹳
- 秩
- 銮
- 饯
- 毽
- 珩
- 眩
- 仄
- 葳
- 撮
- 睇
- 塄
- 肘
- 钠
- 诓
- 呱
- 垅
- 菱
- 亍
- 戍
- 酯
- 袱
- 隘
- 蓟
- 暨
- 痣
- 辗
- 埵
- 殉
- 郏
- 孢
- 悳
- 讫
- 诲
- 髋
- 孑
- 睹
- 擅
- 嗮
- 慒
- 琰
- 濛
- 雌
- 恁
- 擀
- 娼
- 谕
- 撵
- 苯
- 聴
- 唛
- 撂
- 栖
- 拗
- 孬
- 怏
- 掇
- 肽
- 胰
- 沣
- 卅
- 箅
- 氨
- 浠
- 蠡
- 募
- 肛
- 岀
- 瞑
- 蛆
- 舀
- 蚝
- 歙
- 涔
- 诘
- 、
- 垡
- 涠
- 嘢
- 糸
- 胤
- 绊
- 柒
- 沓
- 粼
- 菖
- 犒
- 呒
- 唑
- 莘
- 莪
- 宸
- 睨
- \
- 鲶
- 蛐
- 溏
- 菈
- 蹩
- 焙
- 釆
- 瑗
- 睾
- 槐
- 榉
- 杷
- 鄢
- 僕
- 诽
- 嗲
- 蜃
- 戆
- 蘼
- 糜
- 霁
- 坻
- 硼
- 槛
- 枞
- 麸
- 谒
- 荀
- 邋
- 遢
- 锴
- 啶
- 粪
- 驭
- 筵
- 砌
- 莩
- 蹼
- 吔
- 缳
- 埭
- 隗
- 厶
- 丶
- "\x14"
- "\x17"
- 稼
- 铖
- 涣
- 亳
- 幢
- 沭
- 驮
- 奚
- 藐
- 颅
- 埤
- 愘
- 镲
- 窒
- 暄
- 诃
- 噘
- 歼
- 隅
- 爻
- 蘅
- 锹
- 锇
- 椎
- 琨
- 烩
- 枢
- 觧
- 萁
- 镂
- 龈
- 怠
- 阐
- 藉
- 凛
- 冽
- 珣
- 泘
- 抉
- 锭
- 蕃
- 蠃
- 毓
- 啐
- 栩
- 骷
- 髅
- 耷
- 寥
- 杵
- 蚬
- 窖
- 孛
- 舆
- 皿
- 柸
- 粳
- 钣
- 趸
- 叄
- 腚
- 杖
- 鸸
- 犲
- 浗
- 缮
- 哓
- 箧
- 攘
- 冇
- 钛
- 郗
- 囡
- 酆
- 姌
- 雉
- 胯
- 椭
- 埏
- 钵
- 绌
- 蝾
- 坼
- 濂
- w
- o
- r
- d
- 袒
- 峦
- 鹫
- 炯
- 悱
- 漕
- 莦
- 蔑
- 樽
- 牒
- 濡
- 嫯
- 陖
- 疸
- 桅
- 辖
- 僢
- 《
- 》
- 酣
- 遨
- 邬
- ':'
- 嫲
- 哌
- 锚
- 淙
- Q
- 濑
- 熨
- 谴
- 筛
- 薹
- 磬
- 熠
- 腓
- 阉
- 钴
- 恂
- 溉
- 陨
- 螳
- 孵
- 瘠
- 嫡
- 哝
- 狙
- 怼
- 斟
- 甫
- 渌
- 卒
- 翕
- 沏
- 旮
- 旯
- 菡
- 變
- 狈
- 鳜
- 嵋
- 仞
- 鳕
- 噩
- 踟
- 躇
- 蛀
- 瘸
- 篡
- 锊
- 団
- 斐
- 蹍
- 冗
- "\uFEFF"
- 歆
- 圴
- 泯
- 伥
- 愎
- 坌
- 碘
- 赉
- 骧
- 矩
- 綽
- 秭
- 怵
- 麝
- 贩
- 溥
- 捆
- 腩
- 溴
- 卉
- 痦
- 荻
- 缇
- 秸
- 秆
- 捍
- 炀
- 阆
- 泞
- 懊
- 啕
- 蚶
- 衩
- 桜
- 旖
- 贬
- 酵
- 滟
- 纥
- 倭
- 赝
- 呶
- 哧
- 煸
- 劢
- 炝
- 僚
- 豇
- 阂
- 涝
- 骡
- 霭
- 窨
- 殴
- 竣
- 醇
- 擂
- 怦
- 怩
- 臾
- 搔
- 伱
- 啉
- 嫖
- 囝
- 糠
- 胥
- 酰
- 镫
- 蟒
- 荞
- 醪
- 颦
- 吏
- 颛
- 赳
- 贿
- 赂
- 痩
- 仂
- 颍
- 罔
- 猕
- 嚒
- 蘸
- 熹
- 捺
- 坜
- 郜
- 鉄
- 蒌
- 荑
- 藻
- 谌
- 钳
- 屮
- 疵
- 哞
- 琮
- 潴
- 讹
- 镭
- '3'
- 尕
- 倬
- 庇
- 侩
- 瘆
- 傀
- 儡
- 诧
- 葆
- 唾
- 皋
- 逄
- 诌
- 氦
- 彳
- 盅
- 曳
- 槲
- 挟
- 怿
- 顷
- 臃
- 衙
- 踵
- 霈
- 嗪
- 闩
- 锟
- 恿
- 抻
- 茁
- 惢
- 菅
- 迂
- 瞟
- 痉
- 挛
- 绦
- 晁
- 挢
- 蠕
- 洙
- <sos/eos>
init: null
input_size: null
ctc_conf:
dropout_rate: 0.0
ctc_type: builtin
reduce: true
ignore_nan_grad: null
zero_infinity: true
joint_net_conf: null
use_preprocessor: true
token_type: char
bpemodel: null
non_linguistic_symbols: null
cleaner: null
g2p: null
speech_volume_normalize: null
rir_scp: null
rir_apply_prob: 1.0
noise_scp: null
noise_apply_prob: 1.0
noise_db_range: '13_15'
short_noise_thres: 0.5
frontend: default
frontend_conf:
fs: 16k
specaug: specaug
specaug_conf:
apply_time_warp: true
time_warp_window: 5
time_warp_mode: bicubic
apply_freq_mask: true
freq_mask_width_range:
- 0
- 30
num_freq_mask: 2
apply_time_mask: true
time_mask_width_range:
- 0
- 40
num_time_mask: 2
normalize: global_mvn
normalize_conf:
stats_file: exp/asr_stats_raw_zh_char_sp/train/feats_stats.npz
model: espnet
model_conf:
ctc_weight: 0.3
lsm_weight: 0.1
length_normalized_loss: false
preencoder: null
preencoder_conf: {}
encoder: conformer
encoder_conf:
output_size: 512
attention_heads: 8
linear_units: 2048
num_blocks: 12
dropout_rate: 0.1
positional_dropout_rate: 0.1
attention_dropout_rate: 0.0
input_layer: conv2d
normalize_before: true
pos_enc_layer_type: rel_pos
selfattention_layer_type: rel_selfattn
activation_type: swish
macaron_style: true
use_cnn_module: true
cnn_module_kernel: 31
postencoder: null
postencoder_conf: {}
decoder: transformer
decoder_conf:
attention_heads: 4
linear_units: 2048
num_blocks: 6
dropout_rate: 0.1
positional_dropout_rate: 0.1
self_attention_dropout_rate: 0.0
src_attention_dropout_rate: 0.0
required:
- output_dir
- token_list
version: '202207'
distributed: false
```
</details>
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
9e7fb4dae0150348d8bc64f2c66ba138
|
Gxg/OpNum_Bert_Merge
|
Gxg
|
bert
| 22 | 2 |
transformers
| 0 |
feature-extraction
| true | false | true |
apache-2.0
|
['en']
|
['bookcorpus', 'wikipedia']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['exbert']
| false | true | true | 10,425 | false |
# BERT base model (uncased)
Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in
[this paper](https://arxiv.org/abs/1810.04805) and first released in
[this repository](https://github.com/google-research/bert). This model is uncased: it does not make a difference
between english and English.
Disclaimer: The team releasing BERT did not write a model card for this model so this model card has been written by
the Hugging Face team.
## Model description
BERT is a transformers model pretrained on a large corpus of English data in a self-supervised fashion. This means it
was pretrained on the raw texts only, with no humans labelling them in any way (which is why it can use lots of
publicly available data) with an automatic process to generate inputs and labels from those texts. More precisely, it
was pretrained with two objectives:
- Masked language modeling (MLM): taking a sentence, the model randomly masks 15% of the words in the input then run
the entire masked sentence through the model and has to predict the masked words. This is different from traditional
recurrent neural networks (RNNs) that usually see the words one after the other, or from autoregressive models like
GPT which internally mask the future tokens. It allows the model to learn a bidirectional representation of the
sentence.
- Next sentence prediction (NSP): the models concatenates two masked sentences as inputs during pretraining. Sometimes
they correspond to sentences that were next to each other in the original text, sometimes not. The model then has to
predict if the two sentences were following each other or not.
This way, the model learns an inner representation of the English language that can then be used to extract features
useful for downstream tasks: if you have a dataset of labeled sentences for instance, you can train a standard
classifier using the features produced by the BERT model as inputs.
## Model variations
BERT has originally been released in base and large variations, for cased and uncased input text. The uncased models also strips out an accent markers.
Chinese and multilingual uncased and cased versions followed shortly after.
Modified preprocessing with whole word masking has replaced subpiece masking in a following work, with the release of two models.
Other 24 smaller models are released afterwards.
The detailed release history can be found on the [google-research/bert readme](https://github.com/google-research/bert/blob/master/README.md) on github.
| Model | #params | Language |
|------------------------|--------------------------------|-------|
| [`bert-base-uncased`](https://huggingface.co/bert-base-uncased) | 110M | English |
| [`bert-large-uncased`](https://huggingface.co/bert-large-uncased) | 340M | English | sub
| [`bert-base-cased`](https://huggingface.co/bert-base-cased) | 110M | English |
| [`bert-large-cased`](https://huggingface.co/bert-large-cased) | 340M | English |
| [`bert-base-chinese`](https://huggingface.co/bert-base-chinese) | 110M | Chinese |
| [`bert-base-multilingual-cased`](https://huggingface.co/bert-base-multilingual-cased) | 110M | Multiple |
| [`bert-large-uncased-whole-word-masking`](https://huggingface.co/bert-large-uncased-whole-word-masking) | 340M | English |
| [`bert-large-cased-whole-word-masking`](https://huggingface.co/bert-large-cased-whole-word-masking) | 340M | English |
## Intended uses & limitations
You can use the raw model for either masked language modeling or next sentence prediction, but it's mostly intended to
be fine-tuned on a downstream task. See the [model hub](https://huggingface.co/models?filter=bert) to look for
fine-tuned versions on a task that interests you.
Note that this model is primarily aimed at being fine-tuned on tasks that use the whole sentence (potentially masked)
to make decisions, such as sequence classification, token classification or question answering. For tasks such as text
generation you should look at model like GPT2.
### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-uncased')
>>> unmasker("Hello I'm a [MASK] model.")
[{'sequence': "[CLS] hello i'm a fashion model. [SEP]",
'score': 0.1073106899857521,
'token': 4827,
'token_str': 'fashion'},
{'sequence': "[CLS] hello i'm a role model. [SEP]",
'score': 0.08774490654468536,
'token': 2535,
'token_str': 'role'},
{'sequence': "[CLS] hello i'm a new model. [SEP]",
'score': 0.05338378623127937,
'token': 2047,
'token_str': 'new'},
{'sequence': "[CLS] hello i'm a super model. [SEP]",
'score': 0.04667217284440994,
'token': 3565,
'token_str': 'super'},
{'sequence': "[CLS] hello i'm a fine model. [SEP]",
'score': 0.027095865458250046,
'token': 2986,
'token_str': 'fine'}]
```
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = BertModel.from_pretrained("bert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import BertTokenizer, TFBertModel
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
model = TFBertModel.from_pretrained("bert-base-uncased")
text = "Replace me by any text you'd like."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
### Limitations and bias
Even if the training data used for this model could be characterized as fairly neutral, this model can have biased
predictions:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='bert-base-uncased')
>>> unmasker("The man worked as a [MASK].")
[{'sequence': '[CLS] the man worked as a carpenter. [SEP]',
'score': 0.09747550636529922,
'token': 10533,
'token_str': 'carpenter'},
{'sequence': '[CLS] the man worked as a waiter. [SEP]',
'score': 0.0523831807076931,
'token': 15610,
'token_str': 'waiter'},
{'sequence': '[CLS] the man worked as a barber. [SEP]',
'score': 0.04962705448269844,
'token': 13362,
'token_str': 'barber'},
{'sequence': '[CLS] the man worked as a mechanic. [SEP]',
'score': 0.03788609802722931,
'token': 15893,
'token_str': 'mechanic'},
{'sequence': '[CLS] the man worked as a salesman. [SEP]',
'score': 0.037680890411138535,
'token': 18968,
'token_str': 'salesman'}]
>>> unmasker("The woman worked as a [MASK].")
[{'sequence': '[CLS] the woman worked as a nurse. [SEP]',
'score': 0.21981462836265564,
'token': 6821,
'token_str': 'nurse'},
{'sequence': '[CLS] the woman worked as a waitress. [SEP]',
'score': 0.1597415804862976,
'token': 13877,
'token_str': 'waitress'},
{'sequence': '[CLS] the woman worked as a maid. [SEP]',
'score': 0.1154729500412941,
'token': 10850,
'token_str': 'maid'},
{'sequence': '[CLS] the woman worked as a prostitute. [SEP]',
'score': 0.037968918681144714,
'token': 19215,
'token_str': 'prostitute'},
{'sequence': '[CLS] the woman worked as a cook. [SEP]',
'score': 0.03042375110089779,
'token': 5660,
'token_str': 'cook'}]
```
This bias will also affect all fine-tuned versions of this model.
## Training data
The BERT model was pretrained on [BookCorpus](https://yknzhu.wixsite.com/mbweb), a dataset consisting of 11,038
unpublished books and [English Wikipedia](https://en.wikipedia.org/wiki/English_Wikipedia) (excluding lists, tables and
headers).
## Training procedure
### Preprocessing
The texts are lowercased and tokenized using WordPiece and a vocabulary size of 30,000. The inputs of the model are
then of the form:
```
[CLS] Sentence A [SEP] Sentence B [SEP]
```
With probability 0.5, sentence A and sentence B correspond to two consecutive sentences in the original corpus and in
the other cases, it's another random sentence in the corpus. Note that what is considered a sentence here is a
consecutive span of text usually longer than a single sentence. The only constrain is that the result with the two
"sentences" has a combined length of less than 512 tokens.
The details of the masking procedure for each sentence are the following:
- 15% of the tokens are masked.
- In 80% of the cases, the masked tokens are replaced by `[MASK]`.
- In 10% of the cases, the masked tokens are replaced by a random token (different) from the one they replace.
- In the 10% remaining cases, the masked tokens are left as is.
### Pretraining
The model was trained on 4 cloud TPUs in Pod configuration (16 TPU chips total) for one million steps with a batch size
of 256. The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%. The optimizer
used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01,
learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
When fine-tuned on downstream tasks, this model achieves the following results:
Glue test results:
| Task | MNLI-(m/mm) | QQP | QNLI | SST-2 | CoLA | STS-B | MRPC | RTE | Average |
|:----:|:-----------:|:----:|:----:|:-----:|:----:|:-----:|:----:|:----:|:-------:|
| | 84.6/83.4 | 71.2 | 90.5 | 93.5 | 52.1 | 85.8 | 88.9 | 66.4 | 79.6 |
### BibTeX entry and citation info
```bibtex
@article{DBLP:journals/corr/abs-1810-04805,
author = {Jacob Devlin and
Ming{-}Wei Chang and
Kenton Lee and
Kristina Toutanova},
title = {{BERT:} Pre-training of Deep Bidirectional Transformers for Language
Understanding},
journal = {CoRR},
volume = {abs/1810.04805},
year = {2018},
url = {http://arxiv.org/abs/1810.04805},
archivePrefix = {arXiv},
eprint = {1810.04805},
timestamp = {Tue, 30 Oct 2018 20:39:56 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-1810-04805.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
```
<a href="https://huggingface.co/exbert/?model=bert-base-uncased">
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
</a>
|
80dc2e7436919367c55300f942c4c32f
|
AshishBalhara/distilbert-base-uncased-finetuned-clinc
|
AshishBalhara
|
distilbert
| 12 | 2 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null |
['clinc_oos']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,482 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-clinc
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the clinc_oos dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7721
- Accuracy: 0.9184
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 4.2896 | 1.0 | 318 | 3.2890 | 0.7432 |
| 2.6284 | 2.0 | 636 | 1.8756 | 0.8377 |
| 1.5483 | 3.0 | 954 | 1.1572 | 0.8961 |
| 1.015 | 4.0 | 1272 | 0.8573 | 0.9132 |
| 0.7953 | 5.0 | 1590 | 0.7721 | 0.9184 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.13.0+cu116
- Datasets 1.16.1
- Tokenizers 0.10.3
|
5d16edcba87f749b6ca34b3ce7d86bbe
|
trojblue/yys
|
trojblue
| null | 7 | 0 | null | 3 | null | false | false | false |
openrail
| null | null | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 3,990 | false |
## yy model
768ARB, fp32, EMA, finetuned on animefull
**epoch48**: probably overfitted, good results overall
**epoch28**: not overfitted: less stylized but mostly good anatomy
tested on:
- 768x1024 DPM++ SDE Karras, no hires fix
- 512x768 Latent hires → 1024x1536 DPM++ SDE Karras
token word:
- `yingyi, best quality, explicit` → nsfw
- `yingyi, concept art` → concepts
- `yingyi, best quality` → less nsfw
tagging:
- wd1.4, booru, blip
lr & augments:
- 2e-6 sqrt bs2, weight decay 5e-2
- rotate, brightness, contrast, flip
nsfw sample 1 (e48):
```
yingyi, best quality, explicit, multiple girls, 2girls, breasts, long hair, twintails, rabbit ears, pink hair, sex toy, thighhighs, animal ears, vibrator, very long hair, blue eyes, holding, high heels, full body, blush, outdoors, vaginal object insertion, pink eyes, blue hair, standing, gloves, looking at viewer, bow, medium breasts, bangs, fake animal ears, heart, bodysuit, vibrator cord, object insertion, small breasts, public indecency, navel, elbow gloves, heart-shaped pupils, pussy juice, revealing clothes, hair ornament, piercing, remote control vibrator, body writing, red footwear, flower, dildo, red eyes, vaginal, symbol-shaped pupils, black footwear, aqua hair
Negative prompt: error, signature, watermark, username, multiple people, animals, lowres, cropped, worth quality ,low quality, normal quality, jpeg artifacts, blurry, bad anatomy, bad hands, bad arms, bad feet, bad anatomy, missing fingers, extra digits, fewer digits, long neck, missing legs, huge person, optical_illusion
Steps: 25, Sampler: DPM++ SDE Karras, CFG scale: 6, Seed: 1438393047, Size: 512x768, Model hash: f32626be, Model: yye48, Denoising strength: 0.7, Clip skip: 2, ENSD: 31338, Hires upscale: 2, Hires upscaler: Latent (bicubic)
```
nsfw sample 2 (e48):
```
yingyi, best quality, explicit, restrained, 1girl, thighhighs, sex toy, bound, bdsm, blindfold, object insertion, gloves, dildo, gag, solo, breasts, anal, body writing, gagged, vaginal, bondage, white gloves, elbow gloves, short hair, vibrator, vaginal object insertion, small breasts, ball gag, anal object insertion, white thighhighs, white hair, stationary restraints, nipples, sex machine, soles, pussy, black blindfold, motion blur, navel, feet, tally, spread legs
Negative prompt: error, signature, watermark, username, multiple people, animals, lowres, cropped, worth quality ,low quality, normal quality, jpeg artifacts, blurry, bad anatomy, bad hands, bad arms, bad feet, bad anatomy, missing fingers, extra digits, fewer digits, long neck, missing legs, huge person, optical_illusion
Steps: 25, Sampler: DPM++ SDE Karras, CFG scale: 6, Seed: 1438393014, Size: 512x768, Model hash: f32626be, Model: yye48, Denoising strength: 0.7, Clip skip: 2, ENSD: 31338, Hires upscale: 2, Hires upscaler: Latent (bicubic)
```
sfw sample (e48): [file here](https://huggingface.co/trojblue/yys/blob/main/%5Blatnet%20nearest%20exact%5D31338-DPM%2B%2B%202M%20Karras-step27-cfg6.5-8a648075-20230108_133253_026580.png)
```
yingyi, 1girl, waves, water, long hair, splashing, liquid hair, very long hair, a painting of a blue wave with a white background and a person standing on the wave of water in the middle, ying yi, hatsune miku, vocaloid, absurdres, highres, aqua eyes, aqua hair, bird, boots, bridal gauntlets, cape, fish, nail polish, thigh boots, tropical fish, twintails, wading, white bird
Negative prompt: error, signature, watermark, username, realistic,3d, multiple people, extra legs, animals, lowres, cropped, worth quality, low quality, normal quality, jpeg artifacts, bad anatomy, bad hands, bad arms, bad feet, bad anatomy, missing fingers, extra digits,explicit, fewer digits, long neck, missing legs, huge person, optical_illusion
Steps: 27, Sampler: DPM++ 2M Karras, CFG scale: 6.5, Seed: 31338, Size: 384x576, Model hash: 8a648075, Model: yy_e28, Denoising strength: 0.7, Clip skip: 2, ENSD: 31338, Hires upscale: 2, Hires upscaler: Latent (nearest-exact)
```
|
0483e4731f87a800a6a555fa1aa7ff1b
|
mrm8488/bert-small2bert-small-finetuned-cnn_daily_mail-summarization
|
mrm8488
|
encoder-decoder
| 8 | 2,568 |
transformers
| 4 |
summarization
| true | false | false |
apache-2.0
|
['en']
|
['cnn_dailymail']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['summarization']
| false | true | true | 1,702 | false |
# Bert-small2Bert-small Summarization with 🤗EncoderDecoder Framework
This model is a warm-started *BERT2BERT* ([small](https://huggingface.co/google/bert_uncased_L-4_H-512_A-8)) model fine-tuned on the *CNN/Dailymail* summarization dataset.
The model achieves a **17.37** ROUGE-2 score on *CNN/Dailymail*'s test dataset.
For more details on how the model was fine-tuned, please refer to
[this](https://colab.research.google.com/drive/1Ekd5pUeCX7VOrMx94_czTkwNtLN32Uyu?usp=sharing) notebook.
## Results on test set 📝
| Metric | # Value |
| ------ | --------- |
| **ROUGE-2** | **17.37** |
## Model in Action 🚀
```python
from transformers import BertTokenizerFast, EncoderDecoderModel
import torch
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
tokenizer = BertTokenizerFast.from_pretrained('mrm8488/bert-small2bert-small-finetuned-cnn_daily_mail-summarization')
model = EncoderDecoderModel.from_pretrained('mrm8488/bert-small2bert-small-finetuned-cnn_daily_mail-summarization').to(device)
def generate_summary(text):
# cut off at BERT max length 512
inputs = tokenizer([text], padding="max_length", truncation=True, max_length=512, return_tensors="pt")
input_ids = inputs.input_ids.to(device)
attention_mask = inputs.attention_mask.to(device)
output = model.generate(input_ids, attention_mask=attention_mask)
return tokenizer.decode(output[0], skip_special_tokens=True)
text = "your text to be summarized here..."
generate_summary(text)
```
> Created by [Manuel Romero/@mrm8488](https://twitter.com/mrm8488) | [LinkedIn](https://www.linkedin.com/in/manuel-romero-cs/)
> Made with <span style="color: #e25555;">♥</span> in Spain
|
47d3a87c8e9e19c7df3b03069fe18863
|
TencentGameMate/chinese-wav2vec2-large
|
TencentGameMate
|
wav2vec2
| 6 | 658 |
transformers
| 4 | null | true | false | false |
mit
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,902 | false |
Pretrained on 10k hours WenetSpeech L subset. More details in [TencentGameMate/chinese_speech_pretrain](https://github.com/TencentGameMate/chinese_speech_pretrain)
This model does not have a tokenizer as it was pretrained on audio alone.
In order to use this model speech recognition, a tokenizer should be created and the model should be fine-tuned on labeled text data.
python package:
transformers==4.16.2
```python
import torch
import torch.nn.functional as F
import soundfile as sf
from fairseq import checkpoint_utils
from transformers import (
Wav2Vec2FeatureExtractor,
Wav2Vec2ForPreTraining,
Wav2Vec2Model,
)
from transformers.models.wav2vec2.modeling_wav2vec2 import _compute_mask_indices
model_path=""
wav_path=""
mask_prob=0.0
mask_length=10
feature_extractor = Wav2Vec2FeatureExtractor.from_pretrained(model_path)
model = Wav2Vec2Model.from_pretrained(model_path)
# for pretrain: Wav2Vec2ForPreTraining
# model = Wav2Vec2ForPreTraining.from_pretrained(model_path)
model = model.to(device)
model = model.half()
model.eval()
wav, sr = sf.read(wav_path)
input_values = feature_extractor(wav, return_tensors="pt").input_values
input_values = input_values.half()
input_values = input_values.to(device)
# for Wav2Vec2ForPreTraining
# batch_size, raw_sequence_length = input_values.shape
# sequence_length = model._get_feat_extract_output_lengths(raw_sequence_length)
# mask_time_indices = _compute_mask_indices((batch_size, sequence_length), mask_prob=0.0, mask_length=2)
# mask_time_indices = torch.tensor(mask_time_indices, device=input_values.device, dtype=torch.long)
with torch.no_grad():
outputs = model(input_values)
last_hidden_state = outputs.last_hidden_state
# for Wav2Vec2ForPreTraining
# outputs = model(input_values, mask_time_indices=mask_time_indices, output_hidden_states=True)
# last_hidden_state = outputs.hidden_states[-1]
```
|
d5b463713cb4cd17d9c5281d02c0bc10
|
hesw23168/SD_Elysium_Kuro_Model
|
hesw23168
| null | 4 | 0 | null | 12 | null | false | false | false |
openrail
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 856 | false |
Also on https://civitai.com/models/5301/elysium-kuro-anime
Anime model is custom mix + finetune on dataset of high quality images (mix including Anything 4.0, WD 1.4 Booru, Seek Art Mega V1) and contains the contains the kl-f8-anime2 VAE from Waifu Diffusion.
Example settings:
Negative prompt: (lowres:1.1), (worst quality:1.2), (low quality:1.1), bad anatomy, bad hands, text, error, missing fingers, extra digit, fewer digits, cropped, normal quality, jpeg artifacts, signature, watermark, username, blurry
(General model): Clip skip 1, VAE: 'vae-ft-mse-840000' from StabilityAI (included)
(Anime model): Clip skip 2, VAE: 'kl-f8-anime2.ckpt' from Waifu Diffusion (included)
Example images from anime model:

General model coming soon.
|
4e5acad4b15c14b71707cc70be1af34a
|
Geotrend/distilbert-base-lt-cased
|
Geotrend
|
distilbert
| 6 | 2 |
transformers
| 0 |
fill-mask
| true | false | false |
apache-2.0
|
['lt']
|
['wikipedia']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,215 | false |
# distilbert-base-lt-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-lt-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-lt-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact amine@geotrend.fr for any question, feedback or request.
|
b14bf8c7dfcf5d6d7cb31c8319b5bac2
|
inovex/multi2convai-corona-de-bert
|
inovex
|
bert
| 8 | 3 |
transformers
| 1 |
text-classification
| true | false | false |
mit
|
['de']
| null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['text-classification', 'pytorch', 'transformers']
| false | true | true | 860 | false |
# Multi2ConvAI-Corona: finetuned Bert for German
This model was developed in the [Multi2ConvAI](https://multi2conv.ai) project:
- domain: Corona (more details about our use cases: ([en](https://multi2convai/en/blog/use-cases), [de](https://multi2convai/en/blog/use-cases)))
- language: German (de)
- model type: finetuned Bert
## How to run
Requires:
- Huggingface transformers
### Run with Huggingface Transformers
````python
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("inovex/multi2convai-logistics-de-bert")
model = AutoModelForSequenceClassification.from_pretrained("inovex/multi2convai-logistics-de-bert")
````
## Further information on Multi2ConvAI:
- https://multi2conv.ai
- https://github.com/inovex/multi2convai
- mailto: info@multi2conv.ai
|
a6bdac6aa5e94a38b6dbe261b8da5bda
|
TeamFnord/manga-ocr
|
TeamFnord
|
vision-encoder-decoder
| 8 | 1 |
transformers
| 2 |
image-to-text
| true | false | false |
apache-2.0
|
['ja']
|
['manga109s']
| null | 0 | 0 | 0 | 0 | 1 | 0 | 1 |
['image-to-text']
| false | true | true | 618 | false |
# Manga OCR
Optical character recognition for Japanese text, with the main focus being Japanese manga.
It uses [Vision Encoder Decoder](https://huggingface.co/docs/transformers/model_doc/visionencoderdecoder) framework.
Manga OCR can be used as a general purpose printed Japanese OCR, but its main goal was to provide a high quality
text recognition, robust against various scenarios specific to manga:
- both vertical and horizontal text
- text with furigana
- text overlaid on images
- wide variety of fonts and font styles
- low quality images
Code is available [here](https://github.com/kha-white/manga_ocr).
|
e29c225b38a608b65973a67fb7401c23
|
CalamitousVisibility/enron-spam-checker-10000
|
CalamitousVisibility
|
distilbert
| 13 | 1 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,057 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# enron-spam-checker-10000
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0512
- Accuracy: 0.9915
- F1: [0.99143577 0.99156328]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
### Framework versions
- Transformers 4.22.1
- Pytorch 1.12.1+cpu
- Datasets 2.5.1
- Tokenizers 0.12.1
|
9852e110fbae9ca77d76e6b0f3495d0a
|
annahaz/xlm-roberta-base-finetuned-misogyny-en-it-hi-beng
|
annahaz
|
xlm-roberta
| 10 | 1 |
transformers
| 0 |
text-classification
| true | false | false |
mit
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 2,323 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-misogyny-en-it-hi-beng
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0140
- Accuracy: 0.9970
- F1: 0.9969
- Precision: 0.9937
- Recall: 1.0
- Mae: 0.0030
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | Mae |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|:------:|:---------:|:------:|:------:|
| 0.3131 | 1.0 | 1759 | 0.4655 | 0.7820 | 0.7682 | 0.7855 | 0.7516 | 0.2180 |
| 0.2644 | 2.0 | 3518 | 0.3231 | 0.8619 | 0.8665 | 0.8091 | 0.9326 | 0.1381 |
| 0.2408 | 3.0 | 5277 | 0.3515 | 0.8801 | 0.8877 | 0.8071 | 0.9863 | 0.1199 |
| 0.1927 | 4.0 | 7036 | 0.1428 | 0.9514 | 0.9512 | 0.9194 | 0.9853 | 0.0486 |
| 0.1333 | 5.0 | 8795 | 0.1186 | 0.9712 | 0.9707 | 0.9478 | 0.9947 | 0.0288 |
| 0.1163 | 6.0 | 10554 | 0.0546 | 0.9879 | 0.9875 | 0.9803 | 0.9947 | 0.0121 |
| 0.0854 | 7.0 | 12313 | 0.0412 | 0.9899 | 0.9896 | 0.9804 | 0.9989 | 0.0101 |
| 0.086 | 8.0 | 14072 | 0.0252 | 0.9949 | 0.9948 | 0.9896 | 1.0 | 0.0051 |
| 0.0395 | 9.0 | 15831 | 0.0179 | 0.9965 | 0.9963 | 0.9927 | 1.0 | 0.0035 |
| 0.0343 | 10.0 | 17590 | 0.0140 | 0.9970 | 0.9969 | 0.9937 | 1.0 | 0.0030 |
### Framework versions
- Transformers 4.20.1
- Pytorch 1.9.0+cu111
- Datasets 2.3.2
- Tokenizers 0.12.1
|
ecf6c4bc2e80e8526f8309fa3f98c8bb
|
kadirnar/yolov6s-v3.0
|
kadirnar
| null | 3 | 0 | null | 0 |
object-detection
| false | false | false |
gpl-3.0
| null |
['detection-datasets/coco']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['object-detection', 'computer-vision', 'yolov6', 'pypi']
| false | true | true | 1,168 | false |
### Model Description
[YOLOv6:](https://arxiv.org/abs/2209.02976) A single-stage object detection framework dedicated to industrial applications.
[YOLOv6 v3.0](https://arxiv.org/abs/2301.05586): A Full-Scale Reloading
[YOLOv6-Pip: Packaged version of the Yolov6 repository](https://github.com/kadirnar/yolov6-pip/)
[Paper Repo: Implementation of paper - YOLOv6](https://github.com/meituan/YOLOv6/)
### Installation
```
pip install yolov6detect
```
### Yolov6 Inference
```python
from yolov6 import YOLOV6
model = YOLOV6(weights='kadirnar/yolov6s-v2.0', device='cuda:0', hf_model=True)
model.classes = None
model.conf = 0.25
model.iou = 0.45
model.show = False
model.save = True
pred = model.predict(source='data/images',yaml='data/coco.yaml', img_size=640)
```
### BibTeX Entry and Citation Info
```
@article{li2022yolov6,
title={YOLOv6: A single-stage object detection framework for industrial applications},
author={Li, Chuyi and Li, Lulu and Jiang, Hongliang and Weng, Kaiheng and Geng, Yifei and Li, Liang and Ke, Zaidan and Li, Qingyuan and Cheng, Meng and Nie, Weiqiang and others},
journal={arXiv preprint arXiv:2209.02976},
year={2022}
}
```
|
75098da5951924099efd3c31176f4b33
|
Helsinki-NLP/opus-mt-hr-fr
|
Helsinki-NLP
|
marian
| 10 | 75 |
transformers
| 0 |
translation
| true | true | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['translation']
| false | true | true | 768 | false |
### opus-mt-hr-fr
* source languages: hr
* target languages: fr
* OPUS readme: [hr-fr](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/hr-fr/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-09.zip](https://object.pouta.csc.fi/OPUS-MT-models/hr-fr/opus-2020-01-09.zip)
* test set translations: [opus-2020-01-09.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/hr-fr/opus-2020-01-09.test.txt)
* test set scores: [opus-2020-01-09.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/hr-fr/opus-2020-01-09.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| JW300.hr.fr | 26.1 | 0.482 |
|
2ea55f6463cdc0b251bd0456bfd431c7
|
rikkar/dsd_futurism
|
rikkar
| null | 3 | 0 | null | 0 | null | false | false | false |
cc0-1.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,196 | false |
Stable Diffusion model trained for 5k steps on the art style "futurism".
Invoke the style with "in the style of futtt". Play with weights, it's a strong style so prompt accordingly.
Sample images:



Sample training images:





|
74a2d1fc3b1b15d66fca985e281ce1af
|
lkm2835/distilbert-imdb
|
lkm2835
|
distilbert
| 10 | 5 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null |
['imdb']
| null | 1 | 0 | 1 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,113 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| No log | 1.0 | 391 | 0.1849 | 0.9281 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.11.0+cu113
- Datasets 2.0.0
- Tokenizers 0.11.6
|
48f608b576343b1207926fa44674957f
|
infinitejoy/wav2vec2-large-xls-r-300m-indonesian
|
infinitejoy
|
wav2vec2
| 17 | 8 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
|
['id']
|
['common_voice']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['automatic-speech-recognition', 'mozilla-foundation/common_voice_7_0', 'generated_from_trainer']
| true | true | true | 2,571 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-indonesian
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the MOZILLA-FOUNDATION/COMMON_VOICE_7_0 - ID dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2759
- Wer: 0.3256
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 7e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 4000
- num_epochs: 100.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 3.0387 | 4.72 | 1000 | 3.0892 | 1.0 |
| 1.7911 | 9.43 | 2000 | 0.8451 | 0.6702 |
| 1.2826 | 14.15 | 3000 | 0.4211 | 0.4166 |
| 1.1802 | 18.87 | 4000 | 0.3508 | 0.4690 |
| 1.1065 | 23.58 | 5000 | 0.3319 | 0.4662 |
| 1.0921 | 28.3 | 6000 | 0.3056 | 0.3880 |
| 1.0366 | 33.02 | 7000 | 0.2997 | 0.3665 |
| 0.9988 | 37.74 | 8000 | 0.2972 | 0.3653 |
| 0.9864 | 42.45 | 9000 | 0.2697 | 0.3371 |
| 0.9558 | 47.17 | 10000 | 0.2739 | 0.3141 |
| 0.9094 | 51.89 | 11000 | 0.2657 | 0.3533 |
| 0.9034 | 56.6 | 12000 | 0.2699 | 0.3397 |
| 0.8907 | 61.32 | 13000 | 0.2765 | 0.3470 |
| 0.8631 | 66.04 | 14000 | 0.2774 | 0.3346 |
| 0.8389 | 70.75 | 15000 | 0.2743 | 0.3365 |
| 0.8214 | 75.47 | 16000 | 0.2778 | 0.3201 |
| 0.8195 | 80.19 | 17000 | 0.2725 | 0.3286 |
| 0.7994 | 84.91 | 18000 | 0.2782 | 0.3315 |
| 0.7816 | 89.62 | 19000 | 0.2775 | 0.3363 |
| 0.7816 | 94.34 | 20000 | 0.2731 | 0.3278 |
| 0.7635 | 99.06 | 21000 | 0.2767 | 0.3259 |
### Framework versions
- Transformers 4.16.0.dev0
- Pytorch 1.10.1+cu102
- Datasets 1.17.1.dev0
- Tokenizers 0.11.0
|
ecb0cbb8b084f6dd89700182f3194d73
|
BestJammer/HASDX
|
BestJammer
| null | 3 | 0 | null | 26 | null | false | false | false |
creativeml-openrail-m
| null | null | null | 3 | 0 | 0 | 3 | 2 | 0 | 2 |
[]
| false | true | true | 2,712 | false |
# About this bad ass beast of a checkpoint:
I merged a few checkpoints and got something buttery and amazing. Does great with things other then people too. It can do anything really. It doesn't need crazy prompts either. Keep it simple. No need for all the artist names and trending on whatever.
# PRUNED AND SMALLER ckpt FILES! AS WELL AS DIFFUSERS!
[Link here for diffusers and pruned](https://huggingface.co/johnslegers/hasdx)
There are the download links below as well

Example Prompts:
* female, pale purple hair, frills, detailed skin, perfect face, fashion photography, photo realistic, 20 megapixel, canon eos r3, detailed skin, detailed, detailed face, (full body intricate, vibrant, photo realistic, realistic, dramatic, sharp focus, 8k)
* (extremely detailed photo 8k), full body shot photo of the most beautiful artwork, beautiful woman soldier, green hair, cleavage, wearing intricate advanced futuristic blue power armor, propped up on one elbow, cinematic lighting, very detailed face and eyes, park in background, high quality photo
Negative prompt examples:
* Asian, cartoon, 3d, (disfigured), (bad art), (deformed), (poorly drawn), (extra limbs), (close up), strange colors, blurry, boring, sketch, lackluster, big breast, large breast, huge breasts, face portrait, self-portrait, signature, letters, watermark
* Asian, large boobs, muscular, out of frame , worst quality , text , blurred , monstrous , hideous , ugly , duplicate , cropped , mutilated , horrifying
### CKPT Here and diffuers
[Download ckptSXDHAS.ckpt (7.7GB)](https://huggingface.co/BestJammer/HASDX/resolve/main/ckptSXDHAS.ckpt)
[Download hasdx_emaonly.ckpt (4.27GB)](https://huggingface.co/johnslegers/hasdx/resolve/main/hasdx_emaonly.ckpt)
[Download hasdx.ckpt (2.13GB)](https://huggingface.co/johnslegers/hasdx/resolve/main/hasdx.ckpt)
# What I merged:
https://civitai.com/models/1349/sxd-berrymix-merge
https://civitai.com/models/2504/handas-3dkx10b
https://civitai.com/models/3762/general-purpose-model
The third one is a mystery because I cannot remember where I got it but it was called model.ckpt and I uploaded it myself because it was lost somewhere I cannot find. I did some .4 but most merges were .5 weight. I have tried so many merges and this one just clicked great.
I merged SXD with the model so I had modelsdx then merged that with Handas
Enjoy
### Not necessary at all
but if you're feeling generous and want to help support my unhealthy amount of AI generating and future art endeavors:
https://www.buymeacoffee.com/OnlyJams
https://www.Only-Jams.redbubble.com
|
1a51b82337fec752d16a509a4ccf5cac
|
espnet/YushiUeda_swbd_sentiment_asr_train_asr_conformer_wav2vec2_2
|
espnet
| null | 20 | 3 |
espnet
| 0 |
automatic-speech-recognition
| false | false | false |
cc-by-4.0
|
['en']
|
['swbd_sentiment']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['espnet', 'audio', 'automatic-speech-recognition']
| false | true | true | 179,437 | false |
## ESPnet2 ASR model
### `espnet/YushiUeda_swbd_sentiment_asr_train_asr_conformer_wav2vec2_2`
This model was trained by YushiUeda using swbd_sentiment recipe in [espnet](https://github.com/espnet/espnet/).
### Demo: How to use in ESPnet2
```bash
cd espnet
git checkout 17089cb2cf5f1275132163f6327defbcc1b1bc1b
pip install -e .
cd egs2/swbd_sentiment/asr1
./run.sh --skip_data_prep false --skip_train true --download_model espnet/YushiUeda_swbd_sentiment_asr_train_asr_conformer_wav2vec2_2
```
## ASR config
<details><summary>expand</summary>
```
config: conf/tuning/train_asr_conformer_wav2vec2_2.yaml
print_config: false
log_level: INFO
dry_run: false
iterator_type: sequence
output_dir: exp/asr_train_asr_conformer_wav2vec2_2_raw_en_word
ngpu: 1
seed: 2022
num_workers: 2
num_att_plot: 3
dist_backend: nccl
dist_init_method: env://
dist_world_size: 2
dist_rank: 0
local_rank: 0
dist_master_addr: localhost
dist_master_port: 43183
dist_launcher: null
multiprocessing_distributed: true
unused_parameters: false
sharded_ddp: false
cudnn_enabled: true
cudnn_benchmark: false
cudnn_deterministic: true
collect_stats: false
write_collected_feats: false
max_epoch: 70
patience: null
val_scheduler_criterion:
- valid
- loss
early_stopping_criterion:
- valid
- loss
- min
best_model_criterion:
- - valid
- acc
- max
keep_nbest_models: 10
nbest_averaging_interval: 10
grad_clip: 5.0
grad_clip_type: 2.0
grad_noise: false
accum_grad: 2
no_forward_run: false
resume: true
train_dtype: float32
use_amp: false
log_interval: 100
use_matplotlib: true
use_tensorboard: true
use_wandb: false
wandb_project: null
wandb_id: null
wandb_entity: null
wandb_name: null
wandb_model_log_interval: -1
detect_anomaly: false
pretrain_path: null
init_param: []
ignore_init_mismatch: false
freeze_param:
- frontend.upstream
num_iters_per_epoch: null
batch_size: 20
valid_batch_size: null
batch_bins: 6000000
valid_batch_bins: null
train_shape_file:
- exp/asr_stats_raw_en_word/train/speech_shape
- exp/asr_stats_raw_en_word/train/text_shape.word
valid_shape_file:
- exp/asr_stats_raw_en_word/valid/speech_shape
- exp/asr_stats_raw_en_word/valid/text_shape.word
batch_type: numel
valid_batch_type: null
fold_length:
- 80000
- 150
sort_in_batch: descending
sort_batch: descending
multiple_iterator: false
chunk_length: 500
chunk_shift_ratio: 0.5
num_cache_chunks: 1024
train_data_path_and_name_and_type:
- - dump/raw/train/wav.scp
- speech
- sound
- - dump/raw/train/text
- text
- text
valid_data_path_and_name_and_type:
- - dump/raw/dev/wav.scp
- speech
- sound
- - dump/raw/dev/text
- text
- text
allow_variable_data_keys: false
max_cache_size: 0.0
max_cache_fd: 32
valid_max_cache_size: null
optim: adam
optim_conf:
lr: 0.002
weight_decay: 1.0e-05
scheduler: warmuplr
scheduler_conf:
warmup_steps: 5000
token_list:
- <blank>
- <unk>
- i
- and
- the
- you
- that
- it
- a
- Neutral
- to
- uh
- '''s'
- of
- know
- Positive
- they
- in
- we
- '''t'
- have
- but
- so
- was
- like
- Negative
- yeah
- is
- just
- um
- well
- do
- for
- think
- don
- there
- or
- 'on'
- '''re'
- my
- what
- really
- be
- with
- not
- if
- are
- one
- he
- '''ve'
- because
- '''m'
- about
- all
- get
- can
- had
- out
- at
- them
- when
- this
- as
- oh
- lot
- up
- people
- some
- then
- would
- go
- right
- mean
- now
- time
- kind
- got
- going
- good
- she
- things
- more
- were
- from
- something
- been
- 'no'
- see
- me
- too
- an
- your
- much
- little
- guess
- how
- where
- our
- very
- here
- their
- thing
- two
- '''ll'
- other
- did
- years
- work
- even
- has
- any
- way
- probably
- those
- could
- say
- real
- back
- '''d'
- year
- down
- home
- than
- want
- didn
- into
- pretty
- okay
- who
- take
- huh
- school
- said
- make
- over
- kids
- never
- always
- put
- by
- her
- stuff
- went
- doing
- three
- these
- 'yes'
- which
- around
- only
- big
- maybe
- 'off'
- anything
- day
- t
- sure
- actually
- come
- money
- him
- different
- everything
- still
- used
- many
- five
- will
- sort
- nice
- us
- last
- his
- thought
- every
- most
- getting
- first
- feel
- bit
- need
- children
- same
- course
- also
- new
- care
- family
- hum
- long
- through
- before
- use
- done
- should
- house
- old
- let
- does
- car
- being
- problem
- doesn
- four
- seems
- though
- pay
- look
- whole
- great
- husband
- haven
- try
- live
- trying
- ever
- why
- read
- better
- find
- far
- keep
- ago
- sometimes
- watch
- interesting
- quite
- area
- hard
- talking
- else
- another
- part
- bad
- having
- twenty
- whatever
- place
- couple
- usually
- 'true'
- high
- texas
- seen
- fact
- s
- enough
- after
- own
- college
- while
- country
- hundred
- somebody
- few
- either
- times
- week
- away
- gonna
- type
- job
- six
- dollars
- tell
- might
- remember
- again
- came
- give
- started
- start
- ten
- made
- play
- able
- dallas
- enjoy
- working
- once
- c
- someone
- life
- least
- v
- everybody
- since
- fun
- both
- talk
- wouldn
- ones
- news
- anyway
- wasn
- person
- heard
- believe
- am
- th
- buy
- may
- point
- call
- night
- y
- almost
- bye
- isn
- system
- wanted
- called
- took
- state
- wife
- child
- half
- women
- goes
- next
- yet
- especially
- love
- looking
- parents
- gone
- such
- gets
- understand
- together
- movie
- until
- w
- days
- end
- saying
- idea
- saw
- music
- mother
- thirty
- couldn
- makes
- stay
- change
- m
- basically
- wonderful
- problems
- guy
- worked
- spend
- help
- lived
- credit
- whether
- seem
- eight
- n
- best
- world
- run
- hear
- bought
- young
- each
- months
- seven
- places
- supposed
- city
- matter
- coming
- exactly
- d
- small
- summer
- comes
- certain
- company
- less
- thinking
- won
- during
- b
- thousand
- agree
- show
- daughter
- sounds
- myself
- funny
- water
- o
- month
- dog
- fifty
- paper
- gotten
- found
- taking
- today
- certainly
- boy
- friends
- number
- mine
- program
- food
- son
- p
- older
- name
- air
- movies
- government
- moved
- schools
- outside
- deal
- close
- tried
- paying
- eat
- drive
- hours
- nine
- rather
- cars
- crime
- important
- war
- living
- between
- business
- anymore
- reason
- weeks
- public
- vote
- situation
- recently
- nothing
- easy
- sit
- pick
- taxes
- turn
- full
- percent
- making
- friend
- book
- happen
- minutes
- middle
- town
- watching
- paid
- eighty
- tax
- several
- listen
- set
- talked
- north
- takes
- reading
- definitely
- law
- jury
- kinds
- married
- u
- enjoyed
- says
- without
- works
- learn
- everyone
- drug
- major
- side
- cost
- room
- education
- morning
- computer
- involved
- mostly
- aren
- health
- l
- anybody
- along
- amount
- man
- against
- weather
- often
- under
- age
- forty
- insurance
- favorite
- hope
- card
- must
- happened
- lives
- left
- drugs
- expensive
- american
- miles
- yourself
- hour
- already
- plano
- cards
- decided
- large
- difference
- ahead
- fifteen
- camping
- told
- although
- second
- r
- woman
- twelve
- knew
- guys
- cut
- neat
- fish
- mind
- wrong
- unless
- sense
- instead
- leave
- wear
- class
- hand
- top
- walk
- bring
- past
- f
- running
- e
- absolutely
- weekend
- line
- books
- question
- team
- wish
- exercise
- interested
- areas
- baby
- states
- liked
- somewhere
- father
- experience
- phone
- case
- men
- lots
- cat
- society
- taken
- changed
- game
- worth
- seventy
- gun
- h
- wonder
- hit
- group
- service
- kept
- shows
- gosh
- early
- interest
- trouble
- control
- themselves
- ha
- finally
- using
- god
- dad
- cook
- hot
- difficult
- nursing
- front
- terms
- growing
- late
- kid
- looked
- felt
- rain
- teach
- tend
- realize
- weren
- sixty
- except
- needs
- social
- budget
- figure
- recycling
- lake
- wanna
- looks
- wh
- forth
- mom
- concerned
- south
- grew
- topic
- ways
- death
- christmas
- regular
- wait
- imagine
- television
- east
- trees
- check
- fairly
- hate
- general
- catch
- dinner
- built
- ready
- fine
- sister
- story
- playing
- starting
- homes
- office
- awful
- radio
- needed
- companies
- changes
- programs
- fishing
- nineteen
- ask
- tough
- cans
- easier
- yard
- cold
- ought
- street
- later
- door
- wants
- students
- national
- space
- across
- brother
- free
- local
- tha
- level
- happens
- sitting
- newspaper
- move
- countries
- store
- subject
- girl
- beautiful
- turned
- soon
- income
- putting
- church
- university
- dress
- information
- lately
- degree
- york
- vacation
- pollution
- totally
- winter
- america
- ah
- ours
- cats
- spent
- happy
- played
- consider
- cases
- spring
- california
- longer
- teacher
- oil
- send
- lost
- sports
- garden
- teachers
- families
- particular
- buying
- amazing
- likes
- football
- united
- teaching
- hey
- benefits
- brought
- gave
- party
- worry
- throw
- testing
- given
- bunch
- near
- nobody
- community
- driving
- open
- personal
- sell
- force
- chance
- wow
- test
- baseball
- within
- biggest
- quality
- building
- example
- seeing
- power
- afford
- support
- caught
- inside
- plan
- seemed
- ninety
- younger
- learned
- generation
- charge
- punishment
- rest
- dogs
- become
- clean
- short
- privacy
- g
- calls
- plus
- particularly
- decide
- terrible
- twice
- fall
- extra
- period
- choice
- hold
- ended
- hadn
- main
- guilty
- depends
- save
- excellent
- price
- strange
- feeling
- size
- trial
- military
- boys
- per
- bet
- judge
- parts
- noticed
- anywhere
- fan
- head
- center
- glad
- clothes
- rate
- stop
- eleven
- white
- stand
- suppose
- guns
- grade
- watched
- bigger
- scary
- issue
- special
- dollar
- green
- its
- jobs
- means
- black
- worse
- knows
- plastic
- low
- spending
- picked
- golf
- gas
- single
- neighborhood
- necessarily
- alone
- cooking
- newspapers
- pull
- fast
- completely
- road
- student
- crimes
- houses
- paint
- medical
- learning
- fair
- restaurant
- miss
- lawn
- giving
- washington
- doctor
- word
- killed
- recycle
- light
- cash
- visit
- familiar
- grass
- itself
- season
- chicken
- rid
- president
- stayed
- normally
- whenever
- machine
- graduate
- eighteen
- capital
- shouldn
- virginia
- private
- field
- magazines
- kill
- market
- apartment
- anyone
- waiting
- asked
- classes
- break
- crazy
- helps
- aware
- sunday
- hm
- speak
- term
- sound
- property
- sad
- comfortable
- waste
- channel
- evening
- cover
- heavy
- carry
- everyday
- systems
- gives
- wa
- answer
- higher
- unfortunately
- minute
- future
- serious
- snow
- available
- smaller
- handle
- ground
- behind
- huge
- west
- plant
- allowed
- wind
- peace
- costs
- cause
- serve
- rent
- lucky
- gee
- build
- english
- telling
- lose
- individual
- gardening
- busy
- order
- raised
- basic
- basis
- rock
- training
- happening
- opinion
- heart
- follow
- mainly
- history
- walking
- ye
- average
- towards
- houston
- games
- travel
- decision
- environment
- respect
- list
- hopefully
- grow
- others
- sorry
- san
- taught
- weight
- bags
- hurt
- finding
- attention
- hasn
- computers
- raise
- aerobics
- quick
- shot
- personally
- bedroom
- similar
- loved
- sixties
- park
- helping
- feet
- industry
- write
- generally
- weird
- record
- benefit
- pool
- mail
- pennsylvania
- glass
- notice
- calling
- process
- land
- originally
- richardson
- cities
- afraid
- utah
- entire
- colorado
- ball
- boat
- grandmother
- possible
- folks
- helped
- strong
- keeping
- bill
- keeps
- thank
- camp
- third
- types
- eventually
- obviously
- yesterday
- apparently
- instance
- pet
- central
- club
- flowers
- trash
- trip
- classical
- europe
- changing
- perhaps
- self
- color
- foot
- video
- based
- station
- saturday
- french
- normal
- fire
- '''clock'
- issues
- starts
- piece
- hobby
- quit
- prison
- parent
- oldest
- bush
- coverage
- police
- forget
- girls
- occasionally
- bank
- shape
- beginning
- moving
- sent
- vietnam
- nights
- current
- salary
- himself
- stories
- mountains
- aluminum
- luck
- invasion
- tape
- florida
- bed
- laws
- research
- mess
- hoping
- players
- tired
- thirteen
- magazine
- expect
- sleep
- words
- language
- push
- position
- hobbies
- background
- plants
- inches
- easily
- stopped
- murder
- shoot
- maryland
- hardly
- bills
- attitude
- pro
- civil
- sometime
- human
- wanting
- goodness
- security
- doctors
- kitchen
- somehow
- penalty
- county
- eating
- simply
- die
- bike
- reunion
- project
- typical
- j
- however
- total
- mexico
- base
- economy
- restaurants
- responsibility
- jail
- lower
- died
- tested
- safe
- voting
- elderly
- sh
- listening
- sudden
- numbers
- career
- stick
- born
- wondering
- poor
- painting
- active
- professional
- supposedly
- li
- lady
- reasons
- cool
- sixteen
- yep
- excuse
- horrible
- political
- red
- science
- federal
- besides
- shop
- opportunity
- ride
- planning
- degrees
- writing
- mexican
- engineering
- surprised
- bother
- share
- graduated
- account
- financial
- hands
- activities
- seventies
- step
- thanks
- bag
- role
- england
- limit
- willing
- hospital
- view
- band
- teams
- tonight
- groups
- advantage
- heat
- department
- turns
- tree
- telephone
- became
- brand
- criminal
- blue
- dry
- warm
- weekends
- grown
- stores
- rights
- garbage
- junior
- everywhere
- prices
- metric
- ran
- equipment
- till
- cross
- considered
- track
- moment
- figured
- americans
- met
- worst
- ridiculous
- grocery
- yours
- neighbor
- piano
- sold
- cowboys
- selling
- savings
- grandchildren
- nowadays
- add
- plays
- conversation
- lunch
- straight
- sentence
- floor
- dead
- fourteen
- meet
- ideas
- foods
- israel
- fix
- ourselves
- swimming
- upset
- sign
- sewing
- wood
- recipe
- van
- upon
- standard
- box
- win
- wall
- offer
- products
- otherwise
- pounds
- stations
- ex
- staying
- drop
- body
- carolina
- sales
- meal
- ice
- basketball
- mixed
- careful
- possibly
- sick
- farm
- retired
- compared
- western
- hearing
- finished
- separate
- mentioned
- soviet
- truck
- river
- defense
- oklahoma
- harder
- k
- re
- stuck
- cable
- trade
- favor
- positive
- related
- smoke
- effect
- various
- bottom
- awhile
- kindergarten
- beat
- court
- beach
- baltimore
- choose
- allow
- brown
- hang
- known
- sorts
- bathroom
- scared
- popular
- extremely
- politics
- hair
- policy
- wha
- saint
- covered
- ca
- sisters
- boston
- lakes
- forever
- fight
- downtown
- visa
- sauce
- garage
- lines
- suit
- whereas
- speech
- direction
- animals
- corps
- fit
- majority
- chinese
- dark
- painted
- milk
- concern
- dump
- nature
- safety
- shoes
- star
- questions
- switch
- clear
- trips
- management
- beyond
- depending
- sing
- iraq
- pressure
- cute
- runs
- windows
- salad
- board
- chicago
- population
- legal
- super
- '''all'
- puts
- slow
- pets
- forward
- thousands
- style
- debt
- becoming
- mo
- pop
- violent
- italian
- earlier
- cheap
- weapons
- coast
- austin
- traveling
- passed
- x
- speaking
- points
- prefer
- threat
- further
- master
- table
- broken
- random
- row
- northern
- simple
- appreciate
- district
- train
- continue
- rangers
- pittsburgh
- truth
- value
- quickly
- raising
- pass
- tennis
- flower
- bass
- engine
- becomes
- variety
- jeans
- exciting
- organization
- spread
- sat
- incredible
- somewhat
- loan
- engineer
- doubt
- southern
- monday
- backyard
- forced
- papers
- express
- saving
- owned
- recent
- toward
- fortunate
- liberal
- shopping
- rough
- brothers
- worried
- meals
- scouts
- vacations
- hunting
- lawyers
- wisconsin
- bucks
- act
- voice
- helpful
- wide
- retirement
- cannot
- picture
- picking
- suspect
- spare
- held
- election
- study
- report
- begin
- antonio
- drove
- opposed
- league
- ju
- se
- solution
- closer
- character
- finish
- knowing
- million
- common
- services
- thinks
- player
- violence
- wrote
- highway
- reasonable
- afternoon
- series
- developed
- effort
- christian
- fantastic
- saved
- seventeen
- barbecue
- sun
- conditioning
- ohio
- babies
- arlington
- hole
- visited
- rural
- herself
- knowledge
- kn
- plans
- instruments
- above
- border
- bible
- losing
- china
- events
- leaving
- written
- taste
- friday
- schedule
- anytime
- showed
- aspect
- range
- earth
- rice
- broke
- tent
- excited
- roles
- situations
- rooms
- spot
- laid
- duty
- bottles
- russia
- fighting
- pound
- letter
- convenient
- thi
- storm
- original
- wild
- showing
- percentage
- required
- grandparents
- extent
- economic
- voted
- canada
- trust
- healthy
- dealing
- face
- hired
- discuss
- larger
- pleased
- eye
- constantly
- perfect
- stupid
- square
- mix
- meat
- semester
- necessary
- mandatory
- burning
- fly
- mothers
- aids
- checked
- bedrooms
- fresh
- advice
- tomatoes
- treat
- sale
- ford
- japanese
- burn
- correct
- limited
- sleeping
- actual
- ends
- female
- hundreds
- feelings
- impact
- leaves
- section
- lay
- provide
- planted
- factor
- fill
- rich
- deep
- someplace
- drives
- circumstances
- honda
- jersey
- smoking
- feels
- fifties
- access
- doors
- pattern
- names
- payment
- facilities
- automatic
- boxes
- hi
- pictures
- versus
- ability
- edge
- politicians
- amazed
- boss
- union
- neighbors
- distance
- prime
- article
- mistake
- grades
- bread
- bothers
- jeez
- rented
- fourth
- alcohol
- gulf
- catfish
- license
- shooting
- touch
- asking
- realized
- require
- natural
- expenses
- purchase
- energy
- talks
- colors
- smart
- considering
- lessons
- tremendous
- participate
- ages
- missed
- quiet
- cheaper
- cents
- payments
- iron
- frightening
- forgot
- cheese
- daughters
- lawyer
- creek
- dental
- seat
- humid
- belt
- michigan
- extended
- flat
- driver
- foreign
- stays
- adults
- songs
- due
- wet
- double
- stress
- desert
- drink
- material
- equal
- deterrent
- machines
- eastern
- boring
- apart
- vegetables
- recipes
- unusual
- responsible
- hire
- garland
- ho
- dangerous
- loans
- colleges
- served
- prisons
- recycled
- cousins
- gorgeous
- member
- values
- fell
- fund
- metal
- wolves
- technology
- form
- enjoyable
- entertainment
- successful
- juries
- brings
- likely
- convicted
- appeal
- minimum
- opposite
- sport
- complete
- smell
- gallon
- lord
- employees
- centers
- alive
- blow
- meant
- cutting
- relatives
- bus
- commit
- none
- jus
- holding
- sand
- swing
- courses
- ski
- breed
- heck
- casual
- blood
- admit
- join
- fi
- draw
- upper
- bell
- youngest
- traffic
- protect
- tends
- medicine
- strongly
- committed
- opinions
- brick
- sides
- congress
- gasoline
- regularly
- plenty
- collect
- williams
- tickets
- perspective
- damage
- present
- bowl
- kidding
- employee
- tests
- loves
- round
- nations
- german
- roof
- august
- october
- disney
- pieces
- solid
- knock
- facts
- concept
- specific
- option
- jump
- stage
- block
- items
- murders
- breaks
- dirty
- shirts
- package
- pair
- pants
- data
- opera
- standing
- roll
- count
- action
- physical
- differently
- teenagers
- checks
- replace
- independent
- neither
- tuition
- eyes
- theater
- educational
- bins
- animal
- reports
- senior
- window
- curious
- de
- argument
- june
- date
- extreme
- innocent
- december
- germany
- salt
- et
- cetera
- tomorrow
- educated
- clubs
- bird
- sons
- journal
- visiting
- pulled
- letting
- tech
- fixed
- el
- shorts
- assume
- message
- primarily
- signs
- cuts
- john
- jazz
- balance
- un
- walked
- shirt
- dropped
- latin
- feed
- influence
- wondered
- adult
- aid
- inner
- elementary
- negative
- swim
- projects
- raleigh
- practically
- grand
- nearly
- turning
- cleaning
- fort
- recommend
- ate
- skiing
- rules
- yellow
- cruise
- impressed
- address
- labor
- dish
- highly
- repair
- prior
- fee
- terribly
- experiences
- lead
- accept
- mart
- immediately
- portion
- nicer
- seafood
- fault
- disease
- truly
- wearing
- male
- dances
- closed
- product
- expected
- caused
- tapes
- relaxing
- culture
- technical
- criminals
- sentencing
- summertime
- indiana
- killing
- encourage
- housing
- practice
- ups
- stitch
- compare
- sentenced
- freedom
- belong
- purpose
- throwing
- crafts
- pushing
- sweet
- decent
- sew
- campus
- carpet
- channels
- repairs
- preschool
- please
- minnesota
- activity
- naturally
- cooked
- quarterback
- wise
- satisfied
- cadillac
- streets
- businesses
- honest
- automatically
- routine
- coach
- arm
- driven
- dishes
- mornings
- contact
- mall
- deficit
- humidity
- location
- fortunately
- atmosphere
- corporate
- meeting
- improvement
- engineers
- network
- dressed
- mcdonald
- spanish
- catholic
- organizations
- hill
- model
- fifth
- elected
- articles
- expecting
- seriously
- volunteer
- handy
- riding
- threw
- ooh
- trend
- ba
- arts
- thursday
- uncle
- relationship
- members
- throughout
- buffalo
- solve
- pain
- auto
- cholesterol
- planned
- prepared
- presented
- staff
- choices
- march
- filled
- overall
- discipline
- justice
- weights
- mile
- unit
- bringing
- beef
- camped
- wal
- mow
- microwave
- weapon
- inch
- rule
- traveled
- subscribe
- proper
- di
- classic
- software
- pays
- complex
- missing
- shepherd
- pleasure
- st
- cream
- expense
- automobile
- hers
- orleans
- king
- philosophy
- singing
- eighties
- enjoys
- democratic
- significant
- chore
- ev
- combination
- patterns
- disappointed
- republican
- media
- pre
- sesame
- fixing
- seconds
- passing
- daily
- trek
- signed
- raining
- accident
- scale
- interests
- route
- ma
- whoever
- reach
- judges
- evidence
- european
- seasons
- supporting
- dirt
- loose
- france
- cancer
- planting
- iowa
- increase
- hospitals
- maintain
- odd
- pregnant
- math
- press
- agency
- shrimp
- beer
- key
- puppy
- sending
- hardest
- tr
- wi
- return
- corner
- suits
- dakota
- al
- immediate
- possibility
- hooked
- song
- stadium
- frame
- dig
- navy
- comedy
- annual
- fear
- island
- exercising
- fancy
- fat
- enjoying
- motivated
- design
- affect
- investment
- recall
- co
- luxury
- trim
- flexible
- international
- furniture
- potatoes
- wou
- fellow
- breakfast
- bath
- trucks
- uses
- onto
- beans
- apple
- alabama
- records
- musical
- tie
- setting
- offs
- michael
- bugs
- freeze
- anyhow
- properly
- underneath
- dining
- aside
- quarter
- kentucky
- skills
- parole
- parks
- nation
- complain
- wine
- summers
- fans
- golden
- unanimous
- shift
- warranty
- plastics
- rates
- rains
- charged
- lincoln
- decisions
- checking
- gray
- laugh
- hills
- commercial
- recognize
- quote
- receive
- recording
- illegal
- generations
- advance
- motor
- outdoor
- lab
- honestly
- rap
- oriented
- match
- art
- fiction
- manage
- flip
- appropriate
- strict
- mad
- mental
- hung
- adds
- mileage
- bicycle
- thoroughly
- elections
- deserve
- indian
- according
- latest
- bu
- ta
- vehicle
- holidays
- july
- junk
- emergency
- convinced
- graduating
- kick
- including
- teenage
- ceiling
- valley
- victim
- ocean
- hell
- steel
- rainy
- noise
- marvelous
- drunk
- studying
- mountain
- hood
- greatest
- facility
- generate
- desk
- improve
- tells
- sex
- results
- si
- manager
- goal
- teenager
- concert
- copy
- africa
- paycheck
- woods
- lubbock
- sentences
- prevent
- impossible
- split
- faster
- speed
- thin
- chose
- monthly
- stands
- turkey
- repeat
- japan
- financially
- lights
- page
- pulling
- explain
- potential
- rape
- wash
- minor
- thrown
- professor
- pan
- vegetable
- fried
- onions
- roommate
- effects
- wire
- shame
- individuals
- sweat
- scene
- yards
- whose
- thoughts
- draft
- useful
- welfare
- organized
- communities
- realistic
- directly
- print
- printer
- purchased
- aunt
- prepare
- millions
- challenge
- twins
- badly
- thick
- pure
- bar
- roads
- missouri
- tall
- library
- added
- sam
- marriage
- gardens
- lesser
- views
- understanding
- prove
- deer
- delicious
- containers
- depend
- denver
- favorites
- tear
- site
- code
- winds
- parties
- relatively
- opened
- falling
- fascinating
- forties
- options
- sharing
- attached
- owner
- version
- modern
- standpoint
- eaten
- fully
- neck
- trials
- knee
- uncomfortable
- temperature
- chemical
- processing
- fruit
- lovely
- bothered
- pot
- causes
- rea
- diet
- theory
- conflict
- earn
- disagree
- exposed
- administration
- breaking
- buildings
- fence
- shocked
- retire
- wedding
- ch
- dust
- acid
- pushed
- blame
- contract
- carried
- nurse
- overseas
- texan
- fuel
- whe
- vehicles
- increased
- necessity
- plate
- hitting
- reduce
- blocks
- hide
- silly
- length
- writer
- film
- development
- refrigerator
- engines
- louis
- relate
- citizens
- dorm
- began
- hawaii
- january
- wheel
- gourmet
- shots
- bushes
- theirs
- outrageous
- sea
- hook
- conscious
- videos
- mastercard
- suburb
- chevy
- tiny
- mowing
- bulbs
- flag
- detroit
- brakes
- charges
- retriever
- towns
- contribute
- arms
- slacks
- definite
- difficulty
- produce
- cultures
- cou
- discovered
- whatnot
- philadelphia
- ou
- electronic
- strictly
- tendency
- mister
- regard
- con
- approach
- friendly
- handled
- governor
- louisiana
- urban
- develop
- pardon
- construction
- classroom
- personality
- currently
- tour
- apply
- memory
- francisco
- affected
- complicated
- risk
- shock
- roses
- movement
- tied
- teaches
- nuts
- halfway
- softball
- masters
- causing
- cake
- unbelievable
- cast
- characters
- actor
- association
- wallpaper
- habit
- blowing
- expert
- screen
- bake
- dessert
- tents
- minneapolis
- tin
- wars
- steps
- structure
- motivation
- buddy
- minds
- wound
- coat
- holes
- covers
- shell
- tries
- undergraduate
- springs
- banks
- kuwait
- kansas
- established
- dozen
- steak
- following
- massachusetts
- jewish
- affects
- hotel
- sight
- tight
- birthday
- statement
- weeds
- consumer
- understood
- tastes
- cartoons
- apartments
- cares
- settled
- september
- letters
- atlanta
- newer
- guarantee
- citizen
- occasion
- attorneys
- tom
- levels
- sweaters
- tires
- direct
- wagon
- remarkable
- result
- shower
- hello
- commercials
- cassette
- forms
- standards
- james
- native
- falls
- comment
- peers
- wore
- pleasant
- mid
- region
- essentially
- differences
- fitness
- symphony
- finger
- ad
- sounded
- joined
- trained
- toyota
- motors
- aspects
- candidate
- votes
- hunt
- electronics
- charging
- registered
- ed
- electric
- bite
- gifts
- manufacturing
- farmers
- participating
- legislation
- los
- angeles
- ticket
- survive
- catching
- eliminate
- ryan
- luckily
- teeth
- ill
- hated
- offices
- file
- hassle
- universal
- entertain
- roast
- traditional
- entertaining
- crisis
- officer
- saudi
- participated
- profession
- gue
- soap
- johnson
- task
- dumb
- gain
- broad
- surgery
- dressing
- condition
- tex
- grill
- camper
- note
- managed
- increasing
- rained
- parking
- wake
- mistakes
- pitch
- cucumbers
- prescription
- shut
- forgotten
- conditions
- rehabilitation
- gold
- waited
- substitute
- lift
- crowd
- gym
- tools
- divorced
- practical
- avoid
- spray
- seats
- severe
- litter
- trunk
- programming
- soft
- discover
- cs
- zero
- firm
- army
- post
- rarely
- virtually
- suddenly
- relative
- technically
- frustrating
- nursery
- checkbook
- rolls
- colored
- division
- jack
- districts
- guitar
- leaders
- permanent
- puerto
- su
- ultimately
- race
- biking
- statistics
- accepted
- hussein
- steal
- shown
- menu
- pension
- youth
- pride
- create
- knit
- walks
- guide
- fry
- til
- requirements
- reporting
- networks
- chain
- soil
- jumped
- hysterical
- target
- wasting
- horse
- buses
- dear
- butter
- thanksgiving
- instrument
- cared
- unemployment
- switchboard
- vice
- morals
- focus
- beds
- wednesday
- george
- principal
- non
- scores
- grandfather
- qualified
- burned
- courts
- cousin
- proud
- ham
- hits
- literally
- transferred
- institution
- debts
- collection
- weed
- cigarettes
- homework
- corruption
- clarion
- purposes
- improved
- applied
- closet
- corn
- tomato
- lasagna
- pickup
- collecting
- immigration
- sooner
- resources
- largest
- hurting
- soccer
- treated
- shore
- bored
- abuse
- mayor
- continental
- professionals
- verdict
- carrying
- button
- drinking
- dying
- reliable
- transportation
- subjects
- fees
- unfortunate
- evenings
- craft
- scout
- languages
- scratch
- sears
- thirties
- solutions
- sherman
- stack
- funds
- skirt
- fed
- correctly
- listened
- clothing
- serving
- supervisor
- mark
- materials
- lewisville
- below
- chemicals
- era
- incentive
- coffee
- offered
- interior
- determine
- sets
- alternative
- instructor
- dance
- saddam
- discussion
- joke
- boating
- fabulous
- ship
- funding
- groceries
- entirely
- sitter
- communications
- democrat
- cafeteria
- corporation
- squash
- peppers
- nor
- pour
- flour
- waco
- controls
- argentina
- flying
- coal
- nuclear
- february
- saturdays
- phoenix
- electrical
- wage
- laying
- effective
- robin
- wealthy
- hampshire
- concerns
- hall
- figures
- rochester
- agreement
- pages
- bitty
- cowboy
- dealers
- features
- argue
- commitment
- hanging
- policeman
- critical
- user
- dried
- strip
- pie
- balls
- eggs
- among
- lifting
- phase
- desire
- final
- jogging
- bless
- attack
- taxed
- acres
- april
- oven
- pack
- claim
- gorbachev
- wherever
- troops
- illinois
- industries
- trailer
- grab
- pitching
- nineties
- ranch
- ti
- mortgage
- mill
- sue
- register
- attorney
- alike
- adopted
- tournament
- involvement
- silver
- perfectly
- slightly
- meetings
- primary
- sixth
- employer
- survey
- indoor
- partly
- addition
- nervous
- georgia
- recreation
- internal
- rise
- schooling
- previous
- mood
- stolen
- birds
- director
- named
- mustang
- mystery
- upstairs
- goods
- reunions
- perform
- reality
- hurry
- scattered
- environmental
- limits
- cleaned
- tons
- concrete
- belts
- cabin
- rolling
- review
- invaded
- invade
- obvious
- requires
- typically
- religious
- religion
- opportunities
- intelligent
- peter
- album
- drawing
- trumpet
- stock
- household
- customer
- kay
- cotton
- tennessee
- specifically
- lowest
- moon
- reputation
- honor
- secretary
- rico
- assumed
- realizing
- attitudes
- rat
- vegetarian
- occurred
- practicing
- promote
- adding
- designed
- delivered
- nah
- category
- disk
- exact
- pilot
- costing
- brake
- mercedes
- pr
- abortion
- texans
- moral
- capable
- applications
- beneficial
- flavor
- drain
- reporter
- clock
- aggravating
- politically
- governments
- clearly
- designing
- burden
- laughed
- topics
- chunk
- spots
- streams
- efficient
- slowly
- arkansas
- discussed
- conservative
- flute
- choir
- sugar
- answering
- lists
- babysitter
- impression
- lets
- david
- forces
- thumb
- cop
- creative
- dip
- switched
- pine
- content
- aerobic
- conversations
- touched
- candidates
- legitimate
- assistant
- annoying
- finance
- vietnamese
- husbands
- storms
- pump
- lawns
- patio
- roots
- russian
- plot
- mouth
- amounts
- suffering
- headlines
- hunter
- acre
- ties
- measure
- la
- trout
- guidelines
- bonus
- emotional
- cow
- unique
- providing
- encouraged
- positions
- barely
- criteria
- olds
- tradition
- scares
- workers
- iran
- toys
- tornado
- moves
- ton
- recyclable
- crowded
- ladies
- melt
- crack
- finances
- score
- crawfish
- transmission
- purple
- mavericks
- eve
- babysitting
- committing
- maintenance
- exposure
- cassettes
- socially
- reagan
- soup
- hiking
- athlete
- cheesecake
- grandson
- skunk
- addison
- skied
- realistically
- profit
- emissions
- skirts
- heels
- awards
- silence
- lambs
- whatsoever
- lotus
- offering
- unquote
- forest
- phones
- miniature
- medium
- grandma
- goo
- finishing
- judicial
- penalties
- ki
- hose
- hungry
- success
- monitor
- application
- pink
- depressing
- supper
- bureaucracy
- status
- territory
- mississippi
- exercises
- preference
- peo
- packages
- broadcast
- doctorate
- scholarship
- grows
- lean
- anxious
- core
- voluntary
- minority
- couples
- ears
- crochet
- selected
- voters
- democrats
- authority
- airport
- horror
- fox
- sub
- professors
- legs
- stir
- celery
- eats
- chocolate
- cup
- asleep
- studies
- afterwards
- slip
- lap
- connection
- individually
- dependent
- foundation
- worthwhile
- fields
- freedoms
- giants
- stars
- kittens
- vet
- balanced
- homeless
- birth
- mu
- campaign
- empty
- scenes
- heads
- kicked
- messed
- arabia
- greatly
- bob
- talent
- nurses
- strike
- reached
- dedicated
- suggested
- guard
- basement
- laughing
- communication
- ghost
- abused
- token
- plane
- beating
- former
- films
- fought
- failed
- lesson
- lo
- walls
- sink
- girlfriend
- accused
- hurts
- loud
- gang
- consistent
- stereo
- fa
- struggling
- interview
- employment
- borrowed
- spoiled
- tub
- tea
- mex
- lemon
- bin
- evidently
- grant
- tremendously
- cartons
- opening
- mi
- skin
- seed
- acceptable
- filter
- golly
- sits
- coke
- followed
- basics
- psychology
- operate
- owns
- freezing
- nissan
- te
- accidents
- settle
- leader
- poverty
- dr
- masking
- fiancee
- jugs
- landfill
- heavily
- lie
- trends
- interstate
- competitive
- arguments
- weigh
- competition
- surprising
- temporary
- inclined
- overnight
- priority
- darn
- honey
- roy
- accurate
- rocks
- babysit
- priced
- twin
- le
- ban
- athletes
- lack
- pond
- muscles
- connecticut
- anyways
- pacific
- owners
- freon
- responsibilities
- toxic
- permit
- closely
- pitched
- dresses
- scenery
- kevin
- costner
- greater
- enemy
- granted
- welcome
- define
- advertising
- salesman
- reverse
- ideal
- locked
- directions
- object
- figuring
- frequently
- boot
- therefore
- jails
- murdered
- purdue
- received
- led
- picks
- include
- democracy
- studied
- fond
- climate
- alaska
- sake
- avid
- healthier
- fired
- connected
- stealing
- chances
- humane
- supported
- enjoyment
- penny
- turtles
- encouraging
- ea
- marketing
- garlic
- broccoli
- potato
- suburbs
- formal
- rush
- concentrate
- woodworking
- leaf
- cent
- automobiles
- ozone
- devices
- source
- comedies
- landing
- semi
- agent
- string
- precious
- ugly
- phenomenal
- hilarious
- winning
- doe
- mobile
- farther
- chili
- landscape
- path
- someday
- complaining
- sky
- load
- baked
- stove
- bend
- en
- command
- decides
- attacks
- wished
- ac
- yearly
- weekly
- indeed
- brief
- mike
- dealer
- emergencies
- event
- charlotte
- slapstick
- purely
- included
- unfair
- meaning
- injuries
- vermont
- cornstarch
- egg
- worrying
- wrap
- buff
- advertisements
- plain
- chores
- mention
- allows
- novels
- bases
- billion
- protected
- workout
- cancel
- daddy
- outdoors
- novel
- bruce
- awfully
- constant
- spends
- accent
- deductions
- dealt
- informed
- tournaments
- snake
- penn
- sox
- tho
- root
- rip
- combat
- polls
- sundays
- blank
- frozen
- assistance
- ads
- hiring
- drivers
- recession
- convert
- alternate
- dryer
- lightning
- gr
- chair
- emotionally
- angry
- mature
- treatment
- lousy
- seventh
- ninth
- deck
- printed
- answers
- jumping
- mentality
- popcorn
- shade
- oaks
- reasonably
- budgeting
- controlled
- british
- unreal
- mini
- performance
- tip
- ge
- handgun
- toy
- skip
- armed
- fleas
- redo
- deposit
- goldfish
- childhood
- removed
- surprises
- dodge
- consulting
- sacrifice
- placed
- sailing
- classics
- bottle
- secretaries
- diesel
- liter
- chosen
- boats
- returned
- item
- november
- adoption
- fewer
- pizza
- feature
- nebraska
- cafe
- alzheimer
- agreed
- choosing
- council
- bermuda
- suspense
- satisfaction
- winters
- headed
- murphy
- customers
- habits
- norm
- loss
- bec
- crawl
- exist
- attractive
- wor
- leg
- selection
- prob
- sources
- audience
- styles
- davis
- borrow
- goals
- determined
- accounts
- pat
- vs
- whi
- advantages
- diapers
- pin
- models
- queen
- sticks
- mesquite
- canal
- incredibly
- feeding
- importance
- salvador
- fathers
- regardless
- translation
- frustrated
- bond
- structured
- counting
- factors
- economical
- involves
- radical
- depressed
- universities
- shall
- tank
- jesus
- counselor
- proposal
- allowing
- pocket
- airplane
- gangs
- saints
- consideration
- dolls
- horses
- spouse
- midwest
- fashioned
- screw
- curriculum
- oakland
- candy
- blanket
- backpack
- industrial
- smog
- canyon
- elect
- backed
- bear
- comfort
- economically
- warmer
- sunny
- exhausted
- afternoons
- ranger
- worries
- orange
- physically
- experiment
- famous
- copies
- cardboard
- pa
- demand
- polluted
- tail
- compatible
- wordperfect
- drag
- float
- carter
- presidential
- dug
- israelis
- relations
- arab
- rings
- estate
- salaries
- recognition
- headline
- nowhere
- ratings
- asia
- ei
- lifestyle
- tenth
- preparing
- cookies
- fifteenth
- bait
- experienced
- defendant
- surprise
- cocaine
- reminds
- liquid
- destroy
- century
- admire
- rare
- tuned
- schwartzkopf
- reduced
- cruel
- cheers
- picnic
- accounting
- pace
- jane
- tune
- knees
- holy
- owe
- pepper
- worms
- bricks
- mound
- additional
- flow
- tended
- refuse
- landfills
- stance
- cry
- dumping
- memories
- anyplace
- geared
- arrangements
- depth
- tuesday
- raw
- neighborhoods
- policemen
- net
- located
- trail
- edition
- purchases
- injury
- beliefs
- statements
- sin
- cultural
- shorter
- guilt
- 'false'
- economics
- enormous
- lifetime
- advanced
- adopt
- mechanical
- liters
- dream
- bachelor
- nasty
- scare
- laundry
- strikes
- quilt
- chlorine
- shed
- whom
- ds
- convince
- courtroom
- volleyball
- domestic
- stomach
- concerts
- stepfather
- typewriter
- clouds
- rating
- gifted
- generals
- clip
- screwed
- australia
- maine
- quarters
- chrysler
- oldsmobile
- pistol
- membership
- seldom
- supply
- tornadoes
- hu
- oth
- porch
- persian
- lakers
- tarpley
- seattle
- thrilled
- boards
- brian
- roughly
- paints
- attic
- ceilings
- baths
- pig
- killer
- pros
- paris
- brooks
- dealership
- developing
- islands
- kennedy
- ending
- ratio
- created
- separated
- lasts
- wives
- jean
- spaghetti
- village
- biased
- operating
- enid
- crappie
- employers
- conference
- tuna
- tole
- pollutants
- jones
- handling
- emission
- vary
- initially
- finds
- obligation
- select
- carefully
- barrier
- strangest
- spaniel
- blues
- comparison
- attend
- focused
- ver
- blacks
- jurors
- floors
- spell
- wears
- heel
- wooden
- assistants
- accustomed
- mild
- bands
- bang
- alrighty
- campbell
- tours
- panama
- believes
- corrupt
- cocoa
- interestingly
- makeup
- communism
- etcetera
- historical
- heating
- hispanic
- bilingual
- ultimate
- bicycling
- elsewhere
- scientific
- combine
- ar
- consequences
- gal
- cure
- grader
- corporations
- stitching
- grief
- leading
- graphics
- regards
- rank
- personalities
- mission
- whiz
- voter
- controlling
- believed
- minded
- kyle
- author
- certified
- shelter
- historically
- protecting
- fits
- carrots
- knitting
- professionally
- specialty
- jars
- needlework
- robert
- regarding
- billions
- rental
- nolan
- ruined
- searching
- taco
- mama
- relationships
- exchange
- highways
- handicapped
- scouting
- discouraging
- dropping
- electricity
- stacks
- catalytic
- muffler
- pipe
- error
- compete
- cajun
- haul
- discussing
- kurds
- anti
- orchestra
- needle
- ireland
- investments
- dramatically
- drawback
- raises
- growth
- definition
- guatemala
- receiving
- reported
- aikman
- shoulder
- banking
- highest
- jimmy
- jim
- cardinals
- jamaica
- magic
- convictions
- usage
- hamburgers
- sporting
- muscle
- sophisticated
- element
- occur
- designated
- depression
- covering
- tooth
- filling
- sharp
- strawberry
- relax
- advise
- enter
- throat
- instances
- allowance
- stronger
- debate
- literature
- shelves
- remove
- advertised
- progress
- smith
- richard
- raped
- offense
- detail
- christians
- tore
- accomplish
- released
- loaning
- bright
- intense
- dies
- peas
- steaks
- spicy
- conditioner
- convenience
- drought
- cups
- nee
- russians
- yeltsin
- thirds
- acting
- northwest
- freeway
- curbside
- corpus
- publicized
- mets
- memorial
- onion
- garages
- employed
- lazy
- wrestling
- crab
- loaded
- stationary
- coupons
- ripped
- balances
- convict
- loving
- represent
- judgment
- pork
- wasted
- selecting
- recover
- divide
- civic
- builds
- quicker
- translate
- churches
- slice
- discount
- swear
- nap
- centered
- vitamins
- planes
- contractor
- drastically
- elaborate
- continued
- decline
- uncles
- utilities
- camera
- musicians
- musician
- condominium
- augustine
- tolerant
- southwest
- counselors
- mirrors
- communicate
- worker
- medication
- powerful
- manure
- replaced
- redone
- shotgun
- memphis
- turtle
- supreme
- owning
- cycle
- jay
- airline
- sir
- method
- mayonnaise
- execution
- plea
- mower
- buttons
- campaigns
- log
- quarterbacks
- hamburger
- arizona
- ignore
- bred
- indianapolis
- envelope
- conversion
- hail
- flooding
- spanked
- fluid
- bay
- leather
- italy
- locations
- blew
- extensive
- traded
- transition
- kilometers
- robbing
- kills
- cadillacs
- randomly
- institute
- triangle
- mercury
- volvo
- dan
- leads
- pe
- rome
- attraction
- aunts
- latex
- texoma
- rabbit
- audi
- methodist
- basements
- tee
- clarinet
- walker
- massive
- stroke
- leak
- sites
- deals
- lined
- embarrassed
- slab
- officially
- behavior
- examples
- witness
- wishes
- unlisted
- terminal
- modem
- poodle
- weighs
- paul
- subscription
- chapter
- likewise
- documents
- shoe
- miserable
- jacket
- lax
- varies
- peach
- blows
- disco
- suicide
- bo
- downhill
- profitable
- twenties
- official
- pressures
- image
- monies
- absentee
- senate
- ethnic
- involve
- proven
- offenders
- afghans
- borders
- peaceful
- ab
- blown
- lock
- adequate
- scholarships
- offers
- bat
- injection
- useless
- revolution
- mormon
- enforce
- cosby
- preapproved
- fortune
- messing
- promised
- sum
- frankly
- damn
- gravy
- boil
- remembered
- consuming
- metropolitan
- gift
- seeds
- factories
- layer
- costly
- usual
- cooler
- daytime
- appearance
- sufficient
- balcony
- chasing
- chest
- las
- plumbing
- farming
- becau
- cleaner
- packed
- cried
- lover
- indians
- racial
- occasional
- rivers
- pollute
- locally
- contribution
- presentations
- laser
- represented
- guests
- apples
- hank
- closest
- oak
- missionaries
- rob
- mailing
- ring
- bias
- newsweek
- nicely
- tables
- zone
- faith
- cheapest
- excuses
- fail
- administrator
- baylor
- sued
- emotions
- appeared
- notes
- tying
- nail
- shake
- comp
- entry
- peer
- sore
- sticky
- pudding
- knowledgeable
- haze
- mass
- stressed
- academy
- considerably
- rowlett
- shortly
- nose
- ordered
- crying
- handed
- wages
- input
- praying
- warfare
- accomplished
- woke
- regulation
- equivalent
- bankrupt
- jog
- ell
- ri
- appeals
- extraordinary
- metroplex
- absolute
- conclusion
- accountable
- glory
- pray
- prisoners
- bomb
- destroyed
- testament
- pu
- suggest
- polish
- principle
- gardener
- beets
- behave
- periods
- shrubs
- sprinkler
- fajitas
- describe
- release
- motorcycle
- bound
- styrofoam
- valuable
- tolerate
- attempt
- jordan
- exists
- screaming
- stump
- breathing
- selfish
- dick
- blonde
- maximum
- max
- secret
- holds
- landscaping
- reads
- prevalent
- galveston
- weirdest
- joy
- nationwide
- soda
- coin
- dukakis
- steam
- embarrassing
- plates
- incorporate
- deductible
- machinery
- categories
- funded
- chairs
- recommended
- handicap
- bowling
- meantime
- accord
- tyler
- mosquitoes
- booklet
- coaches
- syria
- dinners
- holiday
- baltic
- priorities
- recognized
- wipe
- longest
- suburban
- delayed
- backgrounds
- varied
- eighth
- den
- coats
- theme
- nicest
- penney
- adjust
- hou
- toilet
- bullet
- rapidly
- capabilities
- hilly
- container
- layoff
- watches
- jewelry
- maker
- infant
- resent
- blade
- watering
- wildlife
- decorating
- fabric
- leadership
- privilege
- exotic
- loop
- seasoning
- chopped
- retiring
- backseat
- par
- leukemia
- ammunition
- barrel
- pontiac
- mazda
- expressway
- administer
- unions
- function
- stopping
- organize
- parenting
- schedules
- slept
- wheels
- resource
- competing
- sees
- careers
- pits
- carpeting
- legislature
- functional
- divorce
- bridge
- transfer
- needlepoint
- cookbook
- breast
- published
- portland
- throws
- counts
- larry
- louisville
- com
- glued
- tube
- slide
- protective
- felony
- dursban
- renting
- rebuild
- london
- shingles
- lea
- stink
- puppies
- schnauzer
- steering
- plugs
- mechanic
- worn
- inflation
- diving
- stretch
- purse
- introduced
- stripped
- occupied
- siamese
- controversy
- buick
- religiously
- allergic
- edges
- sail
- nancy
- biographies
- nonfiction
- thunderstorms
- intend
- educate
- nerve
- recordings
- concentration
- steve
- academic
- freshman
- sophomore
- neutered
- ponds
- disgusting
- narrow
- comparing
- associate
- adjusted
- cottage
- foster
- rake
- outstanding
- appreciated
- malpractice
- thankful
- personnel
- selective
- administrative
- comparable
- pier
- contributing
- cart
- explore
- commits
- affair
- cleveland
- glasses
- downstairs
- details
- backpacking
- blackberries
- alternator
- antilock
- peeves
- chris
- billy
- henry
- smooth
- polluting
- sweats
- fever
- sweater
- wyoming
- filmed
- guts
- respond
- theories
- database
- culturally
- threatened
- tears
- messages
- ear
- bark
- grandpa
- versions
- lee
- wave
- analysis
- gear
- comments
- colorful
- photography
- victims
- resolution
- stiff
- brazil
- minister
- interpret
- hero
- lebanon
- declare
- heritage
- escape
- columbia
- prescriptions
- assumption
- berkeley
- combined
- traditionally
- relaxation
- entering
- regulate
- consciousness
- react
- sexual
- proved
- booze
- cloth
- herald
- instructors
- vested
- consultant
- taxpayer
- lethal
- restricted
- pub
- directed
- frequent
- tempted
- hat
- treadmill
- abilene
- hates
- skinny
- turnout
- bouncing
- wayne
- beforehand
- deserves
- ninja
- expand
- probation
- eliminated
- yogurt
- powder
- boyfriend
- blankets
- alarm
- vacuum
- chop
- strips
- ruin
- knots
- bits
- rogers
- guessing
- addicted
- pitcher
- fingers
- rascal
- whip
- ag
- vegas
- response
- advocate
- donate
- proposed
- emphasis
- transit
- carpool
- map
- sheets
- punch
- calories
- strenuous
- laboratory
- resolve
- serves
- drum
- compact
- tigon
- initial
- moms
- identify
- respected
- vision
- visits
- eagle
- summary
- illustrated
- dial
- extraordinarily
- intelligence
- stages
- troy
- injured
- increases
- joints
- dayton
- mary
- deduct
- administrators
- pressing
- contest
- arguing
- marked
- seek
- gross
- roberts
- mentally
- session
- failing
- occasions
- videotape
- clever
- jerry
- mutant
- warning
- intellectual
- approve
- declared
- hallway
- edging
- pressed
- strawberries
- nieces
- sour
- homemade
- trick
- mixture
- solar
- inspection
- global
- winner
- drawn
- trace
- sympathetic
- managing
- anchors
- sulphur
- chuck
- overcrowded
- stole
- dean
- steven
- bi
- thursdays
- appear
- collapse
- dome
- flex
- stressful
- ok
- paroled
- apt
- patient
- injustice
- farmer
- socialized
- snap
- clay
- wintertime
- beaches
- touching
- curb
- clippings
- flowerbeds
- toes
- buffer
- hardware
- republic
- battle
- heading
- units
- shadow
- yankees
- rounded
- immigrant
- diseases
- caesar
- saves
- nephews
- slowed
- grounds
- snakes
- abilities
- missiles
- nova
- pen
- digging
- drew
- pools
- strung
- port
- sticking
- orioles
- hopes
- ov
- fertilizer
- railroad
- rub
- robberies
- theft
- tourist
- sta
- stood
- eligible
- freshwater
- saltwater
- shark
- fool
- commute
- deciding
- fam
- terrific
- catalogs
- froze
- ethic
- controversial
- crossed
- georgetown
- soy
- hoi
- pasta
- dreams
- painful
- filthy
- innocence
- leaning
- cleared
- feasible
- perception
- lottery
- parochial
- announced
- ll
- gallons
- kindercare
- behavioral
- classrooms
- merchandise
- washer
- refrigerators
- tinker
- supplies
- stimulation
- alert
- furthest
- cease
- reward
- biology
- starter
- prairie
- drill
- johnny
- experiments
- exercised
- paneling
- tougher
- strain
- noisy
- instill
- housework
- gap
- auditor
- dot
- maternity
- butler
- amarillo
- mulch
- actions
- lawsuits
- senators
- anniversary
- bonding
- leisure
- fertilize
- dragging
- decorated
- statewide
- format
- skeptical
- pad
- mode
- justify
- budgets
- seniors
- chief
- efforts
- hispanics
- drastic
- frost
- layoffs
- temperatures
- airlines
- hoses
- safer
- nails
- salads
- clients
- vans
- surely
- pulls
- operation
- sells
- bikes
- unable
- permanently
- slight
- rifle
- impulse
- manual
- handguns
- gauge
- someth
- youngsters
- karate
- hotels
- demanding
- wool
- warnings
- sanctions
- attract
- mysteries
- tenths
- pots
- neglected
- sliced
- leagues
- bulls
- celtics
- struggle
- qualify
- bars
- lucked
- cliff
- cabins
- relaxed
- gates
- oregon
- loads
- crystal
- fumes
- previews
- floating
- reviews
- peaks
- poorer
- matters
- continues
- costa
- geographic
- earthquake
- intrigued
- ain
- albums
- singapore
- proof
- bulb
- spayed
- fr
- skating
- robbery
- sector
- horn
- drafting
- premeditated
- frustration
- radiator
- boundaries
- bureau
- belonged
- nephew
- officers
- serger
- seam
- choral
- dating
- genuine
- requirement
- gradually
- asians
- establish
- effectively
- reel
- ra
- steady
- produces
- switzerland
- calm
- anthony
- suzuki
- plymouth
- sized
- thread
- centimeters
- recorder
- signal
- brands
- resolved
- converted
- dumped
- spur
- trap
- yell
- smarter
- humanities
- amherst
- sheriff
- safely
- completed
- equally
- labs
- foam
- sociology
- entertained
- lobster
- title
- recommendation
- residential
- vicious
- lease
- outer
- honesty
- switching
- freezer
- tollway
- heavier
- bahamas
- sperry
- rollers
- mowed
- cougar
- chi
- crooks
- lips
- remodeled
- cocker
- eigh
- syndrome
- overweight
- titles
- lettuce
- gather
- span
- greenville
- drip
- senator
- dam
- zip
- lexus
- peninsula
- counseling
- grapevine
- parental
- branch
- travels
- atlantic
- screening
- thr
- veterans
- substance
- golfers
- golfer
- manually
- carbon
- disposition
- harrison
- putt
- disability
- marry
- infants
- engaged
- braves
- mums
- provo
- boots
- commercialized
- replacing
- moisture
- assign
- router
- saws
- translators
- alleviate
- acquainted
- caring
- incinerator
- receipt
- scrub
- setup
- hazardous
- wardrobe
- jackets
- blouses
- suspenseful
- graphic
- gary
- monitoring
- hacker
- india
- desirable
- invite
- reaction
- fantasy
- shocking
- recorded
- addresses
- rig
- instructions
- faced
- advances
- paperwork
- tongue
- cha
- accommodate
- motion
- performed
- composer
- horrendous
- beatles
- crop
- applying
- budgeted
- coda
- seminars
- challenging
- righty
- cave
- dragged
- conscientious
- lenient
- warehouse
- managers
- windy
- allergies
- flu
- inordinately
- cinderella
- shoulders
- progressive
- cam
- colonial
- nicaragua
- exception
- translations
- scream
- independence
- cope
- economies
- tropical
- consequently
- difficulties
- plead
- disturbed
- correlation
- movements
- athletic
- stoned
- invested
- coincidence
- analyze
- chip
- miracle
- fif
- kee
- inmates
- external
- civilian
- trapped
- ghetto
- amenities
- clutch
- disposable
- makers
- pursue
- organ
- blast
- pluses
- racquetball
- lobbyists
- republicans
- outskirts
- carpenter
- buck
- predict
- backwards
- wok
- sweets
- ugh
- tablespoon
- singer
- shops
- singers
- stockings
- mirror
- crocheting
- zucchini
- voices
- pockets
- exhaust
- oxides
- victimized
- cynical
- colder
- castle
- listed
- deliberately
- spoken
- adventure
- repeats
- imagination
- viewing
- bench
- catcher
- bull
- corners
- dustin
- hoffman
- kmart
- concerning
- bulk
- accepting
- eerie
- na
- properties
- lying
- sturdy
- logic
- dated
- slick
- separating
- talented
- raiders
- device
- macintosh
- statistical
- sausage
- italians
- canoe
- thrill
- honeymoon
- arabs
- defending
- stability
- pops
- musicals
- sends
- asks
- ringing
- versa
- opens
- offhand
- dana
- envision
- philosophical
- charity
- volunteering
- commentaries
- informal
- commentary
- viewpoint
- independently
- sections
- nope
- firmly
- forcing
- flags
- gathered
- gett
- neil
- jagged
- awakening
- julia
- beside
- initiated
- pole
- kidnapping
- witnesses
- handles
- panel
- refined
- portions
- moments
- accessible
- hollywood
- norman
- assets
- tire
- pursued
- factory
- au
- romance
- fuels
- presentation
- closets
- hips
- rated
- publish
- protestant
- females
- crowds
- poorly
- identified
- buys
- stuffed
- chamber
- brass
- arrest
- productive
- ticks
- earned
- prisoner
- reimbursement
- spiritual
- z
- pronounce
- riskier
- protection
- consistently
- endless
- charles
- rebellion
- pacifist
- curse
- unto
- spirit
- barbara
- bombs
- tearing
- struck
- heaven
- theaters
- northeast
- licensed
- reducing
- peoples
- lithuania
- damaged
- bacon
- worm
- bug
- sprays
- bloom
- rye
- leasing
- nightmare
- beautifully
- washing
- nurseries
- neglect
- mixes
- frying
- guacamole
- disc
- populated
- cooperation
- bundle
- nickel
- rely
- insulation
- powers
- soldiers
- leery
- iraqi
- germans
- safest
- appears
- whoa
- republics
- participation
- reference
- disgusted
- hauling
- permitted
- orientals
- excluded
- stone
- sack
- crush
- fills
- crap
- fisher
- leap
- interact
- publicity
- brooklyn
- idiot
- easter
- vines
- extensively
- fou
- extras
- shootings
- knife
- outcome
- pensacola
- fished
- interviews
- disappointing
- overworked
- speedy
- apathy
- juror
- ann
- appointed
- spite
- ballot
- counter
- appetite
- technician
- complaints
- begins
- reaching
- referred
- influences
- swayed
- award
- slips
- stranded
- bankruptcy
- users
- socialize
- boom
- secondary
- captured
- backward
- intellectually
- bean
- measured
- remind
- bolt
- swung
- dryers
- extension
- hooks
- trinity
- lasting
- hatred
- snack
- altogether
- heal
- restore
- restored
- deeper
- strength
- link
- graders
- noticeable
- lowering
- preferred
- remarkably
- baroque
- barry
- townhouse
- fertilizing
- decade
- slower
- pl
- hop
- creates
- alternatives
- gains
- operated
- forgetting
- detector
- deliberate
- cycling
- legally
- bridges
- prize
- adolescents
- gamut
- slant
- fascinated
- baskets
- glue
- collector
- accountant
- rides
- def
- remote
- professions
- suggesting
- crafty
- remembers
- bears
- identical
- burns
- basket
- believer
- document
- korea
- lasted
- meatballs
- waist
- rear
- stretching
- fold
- kroger
- linoleum
- angle
- wo
- diverse
- buyer
- bullets
- banning
- bargain
- breeding
- humor
- evil
- q
- illness
- peop
- oldsmobiles
- fiance
- bodied
- educating
- showers
- mud
- connect
- bothering
- rebuilding
- kuwaiti
- possibilities
- overcast
- cloudy
- hurricanes
- forecast
- ru
- therapist
- scott
- rugs
- angel
- wheat
- editor
- caretaker
- liking
- kiss
- inevitably
- chat
- unhappy
- comfortably
- litt
- variation
- protest
- fences
- samples
- messy
- affectionate
- disabled
- barking
- production
- kelly
- corvette
- fanatic
- towel
- firing
- coaching
- presents
- burglar
- overcrowding
- lane
- imprisonment
- arrested
- asian
- wrecked
- beauty
- olympics
- conviction
- playground
- garth
- rs
- jam
- literary
- cre
- execute
- cartoon
- nearby
- fundamental
- ribbon
- bobby
- montessori
- sofa
- fetched
- rolled
- sewed
- starters
- crocheted
- liberties
- nintendo
- majoring
- associated
- threatening
- freezes
- traction
- perspectives
- southeast
- carp
- advertise
- pint
- merit
- durham
- meryl
- snowed
- advisors
- terrorism
- sectors
- joint
- terrain
- citizenship
- melted
- ounces
- ounce
- keys
- races
- smokers
- sensible
- bradshaw
- hip
- af
- richmond
- sen
- readily
- consistency
- canned
- enforcement
- contracts
- cons
- differ
- suffer
- tool
- specialist
- flies
- confidence
- esteem
- ironing
- inexpensive
- slots
- buffet
- cuisine
- congressman
- persuaded
- minorities
- stranger
- brush
- coastline
- blind
- cape
- dow
- partially
- calcium
- vast
- abroad
- museum
- physician
- physicians
- redid
- erie
- cooperative
- survival
- har
- exac
- intentionally
- affecting
- urine
- grandkids
- agricultural
- beam
- display
- constitution
- capitol
- ordinary
- babysat
- aggressive
- journalism
- grad
- tia
- olive
- collin
- casserole
- cakes
- operas
- accents
- almo
- oprah
- tiles
- tile
- trillions
- struggled
- tips
- tulsa
- museums
- sailboat
- perch
- styling
- seville
- rotten
- ken
- dentist
- maverick
- medicare
- douglas
- leased
- insane
- madison
- dock
- subdivision
- pouring
- wooded
- departments
- airplanes
- pilots
- premium
- ol
- liberty
- malls
- fossil
- produced
- bumper
- purchasing
- gentleman
- tribe
- wordstar
- rinse
- santa
- broth
- thomas
- addressed
- unconsciously
- enchiladas
- slickers
- rib
- lawry
- housekeeping
- opener
- doll
- sierra
- nuskin
- legend
- ruben
- batteries
- drywall
- disturbing
- relief
- devastating
- confined
- strides
- incineration
- drums
- cement
- leaked
- presently
- semiconductor
- firms
- foremost
- hoods
- sample
- client
- update
- predominantly
- gory
- dancing
- inherent
- harmed
- sneak
- invisible
- obligated
- invariably
- supervisors
- dentists
- chew
- randy
- understandable
- springer
- artist
- stardom
- taylor
- synthesis
- adapt
- pla
- labeled
- label
- attended
- manuals
- stephen
- stimulating
- improvements
- veterinarian
- serial
- wrongly
- preschoolers
- conditioned
- detailed
- unload
- highs
- collar
- identification
- stones
- zoo
- owens
- sandinistas
- greedy
- kings
- roosevelt
- bananas
- tempting
- lessened
- performances
- greek
- plots
- sean
- statehood
- quo
- assuming
- significantly
- woul
- ve
- occurring
- stringent
- troubled
- resistance
- regional
- disastrous
- practices
- alternates
- approved
- believing
- joe
- iraqis
- habitual
- bone
- dope
- threaten
- inventory
- bibs
- tasted
- afghan
- quilts
- riot
- earning
- backup
- christ
- begun
- guaranteed
- beats
- monetary
- ne
- involving
- punishable
- instantly
- hog
- logistics
- joining
- tutor
- doggone
- hats
- remodeling
- allen
- cabinets
- motivate
- inspired
- computerized
- pers
- extremes
- willingness
- excitement
- jacobs
- architect
- lump
- shared
- evaluate
- exclusive
- expanded
- tablespoons
- ginger
- peanuts
- sang
- choirs
- finals
- aggravated
- okra
- ruled
- landmark
- restrictions
- smack
- investing
- drier
- hotter
- orlando
- adventures
- scrap
- battery
- timing
- boeing
- alcoholic
- sullivan
- continuing
- ukraine
- adjustments
- astros
- claws
- declawed
- rushed
- stray
- void
- chase
- messes
- procedures
- underwear
- skill
- politician
- mitch
- caddo
- prizes
- lids
- files
- tra
- questioned
- wolf
- thunder
- howl
- buffaloes
- honduras
- wealth
- contributes
- wider
- soak
- installed
- converter
- authorities
- visible
- ash
- suspected
- agencies
- mouse
- printout
- producing
- unix
- blueberry
- hike
- overly
- baker
- assault
- restraint
- enj
- danny
- couch
- arnold
- ridge
- gene
- clo
- unemployed
- ahold
- dislike
- equality
- mistaken
- aged
- quoted
- harsh
- realizes
- upstate
- expend
- brinkley
- complaint
- slanted
- restricting
- halls
- wheelchair
- supervised
- terry
- monstrous
- drawbacks
- fights
- learns
- fallen
- challenged
- rewarding
- mailed
- snowing
- ni
- wreck
- amongst
- misery
- schwarzenegger
- goofy
- entered
- rationale
- prosecutor
- excused
- bare
- lawsuit
- audio
- teti
- eh
- lacking
- memorable
- wisdom
- succeed
- jokes
- frenchman
- liability
- workmen
- executives
- marijuana
- surface
- lengths
- fondue
- cheddar
- watermelon
- saucepan
- lukewarm
- cookbooks
- collected
- saran
- hollow
- warming
- spa
- bathing
- incur
- institutions
- freshmen
- sinking
- description
- graduates
- nelson
- commerce
- recruiting
- homemaker
- cri
- ankle
- install
- sympathy
- burnt
- episode
- awesome
- scandal
- grasp
- multiple
- fonda
- tolerance
- enforced
- lighter
- enemies
- gentle
- avoided
- approaches
- sheep
- grace
- reserve
- claimed
- abusing
- borrowing
- servants
- stops
- moist
- ass
- kin
- trimmed
- varieties
- experimenting
- mashed
- foo
- barbecued
- barbecues
- marinate
- manages
- sacks
- giant
- pact
- confused
- stepping
- seams
- michener
- blooming
- stewart
- tim
- rebel
- grammar
- yankee
- restriction
- biblical
- paychecks
- request
- stable
- diego
- lush
- ga
- limb
- flooded
- strokes
- animated
- muddy
- sharks
- quantum
- partners
- deedee
- formula
- subtle
- solved
- tow
- bounds
- rooting
- championship
- toronto
- ontario
- cabbage
- cantaloupe
- siding
- twist
- sirens
- reminded
- affluent
- bee
- captain
- tackle
- advancement
- isolated
- destroying
- foggy
- regulating
- cigarette
- linguistics
- canadian
- payless
- cashways
- bucket
- cereal
- maxed
- rally
- richards
- convention
- everytime
- mar
- dairy
- doubts
- pursuing
- flight
- crew
- oops
- misses
- amazingly
- punished
- suited
- flexibility
- rehabilitate
- deduction
- debit
- executive
- requested
- implemented
- disadvantage
- shoddy
- naive
- moscow
- marcos
- shoots
- blessed
- cad
- noon
- formed
- bargains
- circuit
- dissertation
- serviceable
- roughing
- cots
- condo
- poles
- locks
- ob
- hearts
- passover
- seder
- catholics
- attacking
- syrian
- bagels
- affairs
- iranian
- ideals
- dividend
- voluntarily
- devote
- performing
- pipes
- arteriosclerosis
- nonexistent
- torn
- outfits
- prejudice
- invited
- remembering
- remedial
- certification
- textured
- insides
- tone
- tornados
- exxon
- brain
- photographer
- audit
- mainframe
- jet
- upgraded
- baghdad
- scheduled
- receptacles
- continual
- potentially
- prestige
- perceived
- trivial
- broader
- sided
- claims
- adjustment
- tread
- richland
- discouraged
- stepdaughter
- sacrificed
- possession
- castroville
- timer
- shady
- lehrer
- editorial
- embroidery
- envelopes
- continuous
- typing
- claude
- aging
- attending
- trainable
- watered
- composition
- dis
- disabilities
- intentions
- inter
- gay
- facing
- interviewed
- seasonal
- patch
- peculiar
- rec
- brilliant
- invest
- payday
- buddies
- wiped
- indoors
- fiddle
- inspect
- peel
- hors
- impress
- ridden
- objects
- surprisingly
- servicemen
- teeny
- equitable
- tier
- stair
- targets
- knocked
- accuracy
- impressive
- cycles
- writers
- rehabilitated
- fleet
- drops
- quarts
- peeve
- sa
- pregnancy
- meets
- campsite
- specialized
- indicated
- beings
- obnoxious
- stereotype
- communist
- sway
- soviets
- monetarily
- circle
- blah
- carnival
- outs
- indication
- gigantic
- ownership
- feeds
- latch
- pansies
- cau
- screened
- references
- tabs
- steamed
- blueberries
- desserts
- sandwich
- slices
- mba
- describing
- duke
- mechanics
- secorski
- financing
- punishments
- whack
- addiction
- '7'
- specials
- climbing
- shells
- spectrum
- ins
- ants
- painter
- painters
- noises
- rats
- sequel
- rocky
- stallone
- pai
- exterior
- afterward
- greasy
- builders
- intervention
- solving
- appliances
- fu
- hesitant
- incorrectly
- lizards
- bats
- evils
- refugees
- permission
- dive
- instituted
- parked
- landry
- scope
- eagles
- cows
- orders
- tokyo
- subway
- remorse
- heinous
- manufacturer
- occupation
- neal
- brushes
- manhattan
- stud
- leftover
- coll
- rifles
- shelf
- robbed
- temporarily
- inconvenient
- limitations
- spelling
- precise
- commodore
- specifications
- belief
- aggravates
- nev
- bites
- knox
- overheard
- rows
- frederick
- pointed
- stu
- rusty
- reelected
- loses
- pretend
- symptoms
- biography
- destroys
- delicate
- speakers
- happier
- grub
- raiser
- petroleum
- menial
- jeff
- blink
- recommending
- diner
- streep
- copper
- explosives
- disappear
- cosmopolitan
- swimmer
- vogue
- felon
- converting
- bolts
- ross
- ro
- reject
- outfit
- automotive
- mexicans
- envious
- risking
- shifts
- cylinder
- gaining
- tragic
- expressing
- expression
- chilly
- yorker
- dall
- deny
- bonuses
- lucrative
- congressmen
- portray
- needing
- scallops
- susan
- protein
- gained
- baking
- academically
- kenyon
- admissions
- sciences
- provides
- preparation
- logical
- cage
- owed
- devastated
- despite
- pillsbury
- surrounding
- prosecution
- liable
- limitation
- writes
- follows
- nash
- paso
- juice
- reusable
- procedure
- vegetation
- bach
- delivery
- rapes
- thou
- contemporary
- brookhaven
- heater
- curiosity
- fuse
- assembly
- limestone
- danger
- ferry
- ducks
- pilgrimage
- annoyance
- seniority
- ben
- partner
- executed
- healing
- darker
- diff
- routes
- touring
- footage
- abandoned
- retain
- warped
- leslie
- mockingbird
- tricky
- steep
- overwhelming
- killers
- calendar
- faculty
- bingo
- fog
- rationing
- visas
- awareness
- howard
- repairing
- bathrooms
- upside
- symbol
- conception
- veteran
- daylight
- babysitters
- valentine
- ideally
- driveway
- digest
- danielle
- severely
- confident
- idaho
- searched
- appointment
- givers
- pappasito
- dillard
- expertise
- tasty
- publisher
- reruns
- soaps
- repaired
- theatre
- cedar
- mainstream
- refer
- tina
- secure
- rockets
- loo
- contacts
- carpooling
- appalachian
- adventurous
- hostages
- fatal
- patients
- '2'
- sunfish
- donated
- shepherds
- joey
- treats
- researcher
- unnecessary
- stucco
- payroll
- scan
- conductors
- versed
- midway
- beard
- princess
- naked
- custom
- mount
- marshmallows
- mommy
- committee
- allegedly
- tap
- woodstock
- routinely
- rod
- tuesdays
- patterned
- czar
- donald
- booked
- intent
- granddaughter
- chips
- sedan
- discounts
- inn
- dent
- crib
- deliver
- schutzhund
- alsatian
- refused
- nola
- grapes
- marinated
- maxima
- oahu
- conferences
- newly
- kauai
- maui
- hunters
- concentrated
- bakery
- hay
- sleeve
- niro
- builder
- curtain
- spain
- crust
- intriguing
- reimbursed
- licenses
- physics
- reaches
- donahue
- cruises
- nassau
- olives
- lodge
- grandsons
- acoustics
- waves
- uniforms
- fancier
- mesa
- dalmatians
- soapdish
- mushroom
- milwaukee
- violin
- harpsichord
- rumor
- disneyworld
- thinner
- carolyn
- risque
- saxophone
- jodie
- hopkins
- credibility
- barbies
- motel
- wendy
- broncos
- chico
- troop
- warranties
- picky
- aberdeen
- solicitors
- autumn
- nevada
- marlin
- operations
- exhibit
- shuttle
- wycliffe
- sheltie
- particulates
- colombo
- duties
- burner
- hometown
- permits
- contributions
- astronomical
- attire
- blazer
- critics
- omaha
- disturbs
- politeness
- polite
- presumably
- conscience
- canceled
- respects
- norms
- rang
- solicitations
- gossipy
- obtained
- frequency
- turf
- soliciting
- medications
- chow
- smiling
- leash
- acts
- gin
- dispute
- reactions
- intimidated
- alm
- inundated
- switches
- influenced
- rhythm
- sim
- mus
- jimi
- hendrix
- pitiful
- promise
- simon
- qualities
- achieve
- unexpected
- alw
- loaned
- quota
- holler
- leeway
- pains
- wing
- coordinated
- spelled
- skid
- counsel
- violation
- actu
- modeling
- lyrics
- oldies
- phil
- collins
- criticize
- suggestions
- petting
- farms
- exit
- determination
- preservation
- ted
- teddy
- underclass
- considerable
- watcher
- gathering
- sexually
- justified
- territories
- capita
- carefree
- taxing
- weak
- territorial
- resist
- attempts
- craze
- uni
- subscribed
- tractors
- regulated
- cal
- organic
- weaponry
- tanks
- offender
- cured
- slave
- foul
- flipping
- shades
- acclimated
- squares
- tapped
- jerusalem
- fearful
- interrupt
- interrupted
- erase
- monterey
- jose
- ram
- supplement
- standardized
- overtime
- amazes
- circumstance
- summons
- conservation
- indestructible
- littlest
- missionary
- wrapped
- ellen
- toyotas
- preferences
- rag
- straw
- wallpapering
- hoe
- vo
- tubes
- dulles
- incoming
- eldorado
- coun
- tenure
- evaluation
- assigned
- flatter
- chickens
- curry
- overextended
- compl
- housewife
- simmer
- yarn
- demo
- ensemble
- bas
- transmissions
- frivolous
- sessions
- grind
- ranges
- quits
- disconnected
- substances
- etched
- notion
- redeeming
- grabbing
- scrape
- por
- funniest
- rotted
- harvest
- adaptations
- mining
- incaviglia
- excess
- exhibition
- da
- nightmares
- biscuits
- echoes
- actress
- believable
- drafted
- truman
- snider
- extend
- planet
- packing
- dumpsters
- awakenings
- deniro
- actors
- ser
- garp
- attacked
- ralph
- rapid
- agreements
- forests
- polluters
- penalize
- undergrad
- output
- sensational
- failure
- fattening
- catered
- brownies
- crock
- downy
- delta
- cooled
- duplicate
- clearing
- pheasant
- genuinely
- capability
- shield
- agenda
- coup
- briefly
- context
- governors
- irish
- reserved
- collectors
- ole
- antique
- eights
- irate
- noticing
- solo
- shipped
- dramatic
- grateful
- segments
- updates
- trite
- platter
- inc
- incidences
- estimate
- walter
- cronkite
- mold
- efficiency
- spouses
- widely
- redskins
- lynn
- deaths
- observe
- educators
- nother
- visual
- graded
- objectives
- principals
- passes
- poli
- interaction
- prescribed
- breakthrough
- fake
- fears
- web
- housewives
- awake
- reservations
- suggestion
- genre
- innovative
- umbrella
- annoyed
- myth
- proportion
- generational
- exams
- gung
- essential
- pushers
- cathy
- sassafras
- dye
- barn
- outlets
- hollering
- dents
- scratches
- layers
- swiss
- cauliflower
- trays
- pans
- boiling
- vanilla
- custard
- unsweetened
- spoon
- freons
- officials
- disaster
- contributor
- analyzing
- respiratory
- powered
- desired
- trainer
- butt
- psychological
- majors
- staggering
- hamilton
- tracy
- protesting
- prejudices
- dale
- willie
- summoned
- questionnaire
- skipped
- bail
- hebert
- mangione
- breeze
- fairer
- regulations
- seriousness
- darkness
- remem
- judith
- dedicate
- owes
- domino
- insured
- backing
- risks
- devalued
- magnitude
- taped
- breakdown
- beep
- murderers
- murderer
- insanity
- slap
- wrist
- merry
- reinstated
- atrocities
- prayer
- premature
- pushes
- offend
- ridiculously
- bind
- identity
- bombed
- keepers
- deducted
- offset
- owing
- giveaway
- immigrants
- seeking
- insects
- daffodils
- bud
- dandelions
- plagued
- tiller
- trie
- plum
- fescue
- dries
- greenbelt
- cracks
- smokey
- megahertz
- samna
- proficient
- poison
- reused
- mash
- heights
- lone
- vicksburg
- handful
- futuristic
- patrick
- foggiest
- soldier
- buckets
- tot
- immigrate
- render
- fab
- principles
- payoff
- incinerators
- smelled
- ozarks
- disappeared
- tad
- tiers
- glance
- enlightening
- nashville
- fellows
- communicated
- catalog
- insight
- spoke
- flounder
- padre
- aransas
- dingy
- marriages
- becky
- squeezed
- triple
- caribbean
- bees
- lilac
- overhead
- static
- lumber
- juan
- irresponsible
- bold
- carmel
- smarts
- surf
- snappers
- snapper
- described
- aetna
- medi
- irving
- provided
- wells
- romania
- resort
- affords
- printing
- seminar
- thaw
- payoffs
- persuade
- judeo
- litigious
- opponent
- underdog
- equate
- fred
- divided
- separately
- turnover
- descent
- filet
- sole
- jerk
- therapy
- companions
- dresser
- explained
- hush
- agrees
- aff
- drama
- at&t
- modest
- bef
- prep
- vocational
- col
- inevitable
- atomic
- disadvantages
- distracted
- measurement
- arrogant
- clientele
- jelly
- biting
- acceptance
- fir
- overdue
- optima
- suckers
- honored
- chevrolet
- taurus
- recreational
- campers
- shines
- holly
- mattresses
- elastic
- hectic
- volunteered
- heartbreaking
- bargaining
- forgive
- adamant
- moderates
- egypt
- muslims
- palestinians
- poem
- naps
- demonstrations
- restless
- underlying
- dissatisfied
- proposing
- upbringing
- outlook
- quilting
- amish
- acreage
- eyed
- motivates
- vitamin
- drilled
- extensions
- quantities
- carson
- doses
- experimented
- chlorinated
- rode
- nationalities
- exam
- memorize
- readers
- scales
- grain
- matching
- explains
- semigloss
- marks
- experiencing
- upbeat
- connections
- dah
- seated
- alley
- uncertainty
- hoot
- itemize
- processors
- portable
- hewlett
- rival
- rugged
- decks
- printers
- obsolete
- quitting
- approximately
- martin
- achieved
- tact
- disappointment
- trusting
- corrected
- opted
- perjured
- barred
- script
- ironic
- witnessed
- answered
- dependents
- mobility
- preventative
- lung
- carrier
- filed
- pissed
- offensive
- opinionated
- textbooks
- forbid
- advertisement
- cordless
- porcelain
- sandy
- tracks
- amateur
- sings
- contraceptives
- luxuries
- continually
- perennials
- arriving
- bows
- ribbons
- designs
- bunny
- ink
- canvas
- crewel
- decorations
- victorian
- stiffen
- uncommon
- compensate
- typed
- correcting
- frustrations
- acted
- rumors
- lebanese
- newsmen
- chemistry
- tw
- literacy
- jackson
- macho
- hint
- cer
- cutbacks
- slogan
- preserving
- trigger
- greenhouse
- plattsburgh
- digital
- sane
- boost
- vacationing
- stationed
- slope
- attach
- starving
- distant
- mideast
- bureaucratic
- bearing
- nightline
- eng
- centuries
- decking
- crawling
- buds
- vine
- chops
- guest
- sucks
- tails
- '''oeuvres'
- cooks
- elegant
- crumbs
- crunchy
- bouillon
- 20/20
- cord
- irritated
- luggage
- climates
- richer
- civilized
- israeli
- jazzercise
- ego
- exer
- leaned
- firearm
- firearms
- twirling
- edited
- dribble
- accidental
- resale
- trading
- strangely
- cutlass
- semesters
- recipients
- recipient
- pathetic
- import
- partnership
- ambition
- disciplined
- prenatal
- peru
- thir
- filters
- tourists
- canadians
- panamanians
- initiate
- concentrating
- cellular
- awkward
- aw
- sanitation
- kuwaitis
- accomplishment
- defend
- amy
- sunshine
- hurricane
- flood
- muggy
- royals
- pitchers
- nat
- indicator
- lineup
- knives
- publishing
- laptop
- search
- significance
- chains
- jonathan
- petunias
- blooms
- stitches
- fruits
- righ
- opportune
- tang
- inspiring
- incomes
- ferraro
- isaiah
- alma
- mater
- dominant
- greed
- hud
- pit
- bounced
- installation
- stinking
- forgets
- morally
- millionaire
- observer
- restrict
- ancestors
- kitchenette
- neatest
- miniskirts
- grandmothers
- feminine
- marching
- bizarre
- overboard
- gu
- neon
- tints
- condominiums
- walt
- crummy
- flake
- woodwork
- widespread
- worldwide
- bow
- contrast
- vocal
- removing
- passive
- colonies
- bury
- presence
- quietly
- whichever
- vacant
- equity
- litters
- fin
- aquarium
- commands
- anticipate
- resulted
- ranches
- repentance
- mas
- olympic
- wicked
- climbed
- stretched
- explaining
- wayside
- combinations
- carpets
- str
- tickled
- tinted
- carmakers
- sporty
- miata
- authentic
- demands
- parkway
- gabriel
- shannon
- patriot
- mansion
- alan
- blessing
- catnip
- bombay
- himmy
- champion
- gloves
- devon
- curly
- mice
- associations
- haired
- qualifications
- attracted
- irritating
- cops
- irks
- ron
- relation
- germantown
- hondas
- skins
- errands
- pigs
- substituting
- spoil
- butts
- experts
- markets
- hong
- kong
- tens
- conflicts
- bangladesh
- prevention
- barrels
- lily
- humongous
- azaleas
- fielder
- cubs
- pri
- aft
- kinder
- callers
- capone
- arsenio
- flatliners
- scheduling
- threads
- bedspread
- lobby
- mckinney
- spaced
- ethical
- expenditures
- recovery
- sitters
- reader
- authors
- scraping
- backlash
- estes
- sensitive
- taxpayers
- fisherman
- soul
- lures
- hea
- propose
- reinforcement
- exempt
- pendulum
- applies
- flea
- skilled
- petty
- brochures
- bussed
- african
- glen
- godfather
- sooners
- hump
- summit
- strengthen
- meaningful
- steamer
- sprinkle
- skillet
- teflon
- passion
- increasingly
- privileges
- constitutional
- thousandths
- motorcycles
- eighths
- annoys
- horizon
- tooling
- essence
- decimal
- inherited
- fifths
- sweatshirts
- blouse
- programmer
- fashions
- taiwan
- keyboard
- unpopular
- plumber
- sucker
- transporting
- indifferent
- shallow
- undo
- seeming
- kilograms
- dates
- propaganda
- confidently
- badge
- clipper
- steelers
- temperament
- scoring
- warren
- proving
- arthritis
- revenue
- scheme
- os
- wholeheartedly
- unknown
- capacity
- noodles
- instincts
- lecture
- stanford
- unlike
- academics
- cannon
- instinct
- stereotypical
- mac
- firepower
- mug
- antenna
- denton
- psych
- hamsters
- smelling
- expenditure
- dec
- diploma
- radioactive
- packaging
- detect
- stream
- particles
- cattle
- creeks
- alaskan
- roam
- booster
- contagious
- scientist
- wednesdays
- shopper
- species
- tribes
- underpaid
- ambience
- texture
- enthralled
- mel
- presidents
- consultants
- persons
- sweaty
- speaker
- subsidy
- lies
- ano
- offenses
- housekeeper
- hottest
- firewheel
- salisbury
- hams
- locking
- prosecuting
- gettysburg
- arena
- openness
- duplex
- fords
- carburetor
- cap
- notch
- overlap
- dash
- vegetarians
- cleanliness
- vegan
- bodies
- utilize
- coo
- hens
- ballpark
- kicking
- getaway
- des
- vitelle
- a&m
- oriental
- yellowstone
- lion
- rio
- grande
- marble
- jealous
- ruins
- objecting
- fireman
- malicious
- compensation
- executing
- falsely
- statistic
- meanwhile
- storing
- internship
- cooper
- clinic
- cardiovascular
- rotate
- picturesque
- biggie
- killeen
- purebred
- virus
- affection
- caravan
- storage
- libber
- heated
- shrubbery
- supportive
- unacceptable
- appalled
- reimburse
- explorer
- middlekauff
- stiffer
- disneyland
- amusement
- solely
- lafayette
- allies
- liars
- masses
- majored
- discriminated
- valid
- lonely
- smile
- consists
- lisa
- floods
- historian
- societies
- eater
- rewiring
- praised
- openly
- logically
- nest
- pap
- supporter
- runner
- moth
- devastate
- mediocre
- excel
- insist
- halloween
- toning
- dramas
- shakespeare
- multimillionaire
- supervise
- imports
- inferior
- wallet
- dwell
- po
- iguana
- br
- twentieth
- assertive
- chewing
- freelance
- reputable
- avenues
- smoothly
- avenue
- classify
- spices
- tort
- riots
- methods
- textbook
- sprayed
- wiring
- busting
- minimal
- youngster
- manner
- fringe
- beeper
- pill
- spraying
- heavens
- splitting
- maturity
- cues
- nineteenth
- velcro
- cole
- codependency
- losses
- worlds
- representation
- roller
- maternal
- franchise
- bones
- quickie
- resorts
- inept
- tossed
- superior
- enthusiastic
- stripper
- eth
- shotguns
- vital
- mutual
- laura
- lotion
- accumulate
- dime
- unfinished
- toned
- treatments
- rust
- instruction
- productivity
- wherewithal
- indigent
- employ
- medicaid
- desperately
- equipped
- alto
- jerker
- christopher
- reeves
- climb
- mastercards
- beaver
- champions
- pines
- berries
- dutch
- shou
- cathedral
- constructed
- rainfall
- chased
- tossing
- peonies
- hardy
- divorces
- drank
- tan
- sunburn
- interfere
- fo
- custody
- bottoms
- guidance
- flew
- jar
- eisenhower
- bitter
- motivational
- presidency
- leaps
- noriega
- tunnel
- anger
- roger
- mis
- universe
- bargained
- interviewing
- potluck
- trump
- hyacinths
- purply
- mugged
- paroling
- int
- avon
- spectator
- deeply
- amou
- crepe
- pile
- toll
- dependable
- cavalier
- squish
- drinks
- census
- pell
- vienna
- waitresses
- ultra
- regency
- progressing
- retrievers
- prompt
- brisket
- reliability
- graveyard
- submit
- reception
- watercolor
- jan
- shanghai
- effected
- micro
- satisfying
- preston
- broiled
- violated
- appealed
- martha
- melodies
- speaks
- squad
- cutback
- texasville
- breathe
- homemakers
- dreyfuss
- spit
- presumed
- cra
- coordination
- irons
- perry
- stepmother
- ambulance
- deteriorated
- bunk
- flan
- vinegar
- pies
- happiest
- wheeling
- geriatric
- cockapoo
- rabbits
- ignored
- earnings
- pencil
- taller
- glorified
- sch
- eyre
- sung
- madam
- butterfly
- puccini
- canoeing
- receptive
- jackie
- gymnastics
- im
- steadily
- ronald
- brownwood
- temple
- substantial
- les
- broadway
- orthodontic
- verge
- orthopedic
- silverton
- drafter
- drawings
- unbiased
- equals
- secretarial
- overturned
- thelma
- louise
- tacky
- chipped
- sledding
- ambulatory
- reluctantly
- adequately
- cheryl
- hearty
- skim
- thai
- lunches
- molestation
- releasing
- sketch
- subscriptions
- upright
- paddle
- appliance
- tops
- pant
- gail
- centralized
- claus
- earns
- coit
- orchestras
- breasts
- chill
- punk
- '101'
- rebate
- perkins
- fluffy
- parker
- coppell
- bleeding
- pittosporum
- thumper
- carney
- trailers
- eager
- signature
- whoops
- discovery
- macaroni
- golfing
- superbowl
- tease
- includes
- desperate
- entitled
- dill
- suing
- semiautomatic
- cuddle
- legislate
- hubbard
- screams
- competitiveness
- mechanically
- jesuit
- duh
- haiti
- constituents
- ordering
- striped
- bonham
- donna
- du
- nist
- sheet
- sergeant
- rebuilt
- spy
- thorough
- fame
- hydrocarbons
- nitrogen
- ville
- manufacturers
- mats
- algebra
- glossy
- pathology
- towncar
- missions
- mat
- gut
- precaution
- kenosha
- pianos
- commissioners
- exemptions
- daytona
- holder
- gloss
- exploring
- hatchback
- abuses
- royalty
- rehearsals
- meg
- boise
- barbie
- radial
- lathe
- distributor
- parakeets
- chimney
- telecom
- bran
- piedmont
- howse
- duncanville
- admitted
- warriors
- marketplace
- dunn
- bradstreet
- vivaldi
- boutique
- decorative
- volume
- honeywell
- quicken
- strengthened
- quantity
- hinge
- cumbersome
- qua
- transport
- makings
- seal
- entitle
- opacity
- abouts
- forum
- ductwork
- shave
- interchange
- ber
- scruffy
- critic
- trivia
- sharon
- invitation
- astounded
- effectiveness
- insulted
- conspiracy
- paranoia
- surmise
- latches
- invading
- knocking
- ritual
- introducing
- click
- occurrences
- summed
- absenteeism
- errand
- discrimination
- improving
- uncertain
- suspicious
- detectors
- hammer
- royalties
- hideous
- militant
- objections
- absurd
- frampton
- performer
- eclectic
- listener
- ravi
- shankar
- spreadsheet
- dedication
- mardi
- gras
- straps
- convincing
- carl
- casually
- horrifying
- litigation
- retention
- dusty
- regulars
- texteller
- stripe
- tipped
- pastel
- pallet
- patent
- spin
- coul
- southbend
- variable
- intended
- workplace
- inputs
- toured
- reich
- genesis
- bottomed
- shoul
- devoted
- detriment
- manipulating
- softly
- alleged
- accuse
- exploiting
- cuba
- starve
- hun
- ashamed
- connery
- dwarf
- favors
- freer
- imposed
- demanded
- natives
- representative
- undoubtedly
- abou
- melting
- clinging
- quebec
- mountaineering
- implies
- fads
- institutes
- newsletter
- orientation
- meditation
- desks
- laborers
- keyed
- enc
- incorporated
- predominant
- intending
- trafficking
- aghast
- frito
- artistic
- kits
- pinks
- kit
- lilly
- greens
- stocking
- selections
- chapel
- percentile
- stabilized
- illegally
- errors
- nasa
- quaint
- mem
- supplemental
- applaud
- competitors
- generous
- repayment
- celebrated
- negatives
- ind
- privately
- brutal
- hoped
- slim
- administrating
- latter
- nickname
- customs
- defeating
- gadgets
- bluegrass
- pizzas
- anderson
- predominately
- standings
- moore
- pennant
- pirates
- appraised
- overpriced
- longevity
- satisfy
- resell
- editing
- availability
- prohibit
- janitors
- endurance
- mutually
- supervisory
- quotas
- swampers
- laborer
- happ
- mushrooms
- consisted
- terr
- siren
- alarms
- jamaican
- knitted
- granny
- moderate
- carpentry
- candle
- contributors
- ai
- comply
- helicopter
- sting
- nitrous
- chemist
- unseasonable
- ust
- nostalgic
- calligraphy
- tidbits
- mcgyver
- inventing
- baling
- washers
- junkyard
- portraying
- invented
- attempting
- innings
- ke
- weaned
- meows
- docile
- traumatic
- secretive
- daisy
- hype
- mimic
- predicting
- fictional
- swamp
- margin
- teasing
- crosses
- dang
- dumpster
- openings
- recycles
- imaginable
- folded
- straightened
- reminding
- settlement
- beaten
- ramifications
- margaret
- thatcher
- gandhi
- volcanos
- rhode
- residue
- pitted
- comeback
- nader
- volcano
- indicates
- previously
- regulatory
- arrows
- zoom
- calculate
- yugo
- pricing
- dos
- pastor
- sauces
- coleman
- sacramento
- backpacked
- undeveloped
- opposition
- negotiate
- factions
- refreshing
- reveal
- occupy
- responding
- tunes
- jigs
- instrumental
- mickey
- wills
- nickelodeon
- fl
- shenandoah
- flimsy
- programmers
- mentioning
- irritates
- aspen
- contel
- demonstrated
- surrogacy
- crass
- nurturing
- donation
- auction
- shelters
- bedridden
- gals
- '''am'
- factual
- nightly
- chancellor
- gaps
- newscaster
- excerpts
- rises
- choi
- assisted
- deteriorate
- sponsor
- caretakers
- supplemented
- possessions
- signing
- sectioned
- zones
- vikings
- hart
- educator
- beg
- initiative
- administrations
- maj
- sabbatical
- minuscule
- referring
- hourly
- gardened
- remotely
- shack
- broaden
- ivy
- couches
- careless
- anybo
- oreo
- twisted
- actresses
- kenny
- columbus
- disrupted
- mistrial
- chooses
- confession
- placing
- inception
- insure
- burglars
- jacques
- lewis
- chagrin
- ame
- preferably
- loudly
- epileptic
- aftermath
- snob
- broadened
- expectations
- swore
- amphetamines
- endangering
- hassles
- splotches
- scratching
- dread
- hardwood
- toothbrush
- proclaimed
- nicks
- breads
- chunks
- quart
- slender
- blender
- thickens
- thickened
- thicken
- cooling
- leaded
- endorse
- caprice
- converters
- arguable
- lit
- meteorological
- circulation
- lungs
- focal
- volkswagen
- pinned
- fulfilling
- obligations
- belonging
- wealthier
- adulthood
- functioning
- monster
- wandering
- ropes
- appreciation
- confess
- tolerances
- pete
- arnett
- sporadically
- impartial
- diversity
- affiliate
- cutesy
- beeped
- moody
- wonderfully
- vowed
- booklets
- recruit
- courthouse
- strangled
- testify
- neurotic
- crooked
- bracelet
- instructed
- whereabouts
- bracket
- koontz
- bachman
- letterman
- hologram
- pitches
- speculative
- deregulation
- teapot
- vaguely
- hoover
- pennies
- nickels
- investors
- holders
- asphalt
- charts
- kathy
- walkman
- simmons
- rapists
- manson
- repealed
- thousandth
- pac
- kingdoms
- ruler
- scriptural
- elses
- discernment
- walters
- wiley
- communists
- assaulted
- compensated
- medicines
- rude
- returns
- indebted
- deli
- strings
- crabgrass
- slimy
- tempered
- standby
- surgeon
- pruning
- undertaking
- irrigation
- leafy
- remain
- flowering
- chick
- lem
- humus
- barbe
- stoves
- flame
- grease
- tortillas
- turkeys
- smoked
- hickories
- spreadsheets
- specs
- montana
- hazards
- crash
- burlap
- coupon
- subtract
- compost
- branches
- heed
- staunch
- withstand
- buffers
- scuds
- provinces
- merely
- demilitarize
- confusing
- sucked
- incomprehensible
- disarm
- socialism
- boris
- nationality
- nut
- sabine
- consequence
- wade
- camps
- kingsley
- centennial
- canton
- dinky
- proclamation
- mason
- dixon
- seller
- avalon
- chilling
- wits
- characteristics
- tuberculosis
- wafer
- linear
- mismanaged
- outraged
- breyiana
- demos
- boggles
- contaminated
- refineries
- desires
- delaware
- caves
- fading
- anythi
- pantry
- crushers
- hallways
- casualties
- magnified
- tones
- questionable
- andy
- creatures
- extends
- fork
- spills
- degrading
- spark
- probab
- hints
- stereotypes
- romanticize
- thugs
- beaumont
- predictions
- barring
- substantially
- separates
- zealous
- farmhouse
- pumpkins
- planter
- creosote
- landlord
- brushing
- rose
- cantaloupes
- cubic
- wary
- youths
- hostilities
- judging
- burlington
- confronted
- slit
- divisions
- rash
- monterrey
- objective
- hamper
- grouper
- oysters
- tiring
- canals
- grabs
- grabbed
- dogfish
- antibiotics
- commuting
- deprived
- clinics
- infections
- enrolled
- rigid
- fined
- mills
- deceiving
- surroundings
- paths
- motive
- motivations
- upwards
- bundled
- doubling
- financed
- integrity
- benefitted
- perceive
- unfairness
- wiser
- segment
- vengeful
- pitifully
- massively
- respon
- represents
- speeches
- slapped
- inflammatory
- atrocious
- blitz
- zoning
- wholesaler
- turnovers
- argentine
- microwaves
- waxed
- flakes
- purplish
- cubes
- sherry
- argentinean
- sausages
- breaded
- publications
- thesis
- disgruntled
- cries
- replaces
- belongings
- roaches
- overhaul
- uniform
- discretionary
- emotion
- hence
- fines
- documentary
- dealings
- declaring
- dire
- squirrelly
- miscellaneous
- nd
- deposited
- scurried
- skaggs
- endangerment
- assumes
- endanger
- endangered
- accidentally
- suspicion
- continents
- ingrained
- confuse
- trans
- centimeter
- measurements
- peanut
- kindercares
- alphabet
- scold
- inappropriate
- trauma
- weath
- predictable
- inversions
- threesome
- novice
- rut
- yo
- delightful
- ferrari
- resembled
- satellite
- bathed
- jacuzzi
- wings
- fastest
- ant
- kitchens
- dented
- refresher
- kosher
- knishes
- mea
- unstable
- relevant
- americanized
- hugged
- scam
- apologize
- hug
- shiite
- poss
- wheth
- countrymen
- wom
- implementing
- decreasing
- finland
- selfishness
- benefited
- mil
- flunk
- canning
- zinc
- processed
- bogged
- distributed
- moderately
- companion
- organs
- sally
- petite
- isometrics
- ingestation
- plight
- surrounded
- directing
- coed
- subbing
- calculator
- behaved
- versatile
- applicable
- depot
- spackling
- creamy
- similarly
- formative
- contacting
- aptitude
- sounding
- upkeep
- cellar
- rents
- complexes
- nanny
- prefabs
- enou
- scoot
- emulate
- guru
- auditors
- packard
- matrix
- transparencies
- outdated
- advisor
- panhandle
- piling
- shredded
- pessimism
- racism
- destined
- fronts
- hippie
- texaco
- pennzoil
- miscarriage
- rational
- testimony
- testifying
- paralegal
- priors
- aggravate
- enlightened
- niceties
- flop
- horrified
- absence
- taxation
- flabbergasted
- gracious
- flops
- certificate
- explanation
- univer
- dustbuster
- plated
- bowls
- patty
- womb
- soothing
- repetitious
- wilder
- eleventh
- painless
- necessities
- harm
- magnolias
- raking
- underground
- grasses
- blend
- macneil
- jennings
- informative
- bureaus
- comics
- mourning
- lace
- weave
- lacy
- draping
- batting
- anticipating
- splurge
- deci
- typist
- damme
- bland
- widow
- dummies
- caan
- rescuers
- submarine
- studio
- survived
- einstein
- stepson
- literate
- honors
- lifesaver
- framing
- hindsight
- incidents
- outsiders
- jesse
- complains
- threatens
- entrepreneur
- achievement
- clue
- sights
- transplant
- glamorous
- uncontrollable
- constitute
- denial
- champlain
- resume
- technicians
- fad
- timid
- macon
- hous
- espec
- contacted
- liquor
- repairman
- popped
- radishes
- turnips
- loam
- intensive
- attachment
- pickles
- unfairly
- seasonings
- paralyzed
- spinal
- discrete
- seatbelt
- arrow
- reuse
- collects
- dorms
- perimeter
- orthopedist
- freak
- diane
- diver
- limping
- tights
- casts
- nautilus
- cushion
- singled
- tighter
- lonesome
- naw
- everyb
- imitate
- oscars
- booth
- demographic
- judgments
- texins
- crest
- demonstrator
- reps
- partying
- tracking
- perpetuate
- manpower
- coincide
- cl
- soreness
- nighttime
- evacuated
- winnebago
- benefiting
- incidence
- abundance
- creature
- aim
- shah
- felons
- unseasonably
- comparisons
- waning
- surviving
- diplomacy
- eliminating
- processes
- righteous
- filtered
- launch
- unmet
- strife
- ray
- blatant
- fax
- proactive
- buil
- treaty
- bully
- repay
- swallow
- evolve
- tug
- skewed
- intersection
- trampoline
- downs
- cy
- swept
- streak
- averages
- catches
- tigers
- strategy
- bayless
- advised
- brunt
- rooted
- dseg
- documentation
- floppy
- disks
- hus
- touchy
- linda
- rossa
- teen
- boo
- livingston
- seagull
- wro
- midland
- odessa
- practiced
- fur
- contra
- haunt
- resentment
- laughable
- arises
- browns
- topping
- toast
- mustard
- cucumber
- bonanza
- meta
- rearing
- robinson
- cylinders
- akeem
- dominate
- reselling
- jap
- wichita
- galen
- amrein
- snacks
- elephant
- transferring
- fare
- veterinarians
- wonders
- developer
- breathed
- limiting
- cookouts
- individuality
- frills
- fluctuates
- tastefully
- smashed
- organizing
- dare
- reform
- bri
- gate
- felonies
- ima
- racist
- gripe
- gar
- width
- spreader
- lightly
- freshly
- arthur
- waterfront
- movers
- frames
- enamel
- spun
- descendants
- favorable
- intervening
- advancing
- frightened
- revolting
- upsetting
- acquired
- creeps
- kitten
- teacup
- frustrates
- cheaply
- brunch
- crook
- mock
- primaries
- workday
- chows
- guinea
- harming
- bellies
- rubbed
- terrified
- louder
- lid
- collie
- mechanism
- inspected
- cheated
- fingernails
- uninformed
- disinterested
- honduran
- rica
- tourism
- enabled
- policies
- engrossed
- virgo
- elder
- ricans
- rican
- loaner
- revival
- christianity
- revered
- pyramid
- birthdays
- disciplinarian
- nutri
- stairs
- elevator
- powerhouse
- alway
- rehearse
- patriots
- photo
- guards
- congested
- incarcerating
- foreground
- snatched
- astro
- minivan
- subaru
- ticking
- rack
- upgrade
- retail
- campgrounds
- bearable
- dipper
- addict
- sportsmanship
- describes
- strasbourg
- missile
- bounce
- goll
- humiliating
- chauffeur
- valet
- condemning
- airs
- tithe
- blessings
- foley
- croak
- critters
- turkish
- himalayan
- patches
- paws
- lanky
- hillside
- communicating
- swam
- supervision
- stephanie
- keel
- tuba
- nerves
- turntable
- dual
- processor
- edit
- layout
- preventing
- overloaded
- mentions
- sevren
- montgomery
- piddly
- compressor
- prelude
- impractical
- wharf
- colts
- seahawks
- winners
- champs
- expansion
- attendance
- kites
- strangers
- tasting
- arrangement
- rewards
- interfering
- inhumane
- overtaken
- underwater
- intention
- philippines
- tag
- quarterly
- incentives
- justification
- sorting
- insurmountable
- forestry
- trails
- emphasized
- obtain
- cubicles
- advent
- op
- accurately
- orchids
- dodgers
- brat
- petrified
- circular
- terrifies
- niece
- laughs
- exc
- negate
- rejected
- lawlessness
- founded
- crippled
- perpetrators
- breath
- intake
- valleys
- pencils
- abreast
- ethics
- scandalous
- churchill
- dickens
- withstood
- mindless
- pi
- sincerely
- whew
- spreading
- petersburg
- finest
- southwestern
- cincinnati
- roaring
- perpetual
- lhasa
- scuba
- pampered
- dinosaur
- fires
- ventured
- dooming
- plunked
- cooperated
- adjusting
- decades
- valued
- downstream
- lure
- bumble
- wasp
- squirrels
- popularity
- isolation
- disciplining
- spank
- isolate
- handicraft
- dough
- ornaments
- empties
- posted
- ruining
- kurdish
- roseanne
- matthew
- brando
- levinson
- follower
- marino
- keystone
- cunningham
- tactics
- granada
- cuban
- salinas
- terrorist
- buried
- hyundee
- helicopters
- stepper
- pillow
- staring
- aqua
- blisters
- rubber
- trashed
- dwindling
- cooker
- cherry
- blackening
- gumbo
- portuguese
- ribs
- ya
- jumbo
- initiatives
- revolt
- obliged
- argues
- constrained
- fools
- indoctrinated
- millimeters
- fractions
- fittings
- wrench
- header
- screws
- progressively
- pullover
- smokes
- sw
- othe
- designer
- foolish
- puzzled
- warned
- cab
- tractor
- sixes
- diesels
- injector
- asylum
- governmental
- antiwar
- translated
- soapbox
- usable
- antimetric
- sweden
- midnight
- plains
- collapsible
- helper
- motivator
- huff
- phenomena
- temper
- miami
- cyclical
- oilers
- stallworth
- swan
- oppose
- decisive
- wrath
- constituency
- nuggets
- meatless
- ingredients
- hostess
- soybeans
- proteins
- belton
- pennsyl
- lsats
- als
- sev
- abcs
- especiall
- affordable
- carpools
- symbolic
- scenario
- gunfire
- outlaw
- abiding
- restrictive
- concealed
- sp
- deterrence
- weighed
- objection
- misusing
- impose
- crackdown
- dawn
- liners
- gerbils
- mutts
- counted
- eel
- tiniest
- debated
- symptom
- furnish
- nonsense
- handicrafts
- awarding
- topsy
- turvy
- worldly
- sparked
- reg
- flours
- dublin
- bulldozers
- overflow
- posters
- chained
- tabby
- rampant
- girlfriends
- inadequate
- '8088'
- monitors
- respectable
- secondly
- binary
- calibrated
- qualification
- brackets
- rescue
- passport
- mou
- alcoholics
- returning
- laurie
- clout
- grilled
- buffets
- brunches
- woodland
- colo
- prix
- seagal
- starred
- premise
- preoccupation
- belly
- millimeter
- darndest
- assembled
- hauled
- fertilizers
- prohibited
- facets
- denied
- loaf
- dawned
- boulders
- marbles
- duck
- shish
- odor
- boneless
- scrambled
- armenian
- consume
- punishing
- devil
- suffered
- agreeing
- enforcing
- burglaries
- rationalize
- busiest
- airy
- wires
- compartment
- soldered
- restrain
- overeat
- pastas
- minerals
- accepts
- supplements
- toledo
- oriole
- steeper
- moines
- bleachers
- collapsed
- herbs
- sill
- appleseed
- pecans
- wes
- enterprise
- bulletin
- electrician
- terminology
- gaithersburg
- valedictorian
- pushy
- seemingly
- rockies
- carries
- yells
- breezed
- solicit
- coworkers
- alright
- humans
- bust
- holdup
- underst
- convicting
- restoring
- ankles
- landscaped
- sal
- continuance
- pensions
- allergy
- baxter
- ceo
- homa
- rallies
- anaerobic
- improves
- ls
- adverse
- hunk
- pulse
- resting
- mirrored
- fireplace
- tucked
- condos
- abandon
- dennis
- distributing
- refuses
- glove
- pricey
- passenger
- lowered
- questioning
- dummy
- mans
- occupations
- norma
- techniques
- karen
- spotted
- incompetent
- exper
- priest
- kindergartners
- conform
- creativity
- manners
- mannerisms
- establishment
- norfork
- farthest
- charleston
- hairs
- follicles
- rehab
- fro
- weddings
- graduation
- med
- saudis
- thieves
- chaos
- promotion
- unconditional
- offspring
- quotes
- dumps
- bluebonnets
- absorb
- es
- flash
- medina
- salty
- beirut
- penalized
- lining
- faucets
- repainting
- arrange
- tripping
- ingest
- ingesting
- arteries
- reacts
- framers
- framed
- viable
- supports
- viewpoints
- delay
- nevertheless
- allocation
- infrastructure
- expended
- restock
- twen
- spider
- marigolds
- impatiens
- replacement
- teased
- bacillus
- gypsy
- toddlers
- recommendations
- skits
- attachments
- slacked
- contributed
- bombarded
- mrs
- cleaver
- senses
- romantic
- illiterate
- paced
- ridged
- totaled
- hesitate
- technologies
- stacked
- renters
- counties
- citibank
- scams
- swayze
- clyde
- drummer
- scratched
- demographics
- companionship
- dependency
- everyth
- prospective
- pairs
- unsupervised
- morton
- lu
- offended
- drinker
- measures
- lions
- arapaho
- drool
- yuppie
- cheat
- reinforced
- fashion
- defrosting
- pilaf
- mixing
- mushy
- korean
- auxiliary
- curriculums
- kathleen
- accordingly
- residency
- sportswise
- blitzer
- fanny
- treadmills
- cinema
- dripping
- shorted
- enlarge
- valves
- shingle
- fixtures
- detached
- stigma
- pioneers
- households
- beepers
- bulky
- vibrates
- hepatitis
- freed
- expectation
- boyfriends
- homeowners
- existence
- anguish
- charming
- weathered
- leveled
- wallpapered
- conserving
- diagnosed
- inspiration
- alerted
- swimmers
- extracurricular
- loser
- sats
- barber
- verses
- robber
- dachshunds
- spaniels
- anthropology
- presses
- clerical
- forthcoming
- homecoming
- famil
- familiarized
- virgin
- qui
- divine
- skates
- cot
- shove
- nannies
- objectivity
- digressing
- ordinarily
- weirder
- revolved
- hatchery
- intimate
- calendars
- decoration
- passage
- continuity
- percentages
- cavaliers
- ewing
- highlights
- patience
- bethesda
- beijing
- pooling
- restful
- pends
- dells
- starring
- rage
- terminator
- twists
- treble
- mackerel
- pike
- stung
- fleetwood
- displayed
- freaks
- backs
- buicks
- convertible
- vintage
- setter
- feathers
- conducted
- ethically
- patrol
- kidnapped
- pun
- exceedingly
- albany
- syracuse
- rapist
- investigation
- pamper
- waits
- assistantship
- newlyweds
- hopping
- annually
- journals
- figurines
- sanded
- 4h
- refinish
- hormones
- lip
- fender
- sparingly
- lime
- sands
- upscale
- gum
- rips
- shreds
- sponge
- mate
- averaged
- harvard
- successfully
- approaching
- nutrition
- conductor
- cringe
- mcneil
- criticism
- palo
- columns
- candles
- psycho
- deadly
- uneasy
- robocop
- molly
- savage
- resented
- retrospect
- juggling
- density
- crucial
- oft
- lame
- assaulting
- pleading
- psychiatrist
- psychiatrists
- psychotics
- assaults
- sponsors
- rainier
- snowy
- immune
- tawakoni
- cones
- fearless
- enclosed
- roofs
- sizes
- cei
- furnace
- ambitious
- poking
- fountains
- latitude
- underpass
- hiding
- petals
- slows
- oscar
- durant
- alo
- notorious
- settles
- smoker
- sponsored
- educations
- ele
- approached
- proponent
- thus
- endeavor
- wri
- fingerprints
- slipped
- fingerprinted
- astounding
- intervals
- contracted
- dea
- imm
- soaking
- visitors
- rug
- daddies
- conformist
- revolutionary
- kramer
- celebration
- feeder
- nets
- minnow
- burping
- purina
- parade
- compound
- pursuit
- refuted
- refute
- turnouts
- vi
- relates
- regain
- moats
- staubach
- encountered
- unrealistic
- landon
- portrayed
- josey
- clint
- jot
- baptist
- reflection
- damages
- shortage
- clerks
- doubled
- smallest
- pavilion
- fuses
- alter
- sensing
- bandit
- theatres
- ellison
- activist
- photographs
- hyacinth
- hollies
- spike
- perennial
- gomphrena
- repeating
- minimize
- ornamental
- happiness
- acquire
- congratulations
- simpler
- circles
- wham
- forgiving
- detrimental
- immature
- maple
- myrtles
- screwing
- disguise
- formatting
- paragraph
- voyager
- crank
- pepsi
- mcmahon
- racking
- recharged
- seabrook
- nucleus
- billed
- mints
- adaptation
- crown
- lunchtime
- celebrate
- incident
- shreveport
- limbo
- diaper
- chassis
- bent
- soapies
- bichon
- frise
- personable
- rin
- tervurien
- latchkey
- considerations
- sunroom
- rambler
- sandstone
- beltway
- adored
- surrendering
- cooperate
- allah
- sakes
- stirring
- pineapple
- oatmeal
- casseroles
- bronze
- catherine
- nissans
- escort
- trusted
- insurances
- provider
- postal
- recourse
- invades
- complained
- susceptible
- newhart
- comedians
- contrary
- bart
- simpson
- morocco
- continent
- ripping
- photos
- reef
- melbourne
- squirrel
- agents
- hockey
- christi
- diverted
- pea
- fiasco
- liver
- caution
- expediency
- misplaced
- technicalities
- technicality
- ruffle
- conducive
- sandwiches
- vendors
- pins
- ligaments
- beethoven
- mozart
- softer
- banned
- regime
- liberalization
- civics
- dart
- wasteful
- wounded
- mcmurtry
- trashy
- grou
- grouchy
- projectionist
- subtitles
- intuitive
- footnotes
- footnote
- operator
- lands
- appetizers
- premed
- specialize
- matinee
- cocoon
- alien
- maintained
- sharif
- oddly
- exceed
- incapacitated
- images
- dangerfield
- stacking
- leftovers
- catering
- scooped
- amelia
- anyth
- wolfe
- myths
- haggard
- phonetics
- relearning
- wheelers
- transaction
- checkup
- reserves
- cranky
- measuring
- coating
- cognitive
- jour
- austen
- reviewed
- attracts
- grandchild
- congealed
- soprano
- canoed
- cancun
- bummer
- teenaged
- manhood
- ostracized
- liken
- pear
- daytimes
- ransom
- sightseeing
- gubernatorial
- robb
- receipts
- gambling
- sedentary
- tortilla
- picante
- grated
- jell
- timely
- subjected
- athletics
- bathe
- commercially
- accordion
- miserables
- milkman
- travis
- phantom
- lloyd
- listens
- illnesses
- diligent
- invaluable
- scotland
- jaw
- periodically
- durango
- jeep
- destin
- jetty
- draftsman
- roman
- recognizes
- regarded
- mediation
- crises
- bystander
- awe
- prac
- gannan
- valerie
- addicts
- sayings
- possi
- restrooms
- festival
- alpine
- uneven
- sleds
- knob
- mows
- mulched
- presbyterian
- willingly
- littler
- strategies
- rapport
- walnut
- impersonal
- hack
- cheerful
- emily
- dell
- preschools
- pediatrician
- dane
- tangent
- backfire
- ethiopian
- venison
- fries
- waitress
- waiter
- attentive
- adventuresome
- heyday
- bernie
- dra
- assortment
- piled
- veal
- evident
- unleaded
- ambivalent
- clothe
- rehabilitating
- confessed
- amendment
- xeros
- quartet
- technique
- carols
- mechanisms
- decompose
- murray
- sorted
- dimes
- crusher
- renewed
- prostate
- antigen
- fourths
- smells
- spinner
- baits
- fisherwoman
- imitation
- sticker
- sn
- pantsuit
- pantsuits
- enthusiasm
- begging
- fitting
- harold
- taft
- milder
- gimmicks
- hemorrhaging
- mennonite
- sealer
- premier
- landed
- suites
- invalid
- invalids
- labels
- frugal
- substituted
- legacy
- reside
- partial
- yuck
- balloting
- sibling
- colds
- discontinued
- primitive
- tulips
- hazard
- codes
- zenith
- ques
- slides
- purity
- richie
- bushel
- wines
- napa
- ronnie
- whittle
- satire
- monotonous
- menus
- frankenstein
- blazing
- saddles
- grants
- hitler
- paintings
- specimen
- fussing
- presume
- pollu
- decorate
- kindergartner
- arguably
- cradle
- grave
- fluff
- swings
- queens
- beltline
- thrus
- aerosol
- corny
- fridays
- camry
- elway
- moneys
- exponentially
- crawls
- grieve
- greg
- foresee
- uninsured
- noses
- rudman
- accountability
- proportionally
- gruesome
- couscous
- repercussions
- wimpy
- shortened
- befitting
- nece
- asset
- flushed
- dressy
- slack
- sl
- tro
- bidness
- apiece
- smokeys
- sur
- outlawed
- legislating
- creating
- activated
- steinbeck
- grizzly
- encounters
- doubting
- doug
- ranked
- sierras
- rai
- tempe
- yelling
- explored
- bogey
- burgled
- plop
- pee
- ay
- handyman
- tighten
- loopholes
- withhold
- advantageous
- bueno
- librarian
- coma
- seasick
- minnows
- seas
- fore
- calico
- yaupon
- labrador
- wax
- scalp
- salsa
- hidden
- continuously
- hibiscus
- wetter
- mitsubishi
- '90210'
- nicole
- matlock
- charlene
- beverly
- shred
- pierre
- recognizing
- cinematography
- invasions
- premises
- '911'
- sitcoms
- misbehaving
- faces
- censor
- morality
- jumps
- finite
- infinite
- whining
- panels
- resurfaced
- cimarron
- jeopardizing
- retirees
- ladder
- investigative
- catastrophes
- existed
- halogen
- sulfur
- combustion
- hitch
- moynihan
- skillman
- lynch
- chil
- amnesty
- abstinence
- crayon
- detest
- ph
- allante
- peppy
- saddle
- inca
- dub
- regiment
- twisters
- toe
- prone
- adjustable
- conspired
- premiums
- reasonableness
- parkland
- losers
- witt
- greave
- wins
- dilemma
- reallowed
- implement
- unsmashed
- crazies
- fabricating
- sampling
- steele
- youn
- upsets
- magnetic
- resonance
- sober
- molesting
- boar
- constraints
- betcha
- severity
- entitlements
- reductions
- defaults
- blackman
- manned
- dealerships
- purrs
- feeders
- frontier
- jetsons
- nearest
- trough
- sli
- howatch
- birmingham
- disregard
- darned
- greenery
- tahoe
- skidding
- surveyors
- tracer
- '486'
- measles
- crunch
- burger
- cameroon
- scoutmaster
- sitcom
- seato
- colony
- nato
- disbanded
- arrive
- uncooked
- overdone
- yummy
- bendix
- pontiacs
- hattiesburg
- bir
- boa
- constrictor
- parrot
- overspending
- coughing
- julio
- misuse
- sniff
- milan
- anchoring
- tedious
- stragglers
- tobogganing
- baggy
- reduction
- hewett
- scaffolds
- excessive
- rep
- disappoints
- nairobi
- safari
- wesley
- hospice
- theoretically
- mishap
- electoral
- stew
- hardaway
- dioxide
- vapor
- aye
- pickings
- legitimately
- sails
- bisquick
- lopsided
- boarding
- freezers
- genealogy
- stash
- proliferates
- brokers
- patterson
- subsidized
- amway
- nonpolluting
- bicycles
- bullheads
- nikki
- jig
- stroll
- ogden
- puzzles
- combo
- airless
- scroll
- dolphin
- torpedo
- malamute
- trillion
- ludicrous
- payers
- column
- dumbbells
- controllers
- harrisville
- specialties
- virtue
- accrued
- transfusion
- refund
- pup
- patron
- parenthesis
- earmarked
- greatful
- striper
- senegalese
- perks
- parkinson
- industrialized
- truer
- dispose
- mega
- tonnage
- scrubber
- ammonia
- compounds
- acids
- thickness
- pronto
- finalization
- utmost
- cognizitive
- scarves
- uns
- unseasonal
- sleeves
- sweatpants
- corduroy
- compliments
- skorts
- nominated
- dud
- recurring
- fami
- overreact
- terror
- cohill
- cohi
- drivel
- eldon
- housepainter
- extracts
- overtly
- uncontrolled
- pirated
- ominous
- thief
- westerner
- lunatic
- violate
- socia
- jehovah
- mormons
- intrusive
- solicited
- invasive
- soli
- intruded
- defining
- surmised
- incorrect
- unsolicited
- nonsol
- unconscious
- cli
- sequence
- peddling
- harassment
- generated
- lois
- intimidating
- rver
- greeting
- stake
- mitzi
- yip
- ranging
- soaked
- rhyme
- ruckus
- parallels
- cov
- hooker
- absolu
- phenomenon
- brazilian
- listenable
- elec
- acoustic
- interchangeably
- folk
- arranger
- sitar
- muted
- existing
- tally
- slush
- stocks
- expired
- pleasures
- albridge
- slogans
- outlooks
- haggerty
- spookier
- pecially
- airways
- focusing
- taj
- mahals
- prolongs
- whim
- deserved
- prevents
- mopping
- odds
- unair
- facial
- beards
- skids
- repack
- buttoned
- starched
- suspenders
- reorganization
- cruddy
- reall
- notre
- dame
- explosion
- untypically
- accumulation
- flatlands
- zeppelin
- floyd
- brash
- bump
- bohemian
- rhapsody
- pumped
- siskel
- ebert
- thumbs
- travolta
- quee
- tokens
- divi
- showbiz
- admission
- scyene
- inexpensively
- sao
- paulo
- usefulness
- spheres
- spaniards
- rulers
- conquistadors
- socialistic
- horribly
- dishonor
- defenses
- sabotaged
- peasant
- exploitation
- exerts
- export
- broadcasting
- ruddy
- minist
- wr
- ler
- interpretations
- histories
- copes
- indicate
- resident
- fledged
- barefoot
- pejorative
- unrest
- citizenry
- ignorance
- ult
- constitutionally
- creole
- prohibitions
- strengths
- cuisines
- throes
- reassess
- functionally
- fractiousness
- faddish
- wellness
- biweekly
- dispensed
- distinctions
- dev
- fizzled
- acupuncture
- gestalt
- irony
- cert
- vigorous
- carbohydrates
- kinesiology
- calc
- calculated
- calisthenics
- myerson
- frantic
- astonishing
- mortars
- formulated
- sociopathic
- pronounced
- unfit
- mouthed
- transcribing
- customized
- anne
- glenn
- improvise
- concentrates
- password
- verbal
- rowing
- lution
- rower
- transforms
- markov
- naval
- postgraduate
- civilians
- mainline
- respondent
- unders
- allergist
- smorgasbord
- compensatory
- profile
- bonds
- deducting
- disproportionate
- brutally
- commuted
- delays
- electrocution
- determent
- deter
- dubious
- internally
- organiz
- coordinating
- scandals
- kisha
- knight
- pullman
- exacerbate
- clutches
- pads
- benz
- absorbed
- keyboards
- spaghettis
- lasagnas
- hor
- horseback
- dabbled
- banjo
- druther
- stre
- farts
- polly
- followers
- inspir
- booths
- commutiv
- billboards
- bartman
- simpsons
- debbie
- nigh
- appraisers
- onward
- ease
- folds
- performs
- tenured
- microcomputer
- comprehensive
- rigamarole
- teachable
- specially
- spicier
- tofu
- pistachios
- pistachio
- bumped
- curried
- saute
- gigs
- perse
- ow
- conventions
- slippers
- teller
- alterations
- utilitarian
- knickknacks
- sconces
- jalapeno
- almanac
- concluding
- warms
- shutting
- piloting
- spectacle
- lobbyist
- legislators
- individ
- unbelieving
- justifiable
- nucle
- kilowatt
- washes
- stinging
- swelter
- lively
- eureka
- rentals
- inspires
- glider
- welder
- treks
- '747'
- mindlessly
- pacifier
- reme
- destructed
- milton
- berle
- stepchild
- tumultuous
- regions
- siberia
- oppression
- attentions
- hopely
- catchers
- gladly
- unheard
- babe
- ruth
- thru
- lovingest
- cosmo
- pellet
- tod
- lovey
- dovey
- kneading
- trimming
- bonzo
- poindexter
- felix
- tortoise
- possessive
- bedtime
- rendering
- jessica
- tandy
- warmth
- manhunt
- manhunter
- dysfunction
- slay
- toothpicks
- outwardly
- awfulness
- wonderfulness
- lapses
- telecommunications
- profits
- waivers
- earners
- physicals
- subsist
- lodges
- moss
- footing
- alumi
- defrays
- defray
- unfold
- walmart
- discourages
- catatonic
- discovers
- buzzards
- pal
- imagined
- slaughter
- earthquakes
- robby
- graze
- indira
- observed
- attleboro
- freeways
- jets
- swinging
- kerosene
- eah
- boilerhouse
- powerhouses
- belch
- kodak
- smokestack
- phosphorous
- grenades
- photograph
- overstated
- environmentalists
- claiming
- automakers
- soot
- particulate
- meter
- tailpipe
- devise
- mufflers
- resumes
- graph
- erased
- simplified
- anduille
- doughnuts
- cobbler
- fudge
- fiber
- sloughs
- rafting
- potty
- packs
- noth
- outfitter
- headwaters
- damper
- hostage
- rhetoric
- rolm
- engi
- sheer
- estimated
- doctrine
- turks
- cheering
- reconcile
- divisive
- unprecedented
- authorize
- frontal
- sununu
- commend
- scud
- lefty
- frizzell
- galway
- harpist
- bagpipes
- whistle
- violins
- instrumentals
- rooney
- dancer
- entertainer
- eddy
- smiley
- burnette
- raspy
- playboys
- ernest
- tubbs
- rector
- scratchy
- opry
- stadler
- autry
- anymo
- vegetate
- fri
- relly
- complication
- eith
- demolishing
- stereos
- annoy
- troubleshooting
- initials
- conversed
- sexes
- consist
- childbearing
- storly
- var
- biological
- urges
- encumbered
- heirs
- characterized
- acquaintances
- terming
- emerging
- marathon
- idear
- discrepancies
- overview
- encapsulated
- introductory
- glamour
- updated
- airspace
- huntley
- analyst
- paragraphs
- noontime
- dose
- spee
- fastened
- wander
- aides
- debilitated
- arboretum
- maid
- tackles
- spinning
- irvin
- overwork
- reinjuring
- scab
- revamped
- metcalf
- smuggled
- investigated
- rehi
- renamed
- psychologists
- ration
- modalities
- learner
- kinesthetic
- gladewater
- baccalaureate
- unle
- commentator
- golsome
- superintendent
- adminis
- scarce
- overachievers
- overachiever
- beeps
- expre
- phoe
- easiest
- horizons
- hurtling
- brothers'
- clips
- madly
- fetish
- luring
- costuming
- remarked
- thriller
- distinguished
- terrorized
- branching
- vito
- flicks
- bawled
- toughest
- venue
- disrup
- sequestered
- entrapment
- displeasure
- waive
- bungling
- caricature
- bloodless
- comic
- functions
- thrash
- fixes
- climactic
- joseph
- reborn
- targeted
- hypercritical
- fart
- gags
- slapsti
- funniness
- gag
- retreading
- tec
- preemployment
- brazen
- wisened
- ventilated
- motorola
- tack
- orangish
- feat
- brighter
- coloring
- haphazard
- baseboards
- edger
- granary
- stocked
- formulas
- perfectionist
- tasks
- freehand
- gratin
- banana
- dissipate
- thickening
- globs
- rubbery
- blenders
- cools
- favoring
- nestle
- quik
- groedy
- whisk
- beater
- melon
- baler
- cond
- octane
- generating
- volt
- v8s
- repellent
- erupted
- meteorologists
- chernobyl
- tracers
- smoky
- array
- fiero
- undisciplined
- jacuzzis
- abdominals
- thighs
- mattered
- alienated
- suffocating
- choke
- differing
- grads
- quirks
- academies
- cadets
- espouse
- anglo
- saxon
- inveterate
- switcher
- dave
- wylie
- pumping
- weatherman
- hansen
- gordon
- lightfoot
- winston
- headphones
- toweling
- investigator
- tailing
- socialite
- extradited
- levy
- uplifting
- interpreting
- jur
- gui
- overcrowd
- connects
- businessmen
- sente
- penned
- duff
- penal
- beca
- litigating
- respo
- spiritually
- begats
- durn
- kratz
- kranz
- hedges
- nathaniel
- hawthorne
- storybooks
- woe
- glossary
- krantz
- twilight
- bogused
- fuck
- dares
- hangover
- sarcastic
- fishbone
- spirited
- venezuela
- avalanche
- gobs
- inflated
- beneath
- captures
- resulting
- risky
- contain
- vague
- guaranty
- guarantees
- guaranties
- disasters
- vulnerability
- regul
- workup
- incline
- unjust
- revoke
- reverked
- revoked
- vengeance
- sayeth
- mao
- tse
- chung
- temples
- unified
- humbly
- sovereignly
- rebuke
- ager
- preface
- admonition
- agrarian
- commander
- conceal
- napalm
- gro
- clayton
- uproots
- residents
- deba
- servant
- repaid
- granddaddy
- dodger
- militia
- bologna
- alleviating
- afresh
- lifestyles
- cabbages
- broccolis
- insecticides
- dandelion
- roly
- poly
- slug
- dragons
- sockets
- alkaline
- stem
- peaches
- silt
- shrivels
- mes
- cottonwoods
- irr
- smartest
- gardenias
- revitalizing
- mayb
- chopping
- blasted
- hybrid
- editions
- spruce
- dips
- dipping
- arabic
- pita
- eggplant
- marinating
- hickory
- clones
- mach
- databases
- searches
- deleting
- pieced
- bypass
- monochrome
- enthusiasts
- nathan
- swollen
- manuscripts
- composts
- nurserymen
- goop
- doorknob
- compress
- mugs
- expressions
- ungodly
- expansionism
- nationalistic
- succ
- origins
- angolan
- sinai
- warsaw
- militory
- indu
- chan
- clobber
- conquered
- autonomists
- shortages
- bulgaria
- czechoslovakia
- placate
- alienate
- emancipated
- slaves
- emancipate
- supplied
- battleground
- val
- verde
- briefcase
- bookcase
- armageddon
- grove
- imposing
- yoakum
- trilogy
- terrifying
- '''brien'
- crappy
- jakes
- compendium
- lobbying
- emancimation
- afterthought
- luted
- honorary
- isaac
- asimov
- robot
- developmental
- blockbuster
- mist
- dune
- freeman
- debating
- suave
- charac
- egalitarian
- scripture
- disciples
- wafers
- contradict
- buyers
- elma
- sheds
- pasadena
- refinery
- phoenixville
- grumble
- northwestern
- piped
- almetco
- pantr
- deanne
- multipurpose
- vide
- launched
- groupings
- gentlem
- dyke
- griffith
- idn
- brave
- shallows
- gig
- naughty
- murky
- spectrums
- abso
- feldon
- madonna
- lamar
- gators
- sneaky
- buckner
- stadiums
- cornell
- redwings
- peewee
- crude
- tilled
- screeching
- acorn
- scents
- pollinate
- yield
- tiered
- shrub
- locus
- thorns
- pollination
- pollinated
- littleton
- trucked
- shovel
- pressurized
- chainsaw
- dusk
- unfeeling
- spreads
- datsun
- ku
- klux
- klan
- incumbents
- larou
- larouche
- chord
- mayport
- brim
- snagging
- owl
- baiting
- oyster
- cracker
- trophies
- rockport
- netted
- ugliest
- archaic
- dots
- croaking
- croaker
- friendships
- copayment
- seclor
- exemplary
- snatch
- impressions
- inspections
- yellowish
- misty
- emphysema
- isolating
- biker
- vowel
- lint
- phrase
- cub
- smash
- conv
- ding
- dongs
- guathier
- eliminates
- briberies
- sidedness
- lengthy
- judo
- hoc
- deltaing
- disagreement
- wapner
- judean
- vibrant
- undoable
- semitic
- predetermined
- wandered
- defeated
- astaire
- sto
- plank
- poultry
- empenadas
- eu
- scallions
- sesa
- slivers
- overcook
- dashes
- ketchup
- bishu
- meats
- empanadas
- bun
- niokes
- requi
- bah
- humbug
- fives
- phony
- interdisciplinary
- dispelled
- grating
- reputations
- impaired
- institutional
- quiche
- growls
- overrun
- hussy
- settlements
- poll
- tiddlywinks
- volumes
- ignorant
- ironsides
- affixing
- chart
- commingle
- confusion
- issuer
- conven
- shucks
- profitability
- shifted
- itemized
- alpha
- beta
- accusation
- linemen
- rotation
- thereafter
- proves
- encouragement
- chemists
- overinflate
- southward
- nonconventional
- warheads
- parallel
- resolves
- negotiations
- inhabiting
- lith
- neutral
- crazier
- libya
- treaties
- overthrow
- survives
- inhabitants
- dancers
- outweigh
- wayward
- attained
- sharpness
- acuity
- disorient
- decimeter
- superpowers
- toddler
- indoctrinate
- understa
- skipping
- lows
- chillier
- handicappers
- mosey
- twosome
- mellowed
- doubles
- rationalizing
- purged
- goofed
- nastier
- cashed
- burgeoning
- metropolis
- carey
- thes
- intern
- sanger
- harris
- lifelong
- thunderbird
- citation
- mazaratti
- conceive
- degray
- stutters
- antennas
- roadside
- cords
- heaters
- hookups
- sopping
- dialect
- hums
- nuns
- trin
- shun
- hospitalized
- pumps
- stimul
- flipper
- retraining
- stagnant
- sores
- golan
- kishkes
- matzi
- goyim
- pocketful
- heston
- commandments
- grips
- muslim
- religions
- sects
- protestants
- lennon
- zionist
- nosed
- tampa
- scariest
- coincidently
- lox
- generic
- predates
- jihads
- toge
- secretly
- unity
- revert
- baltics
- forcibly
- impossibility
- insightful
- prays
- dissimilar
- forefathers
- esc
- disseminated
- giv
- postpones
- juniors
- disgust
- centeredness
- inability
- multicultural
- multiracial
- psychologist
- refers
- preoccupied
- infor
- cults
- motorbike
- maureen
- solomon
- eastland
- farmed
- millennium
- hopeless
- ideology
- eden
- distributorship
- supplier
- dirkson
- extansion
- dirk
- pearson
- embarked
- isometric
- chlorination
- firsthand
- detectives
- hunky
- dory
- gi
- barbados
- colleagues
- covert
- suburbia
- roasted
- goat
- hating
- stunts
- bending
- alleviates
- indicative
- handcuffed
- elem
- escalated
- bett
- reemphasis
- rote
- spitted
- memorizer
- wiping
- mennonites
- electronically
- determines
- sherwin
- molding
- bled
- spackle
- lighting
- nerdy
- garfunkel
- fascination
- innate
- supp
- manilow
- badness
- behinds
- pajamas
- yardage
- enclose
- fanatically
- subcontract
- ducts
- materialistic
- dwelling
- necess
- branched
- dishwasher
- inventions
- trashing
- diskette
- ordeal
- configured
- prestigious
- innova
- innovation
- audits
- pry
- peripherals
- lance
- restraints
- thermal
- razzle
- dazzle
- flats
- clairon
- rath
- educa
- feast
- waking
- tentatively
- receptacle
- raisers
- distribute
- disposables
- incremental
- fiery
- luther
- galvanized
- bashing
- environmentalist
- respons
- glow
- wartime
- overlook
- affirmative
- junkyards
- testimonies
- defendants
- legalistic
- achieving
- likelihood
- tilted
- sleaze
- protects
- choreographed
- patents
- antic
- repeater
- vendetta
- observing
- proceedings
- weightless
- effortless
- sweatless
- surveys
- adjusters
- expressed
- meningitis
- fetal
- terminated
- termination
- codependents
- goddess
- observations
- firemen
- overtones
- astonished
- phys
- cokes
- sternness
- forbi
- expressways
- patricia
- handlebars
- rewarded
- dubbed
- booger
- diamonds
- numbered
- redeem
- attache
- suitcases
- lamps
- wheelbarrows
- mixer
- toaster
- waffle
- clocks
- candlesticks
- aloud
- fussy
- babbly
- druthers
- rockville
- ballady
- abortions
- pregnancies
- handing
- landscapers
- replant
- alleys
- cultivate
- replenished
- subside
- prune
- hosted
- correspondents
- translating
- masks
- typeface
- piddley
- braunsfel
- unread
- skimming
- imperialism
- reasserting
- hangings
- needlepointed
- outlined
- intricate
- geometric
- upholster
- stiffened
- streamers
- stiffener
- quilted
- stamp
- foresaw
- refrain
- expedite
- franc
- francs
- diem
- consternation
- godfrey
- goodies
- prin
- perforated
- metrics
- typos
- retyping
- retypes
- encyclopedia
- prints
- limi
- clone
- bleep
- lionheart
- singular
- superstar
- norris
- deserts
- bates
- floats
- animation
- retitled
- reshot
- rout
- cosmic
- enlightenment
- dichotomy
- educatable
- prodigies
- precocious
- harks
- schoolwork
- construct
- convey
- verbally
- stressing
- penalizing
- eternity
- bradley
- activists
- demonstrating
- agreeable
- gerrymandered
- lipscomb
- disservice
- pauken
- politicking
- upmanship
- fooled
- nationally
- applicants
- dissolved
- shutdown
- mathematics
- outgo
- kidney
- positives
- spe
- sadder
- anxieties
- detected
- dismissal
- pard
- certainty
- handcraft
- wreaths
- eucalyptus
- dowels
- goofs
- bulch
- straying
- koala
- shapes
- wintered
- transplanting
- leafed
- pasture
- jungles
- rubs
- validity
- disagrees
- guessed
- lux
- accom
- transcontinental
- throats
- coalition
- armaments
- congressional
- fuss
- shiites
- fiddling
- shaped
- topsoil
- herb
- rollback
- spurts
- loppers
- rotor
- dethatch
- heave
- ingredient
- shrip
- fettucini
- straightens
- disconnect
- sucking
- depended
- peeled
- chestnuts
- burgundy
- browned
- bruises
- retires
- swivels
- collisions
- automation
- iaccoca
- airbags
- sc
- spine
- harness
- nifty
- chryslers
- aerodynamic
- conveyor
- magnet
- pennsylvanians
- brownie
- pamphlet
- slicks
- slot
- poundage
- instant
- wisely
- shboom
- befriended
- ironically
- resumed
- gymnasium
- flooring
- chrome
- height
- pounding
- engineered
- curbs
- gravity
- singles
- assorted
- immobilized
- screamed
- climbers
- limp
- matches
- ammn
- amm
- initi
- initiation
- mishandle
- guiding
- deregister
- tumbling
- themself
- banding
- pis
- julie
- tense
- bundles
- childish
- kazoo
- numb
- suffices
- rela
- weakness
- weaknesses
- experi
- temporaries
- retest
- retested
- rx7
- whatso
- seater
- narrowed
- assessment
- thirsty
- stint
- wanderlust
- poker
- admiration
- miners
- roadsides
- harvey
- uneducated
- flaunting
- relinquished
- strikers
- speeded
- aerobically
- calmed
- postnatal
- cise
- birthing
- axle
- windstorm
- overlooking
- embankment
- arkan
- sweeping
- tows
- beavers
- flee
- attitu
- flaunt
- americanism
- slums
- coops
- inoculation
- hungary
- requesting
- rotely
- panamanian
- quieted
- anticommunist
- excesses
- playtex
- flowery
- jaded
- comforts
- thorn
- bureaucratics
- dyed
- pollen
- gah
- blowy
- rebellions
- massacred
- protested
- diminishing
- renegade
- launching
- strifes
- defect
- obtaining
- globally
- demise
- glasnost
- escalate
- reins
- intentioned
- conveniences
- nonfeeling
- uphold
- unpopularity
- geez
- honorable
- massad
- madman
- straddle
- personalties
- rethinking
- gesture
- miscalculated
- liberate
- underestimated
- miscalculation
- huss
- assassinate
- staking
- precedent
- bullies
- powdered
- bombing
- khomeini
- normalized
- sanc
- juggle
- friction
- bookkeeping
- earner
- kite
- idling
- spooky
- lat
- tracing
- hitter
- shorten
- saberhagen
- crain
- craning
- reds
- stri
- fouls
- steinbrenner
- bogus
- workable
- peripheral
- notebook
- modems
- revise
- furnishes
- deadline
- courier
- magee
- peretti
- piercing
- fic
- soun
- illu
- illusions
- quintupled
- flied
- nailed
- gibbons
- exempts
- planters
- shedding
- proj
- beau
- insi
- sunlight
- sulked
- overmilitarization
- disparity
- civilization
- bigge
- trickle
- hemisphere
- kingsport
- masala
- sweeter
- amaretta
- dijon
- basil
- turgeon
- laroute
- gastro
- lamink
- restructured
- hardships
- subcultures
- debates
- patronizing
- demeaning
- midwife
- pater
- paternity
- troit
- misunderstood
- ranks
- aines
- peak
- olajuwon
- dunk
- businessman
- murchison
- bottomless
- leanings
- assholes
- reaganomics
- nonexempt
- visitations
- shuts
- hunts
- wan
- degreed
- jenny
- outdoorsie
- twix
- braniff
- gossip
- hound
- host
- pause
- mic
- '''clo'
- participators
- primal
- kicks
- tabloids
- journalistic
- fondly
- steeped
- repu
- unnecessarily
- glancing
- nod
- tonic
- unhooking
- uncoupling
- rotating
- rotated
- dieting
- ourself
- wrapping
- kip
- centrally
- sickness
- folder
- emphasize
- miniskirt
- evoke
- overdo
- laces
- flounces
- adornment
- unprofessional
- sexist
- tailored
- vulgar
- redford
- lewisburg
- emblems
- grotesque
- imag
- shoo
- padlock
- pawn
- someway
- neatness
- psychiatric
- hinkleys
- accidently
- distinguishable
- barbed
- curi
- prayed
- reestablish
- lengthways
- mounds
- clumps
- southw
- slapping
- formidable
- adcose
- exaggeration
- harmful
- structural
- hankering
- tick
- excalibur
- newmarket
- edmunds
- barnyard
- treacherous
- journey
- climbs
- creation
- touristing
- asbestos
- repaint
- roughed
- energized
- bids
- bleed
- caulk
- masonite
- bid
- varnished
- intervene
- toppling
- descend
- latinos
- mee
- meek
- europeans
- vocalism
- comparably
- bitch
- moan
- compromise
- dependence
- cartels
- mistreating
- slovak
- catacombs
- persecution
- idi
- amin
- oopsy
- pood
- greets
- recouped
- evi
- burial
- countenance
- uncanny
- litterbox
- anointed
- buzzer
- cheerleaders
- courage
- cheerleader
- precincts
- precinct
- harmfulness
- heroin
- forefront
- estimation
- demolish
- cur
- tract
- scaredy
- straits
- quieter
- comfy
- husb
- prance
- paw
- lovable
- lapdogs
- cockatoos
- squawking
- som
- cower
- akita
- aq
- padding
- chewed
- wiper
- blades
- tinkering
- rightly
- punctured
- patched
- restores
- feminist
- amer
- undoing
- stains
- altar
- spooked
- butterflies
- dee
- nicaraguan
- housed
- spiders
- repent
- evangelical
- surpassing
- override
- rejoice
- borrower
- bondage
- squatters
- witchcraft
- mayans
- incas
- worshipped
- pyramids
- sacrifices
- gods
- oppressed
- warehouses
- cumulative
- itemizing
- scrimp
- walkabout
- boonies
- attribute
- eric
- dickerson
- smi
- linebacker
- bickering
- wen
- appropriately
- arcade
- drafts
- archie
- manning
- nobodies
- showi
- furious
- veg
- padded
- opposing
- satin
- bridesmaids
- maids
- accessibility
- harsher
- aerostar
- stealth
- slipping
- celicas
- perfor
- racing
- surreal
- fulfilled
- blair
- reformed
- gambler
- microbiologist
- competitions
- minnea
- dowling
- ren
- entrances
- periphery
- paired
- deacons
- blesses
- fugate
- proverb
- macy
- lowe
- purebreds
- studs
- sweetest
- sweetheart
- breeders
- bree
- inbreeding
- inquisitive
- hindquarters
- predominate
- rex
- rexes
- rodents
- groundhogs
- mesh
- remains
- teetering
- refusal
- presc
- pharmacy
- mens
- absoluteness
- foiled
- mere
- outlawing
- conspicuous
- inconspicuous
- inappropriately
- hunted
- squirted
- novelty
- outdo
- raciness
- calculators
- euphonium
- mellow
- deejays
- grafting
- cough
- graphs
- sponsoring
- enhanced
- bytes
- '128'
- callously
- deterr
- blooded
- midsized
- porting
- attendant
- vessels
- overbuilding
- phe
- phenomenally
- galant
- serviced
- 49ers
- harbor
- niners
- kim
- redskin
- cartoonist
- ellicott
- basicall
- importantly
- devaluated
- goats
- schoolyard
- motherhood
- overcompensate
- destabilize
- vying
- regroup
- standpoints
- easterners
- couched
- proclaim
- weaving
- dike
- plug
- unveiling
- takers
- roomie
- slaughtered
- sudan
- occurrence
- shredding
- bedding
- wrappers
- reviving
- yosemite
- objectors
- assigning
- examined
- idealistic
- pakistan
- algeria
- blinking
- manipulations
- insofar
- clowns
- partition
- dividers
- baloney
- daylilies
- orchid
- closes
- velvety
- multiplied
- weeded
- lilies
- azalea
- glories
- ned
- skeldon
- ojeda
- hubie
- offerman
- prediction
- cecil
- orel
- hershiser
- darrell
- interleague
- introduce
- anoth
- homey
- randi
- dawdle
- steamy
- lawrence
- mae
- rambo
- hogan
- associates
- realist
- garments
- vogues
- knits
- garment
- loopers
- piping
- cording
- twe
- sewn
- exceptional
- bev
- reap
- sow
- establishes
- pardons
- lust
- incest
- swiftly
- integral
- reeks
- expediting
- compunction
- appropr
- sins
- stoning
- clog
- streamlining
- extremism
- bubble
- habitat
- humanity
- inefficient
- preconceived
- notions
- delivering
- spiraling
- conservatism
- hampers
- patchwork
- unflattering
- autobiographies
- randolph
- descriptive
- affluents
- tale
- binge
- bookl
- francis
- momentarily
- connecting
- sigh
- chowperd
- snowbirds
- spawned
- contend
- melts
- kitty
- apso
- panic
- preserve
- campsites
- twang
- pfeiffer
- rim
- glenrose
- latrines
- gemini
- genocide
- hmong
- unsure
- slash
- intercultural
- dissimilated
- conceptualize
- slavery
- linguist
- withholding
- worthless
- cambodians
- graft
- falk
- drugstore
- coils
- mosquito
- crickets
- foamy
- pristine
- froth
- bobber
- reeling
- saturated
- soggy
- damp
- claustrophobia
- terrify
- spanking
- revamping
- lev
- plaques
- stenciling
- cushions
- impeme
- interface
- janitor
- reams
- dalmarva
- deinking
- contaminate
- wastebaskets
- publicly
- yucky
- interven
- occupying
- schwartz
- iranians
- egyptians
- kane
- matinees
- burton
- batman
- glover
- kline
- dennehe
- goldblum
- clease
- arquett
- untouchables
- graffiti
- broderick
- marlon
- parody
- tinman
- humphrey
- bogart
- maltese
- falcon
- quinn
- rainman
- okie
- homeboys
- optimism
- reconstruction
- redefining
- trait
- longhorns
- randal
- streaky
- touted
- sentimental
- instability
- indoctrination
- marines
- ak
- 47s
- cubans
- capturing
- nicaraguans
- crate
- patrice
- lamumba
- teachings
- extremist
- gen
- irregardless
- albania
- revolts
- psychos
- chiefs
- staffs
- uprisings
- squadrons
- afghanistan
- boils
- cen
- berlin
- wat
- steppers
- soles
- reword
- indi
- environmentalism
- ruther
- environmentally
- blasphemy
- acutely
- bureaucracies
- relegated
- heartache
- grudge
- succeeding
- parish
- policed
- comforting
- reminders
- pyrex
- teaspoon
- blackened
- skewers
- basin
- chefs
- clams
- instinctual
- demographically
- democratically
- proposition
- proposals
- revolted
- obligatory
- considers
- australians
- looses
- leas
- denies
- hamilt
- passionate
- democ
- candi
- antigovernment
- misspending
- bastards
- inte
- hundredths
- sixteenths
- mismatch
- clamps
- meters
- drams
- perfume
- machinist
- indic
- indicators
- micrometer
- finders
- nondecimal
- halves
- listing
- beverages
- whiskey
- ploy
- conversant
- milling
- measu
- calipers
- pliers
- milliliter
- drilling
- hundre
- lawy
- strangle
- neiman
- marcus
- outgrowing
- necked
- embellished
- dre
- presentable
- outrageously
- busters
- campinas
- oursel
- asses
- orient
- optimist
- jungle
- resonates
- profound
- bullying
- dreamed
- wildest
- semantics
- transcribes
- onl
- guzzlers
- fours
- threes
- transverse
- mounted
- shoved
- serpentine
- stickers
- reinstalled
- nozzle
- stroking
- groves
- surinam
- natio
- internationally
- amaco
- mobil
- rectified
- inward
- hateful
- kilom
- thumbnail
- kilogram
- britain
- adopting
- precisely
- grams
- sync
- orchestrate
- unfamiliar
- toting
- stroganoff
- allendale
- waldwick
- adirondacks
- pancakes
- outgrew
- beth
- knowl
- roanoke
- randall
- duplicated
- gamble
- ditka
- nate
- newton
- branded
- outlaws
- webster
- cocky
- lambert
- bloopers
- receivers
- tackled
- necks
- fav
- entities
- overburdened
- fairness
- pondsy
- invu
- invulnerable
- belongs
- electing
- politic
- floored
- maryl
- nurture
- credits
- ukrainian
- scallop
- buns
- batter
- bourguignonne
- grudgingly
- pinch
- reversal
- beck
- subsidize
- bennington
- liber
- refinement
- etiquette
- advises
- renaissance
- bowdoin
- bucknell
- lectures
- confirm
- guitarist
- yale
- minoring
- irrevocable
- irrespective
- clinical
- pathologist
- kayla
- bachelors
- profess
- traced
- rung
- maladjusted
- compelling
- distaste
- resp
- beret
- uzis
- disorderly
- unc
- unconcealed
- matched
- vibes
- clearest
- confi
- junkins
- mandated
- prompted
- tobacco
- bandwagon
- cour
- tricked
- syst
- maintenances
- scoop
- fetch
- pooper
- scooper
- colombia
- reek
- kindhearted
- nixed
- asthma
- outgrown
- misclass
- stately
- sunk
- furnished
- swoop
- situational
- punches
- momentum
- lockheed
- arose
- courageous
- accredita
- accreditation
- keying
- adjacent
- refine
- classified
- chemicalwise
- refining
- strean
- stillwater
- stephenville
- toxins
- bacterial
- bleaching
- sinked
- australian
- dominique
- neek
- wimp
- feline
- unconditionally
- feisty
- snuggle
- investigate
- beaner
- wadded
- fixture
- decor
- panty
- garb
- polyesters
- wools
- neatly
- layerings
- eyesore
- mended
- ironed
- compose
- upgrading
- plummeted
- acro
- daltons
- wholly
- understands
- disadvantaged
- winnowed
- structures
- casing
- connectors
- workmanship
- hal
- fluke
- highlands
- patronage
- cranberry
- pou
- lobsters
- billboard
- steams
- culinary
- adventurer
- franchised
- shacks
- shoney
- reliably
- communercation
- compe
- renditions
- organizer
- defeat
- registration
- dragginess
- headache
- draggy
- locker
- sauna
- motiv
- agony
- dictatorship
- uganda
- mils
- distances
- centigrade
- celsius
- metropolitans
- heeley
- wentworth
- differential
- microns
- whatev
- responded
- favorably
- bagged
- ecological
- prod
- additives
- pickups
- hangers
- cupboards
- fountain
- faucet
- exceeding
- decomposed
- shocker
- bizmart
- upseted
- taxwise
- toilets
- smashing
- soaker
- sheltered
- disapp
- rankled
- cheerfully
- outermost
- inland
- curving
- ventura
- buildi
- overflows
- anaheim
- simi
- meanings
- rhymed
- balti
- strayed
- kabob
- breakfasts
- galunkies
- marsh
- pierogies
- grandparent
- newarth
- cholest
- margarine
- margarines
- kebabs
- utensils
- goulashes
- juices
- sealed
- galore
- finer
- drains
- shakers
- journalist
- crux
- remo
- appease
- pob
- patr
- paro
- paroles
- partake
- traumatizing
- viaducts
- ceremonies
- dozens
- pageants
- riveted
- confuses
- thrilling
- producers
- tony
- dorsett
- hershel
- rationalized
- cinemax
- correspondence
- '30'
- cod
- reso
- repossessed
- 635's
- looper
- ramblers
- brook
- dealie
- diversion
- chevys
- nex
- v8
- carburetors
- gingerly
- yanked
- tinkerer
- evaporator
- rubbing
- testers
- diagnostic
- tester
- diagnostics
- carriage
- chilton
- multiplying
- lincolns
- tremend
- leaking
- condenser
- busted
- haas
- ovolacto
- lard
- nutrient
- lactose
- synthesize
- slough
- utilizing
- rids
- utili
- paperback
- novelization
- lucas
- freder
- brink
- feinstein
- fairfax
- deaf
- insulate
- scrubby
- pecan
- paralegals
- clears
- interference
- surplus
- tariffs
- mon
- apprentices
- advisable
- journeyman
- exporting
- imminent
- oodles
- salutatorian
- prided
- welcom
- welcoming
- tol
- resentful
- zales
- spiegel
- hurried
- circulating
- walrus
- porpoises
- mainland
- sanctuary
- whooping
- cranes
- pelicans
- antone
- alamo
- brewery
- caverns
- uncourteous
- actua
- irritant
- hullabaloo
- stockholders
- inebriated
- unsafe
- surgeries
- subsidizing
- quack
- waiveable
- refresh
- somewh
- willy
- horton
- consolation
- microscopic
- kneecap
- curtailed
- forming
- bison
- weakening
- strengthening
- '401'
- continuation
- telephones
- handbook
- badger
- showering
- physiological
- advan
- fledgling
- bikers
- bicyclist
- knocks
- coronary
- artery
- decreases
- embark
- motivating
- disevered
- knobby
- vaulted
- woodhollow
- villa
- secluded
- joking
- sellers
- coworker
- doorstep
- housebroken
- playful
- gastrointestinal
- beagle
- romping
- waters
- retrieve
- paddled
- unrequir
- degenerating
- rosebud
- sociable
- smu
- synopsis
- furrier
- judgement
- distribution
- wrongfully
- penitentiary
- sitt
- caravans
- lending
- simulation
- resemble
- adroit
- oddity
- moonlighting
- strengthwise
- divulging
- tarnished
- faye
- socialist
- undone
- inefficiency
- platform
- lieu
- mamma
- disruptive
- brow
- browbeat
- wist
- mugging
- faceless
- persuadable
- thunderbirds
- topaz
- camaro
- reim
- dominated
- wrenches
- eas
- champ
- premeditate
- premeditatively
- stiffening
- lessening
- retarded
- pleaded
- phrased
- dayers
- correctness
- promoting
- niceness
- vouch
- waterfall
- busch
- blacksburg
- portsmith
- williamsburg
- epcot
- temp
- buccaneers
- assessing
- opp
- benef
- wadley
- milestone
- tainted
- snickered
- examine
- aircraft
- astound
- pusher
- circularly
- chairman
- judy
- perturbed
- promotions
- programmed
- brightens
- hallmark
- servi
- seizures
- brighten
- tonya
- sneaks
- rainstorm
- breezes
- temperate
- promises
- westernize
- intact
- extensly
- vely
- woodward
- projected
- commanders
- colin
- powell
- embargo
- misread
- earliest
- disarray
- hopeful
- prosecute
- stature
- statesman
- foreseeable
- selves
- volatile
- retile
- bathtubs
- scouter
- drippy
- panes
- putty
- gazoo
- pes
- pesticides
- bulging
- chlorinating
- coronarys
- diets
- quadrupled
- ingestion
- clogging
- primates
- regimen
- kenneth
- innovator
- inactivity
- neurosurgeon
- strictest
- idiots
- stan
- destruction
- symbolism
- evokes
- lynched
- modified
- possess
- condone
- adamantly
- symbolizes
- circum
- satisfactory
- budg
- spartan
- frugally
- jordache
- nonessential
- victory
- cliche
- enactment
- adjourned
- mot
- expending
- reasoning
- allege
- myriad
- departure
- restocked
- guided
- unconstitutional
- reforms
- gard
- arranging
- orig
- florist
- slowdown
- runners
- geraniums
- coleus
- vinca
- thuringiansis
- caterpillars
- expands
- unlicensed
- brittle
- excelled
- wei
- denotes
- tension
- bicep
- tricep
- instructing
- grindstone
- hovering
- configuration
- blended
- muscular
- dystrophy
- documentaries
- paroe
- planner
- uruguay
- concepts
- yuppies
- legislated
- dynamics
- auditing
- rev
- revenues
- millspec
- operates
- elevens
- hammers
- federalized
- ci
- emphas
- identi
- americard
- adios
- commu
- demeanor
- announcement
- calcutta
- foreigner
- worldliness
- attributed
- chuckle
- pogo
- mourn
- tolerated
- drumming
- scrunch
- glamor
- sprigs
- ricksun
- tender
- lamp
- ashes
- overcame
- nondescript
- damned
- hierarchy
- restructuring
- feminism
- boomer
- creep
- rapidity
- electroni
- luncheon
- existent
- consulted
- alters
- stamina
- goi
- denying
- revolve
- entrusting
- omniscious
- omniscipotent
- alec
- precedes
- daders
- shrinking
- worthy
- whate
- responses
- spoils
- flashbacks
- flashback
- fidgety
- discriminate
- pertaining
- distraction
- males
- ital
- entree
- sagar
- presby
- kimonos
- grishman
- bavarian
- constricted
- putrid
- folley
- tableclo
- crayons
- disintegration
- flickers
- prevalence
- excusing
- signals
- mechanized
- requiring
- antipasta
- stuffing
- poached
- kernel
- spinach
- wilson
- beeping
- bakes
- frosting
- frostings
- chatting
- mentor
- adversaries
- manuscript
- harried
- interruptions
- feedback
- videotaping
- adopts
- twelfth
- tangible
- overseen
- alternately
- ilk
- phonic
- pistons
- snooty
- telev
- leno
- carvey
- deduce
- cros
- wheeled
- porked
- termites
- chess
- rearrange
- hisself
- bathtub
- prettier
- rewired
- shorting
- surges
- famili
- rearranging
- shuffle
- pane
- breakers
- valve
- drips
- walkway
- splash
- vein
- downfall
- yuppiedom
- restructure
- biologically
- physiologically
- wonderment
- swooshed
- viva
- talents
- mongst
- jealousy
- computerizing
- pecking
- punched
- slightest
- epidemiological
- guesswork
- transmitted
- semen
- illegitimate
- exploded
- stepchildren
- socio
- radios
- faxes
- sensors
- stalk
- jurisdiction
- outnumber
- solicitation
- prostitution
- unlocked
- fallout
- probability
- indentured
- servitude
- vigilantes
- victimless
- ridicul
- auctioning
- bidding
- patios
- insecticide
- diazinon
- carefu
- deb
- wallpa
- stagger
- renovator
- sheeting
- resilient
- stairway
- sworn
- rud
- veto
- bout
- yea
- dams
- droughts
- reservoirs
- poole
- reflected
- counteract
- learners
- genius
- perspiration
- diagnose
- predisposition
- flashing
- drowsy
- facilitators
- manipulated
- burdening
- toot
- weekdays
- racket
- drawer
- dennison
- derby
- siphon
- cu
- uba
- tailgate
- deterrents
- publishers
- poisons
- ergotisms
- fungus
- gender
- confidential
- tide
- vatted
- archeology
- shoelace
- promising
- upcoming
- reprinting
- thurber
- hundredth
- riveting
- viorst
- sci
- revol
- revolves
- shoelaces
- binds
- melody
- workbooks
- workbook
- geometry
- cypress
- greece
- irrelevant
- tortola
- gorda
- infusion
- ethnicity
- familial
- acclimate
- retaining
- latino
- continentals
- roberto
- unprepared
- vociferous
- attain
- imported
- territorialism
- horns
- encompass
- handcrafts
- wreath
- phillips
- ranching
- contemplating
- stabilize
- occupies
- baseline
- flextime
- grading
- scribble
- sensitivities
- akin
- minimized
- prematurely
- dumper
- geria
- empathize
- tandem
- providers
- prohibitive
- fantastically
- moslem
- surro
- surrogate
- regretful
- arou
- swims
- nationals
- quarries
- tumbled
- avail
- denmark
- appliqued
- eraser
- maturing
- rite
- unmarried
- aquariums
- zoos
- paternal
- traditions
- disintegrated
- trinket
- sociologist
- multigeneration
- eightch
- scorer
- rebounders
- assists
- thown
- laker
- marriott
- spittering
- sputtering
- swimsuit
- mavs
- favored
- endorsements
- prospects
- stanley
- underclassmen
- myrna
- curfew
- fiscally
- jockey
- catton
- dives
- cayman
- itinerary
- viet
- doves
- abnormal
- puppet
- heartbeats
- reviewing
- bocket
- hannibal
- lector
- fascin
- luster
- attractiveness
- originality
- pinpoint
- lavon
- upstream
- sever
- benders
- grea
- musky
- perches
- salami
- sonar
- maneuver
- charter
- suntan
- hobbyist
- styled
- convertibles
- sevi
- welded
- welding
- sunroof
- soured
- contention
- jags
- contractors
- bends
- enthused
- enthusi
- ap
- vending
- cartilage
- glanced
- fenced
- econ
- repeatable
- bundy
- exe
- strauss
- punish
- electrocute
- problematic
- candid
- fraud
- intangible
- reinstate
- mario
- cuomo
- legislatures
- molested
- incarcerate
- sylvan
- reenacted
- paltry
- polishing
- lotions
- meniar
- cringes
- thrifty
- flier
- psycholinguistics
- ivory
- godsend
- pathe
- willow
- cana
- bacally
- obese
- reimburses
- collared
- widget
- bramalea
- 401k
- weeny
- nonex
- censored
- bombarding
- dramatize
- statues
- weld
- epoxy
- resin
- shattered
- statue
- cricket
- thatches
- thatched
- vapors
- stained
- lacquered
- tung
- fanatical
- pills
- hem
- sweating
- bulge
- wrinkles
- vices
- sha
- germ
- ecru
- undercoat
- peachy
- steamers
- mottled
- grey
- maroon
- vivid
- turquoise
- coral
- renovating
- hallucinations
- cloths
- slop
- soluble
- tricks
- skimp
- tediously
- rewallpaper
- racks
- metlife
- worki
- workm
- inconsistencies
- amateurs
- footballs
- fencing
- earl
- princeton
- pacers
- subminimum
- administered
- reluctant
- poured
- chiropractor
- cautious
- janitorial
- rafael
- septien
- applicant
- eduardo
- mana
- sai
- mafia
- newcomers
- ellis
- redoing
- comm
- elitist
- concise
- rathers
- yous
- segregate
- wretched
- horrid
- shortchanged
- brokaw
- demi
- ringwald
- sixteenth
- doogie
- howser
- freckly
- ferris
- moustache
- reeve
- dreaming
- ooze
- bride
- pretended
- occupational
- exemption
- judiciously
- incidental
- figuratively
- westport
- bradford
- indirectly
- clair
- dayt
- baldwin
- bebble
- foreclosed
- rider
- homestead
- creeping
- livable
- retrial
- retry
- wond
- seeded
- raping
- choking
- shotcross
- televised
- vendettas
- trialed
- revoted
- annihilated
- enterprises
- misgivings
- quiz
- sprint
- capture
- extending
- endowment
- joes
- alumni
- splits
- governme
- faired
- undertaken
- deficiency
- dilly
- sangre
- cristos
- wichitas
- lakefront
- pinon
- naturalist
- stools
- binding
- component
- carol
- playroom
- realtors
- dominantly
- alleyways
- shifting
- popping
- bangla
- hugo
- bedroo
- barometric
- borger
- funnel
- pillowy
- radar
- veer
- swirl
- junes
- budding
- crimp
- scorch
- distracting
- heats
- therapeutic
- northe
- mayer
- denison
- purify
- purifying
- philodendron
- acc
- divert
- blurred
- fluoro
- fluorocarbons
- provoking
- brandeis
- fift
- readings
- iliad
- mythology
- choo
- scientifically
- grumbled
- unpleasant
- imparting
- cluster
- vicarious
- compromised
- profiles
- telemarketeers
- outcry
- cited
- crashes
- eroded
- erosion
- lockers
- latitudes
- motorists
- liens
- representing
- landlo
- dakotas
- alarmed
- exclusion
- parameters
- interpreted
- adoptive
- carting
- arresting
- interval
- orwell
- tay
- unusually
- leathery
- venture
- wea
- pebbles
- drainage
- deceptive
- fiend
- wrinkled
- oils
- fishermen
- tricycles
- kiddie
- wilds
- calves
- heifer
- jea
- flared
- hep
- themsel
- continuum
- astute
- propagate
- raccoon
- filleted
- livestock
- whiskers
- growling
- widen
- weaker
- ticker
- pentagon
- whomever
- nutrisweet
- bitterness
- ancient
- vets
- complicate
- preregister
- registrations
- eligibility
- preceded
- theodore
- upward
- rascals
- stinks
- precluded
- gullibility
- democracies
- redistricting
- subsidizes
- lineman
- spilled
- camouflage
- booby
- traps
- apocalypse
- influx
- surge
- buckle
- overcome
- castaways
- depicting
- dudley
- bloody
- olden
- realism
- pioneer
- worship
- chri
- videotapes
- shrunk
- eastwood
- showy
- westerns
- cursed
- pointy
- melissa
- gilbert
- idol
- verse
- shep
- immemorial
- misdemeanor
- waving
- prevail
- appoint
- bailiffs
- clerk
- verbalize
- tripled
- cameras
- reporters
- prosecutors
- outweighs
- prosecuted
- sump
- sewage
- towed
- aut
- trad
- marina
- hears
- acclaim
- sequels
- earle
- recluse
- essays
- qu
- conclusions
- photographers
- arro
- gorillas
- sloth
- fascinates
- bottoming
- landers
- tycoon
- bloomed
- fade
- spiky
- bl
- hya
- colossians
- thistles
- landscaper
- junipers
- puny
- foliage
- iris
- fuzzies
- wildflower
- insists
- camcorder
- pastime
- muggings
- grates
- claustrophobic
- tendencies
- deviant
- anguished
- cleaners
- meridian
- inlaws
- sneakers
- jordans
- brains
- caps
- videoed
- repeated
- repetition
- termed
- allowable
- purs
- discretion
- freely
- altering
- preparations
- namely
- minuses
- factored
- competitor
- trevino
- influencing
- wholesome
- exclamations
- sportsman
- phooey
- applicator
- nurseryman
- elm
- circumference
- stubs
- propelled
- pest
- sawed
- rot
- rotter
- autobiography
- liquidating
- emulating
- compu
- ause
- accomplishing
- spacings
- formattings
- insert
- reset
- rewrite
- typesetting
- typeset
- spaces
- compatibles
- adhere
- brochco
- hillstreet
- finale
- nudity
- delight
- shudder
- flabby
- telemarketing
- classification
- lotteries
- kalamazoo
- sinus
- carton
- stakes
- mounts
- hub
- airports
- altitudes
- intermediate
- simp
- fluorides
- guerrilla
- marched
- lied
- expire
- xerox
- modify
- soo
- terminals
- insur
- breakable
- hangouts
- haunts
- southerners
- rudest
- bartenders
- wee
- ferrings
- taiwanese
- jambalaya
- wowed
- univerisity
- arias
- casks
- hospitalization
- hos
- crowns
- fluctuate
- celebr
- inordinate
- axe
- newscast
- js
- recap
- sensationalize
- sensationalized
- asinine
- puzzle
- precede
- preclu
- preclude
- stretches
- wakes
- depreciate
- tru
- unibody
- granddaughters
- gol
- wagging
- trainers
- airheaded
- yappy
- dignified
- culling
- tamper
- innately
- tractable
- selectively
- culled
- belgian
- distinct
- breeds
- kennel
- translates
- shit
- unreliable
- handlers
- indiscriminate
- breeder
- handler
- bab
- doorbell
- stipulation
- laundromat
- grasslands
- surrounds
- betty
- parades
- palestine
- id
- peg
- catalyst
- palestinian
- kindest
- abounding
- kindness
- godly
- compassion
- humanness
- mandarin
- oranges
- grape
- fridge
- gelatin
- carrot
- eggo
- waffles
- adolph
- breakfa
- craftsmanship
- opt
- stanza
- glitters
- oasis
- warp
- clearinghouse
- consolidating
- salespers
- tel
- compan
- announcing
- telepho
- discard
- episodes
- cramp
- vela
- someb
- thirtysomething
- mclaughlin
- yogi
- loner
- comedian
- cantankerous
- echoed
- withdrawal
- grumpy
- stooges
- mouthiest
- kiddos
- mouthy
- touristy
- besieged
- defini
- badgering
- galapagos
- sidney
- adelaide
- chengdu
- quingdao
- retreat
- flights
- rita
- oah
- destitute
- ree
- snorkeling
- prawns
- milli
- arsenal
- traffi
- bennett
- gangsters
- corp
- arr
- pris
- crowding
- statutory
- verbalizing
- stints
- citing
- intensity
- limbaugh
- lamenting
- microwaved
- healthiest
- teases
- accuses
- deprivation
- nourishing
- evaporated
- broil
- marinara
- grapefruit
- starch
- pleasurable
- kalli
- cater
- rodolfo
- royal
- maitre
- pilgrim
- unnatural
- lookout
- arby
- wastes
- reduces
- speedup
- healthily
- sup
- quoting
- disputes
- commas
- reevaluated
- inma
- blinded
- restitution
- willfully
- contradictory
- caveman
- coleslaw
- tablecloths
- bakeries
- regretted
- purch
- pastrami
- '''oeuvre'
- complicat
- sustain
- addressing
- fellowship
- prefers
- troublesome
- camels
- beatle
- orchestration
- okeydoke
- statler
- stated
- debut
- investigating
- bootstraps
- baptisms
- clergy
- imprisoned
- confiscated
- bourgeoisie
- commonality
- recanting
- courtyard
- motions
- commandant
- escaped
- perseverance
- bureauc
- persecuted
- dab
- chorus
- mothering
- rerate
- precluding
- analogy
- spade
- marketeer
- warring
- peacefully
- trampling
- fantas
- crabby
- coated
- willis
- sarandon
- gena
- vatican
- paradeso
- befriends
- friendship
- califor
- drying
- nippy
- mucky
- thunderstormed
- shoveling
- michelle
- lan
- footnoting
- retype
- appetizer
- criterion
- alumnae
- heavyset
- poignant
- subtleties
- gore
- warlock
- omelet
- characterizing
- conceited
- portay
- goer
- prosecu
- cutor
- struggles
- flowing
- ir
- slicing
- locust
- omar
- swallowed
- redwood
- brownstone
- caulking
- myneer
- spacious
- inhaled
- revived
- airway
- revive
- sol
- dignity
- luxurious
- blossoming
- brazos
- sleeps
- purdis
- sandlin
- quake
- mak
- caramelized
- customary
- orchard
- accor
- ply
- crier
- waistline
- jewels
- earhart
- thurow
- perceptive
- pinpointing
- flimflam
- hughes
- assis
- plod
- rereading
- ditched
- findings
- bonfire
- vanities
- temporally
- burdened
- cafeterias
- linen
- napkins
- duplexes
- hodgkin
- undergoing
- interim
- constancy
- sufficiently
- farfetched
- wheeler
- cock
- slowing
- pals
- unjudgmental
- homy
- reprimand
- secrets
- brooksville
- campuses
- eyesight
- enrichment
- schooled
- rejection
- proceed
- herman
- foreigners
- polluter
- rigs
- busses
- incinerate
- pollutant
- untold
- cockroach
- accelerated
- nutrients
- sponges
- tending
- newark
- vividly
- entrance
- biggies
- consumable
- calculation
- physiology
- snowball
- dieters
- robbers
- trendsetters
- correspond
- circulates
- centralize
- descendancy
- closeness
- caliber
- differentiate
- stevens
- shippensburg
- specializes
- novelist
- intricately
- johann
- sebastian
- copyright
- compile
- poems
- baudelaire
- jennie
- abridged
- reunited
- rituals
- equated
- communion
- repetitively
- vernon
- salmonella
- silverware
- caterer
- biographer
- obituaries
- succeeded
- vigor
- bulletins
- chorals
- beginner
- violinist
- percussion
- accompany
- choruses
- audition
- verdi
- hermit
- vacationed
- anonymous
- whirlwinded
- effortlessly
- elicited
- unwound
- guadalupe
- penetrates
- alda
- burt
- reynolds
- vignettes
- dinosaurs
- robots
- satur
- sniping
- howling
- gleason
- snippets
- idle
- workshop
- gra
- dividing
- moses
- hab
- scavenge
- conserve
- indulgent
- exceptions
- contemplate
- permitting
- calming
- aboard
- docks
- cozumel
- ocho
- rios
- jurisdictions
- tapping
- lynda
- slandered
- landslide
- thornburg
- landslided
- characteristically
- savory
- petition
- resisted
- dirtier
- muddier
- sensibilities
- transpired
- nixon
- edible
- accumulating
- elbow
- cho
- grandes
- refried
- katy
- avocados
- avocado
- coolwhip
- horseshoes
- auctions
- sidelines
- loosely
- socioeconomic
- tracked
- pressured
- vandalism
- outward
- custodial
- skyline
- irritable
- unattended
- environments
- dunked
- compaq
- honk
- prodigy
- mush
- shareware
- paradox
- shooter
- crawford
- andrew
- webber
- paranoid
- unlucky
- anonymously
- competency
- wholesale
- lon
- exa
- beginnings
- kuenzer
- rebelled
- debtor
- angela
- eyeglasses
- indiv
- staffing
- examines
- optometrist
- ophthalmologist
- extractions
- publication
- unfeasible
- bettle
- orthodontal
- outsor
- roo
- suite
- scattering
- leniency
- underhanded
- perpetrator
- injustices
- wherein
- dist
- unsavory
- elimi
- rarity
- chairmen
- ministers
- congregations
- catholicism
- forthright
- disorders
- soothe
- exertion
- characteristic
- cram
- guarded
- sacrificing
- mediators
- interpersonal
- mediator
- doable
- devised
- stimulations
- goof
- whipping
- nickie
- snail
- hards
- futuristically
- subjective
- harmony
- impregnated
- challenges
- motherly
- competent
- militaristic
- colonel
- infantry
- embrey
- reynold
- riddle
- aeronautical
- pratt
- whitney
- daphne
- dictated
- qualifying
- rhodes
- scholars
- homogeneous
- realities
- socialization
- insular
- sheriffs
- evict
- continuances
- abundantly
- appealing
- retried
- lowers
- percep
- gypped
- slicker
- bruno
- kirby
- chauvinistic
- punching
- correlations
- opium
- dens
- weakened
- duress
- drunken
- induced
- legalized
- quantify
- deg
- safeguards
- fraction
- oath
- sensings
- sentencings
- pertains
- introduction
- accordance
- clark
- parachute
- presiding
- reorganizing
- sweeper
- univerty
- versity
- lakeway
- expose
- jun
- bethany
- unfocused
- midst
- instigated
- marrie
- remained
- tomorr
- whitmore
- arbor
- slushy
- sled
- icy
- lingering
- exodus
- eternally
- snowfall
- grassy
- sachse
- goddard
- stickler
- mulcher
- seni
- antisocial
- adapting
- deteriorates
- glimpse
- unwilling
- appalachia
- stopgap
- rougher
- strategic
- fails
- worded
- peoria
- dropouts
- insecure
- scaring
- stylish
- interpretive
- fathom
- expanding
- wean
- referrals
- advisory
- myrtle
- barricaded
- blackberry
- defeats
- enchila
- boiled
- toasted
- calorie
- hereditary
- headstart
- preschooler
- tacos
- tamales
- romanian
- backfires
- waiters
- batty
- momo
- colter
- pas
- campari
- adventured
- souper
- prey
- backlogged
- patrolled
- frus
- imme
- dialogue
- aisles
- cornball
- overacted
- applauding
- waterskiing
- ashley
- jamie
- warner
- deanna
- cheeks
- backdraft
- berry
- raspberries
- shaved
- entrees
- accompaniments
- gershwin
- puree
- antipollution
- gases
- accumulates
- groundwater
- fusion
- optimistic
- pessimistic
- reconvicted
- sicko
- merciful
- cannibalism
- hunch
- coordinate
- communicable
- memos
- orchestral
- fiddler
- oboe
- classy
- corresponds
- christening
- elijah
- marches
- poinsettias
- bouncy
- haunting
- conventional
- disposal
- odors
- throwaway
- ditches
- drinkers
- churn
- shipwrecked
- explodes
- maims
- sylvester
- mermaid
- outfitted
- crushing
- hobnail
- phobia
- bifocers
- trifocals
- mccalls
- byte
- afflicted
- exceeded
- antibody
- realm
- telethons
- doling
- receives
- ociety
- aesthetic
- enhancing
- frightens
- dahmer
- burglary
- enquirer
- cranks
- fuzz
- repala
- sil
- shiny
- heartbeat
- spins
- rainbow
- packaged
- trespass
- tidbit
- refrozen
- cheesecakes
- refreeze
- liabilities
- wrecks
- tattoos
- speedboats
- chambers
- afloat
- maneuvers
- stormy
- nibble
- rope
- entice
- sneaking
- paged
- favo
- flyer
- shaky
- iffy
- sentra
- subdued
- urinalysis
- bums
- overdress
- overkill
- businesslike
- nylons
- nutrisystem
- dreaded
- toppers
- ceramics
- seamstress
- cramped
- negligent
- initiates
- squeegees
- newscasters
- postponed
- a1
- alfredo
- clowning
- circuits
- sfuzzi
- copeland
- transported
- thirteenth
- wobbly
- bookends
- jug
- viscosity
- saver
- brushed
- tooken
- turpentine
- towels
- shi
- jul
- shindig
- boulevard
- maizeland
- skier
- minnie
- canaveral
- reschedule
- hilton
- eighteenth
- raton
- '287'
- '70'
- broadmoor
- breckenridge
- trinidad
- '25'
- hexpired
- disheartening
- elders
- albertson
- limbs
- sodas
- arranged
- brookshires
- pickle
- piles
- emporium
- cinch
- consolidate
- alluring
- cupcake
- henpecked
- instilled
- gatherings
- subtracts
- debits
- incidentals
- scotch
- igloos
- strateg
- strategically
- incurred
- cashes
- reunio
- entryway
- roaming
- ris
- risen
- appraisal
- disoriented
- blissful
- unexpectedly
- cockroaches
- complacent
- bitterly
- polling
- campaigning
- napping
- structuring
- digested
- perfumes
- geese
- peaked
- balloon
- canyons
- weatherwise
- sleet
- maps
- sy
- pearls
- loafers
- distinguishes
- '1200'
- whereby
- extract
- generates
- bursts
- navc
- blazey
- obscure
- promotes
- goe
- refrigerate
- tartness
- raspberry
- connoisseur
- tastings
- mesina
- exorbitant
- kaiser
- mccullum
- catastrophic
- implants
- transplants
- howe
- dislikes
- chopin
- expresses
- discussions
- chords
- panicking
- kielbasa
- bak
- ravioli
- reggae
- twangy
- agr
- cackle
- atteck
- scholar
- adolf
- imaginative
- sty
- antiques
- winnie
- pooh
- grimm
- fairy
- tales
- gentlest
- jewel
- restroom
- spitz
- extravagant
- overpass
- littering
- timers
- tans
- mauve
- distantly
- swap
- bichons
- barks
- hind
- origina
- bernards
- lega
- belittling
- liberals
- suppos
- tcat
- examination
- clicker
- screens
- carpooled
- bolivia
- sundresses
- polyester
- overheat
- sweltering
- newborn
- pleats
- absent
- strep
- bookkeeper
- partitions
- duality
- extenuating
- newsworthy
- leafing
- mccall
- subscribing
- gott
- newsy
- putterer
- caladiums
- hardened
- semitropical
- carrollton
- architecture
- hairless
- coon
- manx
- tame
- ships
- folklore
- faint
- chincoteague
- burgers
- teriyaki
- shakes
- grandy
- fend
- snowballed
- inconveniences
- woozy
- sys
- squirt
- flicking
- whales
- showtime
- adder
- dragon
- rosa
- sorrento
- dine
- mah
- jongg
- yearbook
- imprinted
- depreciated
- cribs
- bestes
- giver
- enables
- ly
- confining
- bronco
- moder
- cowb
- cheer
- schnauzers
- dachshund
- starved
- curled
- skittish
- spaying
- belon
- severing
- sr
- suicidal
- craziness
- mistrust
- lacks
- poland
- weeding
- mankind
- uninsurable
- medcenter
- hearings
- overstaffed
- mortgages
- outlaid
- intergovernmental
- plugging
- indepth
- capsize
- sensationalism
- blase
- sel
- sadist
- oleo
- oregano
- ight
- semolina
- absorbs
- vulnerable
- align
- bombings
- aligned
- tensions
- forceful
- cr
- expedited
- deserving
- mandate
- grassroots
- introspective
- schoo
- visitation
- advantaged
- energies
- tiananmen
- custodians
- immigrated
- brightest
- burst
- lanes
- winterized
- yourselfer
- representatives
- homemaking
- accessed
- uzi
- flyswatter
- utilized
- acquiring
- illicit
- gatlinburg
- cosa
- hiked
- ardmore
- cloud
- ledges
- hyatt
- gully
- trench
- tenkiller
- enlisting
- seductive
- pinion
- totality
- revealed
- legislat
- abrupt
- ruder
- arrives
- '1'
- microcomputers
- gateway
- apollo
- faulkner
- emblem
- candice
- bergen
- ghosts
- haunted
- dianetics
- gibberish
- broudigan
- journeys
- mailman
- karl
- malone
- hacking
- fillmont
- generically
- cyclist
- techy
- hackers
- davy
- crockett
- sailor
- sailed
- mck
- equalize
- semiretired
- dementia
- insisted
- rejuvenating
- coldest
- cus
- celltrex
- jeri
- maceo
- rampages
- cocoons
- occa
- uniqueness
- winfrey
- prebuilt
- workbench
- subcontracted
- subbed
- scramble
- championships
- peacefulness
- birdie
- quadruple
- whizzing
- spectators
- scrambles
- kerr
- mcgee
- infrared
- suffice
- notifies
- supplying
- angles
- anticrime
- outings
- sec
- arlene
- lister
- poked
- togethers
- dearly
- swoosh
- skate
- begonias
- destruct
- concessions
- drizzly
- huddled
- cages
- fanatics
- straightforward
- piston
- oiling
- altog
- reelection
- provisional
- locate
- incomewise
- ifs
- ands
- buts
- '4'
- hel
- discontinue
- narrowing
- nitty
- gritty
- faithful
- shoppers
- yourselves
- straighten
- stems
- relating
- supporters
- antisupporters
- contras
- dictator
- fascist
- siesta
- mouths
- reflecting
- dabble
- chalk
- chesapeake
- suspended
- ath
- tutored
- goofing
- piney
- diameter
- calmness
- outwitting
- shiners
- infla
- inflatable
- raft
- cottonmouth
- coves
- walkie
- talkies
- handcrafted
- semifixed
- automated
- crafted
- stateside
- adage
- advising
- embarrassment
- jessie
- helms
- intelligently
- mistreated
- papa
- doc
- tyrant
- puberty
- tibby
- perfumed
- legendary
- brookies
- rainbows
- accommodated
- specialists
- replanted
- rods
- norfolk
- portsmouth
- hikes
- pests
- chaperon
- calloway
- variegated
- beetles
- borderline
- zaps
- ligustrum
- apron
- gourds
- bolton
- symphonies
- caller
- sax
- houseful
- crabs
- sensation
- tingling
- oddball
- waitressing
- crunches
- relevance
- federally
- hogs
- barns
- revealing
- horticultural
- groundskeepers
- dormant
- centipede
- crops
- behold
- cuttings
- mit
- diamante
- boozier
- passengers
- shining
- becca
- nina
- palmer
- remarrying
- griffins
- crackers
- burritos
- debone
- notoriety
- jurisprudence
- thoroughfare
- sleeper
- herd
- cima
- savages
- plywood
- beams
- migrate
- undercover
- barbiturates
- codeine
- drixoral
- unsolved
- mcgillis
- weeknights
- physicist
- facet
- hurst
- greensboro
- celebrities
- repeaters
- zealand
- statistically
- outbound
- astronomy
- gallagher
- pictured
- betters
- hubble
- telescope
- planets
- habitable
- backers
- zippers
- snaps
- dull
- pretechnology
- shelled
- duplicates
- regulat
- regulators
- regulator
- lever
- pulley
- chev
- oi
- resur
- ourse
- hesitating
- russ
- noons
- flaw
- gasket
- fury
- exceptionally
- surfaced
- repeatedly
- escapes
- pragmatic
- consti
- opponents
- laural
- squeaked
- andrews
- clou
- crept
- firewood
- maples
- dogwoods
- lowell
- unu
- periodicals
- historic
- interes
- lawful
- scanners
- attempted
- thoroughness
- mag
- announcers
- tele
- ivan
- rodriguez
- ballplayers
- routing
- enthusiast
- ducted
- gettin
- brussels
- sprouts
- kale
- pony
- grazing
- pears
- extinguishers
- depleter
- extinguisher
- timed
- contaminants
- probe
- ionization
- miller
- temptation
- squareness
- buckles
- fea
- lettering
- vin
- vinyl
- balloons
- recy
- commented
- nudge
- decomposable
- flips
- emptying
- regressive
- defen
- kate
- curves
- raphael
- atchafalaya
- sausa
- alvarez
- applebee
- nonstructured
- torture
- nur
- fai
- glorious
- esoteric
- producer
- hairspray
- batch
- partic
- preteen
- unlikely
- dynamic
- raunchy
- horrifyingly
- poppins
- differed
- eclipses
- belie
- lebaron
- peeling
- gears
- oklahoman
- beatings
- proy
- condoms
- stupidity
- truthful
- faded
- marker
- reflective
- adheres
- sealing
- dings
- variance
- prop
- pressuring
- primed
- bragging
- sickening
- shitty
- drags
- burners
- putts
- teeing
- lodging
- dialers
- provision
- specify
- dialing
- prised
- weir
- overloads
- hoosiers
- crossing
- delancey
- thrillers
- backless
- ani
- nick
- nite
- dragnet
- bald
- marlo
- collier
- brigham
- estonia
- agriculture
- foodwise
- rioting
- secede
- proportionately
- hinders
- tubs
- brougham
- trunks
- shy
- gadgetry
- '6'
- interiors
- veered
- revolving
- reverting
- envy
- exhausts
- hairy
- gettingest
- daught
- bertinelli
- dysfunctional
- childfaring
- miracles
- bette
- midler
- redbook
- previewing
- postage
- unauthorized
- mayors
- discredit
- ps
- productions
- chariots
- gladiator
- fluent
- batches
- subtitle
- subtitled
- gems
- supernatural
- accusing
- migh
- mondays
- thrust
- lifters
- drills
- rocking
- referee
- abrasive
- maintaining
- posed
- refusing
- coins
- conversions
- dormitory
- unused
- ramp
- hydraulic
- disposer
- escapement
- incorporating
- leonard
- nimoy
- trekkie
- luke
- spock
- mccoy
- admiral
- hobbled
- vulcans
- doohan
- scotty
- addams
- averaging
- decrease
- munich
- snows
- chattanooga
- lori
- coldness
- membered
- unemp
- fetus
- complications
- slobs
- equation
- nameless
- malformed
- sincere
- deliberations
- dismissed
- indicted
- revenge
- subsequent
- provoked
- provocation
- qualifies
- mitigating
- contender
- linguini
- hawaiian
- luau
- angie
- shellfish
- clam
- cheeses
- nachos
- resurrection
- lutheran
- scanned
- cooperating
- toss
- inmate
- interpretation
- blanks
- executioner
- bamorghini
- skyhawk
- dominican
- nantes
- castles
- vineyard
- consignment
- goodwill
- crushes
- sewer
- res
- unoccupied
- assassinated
- menace
- perspec
- relativity
- vantage
- weighted
- reflect
- subservient
- integration
- ith
- frien
- drudgery
- montpe
- mont
- monteplier
- montpelier
- everett
- yack
- tromping
- unlimited
- wedge
- fairway
- flus
- startling
- '286'
- turret
- scien
- simulators
- plugged
- upgrades
- custer
- '386'
- trenches
- trencher
- stunt
- cul
- sac
- rearranged
- clancy
- novell
- netware
- ark
- ladonna
- peck
- bourne
- ultimatum
- enveloped
- amsterdam
- holland
- harpsichordist
- forte
- warrington
- cheating
- harry
- heroic
- mayfield
- corrupts
- lig
- hatteras
- imaging
- legalese
- himsnelf
- koop
- scarcity
- highland
- jogs
- gyms
- inequities
- stimulate
- deductor
- bentsen
- drunks
- lafferty
- infringe
- snuffed
- snuff
- compares
- gilmore
- accomplishes
- william
- thrice
- mating
- sows
- suckling
- hernia
- carcass
- cloves
- pineapples
- cranberries
- hominy
- barb
- automatics
- avis
- crashed
- lens
- porsche
- turbo
- carrera
- mys
- mushrooming
- percentagewise
- folderol
- lifeguard
- jarring
- flui
- watchers
- pokes
- blamed
- ceases
- intravenous
- cell
- quests
- subsidies
- slashed
- entitlement
- trades
- beauticians
- unending
- spiral
- consumers
- unf
- ailments
- magerick
- celtic
- transplanted
- rolando
- harper
- plaint
- straighter
- dayer
- plumbed
- bolted
- logan
- accredited
- professorship
- distressing
- fiel
- treasury
- refunds
- halt
- spying
- scaled
- loading
- challenger
- stat
- mirv
- roomy
- cargo
- recommends
- volvos
- wagons
- conscientiously
- emiss
- hypothesize
- muncie
- terre
- haute
- triggering
- verify
- drivable
- emerges
- overgrazed
- reclaimed
- prettiest
- palm
- paintbrush
- septic
- hummingbirds
- hummingbird
- pooped
- annuals
- countrified
- supermarket
- coaster
- afterburners
- gliding
- oomph
- subs
- gambled
- insulating
- spec
- verandas
- genes
- drapes
- guppies
- platies
- fishies
- glacier
- playgrounds
- wilderness
- scaries
- rayburn
- curling
- nominal
- fulfill
- synagogue
- geriatrics
- app
- degenerative
- communiky
- enhance
- assist
- text
- biogra
- daniels
- prince
- phillip
- criticizing
- miniseries
- scarlett
- spectacular
- torrents
- ligh
- horizontally
- arid
- crisp
- sleigh
- brighton
- springtime
- skie
- hammered
- subtly
- brianna
- lib
- submerged
- loosening
- leaks
- tar
- gravel
- plastered
- drywalled
- plastering
- terri
- exasperating
- swelling
- squirming
- swells
- shrinks
- retains
- highlight
- captive
- legos
- technic
- lego
- stare
- engagements
- sousa
- refreshments
- rehearsal
- donations
- municipal
- conduct
- nitny
- altoona
- lockhaven
- nighttimes
- ama
- emerson
- maceboast
- circuitry
- vacationer
- wausau
- unduly
- sunglasses
- grip
- durable
- faulty
- recliner
- pinto
- sequoias
- redwoods
- bryce
- tetons
- sequoia
- driveways
- snowmen
- snowballs
- marketed
- acceleration
- suspension
- lumbar
- sma
- bur
- skyrocketing
- govern
- exclude
- ballgame
- warrant
- rounds
- brats
- eff
- nativity
- facings
- casings
- relieve
- strase
- reliever
- relieving
- sander
- cabinet
- equipments
- dado
- rotary
- sicknesses
- bryan
- mamas
- packards
- solburns
- frown
- niggardly
- chintzy
- megs
- mirroring
- epidemic
- immunizations
- rays
- mumps
- rubella
- inaccuracy
- defined
- issued
- hypocritical
- stings
- laundering
- contr
- governed
- discomfort
- stea
- holster
- spontaneous
- headquarters
- bitterest
- fluctuations
- texts
- doen
- rosie
- '''neil'
- thomases
- trimmer
- clump
- tithing
- homeowner
- computerization
- stale
- subroutine
- libra
- clara
- beastie
- triggered
- pledged
- fren
- ally
- organi
- trombone
- weathers
- facetious
- directors
- spells
- compulsive
- childr
- fluffs
- toppings
- brea
- torque
- underdrive
- sportier
- beetle
- coolers
- bonneville
- secondaries
- quadrajet
- compulsion
- elevation
- variations
- hilltops
- mines
- hamster
- cruelty
- parakeet
- parakreet
- burmese
- deactivated
- infatuated
- jobbies
- visualize
- boggling
- slid
- clamped
- kisses
- everywh
- brag
- gramm
- overturning
- renegotiate
- kickbacks
- valdez
- defi
- batted
- hangs
- threats
- emit
- che
- churning
- remembrance
- networking
- conformance
- wyatt
- extremey
- bennigan
- vincent
- chefalia
- whataburger
- zillion
- mercado
- juarez
- tallest
- ewaldes
- cont
- stoneleigh
- chews
- yapping
- collies
- roughest
- hollered
- battling
- obedience
- squats
- vaca
- pilgrims
- medieval
- relics
- bemerton
- newness
- turin
- muffins
- requests
- helman
- tart
- zing
- cele
- layering
- fluffier
- joins
- jennifer
- unselfish
- tutoring
- affiliated
- aimlessly
- perky
- shins
- hyper
- burdensome
- earphones
- timbuktu
- onna
- lieutenant
- biologist
- sliding
- tremors
- variedly
- bakers
- aprons
- sweatshirt
- wigs
- lamb
- bunnies
- symbols
- milky
- polytechnochloride
- mought
- trashmore
- lifts
- riverview
- tranged
- strongest
- recessionary
- stagnate
- unteachable
- prominent
- chide
- remaining
- backbone
- newborns
- fullest
- firewh
- daffodil
- jung
- aquinas
- libretto
- rossini
- mahler
- dutchen
- trumpets
- elixir
- floated
- swapped
- tyme
- tempco
- trooper
- gisland
- carribean
- unpacking
- lotto
- alcatraz
- hairdresser
- crui
- janice
- furry
- eaves
- rafter
- cactuses
- furrows
- wrung
- plink
- construe
- thinkings
- bue
- buechele
- grieves
- gullible
- manufactures
- borden
- bib
- overalls
- oshman
- evaluated
- unfor
- linguistic
- austria
- niagara
- coasts
- carolinas
- leisurely
- modesto
- cheeseburgers
- incapable
- hygienic
- inoperable
- oxygen
- banish
- relocated
- realtor
- listings
- precautions
- integrate
- cooperatives
- reallocate
- reorganize
- accelerate
- transient
- commish
- tenderhearted
- galaxies
- crud
- mutations
- feazure
- ballooned
- reclamation
- merits
- axiom
- fiends
- sensitivity
- aboveboard
- evaluating
- veggies
- unarmed
- resembling
- tallow
- scalloped
- weighing
- strap
- squeaker
- closing
- mullin
- squeakers
- marquee
- bluish
- hydrogen
- sulfide
- h2s
- ramps
- vaccine
- preventable
- syringes
- needles
- feared
- ruf
- riffraff
- haves
- nots
- earhout
- bulletproof
- vest
- hedge
- tollbooth
- hatcher
- taverns
- sailboats
- ancle
- lounge
- cocktail
- sailer
- cruiser
- hull
- spars
- rigging
- gusts
- wearisome
- flaky
- markups
- arming
- stra
- quail
- swedish
- munch
- intermission
- doughy
- frosts
- iceberg
- schoolteacher
- altrusa
- upholstery
- garl
- jupiter
- musically
- auditions
- repertory
- outlet
- auditory
- lear
- educationally
- verified
- chording
- pianist
- min
- ec
- subbranch
- emigrated
- beware
- entrepreneurial
- ventures
- banked
- stored
- footsteps
- postcards
- notify
- notifying
- steals
- hides
- subsequently
- corrective
- leers
- downright
- outright
- shu
- newest
- apathetic
- absol
- prolong
- roofing
- retool
- zigzag
- kan
- untalented
- washed
- salvageable
- gluing
- feds
- interrupting
- faults
- caucasian
- educ
- thei
- officed
- deputy
- pruned
- gladiolas
- amaryllis
- conf
- plantings
- sprout
- narcissus
- psychic
- rerun
- activate
- rusted
- rusts
- fenders
- repainted
- acco
- dreary
- expen
- salting
- weinstocks
- wad
- hilt
- dolphene
- feelt
- throwed
- wheelchairs
- emjoy
- anheimer
- tela
- kindly
- innovated
- endeavors
- adam
- particulars
- abusive
- evolutionary
- duplication
- imagers
- allocate
- optimally
- squawk
- evolution
- insurers
- entity
- burnable
- ticketed
- charities
- braved
- suede
- cardigan
- appointments
- unlined
- toasty
- lightweight
- fireplaces
- dense
- ethanol
- smokestacks
- mowers
- wedded
- organism
- nutritionally
- bamba
- szechuan
- pancho
- binders
- assignments
- developments
- cashew
- avoiding
- suey
- disburse
- squeeze
- sq
- faculties
- pauper
- brokerage
- anticipation
- cherished
- commodity
- famuel
- slopes
- biness
- furlough
- promoted
- nec
- shasta
- salmon
- sk
- walleye
- fighters
- fillet
- foil
- seekers
- scrutiny
- tarrant
- bobsy
- accu
- smiled
- growled
- mistrials
- railroaded
- convalescent
- unsettling
- senile
- graying
- exercisings
- unaffordable
- restricts
- casse
- gabrielli
- bankrupted
- cello
- viola
- composers
- boutiques
- darling
- chanting
- canseco
- ramming
- vinny
- utility
- outweighing
- sundance
- smithsonian
- crosswords
- planners
- artists
- bazo
- faron
- spiro
- gyro
- dulcimer
- jarreau
- contorted
- bonnie
- rait
- grammy
- unedu
- sprayer
- routers
- cookie
- varnish
- smoother
- hayloft
- franklin
- gradual
- increasement
- torpedoed
- downside
- blythe
- tonkin
- macintoshes
- graphical
- multitasking
- gestures
- vocabulary
- compilers
- consultation
- interactive
- discriminating
- correlate
- funnest
- gentler
- panicked
- sassy
- westmin
- westminster
- infra
- mondale
- situa
- circuses
- disrepair
- dashboard
- ce
- beefing
- patrols
- visibility
- lifted
- cumberland
- cobb
- thefts
- superficial
- cracked
- electrically
- manufactured
- bordering
- elects
- aerodyne
- aerob
- brace
- publicize
- killings
- duri
- commentators
- blurbs
- bog
- dur
- countdown
- newscasts
- unreasonable
- moderator
- unorganized
- moderated
- assumingly
- importers
- dahlmer
- ohi
- nightmarish
- withheld
- sovereign
- martial
- puritanical
- permissible
- acquitting
- acquit
- impaneling
- dismissing
- foreman
- deliberating
- una
- restate
- unannounced
- sweep
- definitive
- bodily
- behaviors
- enters
- privacies
- melanie
- spry
- announcements
- anson
- fayetteville
- waynesboro
- delinquency
- fre
- gainfully
- tremen
- thriving
- towar
- grit
- pail
- latent
- compression
- ovens
- armor
- fierce
- finagle
- nationalizing
- cutoff
- operat
- unionized
- distinction
- institutionally
- expedient
- innovativeness
- expedi
- unequal
- plaintiff
- novices
- bets
- leaky
- luby
- taping
- promo
- blurb
- mutt
- hooper
- veterin
- spay
- neuter
- frie
- shorties
- decreased
- unrestricted
- glut
- magnum
- rushes
- oper
- preset
- styro
- frank
- shocks
- allot
- frowned
- chronicle
- analytical
- abnormality
- overwhelmingly
- academia
- descriptions
- addictive
- reevaluate
- divvy
- allocated
- psy
- psychedelic
- crosby
- stills
- performers
- secular
- druggie
- shipping
- maximize
- actuall
- revelation
- polymers
- roadways
- hoop
- funn
- heavenly
- retailers
- induce
- inducement
- recycler
- saskatoon
- welfor
- employing
- deposits
- arithmetic
- sums
- colleague
- internet
- infusions
- incurring
- surveying
- assesses
- footloose
- smattering
- greetings
- snobby
- paled
- refrained
- acute
- indivigal
- thrives
- categorized
- receptionist
- lar
- curve
- critter
- incumbent
- entrenched
- standardizing
- <sos/eos>
init: null
input_size: null
ctc_conf:
dropout_rate: 0.0
ctc_type: builtin
reduce: true
ignore_nan_grad: true
joint_net_conf: null
model_conf:
ctc_weight: 0.3
lsm_weight: 0.1
length_normalized_loss: false
extract_feats_in_collect_stats: false
use_preprocessor: true
token_type: word
bpemodel: null
non_linguistic_symbols: null
cleaner: null
g2p: null
speech_volume_normalize: null
rir_scp: null
rir_apply_prob: 1.0
noise_scp: null
noise_apply_prob: 1.0
noise_db_range: '13_15'
frontend: s3prl
frontend_conf:
frontend_conf:
upstream: wav2vec2_large_ll60k
download_dir: ./hub
multilayer_feature: true
fs: 16k
specaug: specaug
specaug_conf:
apply_time_warp: true
time_warp_window: 5
time_warp_mode: bicubic
apply_freq_mask: true
freq_mask_width_range:
- 0
- 27
num_freq_mask: 2
apply_time_mask: true
time_mask_width_ratio_range:
- 0.0
- 0.05
num_time_mask: 2
normalize: utterance_mvn
normalize_conf: {}
preencoder: linear
preencoder_conf:
input_size: 1024
output_size: 80
encoder: conformer
encoder_conf:
output_size: 256
attention_heads: 4
linear_units: 1024
num_blocks: 12
dropout_rate: 0.1
positional_dropout_rate: 0.1
attention_dropout_rate: 0.1
input_layer: conv2d2
normalize_before: true
macaron_style: true
rel_pos_type: latest
pos_enc_layer_type: rel_pos
selfattention_layer_type: rel_selfattn
activation_type: swish
use_cnn_module: true
cnn_module_kernel: 31
postencoder: null
postencoder_conf: {}
decoder: transformer
decoder_conf:
attention_heads: 4
linear_units: 2048
num_blocks: 6
dropout_rate: 0.1
positional_dropout_rate: 0.1
self_attention_dropout_rate: 0.1
src_attention_dropout_rate: 0.1
required:
- output_dir
- token_list
version: 0.10.7a1
distributed: true
```
</details>
### Citing ESPnet
```BibTex
@inproceedings{watanabe2018espnet,
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
title={{ESPnet}: End-to-End Speech Processing Toolkit},
year={2018},
booktitle={Proceedings of Interspeech},
pages={2207--2211},
doi={10.21437/Interspeech.2018-1456},
url={http://dx.doi.org/10.21437/Interspeech.2018-1456}
}
```
or arXiv:
```bibtex
@misc{watanabe2018espnet,
title={ESPnet: End-to-End Speech Processing Toolkit},
author={Shinji Watanabe and Takaaki Hori and Shigeki Karita and Tomoki Hayashi and Jiro Nishitoba and Yuya Unno and Nelson Yalta and Jahn Heymann and Matthew Wiesner and Nanxin Chen and Adithya Renduchintala and Tsubasa Ochiai},
year={2018},
eprint={1804.00015},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
8b1dd5e7feb1362a44c13a6b7027e781
|
BSC-LT/roberta-base-bne-capitel-pos
|
BSC-LT
|
roberta
| 9 | 1 |
transformers
| 3 |
token-classification
| true | false | false |
apache-2.0
|
['es']
|
['bne', 'capitel']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['national library of spain', 'spanish', 'bne', 'capitel', 'pos']
| false | true | true | 1,662 | false |
**⚠️NOTICE⚠️: THIS MODEL HAS BEEN MOVED TO THE FOLLOWING URL AND WILL SOON BE REMOVED:** https://huggingface.co/PlanTL-GOB-ES/roberta-base-bne-capitel-pos
# Spanish RoBERTa-base trained on BNE finetuned for CAPITEL Part of Speech (POS) dataset
RoBERTa-base-bne is a transformer-based masked language model for the Spanish language. It is based on the [RoBERTa](https://arxiv.org/abs/1907.11692) base model and has been pre-trained using the largest Spanish corpus known to date, with a total of 570GB of clean and deduplicated text processed for this work, compiled from the web crawlings performed by the [National Library of Spain (Biblioteca Nacional de España)](http://www.bne.es/en/Inicio/index.html) from 2009 to 2019.
Original pre-trained model can be found here: https://huggingface.co/BSC-TeMU/roberta-base-bne
## Dataset
The dataset used is the one from the [CAPITEL competition at IberLEF 2020](https://sites.google.com/view/capitel2020) (sub-task 2).
## Evaluation and results
F1 Score: 0.9846 (average of 5 runs).
For evaluation details visit our [GitHub repository](https://github.com/PlanTL-SANIDAD/lm-spanish).
## Citing
Check out our paper for all the details: https://arxiv.org/abs/2107.07253
```
@misc{gutierrezfandino2021spanish,
title={Spanish Language Models},
author={Asier Gutiérrez-Fandiño and Jordi Armengol-Estapé and Marc Pàmies and Joan Llop-Palao and Joaquín Silveira-Ocampo and Casimiro Pio Carrino and Aitor Gonzalez-Agirre and Carme Armentano-Oller and Carlos Rodriguez-Penagos and Marta Villegas},
year={2021},
eprint={2107.07253},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
899034236ad2d1c4f87c3a183d512cb0
|
aytugkaya/python-gpt2-large-issues-128
|
aytugkaya
|
bert
| 11 | 0 |
transformers
| 0 |
fill-mask
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,936 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# python-gpt2-large-issues-128
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2286
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 16
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.9843 | 1.0 | 1163 | 1.6715 |
| 1.5713 | 2.0 | 2326 | 1.4301 |
| 1.4226 | 3.0 | 3489 | 1.3808 |
| 1.332 | 4.0 | 4652 | 1.3806 |
| 1.2708 | 5.0 | 5815 | 1.2737 |
| 1.2089 | 6.0 | 6978 | 1.2354 |
| 1.167 | 7.0 | 8141 | 1.2250 |
| 1.126 | 8.0 | 9304 | 1.2262 |
| 1.0846 | 9.0 | 10467 | 1.1891 |
| 1.0647 | 10.0 | 11630 | 1.2263 |
| 1.0301 | 11.0 | 12793 | 1.1383 |
| 1.0054 | 12.0 | 13956 | 1.0922 |
| 0.9714 | 13.0 | 15119 | 1.1141 |
| 0.9713 | 14.0 | 16282 | 1.1614 |
| 0.9362 | 15.0 | 17445 | 1.0753 |
| 0.9382 | 16.0 | 18608 | 1.2286 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.10.2+cu102
- Datasets 1.18.3
- Tokenizers 0.11.6
|
44890dfc321fade478a7291843912478
|
l3cube-pune/hing-gpt-devanagari
|
l3cube-pune
|
gpt2
| 7 | 3 |
transformers
| 0 |
text-generation
| true | false | false |
cc-by-4.0
|
['hi', 'en', 'multilingual']
|
['L3Cube-HingCorpus']
| null | 1 | 0 | 1 | 0 | 0 | 0 | 0 |
['hi', 'en', 'codemix']
| false | true | true | 898 | false |
## HingGPT-Devanagari
HingGPT-Devanagari is a Hindi-English code-mixed GPT model trained on Devanagari text. It is a GPT2 model trained on L3Cube-HingCorpus.
<br>
[dataset link] (https://github.com/l3cube-pune/code-mixed-nlp)
More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2204.08398)
```
@inproceedings{nayak-joshi-2022-l3cube,
title = "{L}3{C}ube-{H}ing{C}orpus and {H}ing{BERT}: A Code Mixed {H}indi-{E}nglish Dataset and {BERT} Language Models",
author = "Nayak, Ravindra and Joshi, Raviraj",
booktitle = "Proceedings of the WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.wildre-1.2",
pages = "7--12",
}
```
|
801b47798dfeba184df0a1fed5540754
|
padmajabfrl/distilbert-base-uncased-finetuned_gender_classification
|
padmajabfrl
|
distilbert
| 13 | 1 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,498 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned_gender_classification
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0000
- Accuracy: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.003 | 1.0 | 4390 | 0.0000 | 1.0 |
| 0.0069 | 2.0 | 8780 | 0.0000 | 1.0 |
| 0.0014 | 3.0 | 13170 | 0.0000 | 1.0 |
| 0.0014 | 4.0 | 17560 | 0.0000 | 1.0 |
| 0.0008 | 5.0 | 21950 | 0.0000 | 1.0 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
|
c00e7cb78565f780e3f8a3c0936de1d8
|
DOOGLAK/Article_100v1_NER_Model_3Epochs_AUGMENTED
|
DOOGLAK
|
bert
| 13 | 5 |
transformers
| 0 |
token-classification
| true | false | false |
apache-2.0
| null |
['article100v1_wikigold_split']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,559 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Article_100v1_NER_Model_3Epochs_AUGMENTED
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on the article100v1_wikigold_split dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3140
- Precision: 0.4708
- Recall: 0.4550
- F1: 0.4628
- Accuracy: 0.8932
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 40 | 0.4092 | 0.1933 | 0.1445 | 0.1654 | 0.8398 |
| No log | 2.0 | 80 | 0.3279 | 0.4254 | 0.3924 | 0.4082 | 0.8851 |
| No log | 3.0 | 120 | 0.3140 | 0.4708 | 0.4550 | 0.4628 | 0.8932 |
### Framework versions
- Transformers 4.17.0
- Pytorch 1.11.0+cu113
- Datasets 2.4.0
- Tokenizers 0.11.6
|
9ba66f560745ddaaa5afb231aee6e306
|
krishnateja/wav2vec2-telugu_150
|
krishnateja
|
wav2vec2
| 17 | 3 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
| null |
['openslr']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 3,665 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-telugu_150
This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the [openslr](https://openslr.org/66) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3312
- Wer: 0.2213
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 150
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:------:|:-----:|:---------------:|:------:|
| 6.096 | 3.84 | 400 | 0.5762 | 0.7029 |
| 0.427 | 7.69 | 800 | 0.3124 | 0.5148 |
| 0.208 | 11.54 | 1200 | 0.2994 | 0.4201 |
| 0.1506 | 15.38 | 1600 | 0.3106 | 0.3844 |
| 0.1223 | 19.23 | 2000 | 0.3080 | 0.3608 |
| 0.1094 | 23.08 | 2400 | 0.3206 | 0.3332 |
| 0.0949 | 26.92 | 2800 | 0.3085 | 0.3253 |
| 0.0802 | 30.77 | 3200 | 0.3076 | 0.3425 |
| 0.0713 | 34.61 | 3600 | 0.3280 | 0.3398 |
| 0.0687 | 38.46 | 4000 | 0.3042 | 0.3081 |
| 0.0613 | 42.31 | 4400 | 0.3227 | 0.3073 |
| 0.0548 | 46.15 | 4800 | 0.3152 | 0.3213 |
| 0.0508 | 50.0 | 5200 | 0.3259 | 0.3107 |
| 0.0455 | 53.84 | 5600 | 0.3046 | 0.2881 |
| 0.0427 | 57.69 | 6000 | 0.2779 | 0.3007 |
| 0.0391 | 61.54 | 6400 | 0.2996 | 0.2693 |
| 0.0388 | 65.38 | 6800 | 0.3016 | 0.2695 |
| 0.0339 | 69.23 | 7200 | 0.3225 | 0.2935 |
| 0.0312 | 73.08 | 7600 | 0.2907 | 0.2942 |
| 0.029 | 76.92 | 8000 | 0.3148 | 0.3029 |
| 0.0254 | 80.77 | 8400 | 0.3118 | 0.2996 |
| 0.0229 | 84.61 | 8800 | 0.3022 | 0.2993 |
| 0.0231 | 88.46 | 9200 | 0.3203 | 0.2465 |
| 0.019 | 92.31 | 9600 | 0.3223 | 0.2460 |
| 0.0173 | 96.15 | 10000 | 0.3178 | 0.2501 |
| 0.0168 | 100.0 | 10400 | 0.2937 | 0.2415 |
| 0.015 | 103.84 | 10800 | 0.3062 | 0.2415 |
| 0.014 | 107.69 | 11200 | 0.3104 | 0.2383 |
| 0.012 | 111.54 | 11600 | 0.3308 | 0.2408 |
| 0.0111 | 115.38 | 12000 | 0.3228 | 0.2335 |
| 0.01 | 119.23 | 12400 | 0.3228 | 0.2374 |
| 0.0096 | 123.08 | 12800 | 0.3241 | 0.2304 |
| 0.009 | 126.92 | 13200 | 0.3237 | 0.2295 |
| 0.0075 | 130.77 | 13600 | 0.3221 | 0.2261 |
| 0.0065 | 134.61 | 14000 | 0.3310 | 0.2277 |
| 0.0064 | 138.46 | 14400 | 0.3348 | 0.2266 |
| 0.0064 | 142.31 | 14800 | 0.3330 | 0.2229 |
| 0.0056 | 146.15 | 15200 | 0.3310 | 0.2229 |
| 0.0053 | 150.0 | 15600 | 0.3312 | 0.2213 |
### Test results
WER(without LM): 42.8\%
WER(with LM): 42\%
### Framework versions
- Transformers 4.24.0
- Pytorch 1.13.0+cu117
- Datasets 2.6.1
- Tokenizers 0.13.2
ps: 150 in repo name denotes number of epochs
|
aff10b3f38210c09ef54d6dccbc3d3d1
|
KenP/codeparrot-ds
|
KenP
|
gpt2
| 9 | 2 |
transformers
| 0 |
text-generation
| false | true | false |
mit
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_keras_callback']
| true | true | true | 1,490 | false |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# KenP/codeparrot-ds
This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 10.3900
- Validation Loss: 9.6171
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 5e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': -922, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 10.3900 | 9.6171 | 0 |
### Framework versions
- Transformers 4.18.0
- TensorFlow 2.8.0
- Datasets 2.2.0
- Tokenizers 0.12.1
|
f66082a3cca7864f98b61e52911c59a8
|
gary109/ai-light-dance_singing2_ft_wav2vec2-large-xlsr-53-v1
|
gary109
|
wav2vec2
| 14 | 5 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['automatic-speech-recognition', 'gary109/AI_Light_Dance', 'generated_from_trainer']
| true | true | true | 3,821 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# ai-light-dance_singing2_ft_wav2vec2-large-xlsr-53-v1
This model is a fine-tuned version of [gary109/ai-light-dance_singing2_ft_wav2vec2-large-xlsr-53](https://huggingface.co/gary109/ai-light-dance_singing2_ft_wav2vec2-large-xlsr-53) on the GARY109/AI_LIGHT_DANCE - ONSET-SINGING2 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5760
- Wer: 0.2905
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 160
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 40.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 1.656 | 1.0 | 112 | 1.7625 | 0.9265 |
| 1.3693 | 2.0 | 224 | 1.5135 | 0.9243 |
| 1.2172 | 3.0 | 336 | 1.2657 | 0.8533 |
| 1.0456 | 4.0 | 448 | 1.0893 | 0.7691 |
| 0.9385 | 5.0 | 560 | 1.0110 | 0.7097 |
| 0.8165 | 6.0 | 672 | 0.9243 | 0.6682 |
| 0.7491 | 7.0 | 784 | 0.8948 | 0.6583 |
| 0.6772 | 8.0 | 896 | 0.7894 | 0.6007 |
| 0.6096 | 9.0 | 1008 | 0.7684 | 0.5663 |
| 0.5714 | 10.0 | 1120 | 0.6978 | 0.4826 |
| 0.5213 | 11.0 | 1232 | 0.8433 | 0.4927 |
| 0.4624 | 12.0 | 1344 | 0.6695 | 0.4469 |
| 0.4298 | 13.0 | 1456 | 0.6569 | 0.3868 |
| 0.3939 | 14.0 | 1568 | 0.6633 | 0.3694 |
| 0.3803 | 15.0 | 1680 | 0.6376 | 0.3920 |
| 0.3415 | 16.0 | 1792 | 0.6463 | 0.3414 |
| 0.3239 | 17.0 | 1904 | 0.5841 | 0.3197 |
| 0.2946 | 18.0 | 2016 | 0.5948 | 0.3112 |
| 0.2751 | 19.0 | 2128 | 0.5760 | 0.2905 |
| 0.2834 | 20.0 | 2240 | 0.5884 | 0.2975 |
| 0.2383 | 21.0 | 2352 | 0.5989 | 0.2775 |
| 0.2265 | 22.0 | 2464 | 0.6151 | 0.2853 |
| 0.2158 | 23.0 | 2576 | 0.5843 | 0.2670 |
| 0.2015 | 24.0 | 2688 | 0.6621 | 0.2738 |
| 0.215 | 25.0 | 2800 | 0.6068 | 0.2652 |
| 0.1859 | 26.0 | 2912 | 0.6136 | 0.2570 |
| 0.1745 | 27.0 | 3024 | 0.6191 | 0.2624 |
| 0.1611 | 28.0 | 3136 | 0.6364 | 0.2578 |
| 0.1513 | 29.0 | 3248 | 0.6402 | 0.2535 |
| 0.172 | 30.0 | 3360 | 0.6330 | 0.2500 |
| 0.1488 | 31.0 | 3472 | 0.6275 | 0.2521 |
| 0.1371 | 32.0 | 3584 | 0.6539 | 0.2540 |
| 0.1356 | 33.0 | 3696 | 0.6544 | 0.2491 |
| 0.1319 | 34.0 | 3808 | 0.6545 | 0.2491 |
| 0.1465 | 35.0 | 3920 | 0.6573 | 0.2495 |
| 0.13 | 36.0 | 4032 | 0.6594 | 0.2494 |
| 0.1244 | 37.0 | 4144 | 0.6651 | 0.2476 |
| 0.1228 | 38.0 | 4256 | 0.6754 | 0.2497 |
| 0.1181 | 39.0 | 4368 | 0.6684 | 0.2468 |
| 0.1338 | 40.0 | 4480 | 0.6713 | 0.2471 |
### Framework versions
- Transformers 4.21.0.dev0
- Pytorch 1.9.1+cu102
- Datasets 2.3.3.dev0
- Tokenizers 0.12.1
|
e1eedf024e089c6d045975ab62d0328c
|
CAMeL-Lab/bert-base-arabic-camelbert-msa-sixteenth
|
CAMeL-Lab
|
bert
| 9 | 56 |
transformers
| 2 |
fill-mask
| true | true | true |
apache-2.0
|
['ar']
| null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 10,765 | false |
# CAMeLBERT: A collection of pre-trained models for Arabic NLP tasks
## Model description
**CAMeLBERT** is a collection of BERT models pre-trained on Arabic texts with different sizes and variants.
We release pre-trained language models for Modern Standard Arabic (MSA), dialectal Arabic (DA), and classical Arabic (CA), in addition to a model pre-trained on a mix of the three.
We also provide additional models that are pre-trained on a scaled-down set of the MSA variant (half, quarter, eighth, and sixteenth).
The details are described in the paper *"[The Interplay of Variant, Size, and Task Type in Arabic Pre-trained Language Models](https://arxiv.org/abs/2103.06678)."*
This model card describes **CAMeLBERT-MSA-sixteenth** (`bert-base-arabic-camelbert-msa-sixteenth`), a model pre-trained on a sixteenth of the full MSA dataset.
||Model|Variant|Size|#Word|
|-|-|:-:|-:|-:|
||`bert-base-arabic-camelbert-mix`|CA,DA,MSA|167GB|17.3B|
||`bert-base-arabic-camelbert-ca`|CA|6GB|847M|
||`bert-base-arabic-camelbert-da`|DA|54GB|5.8B|
||`bert-base-arabic-camelbert-msa`|MSA|107GB|12.6B|
||`bert-base-arabic-camelbert-msa-half`|MSA|53GB|6.3B|
||`bert-base-arabic-camelbert-msa-quarter`|MSA|27GB|3.1B|
||`bert-base-arabic-camelbert-msa-eighth`|MSA|14GB|1.6B|
|✔|`bert-base-arabic-camelbert-msa-sixteenth`|MSA|6GB|746M|
## Intended uses
You can use the released model for either masked language modeling or next sentence prediction.
However, it is mostly intended to be fine-tuned on an NLP task, such as NER, POS tagging, sentiment analysis, dialect identification, and poetry classification.
We release our fine-tuninig code [here](https://github.com/CAMeL-Lab/CAMeLBERT).
#### How to use
You can use this model directly with a pipeline for masked language modeling:
```python
>>> from transformers import pipeline
>>> unmasker = pipeline('fill-mask', model='CAMeL-Lab/bert-base-arabic-camelbert-msa-sixteenth')
>>> unmasker("الهدف من الحياة هو [MASK] .")
[{'sequence': '[CLS] الهدف من الحياة هو التغيير. [SEP]',
'score': 0.08320745080709457,
'token': 7946,
'token_str': 'التغيير'},
{'sequence': '[CLS] الهدف من الحياة هو التعلم. [SEP]',
'score': 0.04305094853043556,
'token': 12554,
'token_str': 'التعلم'},
{'sequence': '[CLS] الهدف من الحياة هو العمل. [SEP]',
'score': 0.0417640283703804,
'token': 2854,
'token_str': 'العمل'},
{'sequence': '[CLS] الهدف من الحياة هو الحياة. [SEP]',
'score': 0.041371218860149384,
'token': 3696,
'token_str': 'الحياة'},
{'sequence': '[CLS] الهدف من الحياة هو المعرفة. [SEP]',
'score': 0.039794355630874634,
'token': 7344,
'token_str': 'المعرفة'}]
```
*Note*: to download our models, you would need `transformers>=3.5.0`. Otherwise, you could download the models manually.
Here is how to use this model to get the features of a given text in PyTorch:
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained('CAMeL-Lab/bert-base-arabic-camelbert-msa-sixteenth')
model = AutoModel.from_pretrained('CAMeL-Lab/bert-base-arabic-camelbert-msa-sixteenth')
text = "مرحبا يا عالم."
encoded_input = tokenizer(text, return_tensors='pt')
output = model(**encoded_input)
```
and in TensorFlow:
```python
from transformers import AutoTokenizer, TFAutoModel
tokenizer = AutoTokenizer.from_pretrained('CAMeL-Lab/bert-base-arabic-camelbert-msa-sixteenth')
model = TFAutoModel.from_pretrained('CAMeL-Lab/bert-base-arabic-camelbert-msa-sixteenth')
text = "مرحبا يا عالم."
encoded_input = tokenizer(text, return_tensors='tf')
output = model(encoded_input)
```
## Training data
- MSA (Modern Standard Arabic)
- [The Arabic Gigaword Fifth Edition](https://catalog.ldc.upenn.edu/LDC2011T11)
- [Abu El-Khair Corpus](http://www.abuelkhair.net/index.php/en/arabic/abu-el-khair-corpus)
- [OSIAN corpus](https://vlo.clarin.eu/search;jsessionid=31066390B2C9E8C6304845BA79869AC1?1&q=osian)
- [Arabic Wikipedia](https://archive.org/details/arwiki-20190201)
- The unshuffled version of the Arabic [OSCAR corpus](https://oscar-corpus.com/)
## Training procedure
We use [the original implementation](https://github.com/google-research/bert) released by Google for pre-training.
We follow the original English BERT model's hyperparameters for pre-training, unless otherwise specified.
### Preprocessing
- After extracting the raw text from each corpus, we apply the following pre-processing.
- We first remove invalid characters and normalize white spaces using the utilities provided by [the original BERT implementation](https://github.com/google-research/bert/blob/eedf5716ce1268e56f0a50264a88cafad334ac61/tokenization.py#L286-L297).
- We also remove lines without any Arabic characters.
- We then remove diacritics and kashida using [CAMeL Tools](https://github.com/CAMeL-Lab/camel_tools).
- Finally, we split each line into sentences with a heuristics-based sentence segmenter.
- We train a WordPiece tokenizer on the entire dataset (167 GB text) with a vocabulary size of 30,000 using [HuggingFace's tokenizers](https://github.com/huggingface/tokenizers).
- We do not lowercase letters nor strip accents.
### Pre-training
- The model was trained on a single cloud TPU (`v3-8`) for one million steps in total.
- The first 90,000 steps were trained with a batch size of 1,024 and the rest was trained with a batch size of 256.
- The sequence length was limited to 128 tokens for 90% of the steps and 512 for the remaining 10%.
- We use whole word masking and a duplicate factor of 10.
- We set max predictions per sequence to 20 for the dataset with max sequence length of 128 tokens and 80 for the dataset with max sequence length of 512 tokens.
- We use a random seed of 12345, masked language model probability of 0.15, and short sequence probability of 0.1.
- The optimizer used is Adam with a learning rate of 1e-4, \\(\beta_{1} = 0.9\\) and \\(\beta_{2} = 0.999\\), a weight decay of 0.01, learning rate warmup for 10,000 steps and linear decay of the learning rate after.
## Evaluation results
- We evaluate our pre-trained language models on five NLP tasks: NER, POS tagging, sentiment analysis, dialect identification, and poetry classification.
- We fine-tune and evaluate the models using 12 dataset.
- We used Hugging Face's transformers to fine-tune our CAMeLBERT models.
- We used transformers `v3.1.0` along with PyTorch `v1.5.1`.
- The fine-tuning was done by adding a fully connected linear layer to the last hidden state.
- We use \\(F_{1}\\) score as a metric for all tasks.
- Code used for fine-tuning is available [here](https://github.com/CAMeL-Lab/CAMeLBERT).
### Results
| Task | Dataset | Variant | Mix | CA | DA | MSA | MSA-1/2 | MSA-1/4 | MSA-1/8 | MSA-1/16 |
| -------------------- | --------------- | ------- | ----- | ----- | ----- | ----- | ------- | ------- | ------- | -------- |
| NER | ANERcorp | MSA | 80.8% | 67.9% | 74.1% | 82.4% | 82.0% | 82.1% | 82.6% | 80.8% |
| POS | PATB (MSA) | MSA | 98.1% | 97.8% | 97.7% | 98.3% | 98.2% | 98.3% | 98.2% | 98.2% |
| | ARZTB (EGY) | DA | 93.6% | 92.3% | 92.7% | 93.6% | 93.6% | 93.7% | 93.6% | 93.6% |
| | Gumar (GLF) | DA | 97.3% | 97.7% | 97.9% | 97.9% | 97.9% | 97.9% | 97.9% | 97.9% |
| SA | ASTD | MSA | 76.3% | 69.4% | 74.6% | 76.9% | 76.0% | 76.8% | 76.7% | 75.3% |
| | ArSAS | MSA | 92.7% | 89.4% | 91.8% | 93.0% | 92.6% | 92.5% | 92.5% | 92.3% |
| | SemEval | MSA | 69.0% | 58.5% | 68.4% | 72.1% | 70.7% | 72.8% | 71.6% | 71.2% |
| DID | MADAR-26 | DA | 62.9% | 61.9% | 61.8% | 62.6% | 62.0% | 62.8% | 62.0% | 62.2% |
| | MADAR-6 | DA | 92.5% | 91.5% | 92.2% | 91.9% | 91.8% | 92.2% | 92.1% | 92.0% |
| | MADAR-Twitter-5 | MSA | 75.7% | 71.4% | 74.2% | 77.6% | 78.5% | 77.3% | 77.7% | 76.2% |
| | NADI | DA | 24.7% | 17.3% | 20.1% | 24.9% | 24.6% | 24.6% | 24.9% | 23.8% |
| Poetry | APCD | CA | 79.8% | 80.9% | 79.6% | 79.7% | 79.9% | 80.0% | 79.7% | 79.8% |
### Results (Average)
| | Variant | Mix | CA | DA | MSA | MSA-1/2 | MSA-1/4 | MSA-1/8 | MSA-1/16 |
| -------------------- | ------- | ----- | ----- | ----- | ----- | ------- | ------- | ------- | -------- |
| Variant-wise-average<sup>[[1]](#footnote-1)</sup> | MSA | 82.1% | 75.7% | 80.1% | 83.4% | 83.0% | 83.3% | 83.2% | 82.3% |
| | DA | 74.4% | 72.1% | 72.9% | 74.2% | 74.0% | 74.3% | 74.1% | 73.9% |
| | CA | 79.8% | 80.9% | 79.6% | 79.7% | 79.9% | 80.0% | 79.7% | 79.8% |
| Macro-Average | ALL | 78.7% | 74.7% | 77.1% | 79.2% | 79.0% | 79.2% | 79.1% | 78.6% |
<a name="footnote-1">[1]</a>: Variant-wise-average refers to average over a group of tasks in the same language variant.
## Acknowledgements
This research was supported with Cloud TPUs from Google’s TensorFlow Research Cloud (TFRC).
## Citation
```bibtex
@inproceedings{inoue-etal-2021-interplay,
title = "The Interplay of Variant, Size, and Task Type in {A}rabic Pre-trained Language Models",
author = "Inoue, Go and
Alhafni, Bashar and
Baimukan, Nurpeiis and
Bouamor, Houda and
Habash, Nizar",
booktitle = "Proceedings of the Sixth Arabic Natural Language Processing Workshop",
month = apr,
year = "2021",
address = "Kyiv, Ukraine (Online)",
publisher = "Association for Computational Linguistics",
abstract = "In this paper, we explore the effects of language variants, data sizes, and fine-tuning task types in Arabic pre-trained language models. To do so, we build three pre-trained language models across three variants of Arabic: Modern Standard Arabic (MSA), dialectal Arabic, and classical Arabic, in addition to a fourth language model which is pre-trained on a mix of the three. We also examine the importance of pre-training data size by building additional models that are pre-trained on a scaled-down set of the MSA variant. We compare our different models to each other, as well as to eight publicly available models by fine-tuning them on five NLP tasks spanning 12 datasets. Our results suggest that the variant proximity of pre-training data to fine-tuning data is more important than the pre-training data size. We exploit this insight in defining an optimized system selection model for the studied tasks.",
}
```
|
40b30436cfce977b174cee14f623f530
|
sd-concepts-library/Aflac-duck
|
sd-concepts-library
| null | 10 | 0 | null | 1 | null | false | false | false |
mit
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,153 | false |
### Aflac duck on Stable Diffusion
This is the `<aflac duck>` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb).
Here is the new concept you will be able to use as an `object`:





|
7f55c57c6adfa952420972b54fb780ab
|
SpectaclesLLC/distilbert-legal-chunk
|
SpectaclesLLC
|
distilbert
| 10 | 218 |
transformers
| 0 |
token-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 2,845 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-legal-chunk
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0699
- Precision: 0.8994
- Recall: 0.8721
- Macro F1: 0.8855
- Micro F1: 0.8855
- Accuracy: 0.9789
- Marker F1: 0.9804
- Marker Precision: 0.9687
- Marker Recall: 0.9925
- Reference F1: 0.9791
- Reference Precision: 0.9804
- Reference Recall: 0.9778
- Term F1: 0.8670
- Term Precision: 0.8844
- Term Recall: 0.8502
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | Macro F1 | Micro F1 | Accuracy | Marker F1 | Marker Precision | Marker Recall | Reference F1 | Reference Precision | Reference Recall | Term F1 | Term Precision | Term Recall |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:--------:|:--------:|:--------:|:---------:|:----------------:|:-------------:|:------------:|:-------------------:|:----------------:|:-------:|:--------------:|:-----------:|
| 0.0857 | 1.0 | 3125 | 0.0966 | 0.8374 | 0.7889 | 0.8124 | 0.8124 | 0.9676 | 0.6143 | 0.5874 | 0.6437 | 0.9628 | 0.9423 | 0.9842 | 0.8291 | 0.8656 | 0.7955 |
| 0.058 | 2.0 | 6250 | 0.0606 | 0.8869 | 0.9146 | 0.9006 | 0.9006 | 0.9814 | 0.9405 | 0.9126 | 0.9702 | 0.9689 | 0.9511 | 0.9873 | 0.8923 | 0.8805 | 0.9045 |
| 0.0415 | 3.0 | 9375 | 0.0642 | 0.9077 | 0.9131 | 0.9104 | 0.9104 | 0.9823 | 0.9524 | 0.9262 | 0.9801 | 0.9742 | 0.9614 | 0.9873 | 0.9021 | 0.9026 | 0.9016 |
| 0.0283 | 4.0 | 12500 | 0.0646 | 0.9066 | 0.9089 | 0.9077 | 0.9077 | 0.9819 | 0.9564 | 0.9326 | 0.9815 | 0.9712 | 0.9555 | 0.9873 | 0.8986 | 0.9008 | 0.8965 |
### Framework versions
- Transformers 4.21.3
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1
|
a73a0f96c9291af6d76b0c4005ef59b7
|
Yanjie24/bart-samsung-5
|
Yanjie24
|
bart
| 13 | 1 |
transformers
| 0 |
text2text-generation
| true | false | false |
apache-2.0
| null |
['samsum']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,901 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-samsung-5
This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on the samsum dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4959
- Rouge1: 48.4734
- Rouge2: 25.3475
- Rougel: 40.9144
- Rougelsum: 44.7797
- Gen Len: 18.22
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 8
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| 1.6107 | 1.0 | 1841 | 1.5390 | 47.1407 | 24.384 | 40.4826 | 43.4437 | 17.5513 |
| 1.5528 | 2.0 | 3682 | 1.4971 | 48.5483 | 25.1562 | 41.1806 | 44.7254 | 18.3521 |
| 1.4225 | 3.0 | 5523 | 1.5013 | 48.2461 | 25.2181 | 40.9022 | 44.4942 | 18.0844 |
| 1.3266 | 4.0 | 7364 | 1.4976 | 48.8949 | 25.4367 | 41.2355 | 45.0961 | 18.2359 |
| 1.2635 | 5.0 | 9205 | 1.4959 | 48.4734 | 25.3475 | 40.9144 | 44.7797 | 18.22 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.7.1
- Tokenizers 0.13.2
|
66384a18546a62146c1fd84b215e2f68
|
Helsinki-NLP/opus-mt-sv-guw
|
Helsinki-NLP
|
marian
| 10 | 7 |
transformers
| 0 |
translation
| true | true | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['translation']
| false | true | true | 776 | false |
### opus-mt-sv-guw
* source languages: sv
* target languages: guw
* OPUS readme: [sv-guw](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/sv-guw/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-16.zip](https://object.pouta.csc.fi/OPUS-MT-models/sv-guw/opus-2020-01-16.zip)
* test set translations: [opus-2020-01-16.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/sv-guw/opus-2020-01-16.test.txt)
* test set scores: [opus-2020-01-16.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/sv-guw/opus-2020-01-16.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| JW300.sv.guw | 33.5 | 0.531 |
|
0d783ae86430d2af16d9a85569a82c12
|
twieland/MIX1_ja-en_helsinki
|
twieland
|
marian
| 11 | 2 |
transformers
| 0 |
text2text-generation
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 9,727 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# MIX1_ja-en_helsinki
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-ja-en](https://huggingface.co/Helsinki-NLP/opus-mt-ja-en) on a combination of Visual Novel, Light Novel, and Subtitle data. A total of ~10MM lines of training data were used.
It achieves the following results on the evaluation set:
- Loss: 1.7947
- Otaku Benchmark VN BLEU: 17.78
- Otaku Benchmark LN BLEU: 11.80
- Otaku Benchmark MANGA BLEU: 13.66
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:------:|:---------------:|
| 2.7495 | 0.01 | 2000 | 2.5989 |
| 2.5415 | 0.03 | 4000 | 2.4746 |
| 2.4409 | 0.04 | 6000 | 2.4731 |
| 2.3743 | 0.05 | 8000 | 2.4012 |
| 2.3254 | 0.06 | 10000 | 2.3904 |
| 2.2857 | 0.08 | 12000 | 2.3649 |
| 2.2448 | 0.09 | 14000 | 2.3188 |
| 2.2158 | 0.1 | 16000 | 2.2975 |
| 2.193 | 0.11 | 18000 | 2.2756 |
| 2.1669 | 0.13 | 20000 | 2.2852 |
| 2.144 | 0.14 | 22000 | 2.2689 |
| 2.1222 | 0.15 | 24000 | 2.2721 |
| 2.1045 | 0.16 | 26000 | 2.2489 |
| 2.0885 | 0.18 | 28000 | 2.2359 |
| 2.0732 | 0.19 | 30000 | 2.2771 |
| 2.0584 | 0.2 | 32000 | 2.2582 |
| 2.0471 | 0.21 | 34000 | 2.2093 |
| 2.0369 | 0.23 | 36000 | 2.1768 |
| 2.0241 | 0.24 | 38000 | 2.1884 |
| 2.0196 | 0.25 | 40000 | 2.2025 |
| 2.004 | 0.27 | 42000 | 2.1507 |
| 1.9936 | 0.28 | 44000 | 2.1668 |
| 1.9869 | 0.29 | 46000 | 2.1432 |
| 1.9735 | 0.3 | 48000 | 2.1662 |
| 1.9651 | 0.32 | 50000 | 2.1824 |
| 1.9551 | 0.33 | 52000 | 2.1608 |
| 1.9485 | 0.34 | 54000 | 2.1322 |
| 1.9421 | 0.35 | 56000 | 2.1476 |
| 1.9303 | 0.37 | 58000 | 2.0994 |
| 1.9236 | 0.38 | 60000 | 2.1182 |
| 1.9183 | 0.39 | 62000 | 2.1305 |
| 1.9108 | 0.4 | 64000 | 2.1469 |
| 1.9051 | 0.42 | 66000 | 2.1414 |
| 1.9018 | 0.43 | 68000 | 2.1089 |
| 1.8959 | 0.44 | 70000 | 2.0908 |
| 1.886 | 0.46 | 72000 | 2.0968 |
| 1.8802 | 0.47 | 74000 | 2.0503 |
| 1.8713 | 0.48 | 76000 | 2.0542 |
| 1.8648 | 0.49 | 78000 | 2.0990 |
| 1.8599 | 0.51 | 80000 | 2.1112 |
| 1.8563 | 0.52 | 82000 | 2.1007 |
| 1.8541 | 0.53 | 84000 | 2.0849 |
| 1.845 | 0.54 | 86000 | 2.0831 |
| 1.8448 | 0.56 | 88000 | 2.0560 |
| 1.8342 | 0.57 | 90000 | 2.0349 |
| 1.8344 | 0.58 | 92000 | 2.0301 |
| 1.8291 | 0.59 | 94000 | 2.0300 |
| 1.819 | 0.61 | 96000 | 2.0378 |
| 1.8154 | 0.62 | 98000 | 2.0197 |
| 1.82 | 0.63 | 100000 | 2.0463 |
| 1.8081 | 0.64 | 102000 | 2.0077 |
| 1.8046 | 0.66 | 104000 | 2.0101 |
| 1.7978 | 0.67 | 106000 | 2.0150 |
| 1.7934 | 0.68 | 108000 | 2.0215 |
| 1.7904 | 0.7 | 110000 | 2.0278 |
| 1.7871 | 0.71 | 112000 | 2.0588 |
| 1.779 | 0.72 | 114000 | 2.0062 |
| 1.7784 | 0.73 | 116000 | 2.0300 |
| 1.7749 | 0.75 | 118000 | 1.9664 |
| 1.7691 | 0.76 | 120000 | 2.0033 |
| 1.7622 | 0.77 | 122000 | 1.9983 |
| 1.7587 | 0.78 | 124000 | 2.0030 |
| 1.755 | 0.8 | 126000 | 1.9955 |
| 1.7531 | 0.81 | 128000 | 1.9764 |
| 1.7439 | 0.82 | 130000 | 1.9942 |
| 1.7406 | 0.83 | 132000 | 2.0221 |
| 1.7385 | 0.85 | 134000 | 1.9835 |
| 1.7332 | 0.86 | 136000 | 1.9967 |
| 1.7332 | 0.87 | 138000 | 2.0247 |
| 1.7309 | 0.88 | 140000 | 1.9817 |
| 1.7248 | 0.9 | 142000 | 2.0063 |
| 1.7209 | 0.91 | 144000 | 1.9583 |
| 1.7154 | 0.92 | 146000 | 1.9779 |
| 1.7153 | 0.94 | 148000 | 1.9478 |
| 1.7094 | 0.95 | 150000 | 1.9706 |
| 1.7061 | 0.96 | 152000 | 1.9605 |
| 1.7017 | 0.97 | 154000 | 1.9447 |
| 1.6965 | 0.99 | 156000 | 1.9419 |
| 1.6929 | 1.0 | 158000 | 1.9589 |
| 1.6628 | 1.01 | 160000 | 1.9383 |
| 1.6535 | 1.02 | 162000 | 1.9487 |
| 1.6495 | 1.04 | 164000 | 1.9400 |
| 1.6516 | 1.05 | 166000 | 1.9353 |
| 1.6513 | 1.06 | 168000 | 1.9253 |
| 1.6518 | 1.07 | 170000 | 1.9132 |
| 1.6491 | 1.09 | 172000 | 1.9076 |
| 1.6453 | 1.1 | 174000 | 1.9192 |
| 1.6426 | 1.11 | 176000 | 1.9191 |
| 1.6353 | 1.13 | 178000 | 1.9367 |
| 1.6352 | 1.14 | 180000 | 1.9218 |
| 1.6304 | 1.15 | 182000 | 1.9305 |
| 1.6299 | 1.16 | 184000 | 1.9072 |
| 1.6263 | 1.18 | 186000 | 1.9211 |
| 1.6284 | 1.19 | 188000 | 1.9037 |
| 1.6237 | 1.2 | 190000 | 1.8951 |
| 1.6231 | 1.21 | 192000 | 1.8998 |
| 1.6184 | 1.23 | 194000 | 1.8960 |
| 1.6153 | 1.24 | 196000 | 1.8776 |
| 1.6122 | 1.25 | 198000 | 1.8747 |
| 1.6109 | 1.26 | 200000 | 1.8951 |
| 1.6072 | 1.28 | 202000 | 1.8705 |
| 1.6094 | 1.29 | 204000 | 1.8903 |
| 1.6063 | 1.3 | 206000 | 1.8660 |
| 1.599 | 1.31 | 208000 | 1.8696 |
| 1.5931 | 1.33 | 210000 | 1.8598 |
| 1.5943 | 1.34 | 212000 | 1.8760 |
| 1.5906 | 1.35 | 214000 | 1.8833 |
| 1.5858 | 1.37 | 216000 | 1.8645 |
| 1.5873 | 1.38 | 218000 | 1.8620 |
| 1.5842 | 1.39 | 220000 | 1.8632 |
| 1.5808 | 1.4 | 222000 | 1.8782 |
| 1.5756 | 1.42 | 224000 | 1.8627 |
| 1.5728 | 1.43 | 226000 | 1.8649 |
| 1.5709 | 1.44 | 228000 | 1.8735 |
| 1.5704 | 1.45 | 230000 | 1.8630 |
| 1.5659 | 1.47 | 232000 | 1.8598 |
| 1.5637 | 1.48 | 234000 | 1.8519 |
| 1.5628 | 1.49 | 236000 | 1.8569 |
| 1.5559 | 1.5 | 238000 | 1.8401 |
| 1.5532 | 1.52 | 240000 | 1.8528 |
| 1.557 | 1.53 | 242000 | 1.8637 |
| 1.5499 | 1.54 | 244000 | 1.8701 |
| 1.5476 | 1.55 | 246000 | 1.8423 |
| 1.5502 | 1.57 | 248000 | 1.8320 |
| 1.5469 | 1.58 | 250000 | 1.8542 |
| 1.5382 | 1.59 | 252000 | 1.8526 |
| 1.5396 | 1.61 | 254000 | 1.8537 |
| 1.528 | 1.62 | 256000 | 1.8248 |
| 1.532 | 1.63 | 258000 | 1.8322 |
| 1.5269 | 1.64 | 260000 | 1.8381 |
| 1.5269 | 1.66 | 262000 | 1.8389 |
| 1.5269 | 1.67 | 264000 | 1.8445 |
| 1.525 | 1.68 | 266000 | 1.8232 |
| 1.5175 | 1.69 | 268000 | 1.8561 |
| 1.5172 | 1.71 | 270000 | 1.8342 |
| 1.5174 | 1.72 | 272000 | 1.8167 |
| 1.5114 | 1.73 | 274000 | 1.8281 |
| 1.5094 | 1.74 | 276000 | 1.8164 |
| 1.5083 | 1.76 | 278000 | 1.8317 |
| 1.5047 | 1.77 | 280000 | 1.8207 |
| 1.5045 | 1.78 | 282000 | 1.8155 |
| 1.497 | 1.8 | 284000 | 1.8275 |
| 1.4996 | 1.81 | 286000 | 1.8152 |
| 1.497 | 1.82 | 288000 | 1.8137 |
| 1.4967 | 1.83 | 290000 | 1.8109 |
| 1.4936 | 1.85 | 292000 | 1.8037 |
| 1.4867 | 1.86 | 294000 | 1.7955 |
| 1.4859 | 1.87 | 296000 | 1.8181 |
| 1.4869 | 1.88 | 298000 | 1.7999 |
| 1.4811 | 1.9 | 300000 | 1.8062 |
| 1.4831 | 1.91 | 302000 | 1.8042 |
| 1.4791 | 1.92 | 304000 | 1.8020 |
| 1.4797 | 1.93 | 306000 | 1.7972 |
| 1.483 | 1.95 | 308000 | 1.8044 |
| 1.4748 | 1.96 | 310000 | 1.8036 |
| 1.4772 | 1.97 | 312000 | 1.7958 |
| 1.4708 | 1.98 | 314000 | 1.7967 |
| 1.4743 | 2.0 | 316000 | 1.7947 |
### Framework versions
- Transformers 4.19.2
- Pytorch 1.11.0+cu113
- Datasets 2.2.2
- Tokenizers 0.12.1
|
1b243268a2d32ed7a2fc60038e5910e1
|
Helsinki-NLP/opus-mt-de-el
|
Helsinki-NLP
|
marian
| 10 | 184 |
transformers
| 0 |
translation
| true | true | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['translation']
| false | true | true | 770 | false |
### opus-mt-de-el
* source languages: de
* target languages: el
* OPUS readme: [de-el](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/de-el/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-20.zip](https://object.pouta.csc.fi/OPUS-MT-models/de-el/opus-2020-01-20.zip)
* test set translations: [opus-2020-01-20.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/de-el/opus-2020-01-20.test.txt)
* test set scores: [opus-2020-01-20.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/de-el/opus-2020-01-20.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| Tatoeba.de.el | 45.7 | 0.649 |
|
6dfb46970acb3dd72e777eade115258e
|
sd-concepts-library/doener-red-line-art
|
sd-concepts-library
| null | 11 | 0 | null | 0 | null | false | false | false |
mit
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,258 | false |
### doener_red_line_art on Stable Diffusion
This is the `<dnr>` concept taught to Stable Diffusion via Textual Inversion. You can load this concept into the [Stable Conceptualizer](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_conceptualizer_inference.ipynb) notebook. You can also train your own concepts and load them into the concept libraries using [this notebook](https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/sd_textual_inversion_training.ipynb).
Here is the new concept you will be able to use as a `style`:






|
5ab2f36327de736151e388b4ed0dc46c
|
netsvetaev/netsvetaev-free
|
netsvetaev
| null | 16 | 0 | null | 2 |
text-to-image
| false | false | false |
mit
|
['en']
| null | null | 1 | 0 | 0 | 1 | 0 | 0 | 0 |
['diffusion', 'netsvetaev', 'dreambooth', 'stable-diffusion', 'text-to-image']
| false | true | true | 1,961 | false |
Hello!
This is the model based on my paintings and SD 1.5. I did it as an experiment.
The token is «in style of netsvetaev abstract paintings».
Best suited for: abstract seamless patterns, simple prompts like «orange, fruit», and large objects like «cat face» or «girl face».
It works well with landscape orientation.
It has MIT license, you can use it for free.
Best used with Invoke AI: https://github.com/invoke-ai/InvokeAI (The examples below contain metadata for it)









________________________
Artur Netsvetaev, 2022
https://netsvetaev.com
|
e5eb5508ea422f025bcacadeb03e926d
|
wietsedv/xlm-roberta-base-ft-udpos28-is
|
wietsedv
|
xlm-roberta
| 8 | 7 |
transformers
| 0 |
token-classification
| true | false | false |
apache-2.0
|
['is']
|
['universal_dependencies']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['part-of-speech', 'token-classification']
| true | true | true | 569 | false |
# XLM-RoBERTa base Universal Dependencies v2.8 POS tagging: Icelandic
This model is part of our paper called:
- Make the Best of Cross-lingual Transfer: Evidence from POS Tagging with over 100 Languages
Check the [Space](https://huggingface.co/spaces/wietsedv/xpos) for more details.
## Usage
```python
from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("wietsedv/xlm-roberta-base-ft-udpos28-is")
model = AutoModelForTokenClassification.from_pretrained("wietsedv/xlm-roberta-base-ft-udpos28-is")
```
|
7d127bd7484b6b3fce8548f572bad2ee
|
joebobby/finetuning-sentiment-model-5000-samples3
|
joebobby
|
bert
| 22 | 1 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 918 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# finetuning-sentiment-model-5000-samples3
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Framework versions
- Transformers 4.19.2
- Pytorch 1.11.0+cu113
- Datasets 2.2.2
- Tokenizers 0.12.1
|
c7b98279ec4be9bb013f1b749e375dc3
|
ptro/model1_test
|
ptro
|
bert
| 13 | 3 |
transformers
| 1 |
text-classification
| true | false | false |
cc-by-sa-4.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,407 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model1_test
This model is a fine-tuned version of [DaNLP/da-bert-hatespeech-detection](https://huggingface.co/DaNLP/da-bert-hatespeech-detection) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1816
- Accuracy: 0.9667
- F1: 0.3548
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 150 | 0.1128 | 0.9667 | 0.2 |
| No log | 2.0 | 300 | 0.1666 | 0.9684 | 0.2963 |
| No log | 3.0 | 450 | 0.1816 | 0.9667 | 0.3548 |
### Framework versions
- Transformers 4.12.5
- Pytorch 1.10.0+cu111
- Datasets 1.16.1
- Tokenizers 0.10.3
|
82ea95ad99670aef6b67451f9579184e
|
Helsinki-NLP/opus-mt-sv-yap
|
Helsinki-NLP
|
marian
| 10 | 25 |
transformers
| 0 |
translation
| true | true | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['translation']
| false | true | true | 776 | false |
### opus-mt-sv-yap
* source languages: sv
* target languages: yap
* OPUS readme: [sv-yap](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/sv-yap/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-16.zip](https://object.pouta.csc.fi/OPUS-MT-models/sv-yap/opus-2020-01-16.zip)
* test set translations: [opus-2020-01-16.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/sv-yap/opus-2020-01-16.test.txt)
* test set scores: [opus-2020-01-16.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/sv-yap/opus-2020-01-16.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| JW300.sv.yap | 27.3 | 0.461 |
|
871874f04f86fb5a9116b122711458ce
|
joe5campbell/BERT_Tweet_Sentiment_50k_2eps
|
joe5campbell
|
bert
| 4 | 4 |
transformers
| 0 |
text-classification
| false | true | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_keras_callback']
| true | true | true | 1,395 | false |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# BERT_Tweet_Sentiment_50k_2eps
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1131
- Train Accuracy: 0.9596
- Validation Loss: 0.6972
- Validation Accuracy: 0.8229
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'clipnorm': 1.0, 'learning_rate': 3e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Validation Loss | Validation Accuracy | Epoch |
|:----------:|:--------------:|:---------------:|:-------------------:|:-----:|
| 0.3420 | 0.8511 | 0.4293 | 0.8299 | 0 |
| 0.1131 | 0.9596 | 0.6972 | 0.8229 | 1 |
### Framework versions
- Transformers 4.16.2
- TensorFlow 2.8.0
- Tokenizers 0.11.0
|
8bee5ab0c40511ac2aafda8596abd915
|
vesteinn/IceBERT
|
vesteinn
|
roberta
| 8 | 166 |
transformers
| 3 |
fill-mask
| true | false | false |
agpl-3.0
|
['is']
| null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['roberta', 'icelandic', 'masked-lm', 'pytorch']
| false | true | true | 895 | false |
# IceBERT
IceBERT was trained with fairseq using the RoBERTa-base architecture. The training data used is shown in the table below.
| Dataset | Size | Tokens |
|------------------------------------------------------|---------|--------|
| Icelandic Gigaword Corpus v20.05 (IGC) | 8.2 GB | 1,388M |
| Icelandic Common Crawl Corpus (IC3) | 4.9 GB | 824M |
| Greynir News articles | 456 MB | 76M |
| Icelandic Sagas | 9 MB | 1.7M |
| Open Icelandic e-books (Rafbókavefurinn) | 14 MB | 2.6M |
| Data from the medical library of Landspitali | 33 MB | 5.2M |
| Student theses from Icelandic universities (Skemman) | 2.2 GB | 367M |
| Total | 15.8 GB | 2,664M |
|
d3bb385bc3abcf87748cd721d808e090
|
gokuls/bert-tiny-sst2-KD-BERT_and_distilBERT
|
gokuls
|
bert
| 13 | 3 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null |
['glue']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,541 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-tiny-sst2-KD-BERT_and_distilBERT
This model is a fine-tuned version of [google/bert_uncased_L-2_H-128_A-2](https://huggingface.co/google/bert_uncased_L-2_H-128_A-2) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5530
- Accuracy: 0.8326
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 33
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 1.7317 | 1.0 | 4210 | 1.5887 | 0.8222 |
| 1.0068 | 2.0 | 8420 | 1.5530 | 0.8326 |
| 0.7961 | 3.0 | 12630 | 1.7072 | 0.8245 |
| 0.6852 | 4.0 | 16840 | 1.8794 | 0.8177 |
| 0.6039 | 5.0 | 21050 | 1.8691 | 0.8142 |
### Framework versions
- Transformers 4.22.1
- Pytorch 1.12.1+cu113
- Datasets 2.5.1
- Tokenizers 0.12.1
|
d2abdc091dea2c0fa5dc82b0b2a31c6a
|
strickvl/nlp-redaction-classifier
|
strickvl
|
deberta-v2
| 11 | 1 |
transformers
| 2 |
text-classification
| true | false | false |
mit
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,688 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Redaction Classifier: NLP Edition
This model is a fine-tuned version of [microsoft/deberta-v3-small](https://huggingface.co/microsoft/deberta-v3-small) on a custom dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0893
- Pearson: 0.8273
## Model description
Read more about the process and the code used to train this model on my blog [here](https://mlops.systems).
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 6
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Pearson |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.2054 | 1.0 | 729 | 0.1382 | 0.6771 |
| 0.1386 | 2.0 | 1458 | 0.1099 | 0.7721 |
| 0.0782 | 3.0 | 2187 | 0.0950 | 0.8083 |
| 0.054 | 4.0 | 2916 | 0.0945 | 0.8185 |
| 0.0319 | 5.0 | 3645 | 0.0880 | 0.8251 |
| 0.0254 | 6.0 | 4374 | 0.0893 | 0.8273 |
### Framework versions
- Transformers 4.19.2
- Pytorch 1.11.0a0+17540c5
- Datasets 2.2.2
- Tokenizers 0.12.1
|
242efc088f677a1b7d785707eec3166a
|
cansen88/PromptGenerator_5_topic_finetuned
|
cansen88
|
gpt2
| 9 | 2 |
transformers
| 0 |
text-generation
| false | true | false |
mit
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_keras_callback']
| true | true | true | 1,965 | false |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# PromptGenerator_5_topic_finetuned
This model is a fine-tuned version of [kmkarakaya/turkishReviews-ds](https://huggingface.co/kmkarakaya/turkishReviews-ds) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.6861
- Train Sparse Categorical Accuracy: 0.8150
- Validation Loss: 1.9777
- Validation Sparse Categorical Accuracy: 0.7250
- Epoch: 4
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': 5e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train Sparse Categorical Accuracy | Validation Loss | Validation Sparse Categorical Accuracy | Epoch |
|:----------:|:---------------------------------:|:---------------:|:--------------------------------------:|:-----:|
| 3.0394 | 0.5171 | 2.7152 | 0.5841 | 0 |
| 2.5336 | 0.6247 | 2.4440 | 0.6318 | 1 |
| 2.2002 | 0.6958 | 2.2557 | 0.6659 | 2 |
| 1.9241 | 0.7608 | 2.1059 | 0.6932 | 3 |
| 1.6861 | 0.8150 | 1.9777 | 0.7250 | 4 |
### Framework versions
- Transformers 4.21.1
- TensorFlow 2.8.2
- Datasets 2.4.0
- Tokenizers 0.12.1
|
dcf764223f19e7f1f0b56bae288668d3
|
jmassot/xlm-roberta-base-jm-finetuned-panx-en_hub
|
jmassot
|
xlm-roberta
| 10 | 11 |
transformers
| 0 |
token-classification
| true | false | false |
mit
| null |
['xtreme']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,327 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-jm-finetuned-panx-en_hub
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the xtreme dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4209
- F1: 0.6542
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 1.2098 | 1.0 | 50 | 0.6167 | 0.4802 |
| 0.5605 | 2.0 | 100 | 0.4436 | 0.6184 |
| 0.4035 | 3.0 | 150 | 0.4209 | 0.6542 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.11.0+cu113
- Datasets 1.16.1
- Tokenizers 0.10.1
|
8c16b10210af560f4ce7e26cae2fef0c
|
autoevaluate/natural-language-inference
|
autoevaluate
|
distilbert
| 13 | 2 |
transformers
| 1 |
text-classification
| true | false | false |
apache-2.0
| null |
['glue']
| null | 8 | 7 | 1 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,326 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# natural-language-inference
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4120
- Accuracy: 0.8284
- F1: 0.8822
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|
| No log | 1.0 | 230 | 0.4288 | 0.8039 | 0.8644 |
| No log | 2.0 | 460 | 0.4120 | 0.8284 | 0.8822 |
### Framework versions
- Transformers 4.21.1
- Pytorch 1.12.1+cu113
- Datasets 2.4.0
- Tokenizers 0.12.1
|
c22c5592b371ab685562db290d86f50e
|
kz/mt5base-finetuned-ECC-japanese-small
|
kz
|
mt5
| 7 | 28 |
transformers
| 1 |
text2text-generation
| true | false | false |
mit
|
['ja']
| null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 2,087 | false |
Google's mt5-base fine-tuned in Japanese to solve error detection and correction task.
# 日本語誤り訂正
- "吾輩をは猫である。名前えはまだない。"→"吾輩は猫である。名前はまだない。"
- "-small" has been trained on 20,000 text pairs only.
- dataset: [link](http://nlp.ist.i.kyoto-u.ac.jp/?%E6%97%A5%E6%9C%AC%E8%AA%9EWikipedia%E5%85%A5%E5%8A%9B%E8%AA%A4%E3%82%8A%E3%83%87%E3%83%BC%E3%82%BF%E3%82%BB%E3%83%83%E3%83%88) *used only first 20,000 text pairs.
- prefix: "correction: " (notice: single task trained.)
- text-to-textのお気持ち体験版ぐらいの感覚でどうぞ.
## 参考
- "東北大学でMASKが研究をしています。"→"東北大学でMASKの研究をしています。" ジム・キャリーを主語とした唯一のガ格が消され、ジム・キャリーは研究対象となった。易読化のために用いられる主語と動詞を近づける記法は誤り扱い?
- "東北大学でマスクが研究をしています。"→"東北大学でマスクの研究をしています。"
- "東北大学でイーロン・マスクが研究をしています。"→"東北大学でイーロン・マスクが研究をしています。"
- "東北大学で「イーロン・マスク」が研究をしています。"→"東北大学で「イーロン・マスク」の研究をしています。" 単語の意味も考慮されている?
- "東北大学でイマスクが研究をしています。"→"東北大学でイマスクの研究をしています。"
- "東北大学でクが研究をしています。"→"東北大学でコンピューターが研究をしています。" それはちょっと待って。
## 参考 extra_idを用い探索 <>は半角に変更してください
- "東北大学で <extra_id_0> の研究をしています。"→"東北大学で化学の研究をしています。"
- "東北大学で <extra_id_0> が研究をしています。"→"東北大学で工学が研究をしています。" 工学さん。
- "吾輩は <extra_id_0> である。"→"吾輩は吾輩である。"
- "答えは猫です。吾輩は <extra_id_0> である。"→"答えは猫です。吾輩は猫である。"
- "答えは猫です。吾輩の <extra_id_0> である。"→"答えは猫です。吾輩の心は猫である。"
- "私は猫です。私は <extra_id_0>"→"私は猫です。私は猫です。"
- "私は猫です。N/A <extra_id_0>"→"猫です。"
- "あなたは女性で猫です。彼は犬です。彼女は <extra_id_0>"→"あなたは女性で猫です。彼は犬です。彼女は猫です。"
- "あなたは女性で猫です。彼は犬です。彼は <extra_id_0>"→"あなたは女性で猫です。彼は犬です。"
- "あなたは女性で猫です。彼は犬です。彼は男性で <extra_id_0>"→"あなたは女性で猫です。彼は犬です。彼は男性で猫です。"
- "あなたは女性で猫です。彼は犬です。ライオンは <extra_id_0>"→"あなたは女性で猫です。彼は犬です。ライオンは猫です。"
- "あなたがは女性で猫です。彼はが犬です。ライオンが <extra_id_0>"→"あなたが女性で猫です。彼は犬です。ライオンが犬です。"
- "Aは11、Bは9。Aは <extra_id_0> 。Bは <extra_id_1> 。"→"Aは11、Bは9。Aは11。Bは9。"
- "彼の名前はallenです。彼のnameは <extra_id_0>"→"彼の名前はallenです。彼の名前は英語です。"
- "translate japanease to english: 赤い花. => red flower. 青い花. => <extra_id_0>"→"赤い花. => red flower. 青い花. => blue flower" タスク比依存翻訳可能性の片鱗.japaneseをjapaneaseと間違えたことは秘密だ・・・と言うか間違えても動くのか
## Prompting参考
Chain of Thought Prompting Elicits Reasoning in Large Language Models
https://arxiv.org/abs/2201.11903
**check in progress**
## Licenese
- The MIT license
|
3476c03a67efa22bf8e871b4b898e946
|
Evelyn18/distilbert-base-uncased-becasv2-6
|
Evelyn18
|
distilbert
| 13 | 7 |
transformers
| 0 |
question-answering
| true | false | false |
apache-2.0
| null |
['becasv2']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,530 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-becasv2-6
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the becasv2 dataset.
It achieves the following results on the evaluation set:
- Loss: 3.8936
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 9 | 4.0542 |
| No log | 2.0 | 18 | 3.0865 |
| No log | 3.0 | 27 | 2.8069 |
| No log | 4.0 | 36 | 3.3330 |
| No log | 5.0 | 45 | 3.4108 |
| No log | 6.0 | 54 | 3.5562 |
| No log | 7.0 | 63 | 3.8846 |
| No log | 8.0 | 72 | 3.8936 |
### Framework versions
- Transformers 4.20.1
- Pytorch 1.11.0+cu113
- Datasets 2.3.2
- Tokenizers 0.12.1
|
ff5b1978c99e7e6f6910e3ef21b43922
|
morenolq/distilbert-base-cased-hate-speech
|
morenolq
|
distilbert
| 13 | 2 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 2,067 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-cased-hate-speech
**Training:** The model has been trained using the script provided in the following repository https://github.com/MorenoLaQuatra/transformers-tasks-templates
This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on [hate speech](https://huggingface.co/datasets/ucberkeley-dlab/measuring-hate-speech) dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6837
- Mae: 1.9686
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mae |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 0.6857 | 1.0 | 3389 | 0.6471 | 1.9725 |
| 0.3645 | 2.0 | 6778 | 0.4359 | 1.9725 |
| 0.2266 | 3.0 | 10167 | 0.3664 | 1.9725 |
| 0.1476 | 4.0 | 13556 | 0.3253 | 1.9725 |
| 0.0992 | 5.0 | 16945 | 0.3047 | 1.9725 |
| 0.0737 | 6.0 | 20334 | 0.2869 | 1.9725 |
| 0.0537 | 7.0 | 23723 | 0.2709 | 1.9725 |
| 0.0458 | 8.0 | 27112 | 0.2667 | 1.9725 |
| 0.0313 | 9.0 | 30501 | 0.2589 | 1.9725 |
| 0.027 | 10.0 | 33890 | 0.2540 | 1.9725 |
### Framework versions
- Transformers 4.22.1
- Pytorch 1.11.0+cu113
- Datasets 2.0.0
- Tokenizers 0.11.6
|
810bd429241b3685c11e15026f8e415b
|
shabohin/ddpm-butterflies-128
|
shabohin
| null | 11 | 0 |
diffusers
| 0 | null | false | false | false |
apache-2.0
|
['en']
|
['huggan/smithsonian_butterflies_subset']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,230 | false |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# ddpm-butterflies-128
## Model description
This diffusion model is trained with the [🤗 Diffusers](https://github.com/huggingface/diffusers) library
on the `huggan/smithsonian_butterflies_subset` dataset.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training data
[TODO: describe the data used to train the model]
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- gradient_accumulation_steps: 1
- optimizer: AdamW with betas=(None, None), weight_decay=None and epsilon=None
- lr_scheduler: None
- lr_warmup_steps: 500
- ema_inv_gamma: None
- ema_inv_gamma: None
- ema_inv_gamma: None
- mixed_precision: fp16
### Training results
📈 [TensorBoard logs](https://huggingface.co/shabohin/ddpm-butterflies-128/tensorboard?#scalars)
|
bd760e0f3e848aebc41d00081c9eda57
|
jonatasgrosman/exp_w2v2t_fr_unispeech-ml_s51
|
jonatasgrosman
|
unispeech
| 10 | 5 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
|
['fr']
|
['mozilla-foundation/common_voice_7_0']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['automatic-speech-recognition', 'fr']
| false | true | true | 499 | false |
# exp_w2v2t_fr_unispeech-ml_s51
Fine-tuned [microsoft/unispeech-large-multi-lingual-1500h-cv](https://huggingface.co/microsoft/unispeech-large-multi-lingual-1500h-cv) for speech recognition using the train split of [Common Voice 7.0 (fr)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
|
f3a7e4fa7c2bd511039659f11857b392
|
firqaaa/indo-dpr-ctx_encoder-single-squad-base
|
firqaaa
|
dpr
| 8 | 6 |
transformers
| 0 |
feature-extraction
| true | false | false |
apache-2.0
|
['id']
|
['squad_v2']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['feature-extraction', 'transformers']
| false | true | true | 1,821 | false |
### indo-dpr-question_encoder-single-squad-base
<p style="font-size:16px">Indonesian Dense Passage Retrieval trained on translated SQuADv2.0 dataset in DPR format.</p>
### Evaluation
| Class | Precision | Recall | F1-Score | Support |
|-------|-----------|--------|----------|---------|
| hard_negative | 0.9963 | 0.9963 | 0.9963 | 183090 |
| positive | 0.8849 | 0.8849 | 0.8849 | 5910 |
| Metric | Value |
|--------|-------|
| Accuracy | 0.9928 |
| Macro Average | 0.9406 |
| Weighted Average | 0.9928 |
<p style="font-size:16px">Note: This report is for evaluation on the dev set, after 12000 batches.</p>
### Usage
```python
from transformers import DPRContextEncoder, DPRContextEncoderTokenizer
tokenizer = DPRContextEncoderTokenizer.from_pretrained('firqaaa/indo-dpr-ctx_encoder-single-squad-base')
model = DPRContextEncoder.from_pretrained('firqaaa/indo-dpr-ctx_encoder-single-squad-base')
input_ids = tokenizer("Ibukota Indonesia terletak dimana?", return_tensors='pt')["input_ids"]
embeddings = model(input_ids).pooler_output
```
You can use it using `haystack` as follows:
```
from haystack.nodes import DensePassageRetriever
from haystack.document_stores import InMemoryDocumentStore
retriever = DensePassageRetriever(document_store=InMemoryDocumentStore(),
query_embedding_model="firqaaa/indo-dpr-ctx_encoder-single-squad-base",
passage_embedding_model="firqaaa/indo-dpr-ctx_encoder-single-squad-base",
max_seq_len_query=64,
max_seq_len_passage=256,
batch_size=16,
use_gpu=True,
embed_title=True,
use_fast_tokenizers=True)
```
|
4c6a0c0f03424e8425ade1787371760e
|
Rakib/roberta-base-on-cuad
|
Rakib
|
roberta
| 11 | 31,600 |
transformers
| 1 |
question-answering
| true | false | false |
mit
|
['en']
|
['cuad']
| null | 3 | 0 | 2 | 1 | 0 | 0 | 0 |
['legal-contract-review', 'roberta', 'cuad']
| false | true | true | 4,841 | false |
# Model Card for roberta-base-on-cuad
# Model Details
## Model Description
- **Developed by:** Mohammed Rakib
- **Shared by [Optional]:** More information needed
- **Model type:** Question Answering
- **Language(s) (NLP):** en
- **License:** MIT
- **Related Models:**
- **Parent Model:** RoBERTa
- **Resources for more information:**
- GitHub Repo: [defactolaw](https://github.com/afra-tech/defactolaw)
- Associated Paper: [An Open Source Contractual Language Understanding Application Using Machine Learning](https://aclanthology.org/2022.lateraisse-1.6/)
# Uses
## Direct Use
This model can be used for the task of Question Answering on Legal Documents.
# Training Details
Read: [An Open Source Contractual Language Understanding Application Using Machine Learning](https://aclanthology.org/2022.lateraisse-1.6/)
for detailed information on training procedure, dataset preprocessing and evaluation.
## Training Data
See [CUAD dataset card](https://huggingface.co/datasets/cuad) for more information.
## Training Procedure
### Preprocessing
More information needed
### Speeds, Sizes, Times
More information needed
# Evaluation
## Testing Data, Factors & Metrics
### Testing Data
See [CUAD dataset card](https://huggingface.co/datasets/cuad) for more information.
### Factors
### Metrics
More information needed
## Results
More information needed
# Model Examination
More information needed
- **Hardware Type:** More information needed
- **Hours used:** More information needed
- **Cloud Provider:** More information needed
- **Compute Region:** More information needed
- **Carbon Emitted:** More information needed
# Technical Specifications [optional]
## Model Architecture and Objective
More information needed
## Compute Infrastructure
More information needed
### Hardware
Used V100/P100 from Google Colab Pro
### Software
Python, Transformers
# Citation
**BibTeX:**
```
@inproceedings{nawar-etal-2022-open,
title = "An Open Source Contractual Language Understanding Application Using Machine Learning",
author = "Nawar, Afra and
Rakib, Mohammed and
Hai, Salma Abdul and
Haq, Sanaulla",
booktitle = "Proceedings of the First Workshop on Language Technology and Resources for a Fair, Inclusive, and Safe Society within the 13th Language Resources and Evaluation Conference",
month = jun,
year = "2022",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2022.lateraisse-1.6",
pages = "42--50",
abstract = "Legal field is characterized by its exclusivity and non-transparency. Despite the frequency and relevance of legal dealings, legal documents like contracts remains elusive to non-legal professionals for the copious usage of legal jargon. There has been little advancement in making legal contracts more comprehensible. This paper presents how Machine Learning and NLP can be applied to solve this problem, further considering the challenges of applying ML to the high length of contract documents and training in a low resource environment. The largest open-source contract dataset so far, the Contract Understanding Atticus Dataset (CUAD) is utilized. Various pre-processing experiments and hyperparameter tuning have been carried out and we successfully managed to eclipse SOTA results presented for models in the CUAD dataset trained on RoBERTa-base. Our model, A-type-RoBERTa-base achieved an AUPR score of 46.6{\%} compared to 42.6{\%} on the original RoBERT-base. This model is utilized in our end to end contract understanding application which is able to take a contract and highlight the clauses a user is looking to find along with it{'}s descriptions to aid due diligence before signing. Alongside digital, i.e. searchable, contracts the system is capable of processing scanned, i.e. non-searchable, contracts using tesseract OCR. This application is aimed to not only make contract review a comprehensible process to non-legal professionals, but also to help lawyers and attorneys more efficiently review contracts.",
}
```
# Glossary [optional]
More information needed
# More Information [optional]
More information needed
# Model Card Authors [optional]
Mohammed Rakib in collaboration with Ezi Ozoani and the Hugging Face team
# Model Card Contact
More information needed
# How to Get Started with the Model
Use the code below to get started with the model.
<details>
<summary> Click to expand </summary>
```python
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("Rakib/roberta-base-on-cuad")
model = AutoModelForQuestionAnswering.from_pretrained("Rakib/roberta-base-on-cuad")
```
</details>
|
dcf90e27ad5247d9a5c3e9c2981b8771
|
krirk/wav2vec2-large-xls-r-300m-turkish-colab
|
krirk
|
wav2vec2
| 13 | 7 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
| null |
['common_voice']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,791 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-large-xls-r-300m-turkish-colab
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the common_voice dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3942
- Wer: 0.3149
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.9921 | 3.67 | 400 | 0.7820 | 0.7857 |
| 0.4496 | 7.34 | 800 | 0.4630 | 0.4977 |
| 0.2057 | 11.01 | 1200 | 0.4293 | 0.4627 |
| 0.1328 | 14.68 | 1600 | 0.4464 | 0.4068 |
| 0.1009 | 18.35 | 2000 | 0.4461 | 0.3742 |
| 0.0794 | 22.02 | 2400 | 0.4328 | 0.3467 |
| 0.0628 | 25.69 | 2800 | 0.4036 | 0.3263 |
| 0.0497 | 29.36 | 3200 | 0.3942 | 0.3149 |
### Framework versions
- Transformers 4.11.3
- Pytorch 1.10.0+cu111
- Datasets 1.13.3
- Tokenizers 0.10.3
|
9bacefd09c02f854acf7a897768332a4
|
Tristan/distilbert_summarization_reward_model
|
Tristan
|
distilbert
| 10 | 5 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,292 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert_summarization_reward_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6972
- Accuracy: 0.5271
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.6922 | 1.0 | 11608 | 0.6918 | 0.5237 |
| 0.6762 | 2.0 | 23216 | 0.6972 | 0.5271 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.13.1+cu117
- Datasets 2.8.0
- Tokenizers 0.13.2
|
3238a94ec7d857acd5da386d056f7f48
|
jonatasgrosman/exp_w2v2t_fa_wavlm_s545
|
jonatasgrosman
|
wavlm
| 10 | 5 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
|
['fa']
|
['mozilla-foundation/common_voice_7_0']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['automatic-speech-recognition', 'fa']
| false | true | true | 439 | false |
# exp_w2v2t_fa_wavlm_s545
Fine-tuned [microsoft/wavlm-large](https://huggingface.co/microsoft/wavlm-large) for speech recognition using the train split of [Common Voice 7.0 (fa)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
|
ec889aa61b03343879096791ea2f87c5
|
EricJai/ddpm-butterflies-128
|
EricJai
| null | 11 | 2 |
diffusers
| 0 | null | false | false | false |
apache-2.0
|
['en']
|
['huggan/smithsonian_butterflies_subset']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,229 | false |
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# ddpm-butterflies-128
## Model description
This diffusion model is trained with the [🤗 Diffusers](https://github.com/huggingface/diffusers) library
on the `huggan/smithsonian_butterflies_subset` dataset.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training data
[TODO: describe the data used to train the model]
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- gradient_accumulation_steps: 1
- optimizer: AdamW with betas=(None, None), weight_decay=None and epsilon=None
- lr_scheduler: None
- lr_warmup_steps: 500
- ema_inv_gamma: None
- ema_inv_gamma: None
- ema_inv_gamma: None
- mixed_precision: fp16
### Training results
📈 [TensorBoard logs](https://huggingface.co/EricJai/ddpm-butterflies-128/tensorboard?#scalars)
|
578a8a2e535fe506dc16b964f4d2248b
|
Helsinki-NLP/opus-mt-swc-fr
|
Helsinki-NLP
|
marian
| 10 | 7 |
transformers
| 0 |
translation
| true | true | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['translation']
| false | true | true | 776 | false |
### opus-mt-swc-fr
* source languages: swc
* target languages: fr
* OPUS readme: [swc-fr](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/swc-fr/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-16.zip](https://object.pouta.csc.fi/OPUS-MT-models/swc-fr/opus-2020-01-16.zip)
* test set translations: [opus-2020-01-16.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/swc-fr/opus-2020-01-16.test.txt)
* test set scores: [opus-2020-01-16.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/swc-fr/opus-2020-01-16.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| JW300.swc.fr | 28.6 | 0.470 |
|
18e44ca6fbd6eecea5916cddbb4b5ccd
|
doc2query/stackexchange-t5-base-v1
|
doc2query
|
t5
| 10 | 1 |
transformers
| 0 |
text2text-generation
| true | false | false |
apache-2.0
|
['en']
|
['flax-sentence-embeddings/stackexchange_title_best_voted_answer_jsonl']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 2,824 | false |
# doc2query/stackexchange-t5-base-v1
This is a [doc2query](https://arxiv.org/abs/1904.08375) model based on T5 (also known as [docT5query](https://cs.uwaterloo.ca/~jimmylin/publications/Nogueira_Lin_2019_docTTTTTquery-v2.pdf)).
It can be used for:
- **Document expansion**: You generate for your paragraphs 20-40 queries and index the paragraphs and the generates queries in a standard BM25 index like Elasticsearch, OpenSearch, or Lucene. The generated queries help to close the lexical gap of lexical search, as the generate queries contain synonyms. Further, it re-weights words giving important words a higher weight even if they appear seldomn in a paragraph. In our [BEIR](https://arxiv.org/abs/2104.08663) paper we showed that BM25+docT5query is a powerful search engine. In the [BEIR repository](https://github.com/UKPLab/beir) we have an example how to use docT5query with Pyserini.
- **Domain Specific Training Data Generation**: It can be used to generate training data to learn an embedding model. On [SBERT.net](https://www.sbert.net/examples/unsupervised_learning/query_generation/README.html) we have an example how to use the model to generate (query, text) pairs for a given collection of unlabeled texts. These pairs can then be used to train powerful dense embedding models.
## Usage
```python
from transformers import T5Tokenizer, T5ForConditionalGeneration
model_name = 'doc2query/stackexchange-t5-base-v1'
tokenizer = T5Tokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)
text = "Python is an interpreted, high-level and general-purpose programming language. Python's design philosophy emphasizes code readability with its notable use of significant whitespace. Its language constructs and object-oriented approach aim to help programmers write clear, logical code for small and large-scale projects."
input_ids = tokenizer.encode(text, max_length=320, truncation=True, return_tensors='pt')
outputs = model.generate(
input_ids=input_ids,
max_length=64,
do_sample=True,
top_p=0.95,
num_return_sequences=5)
print("Text:")
print(text)
print("\nGenerated Queries:")
for i in range(len(outputs)):
query = tokenizer.decode(outputs[i], skip_special_tokens=True)
print(f'{i + 1}: {query}')
```
**Note:** `model.generate()` is non-deterministic. It produces different queries each time you run it.
## Training
This model fine-tuned [google/t5-v1_1-base](https://huggingface.co/google/t5-v1_1-base) for 449k training steps. For the training script, see the `train_script.py` in this repository.
The input-text was truncated to 320 word pieces. Output text was generated up to 64 word pieces.
This model was trained on a (title, best_answer_pairs) from StackExchange.
|
4bcf824e2f8c42f39ae179c029bb4a18
|
Lvxue/distilled-mt5-small-0.05-0.25
|
Lvxue
|
mt5
| 14 | 1 |
transformers
| 0 |
text2text-generation
| true | false | false |
apache-2.0
|
['en', 'ro']
|
['wmt16']
| null | 1 | 1 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,040 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilled-mt5-small-0.05-0.25
This model is a fine-tuned version of [google/mt5-small](https://huggingface.co/google/mt5-small) on the wmt16 ro-en dataset.
It achieves the following results on the evaluation set:
- Loss: 2.8318
- Bleu: 7.1808
- Gen Len: 44.1986
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5.0
### Training results
### Framework versions
- Transformers 4.20.1
- Pytorch 1.12.0+cu102
- Datasets 2.3.2
- Tokenizers 0.12.1
|
d1d2a9fb135b3740eac77ee77fc4d79a
|
Helsinki-NLP/opus-mt-ts-fi
|
Helsinki-NLP
|
marian
| 10 | 7 |
transformers
| 0 |
translation
| true | true | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['translation']
| false | true | true | 768 | false |
### opus-mt-ts-fi
* source languages: ts
* target languages: fi
* OPUS readme: [ts-fi](https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ts-fi/README.md)
* dataset: opus
* model: transformer-align
* pre-processing: normalization + SentencePiece
* download original weights: [opus-2020-01-16.zip](https://object.pouta.csc.fi/OPUS-MT-models/ts-fi/opus-2020-01-16.zip)
* test set translations: [opus-2020-01-16.test.txt](https://object.pouta.csc.fi/OPUS-MT-models/ts-fi/opus-2020-01-16.test.txt)
* test set scores: [opus-2020-01-16.eval.txt](https://object.pouta.csc.fi/OPUS-MT-models/ts-fi/opus-2020-01-16.eval.txt)
## Benchmarks
| testset | BLEU | chr-F |
|-----------------------|-------|-------|
| JW300.ts.fi | 27.7 | 0.509 |
|
0c57edf4b48e7e2ab2f5466f230df92c
|
fathyshalab/massive_datetime-roberta-large-v1-2-0.82
|
fathyshalab
|
roberta
| 14 | 2 |
sentence-transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['setfit', 'sentence-transformers', 'text-classification']
| false | true | true | 1,470 | false |
# fathyshalab/massive_datetime-roberta-large-v1-2-0.82
This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Usage
To use this model for inference, first install the SetFit library:
```bash
python -m pip install setfit
```
You can then run inference as follows:
```python
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("fathyshalab/massive_datetime-roberta-large-v1-2-0.82")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
```
## BibTeX entry and citation info
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
|
81540dc9e0e49a883f3ca7f0fa9a17f5
|
Helsinki-NLP/opus-mt-tc-big-en-lv
|
Helsinki-NLP
|
marian
| 12 | 44 |
transformers
| 0 |
translation
| true | false | false |
cc-by-4.0
|
['en', 'lv']
| null | null | 1 | 0 | 1 | 0 | 0 | 0 | 0 |
['translation', 'opus-mt-tc']
| true | true | true | 5,394 | false |
# opus-mt-tc-big-en-lv
Neural machine translation model for translating from English (en) to Latvian (lv).
This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
* Publications: [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
```
@inproceedings{tiedemann-thottingal-2020-opus,
title = "{OPUS}-{MT} {--} Building open translation services for the World",
author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh},
booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
month = nov,
year = "2020",
address = "Lisboa, Portugal",
publisher = "European Association for Machine Translation",
url = "https://aclanthology.org/2020.eamt-1.61",
pages = "479--480",
}
@inproceedings{tiedemann-2020-tatoeba,
title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
author = {Tiedemann, J{\"o}rg},
booktitle = "Proceedings of the Fifth Conference on Machine Translation",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.wmt-1.139",
pages = "1174--1182",
}
```
## Model info
* Release: 2022-03-13
* source language(s): eng
* target language(s): lav
* model: transformer-big
* data: opusTCv20210807+bt ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge))
* tokenization: SentencePiece (spm32k,spm32k)
* original model: [opusTCv20210807+bt_transformer-big_2022-03-13.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-lav/opusTCv20210807+bt_transformer-big_2022-03-13.zip)
* more information released models: [OPUS-MT eng-lav README](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/eng-lav/README.md)
## Usage
A short example code:
```python
from transformers import MarianMTModel, MarianTokenizer
src_text = [
">>lav<< A day has twenty-four hours.",
">>ltg<< He's a good lawyer."
]
model_name = "pytorch-models/opus-mt-tc-big-en-lv"
tokenizer = MarianTokenizer.from_pretrained(model_name)
model = MarianMTModel.from_pretrained(model_name)
translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))
for t in translated:
print( tokenizer.decode(t, skip_special_tokens=True) )
# expected output:
# Dienā ir divdesmit četras stundas.
# Vyss ir labs advokats.
```
You can also use OPUS-MT models with the transformers pipelines, for example:
```python
from transformers import pipeline
pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-big-en-lv")
print(pipe(">>lav<< A day has twenty-four hours."))
# expected output: Dienā ir divdesmit četras stundas.
```
## Benchmarks
* test set translations: [opusTCv20210807+bt_transformer-big_2022-03-13.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-lav/opusTCv20210807+bt_transformer-big_2022-03-13.test.txt)
* test set scores: [opusTCv20210807+bt_transformer-big_2022-03-13.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/eng-lav/opusTCv20210807+bt_transformer-big_2022-03-13.eval.txt)
* benchmark results: [benchmark_results.txt](benchmark_results.txt)
* benchmark output: [benchmark_translations.zip](benchmark_translations.zip)
| langpair | testset | chr-F | BLEU | #sent | #words |
|----------|---------|-------|-------|-------|--------|
| eng-lav | tatoeba-test-v2021-08-07 | 0.66411 | 44.0 | 1631 | 9932 |
| eng-lav | flores101-devtest | 0.59397 | 30.1 | 1012 | 22092 |
| eng-lav | newsdev2017 | 0.58082 | 28.9 | 2003 | 41503 |
| eng-lav | newstest2017 | 0.53202 | 22.1 | 2001 | 39392 |
## Acknowledgements
The work is supported by the [European Language Grid](https://www.european-language-grid.eu/) as [pilot project 2866](https://live.european-language-grid.eu/catalogue/#/resource/projects/2866), by the [FoTran project](https://www.helsinki.fi/en/researchgroups/natural-language-understanding-with-cross-lingual-grounding), funded by the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation programme (grant agreement No 771113), and the [MeMAD project](https://memad.eu/), funded by the European Union’s Horizon 2020 Research and Innovation Programme under grant agreement No 780069. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland.
## Model conversion info
* transformers version: 4.16.2
* OPUS-MT git hash: 3405783
* port time: Wed Apr 13 17:36:04 EEST 2022
* port machine: LM0-400-22516.local
|
0f27dc51a8b2aaf94a2d9819f4018ad3
|
nateraw/modelcard-creator-demo
|
nateraw
| null | 2 | 0 |
pytorch
| 0 | null | true | false | false |
mit
|
['en']
|
['beans']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['modelcards', 'autogenerated-modelcard']
| false | true | true | 3,511 | false |
# modelcard-creator-demo
## Table of Contents
- [Model Details](#model-details)
- [How To Get Started With the Model](#how-to-get-started-with-the-model)
- [Uses](#uses)
- [Direct Use](#direct-use)
- [Downstream Use](#downstream-use)
- [Misuse and Out of Scope Use](#misuse-and-out-of-scope-use)
- [Limitations and Biases](#limitations-and-biases)
- [Training](#training)
- [Training Data](#training-data)
- [Training Procedure](#training-procedure)
- [Evaluation Results](#evaluation-results)
- [Environmental Impact](#environmental-impact)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Model Details
<!-- Give an overview of your model, the relevant research paper, who trained it, etc. -->
This isn't really a model, it's just a test repo to see if the [model card creator](https://huggingface.co/spaces/nateraw/modelcard-creator) works!
- Developed by: Nathan Raw
- Language(s):
- License: modelcard-creator-demo is licensed under the mit license
- Resources for more information:
- [Research Paper](https://arxiv.org/pdf/1810.03993.pdf)
- [GitHub Repo](https://github.com/nateraw/modelcards)
## How to Get Started with the Model
Use the code below to get started with the model.
```python
# A nice code snippet here that describes how to use the model...
```
## Uses
#### Direct Use
<!-- Describe what kind of tasks this model can be used for directly or problems it can solve. -->
[More Information Needed]
#### Downstream Use
<!-- Describe how this model could be leveraged by a downstream model (if applicable) -->
[More Information Needed]
#### Misuse and Out-of-scope Use
<!-- Describe ways in which this model ***should not*** be used. -->
[More Information Needed]
## Limitations and Biases
<!-- Describe limitations and biases of this model or models of it's type. -->
**CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propogate historical and current stereotypes.**
[More Information Needed]
## Training
#### Training Data
<!-- Describe the dataset used to train this model. -->
<!-- Refer to data card if dataset is provided and exists on the hub -->
See the data card for additional information.
#### Training Procedure
<!-- Describe the preprocessing, hardware used, training hyperparameters, etc. -->
[More Information Needed]
## Evaluation Results
<!-- Describe evaluation results of this model across any datasets it was evaluated on. -->
[More Information Needed]
## Environmental Impact
<!-- Provide information to document the environmental impact of this model -->
You can estimate carbon emissions using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700)
- **Hardware Type:**
- **Hours used:**
- **Cloud Provider:**
- **Compute Region:**
- **Carbon Emitted:**
## Citation Information
```bibtex
@inproceedings{Mitchell_2019,
doi = {10.1145/3287560.3287596},
url = {https://doi.org/10.1145%2F3287560.3287596},
year = 2019,
month = {jan},
publisher = {{ACM}
},
author = {Margaret Mitchell and Simone Wu and Andrew Zaldivar and Parker Barnes and Lucy Vasserman and Ben Hutchinson and Elena Spitzer and Inioluwa Deborah Raji and Timnit Gebru},
title = {Model Cards for Model Reporting},
booktitle = {Proceedings of the Conference on Fairness, Accountability, and Transparency}
}
```
|
424d6e6df924560e4a45d7fccaa1091b
|
Geotrend/distilbert-base-en-el-ru-cased
|
Geotrend
|
distilbert
| 6 | 5 |
transformers
| 0 |
fill-mask
| true | false | false |
apache-2.0
|
['multilingual']
|
['wikipedia']
| null | 1 | 1 | 0 | 0 | 0 | 0 | 0 |
[]
| false | true | true | 1,233 | false |
# distilbert-base-en-el-ru-cased
We are sharing smaller versions of [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased) that handle a custom number of languages.
Our versions give exactly the same representations produced by the original model which preserves the original accuracy.
For more information please visit our paper: [Load What You Need: Smaller Versions of Multilingual BERT](https://www.aclweb.org/anthology/2020.sustainlp-1.16.pdf).
## How to use
```python
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("Geotrend/distilbert-base-en-el-ru-cased")
model = AutoModel.from_pretrained("Geotrend/distilbert-base-en-el-ru-cased")
```
To generate other smaller versions of multilingual transformers please visit [our Github repo](https://github.com/Geotrend-research/smaller-transformers).
### How to cite
```bibtex
@inproceedings{smallermdistilbert,
title={Load What You Need: Smaller Versions of Mutlilingual BERT},
author={Abdaoui, Amine and Pradel, Camille and Sigel, Grégoire},
booktitle={SustaiNLP / EMNLP},
year={2020}
}
```
## Contact
Please contact amine@geotrend.fr for any question, feedback or request.
|
27ad9106bb8d2121fc85075d406f35e7
|
arbml/whisper-medium-ar
|
arbml
|
whisper
| 42 | 90 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
| null |
['arbml/mgb2']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['whisper-event', 'generated_from_trainer', 'hf-asr-leaderboard']
| true | true | true | 1,851 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# openai/whisper-medium
This model is a fine-tuned version of [openai/whisper-medium](https://huggingface.co/openai/whisper-medium) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8488
- Wer: 16.5882
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 10000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:-------:|
| 0.2963 | 0.1 | 1000 | 0.9115 | 27.3641 |
| 0.2676 | 0.2 | 2000 | 0.8796 | 24.1024 |
| 0.3166 | 0.3 | 3000 | 0.8467 | 20.1700 |
| 0.2797 | 0.4 | 4000 | 0.8756 | 29.4889 |
| 0.2302 | 0.5 | 5000 | 0.8523 | 19.6414 |
| 0.2803 | 0.6 | 6000 | 0.8715 | 19.7413 |
| 0.2794 | 0.7 | 7000 | 0.8548 | 18.6840 |
| 0.2173 | 0.8 | 8000 | 0.8543 | 17.9019 |
| 0.217 | 0.9 | 9000 | 0.8518 | 16.3840 |
| 0.1718 | 1.0 | 10000 | 0.8488 | 16.5882 |
### Framework versions
- Transformers 4.26.0.dev0
- Pytorch 1.13.0+cu117
- Datasets 2.7.1.dev0
- Tokenizers 0.13.2
|
975acd3c5847fe5f9d9e4d58238f3d47
|
jonatasgrosman/exp_w2v2t_it_vp-nl_s335
|
jonatasgrosman
|
wav2vec2
| 10 | 7 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
|
['it']
|
['mozilla-foundation/common_voice_7_0']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['automatic-speech-recognition', 'it']
| false | true | true | 469 | false |
# exp_w2v2t_it_vp-nl_s335
Fine-tuned [facebook/wav2vec2-large-nl-voxpopuli](https://huggingface.co/facebook/wav2vec2-large-nl-voxpopuli) for speech recognition using the train split of [Common Voice 7.0 (it)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
|
40509d93d7a7f74df9fc87be85c4cdb7
|
keithanpai/swin-tiny-patch4-window7-224-finetuned-eurosat
|
keithanpai
|
swin
| 20 | 3 |
transformers
| 0 |
image-classification
| true | false | false |
apache-2.0
| null |
['imagefolder']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,492 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5603
- Accuracy: 0.7914
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.67 | 0.99 | 70 | 0.7920 | 0.7265 |
| 0.5856 | 1.99 | 140 | 0.6192 | 0.7804 |
| 0.5612 | 2.99 | 210 | 0.5603 | 0.7914 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
|
5f1d343aafd771705ef4bb520784ab0e
|
marioarteaga/distilbert-base-uncased-finetuned-squad
|
marioarteaga
|
distilbert
| 12 | 5 |
transformers
| 0 |
question-answering
| true | false | false |
apache-2.0
| null |
['squad']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,178 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-squad
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2052
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.2493 | 1.0 | 5533 | 1.2052 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.0+cu111
- Datasets 1.17.0
- Tokenizers 0.10.3
|
6b9029f45401732d19d6339885dbfc3b
|
Deep98/Human_Development_Index-clustered
|
Deep98
|
distilbert
| 8 | 0 |
transformers
| 0 |
question-answering
| false | true | false |
mit
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_keras_callback']
| true | true | true | 1,870 | false |
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# Deep98/Human_Development_Index-clustered
This model is a fine-tuned version of [nandysoham16/4-clustered_aug](https://huggingface.co/nandysoham16/4-clustered_aug) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.1876
- Train End Logits Accuracy: 0.9757
- Train Start Logits Accuracy: 0.9271
- Validation Loss: 0.6587
- Validation End Logits Accuracy: 0.6667
- Validation Start Logits Accuracy: 1.0
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 18, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Train End Logits Accuracy | Train Start Logits Accuracy | Validation Loss | Validation End Logits Accuracy | Validation Start Logits Accuracy | Epoch |
|:----------:|:-------------------------:|:---------------------------:|:---------------:|:------------------------------:|:--------------------------------:|:-----:|
| 0.1876 | 0.9757 | 0.9271 | 0.6587 | 0.6667 | 1.0 | 0 |
### Framework versions
- Transformers 4.26.0
- TensorFlow 2.9.2
- Datasets 2.9.0
- Tokenizers 0.13.2
|
5657e8bc42029f6ab6e9edab10ad7ad9
|
cleandata/distilbert-base-uncased-finetuned-imdb
|
cleandata
|
distilbert
| 13 | 18 |
transformers
| 0 |
fill-mask
| true | false | false |
apache-2.0
| null |
['imdb']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,318 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset.
It achieves the following results on the evaluation set:
- Loss: 2.4721
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.7086 | 1.0 | 157 | 2.4898 |
| 2.5796 | 2.0 | 314 | 2.4230 |
| 2.5269 | 3.0 | 471 | 2.4354 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.13.0+cu116
- Datasets 2.8.0
- Tokenizers 0.13.2
|
82dae3038c814377c396a537ad49579e
|
qmeeus/whisper-small-ner-combined
|
qmeeus
|
whisper_for_slu
| 22 | 1 |
transformers
| 0 |
token-classification
| true | false | false |
apache-2.0
| null |
['qmeeus/slue-voxpopuli']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['whisper-event', 'generated_from_trainer']
| true | true | true | 2,284 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# WhisperForNamedEntityRecognition
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the qmeeus/slue-voxpopuli dataset.
It achieves the following results on the evaluation set:
- Loss: 8.1514
- Wer: 10.4828
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- training_steps: 1600
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 31.3741 | 0.06 | 100 | 25.8582 | 10.4828 |
| 13.0078 | 1.03 | 200 | 13.4173 | 10.4828 |
| 10.3619 | 1.09 | 300 | 10.8540 | 10.4828 |
| 8.7869 | 2.06 | 400 | 9.6249 | 10.4828 |
| 7.3964 | 3.02 | 500 | 9.1812 | 10.4828 |
| 6.6321 | 3.08 | 600 | 8.6536 | 10.4828 |
| 6.4612 | 4.05 | 700 | 8.6046 | 10.4828 |
| 4.8358 | 5.02 | 800 | 8.0890 | 10.4828 |
| 4.4918 | 5.08 | 900 | 8.3141 | 10.4828 |
| 4.7548 | 6.04 | 1000 | 8.1660 | 10.4828 |
| 3.7881 | 7.01 | 1100 | 8.2471 | 10.4828 |
| 3.1916 | 7.07 | 1200 | 8.0779 | 10.4828 |
| 3.2039 | 8.04 | 1300 | 8.1106 | 10.4828 |
| 3.038 | 9.0 | 1400 | 8.0875 | 10.4828 |
| 2.3249 | 9.07 | 1500 | 8.1025 | 10.4828 |
| 2.6124 | 10.03 | 1600 | 8.1514 | 10.4828 |
### Framework versions
- Transformers 4.26.0.dev0
- Pytorch 1.10.0
- Datasets 2.7.1.dev0
- Tokenizers 0.11.0
|
86fefed50ff26c3fa783479bdda05b6b
|
Devarshi/Brain_Tumor_Classification
|
Devarshi
|
swin
| 11 | 7 |
transformers
| 0 |
image-classification
| true | false | false |
apache-2.0
| null |
['imagefolder']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,850 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Brain_Tumor_Classification
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1012
- Accuracy: 0.9647
- F1: 0.9647
- Recall: 0.9647
- Precision: 0.9647
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Recall | Precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:---------:|
| 0.4856 | 0.99 | 83 | 0.3771 | 0.8444 | 0.8444 | 0.8444 | 0.8444 |
| 0.3495 | 1.99 | 166 | 0.2608 | 0.8949 | 0.8949 | 0.8949 | 0.8949 |
| 0.252 | 2.99 | 249 | 0.1445 | 0.9487 | 0.9487 | 0.9487 | 0.9487 |
| 0.2364 | 3.99 | 332 | 0.1029 | 0.9588 | 0.9588 | 0.9588 | 0.9588 |
| 0.2178 | 4.99 | 415 | 0.1012 | 0.9647 | 0.9647 | 0.9647 | 0.9647 |
### Framework versions
- Transformers 4.23.1
- Pytorch 1.12.1
- Datasets 2.6.1
- Tokenizers 0.13.1
|
a7adf5f556216f91fd32e6f7be66d83c
|
jonatasgrosman/exp_w2v2t_fr_no-pretraining_s929
|
jonatasgrosman
|
wav2vec2
| 10 | 5 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
|
['fr']
|
['mozilla-foundation/common_voice_7_0']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['automatic-speech-recognition', 'fr']
| false | true | true | 414 | false |
# exp_w2v2t_fr_no-pretraining_s929
Fine-tuned randomly initialized wav2vec2 model for speech recognition using the train split of [Common Voice 7.0 (fr)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
|
b55c07f9ca7c2b00c6124fa7b7a411b9
|
jonatasgrosman/exp_w2v2t_ar_wavlm_s3
|
jonatasgrosman
|
wavlm
| 10 | 5 |
transformers
| 0 |
automatic-speech-recognition
| true | false | false |
apache-2.0
|
['ar']
|
['mozilla-foundation/common_voice_7_0']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['automatic-speech-recognition', 'ar']
| false | true | true | 437 | false |
# exp_w2v2t_ar_wavlm_s3
Fine-tuned [microsoft/wavlm-large](https://huggingface.co/microsoft/wavlm-large) for speech recognition using the train split of [Common Voice 7.0 (ar)](https://huggingface.co/datasets/mozilla-foundation/common_voice_7_0).
When using this model, make sure that your speech input is sampled at 16kHz.
This model has been fine-tuned by the [HuggingSound](https://github.com/jonatasgrosman/huggingsound) tool.
|
fa62ac6b0a62d92d0d48598db4e9db06
|
krinal214/bert-all-squad_ben_tel_context
|
krinal214
|
bert
| 12 | 5 |
transformers
| 0 |
question-answering
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,174 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-all-squad_ben_tel_context
This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.5393
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 0.996 | 1.0 | 12676 | 0.5393 |
### Framework versions
- Transformers 4.16.2
- Pytorch 1.9.1
- Datasets 1.18.4
- Tokenizers 0.11.6
|
1f68b3cf28915dbd8f53daed1116cb1a
|
ju-bezdek/slovakbert-conll2003-sk-ner
|
ju-bezdek
| null | 14 | 2 | null | 1 | null | true | false | false |
mit
| null |
['ju-bezdek/conll2003-SK-NER']
| null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 6,326 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# outputs
This model is a fine-tuned version of [gerulata/slovakbert](https://huggingface.co/gerulata/slovakbert) on the [ju-bezdek/conll2003-SK-NER](https://huggingface.co/datasets/ju-bezdek/conll2003-SK-NER) dataset.
It achieves the following results on the evaluation (validation) set:
- Loss: 0.1752
- Precision: 0.8190
- Recall: 0.8390
- F1: 0.8288
- Accuracy: 0.9526
## Model description
More information needed
## Code example
```python:
from transformers import pipeline, AutoModel, AutoTokenizer
from spacy import displacy
import os
model_path="ju-bezdek/slovakbert-conll2003-sk-ner"
aggregation_strategy="max"
ner_pipeline = pipeline(task='ner', model=model_path, aggregation_strategy=aggregation_strategy)
input_sentence= "Ruský premiér Viktor Černomyrdin v piatok povedal, že prezident Boris Jeľcin , ktorý je na dovolenke mimo Moskvy , podporil mierový plán šéfa bezpečnosti Alexandra Lebedu pre Čečensko, uviedla tlačová agentúra Interfax"
ner_ents = ner_pipeline(input_sentence)
print(ner_ents)
ent_group_labels = [ner_pipeline.model.config.id2label[i][2:] for i in ner_pipeline.model.config.id2label if i>0]
options = {"ents":ent_group_labels}
dicplacy_ents = [{"start":ent["start"], "end":ent["end"], "label":ent["entity_group"]} for ent in ner_ents]
displacy.render({"text":input_sentence, "ents":dicplacy_ents}, style="ent", options=options, jupyter=True, manual=True)
```
### Result:
<div>
<span class="tex2jax_ignore"><div class="entities" style="line-height: 2.5; direction: ltr">
<mark class="entity" style="background: #ddd; padding: 0.45em 0.6em; margin: 0 0.25em; line-height: 1; border-radius: 0.35em;">
Ruský
<span style="font-size: 0.8em; font-weight: bold; line-height: 1; border-radius: 0.35em; vertical-align: middle; margin-left: 0.5rem">MISC</span>
</mark>
premiér
<mark class="entity" style="background: #ddd; padding: 0.45em 0.6em; margin: 0 0.25em; line-height: 1; border-radius: 0.35em;">
Viktor Černomyrdin
<span style="font-size: 0.8em; font-weight: bold; line-height: 1; border-radius: 0.35em; vertical-align: middle; margin-left: 0.5rem">PER</span>
</mark>
v piatok povedal, že prezident
<mark class="entity" style="background: #ddd; padding: 0.45em 0.6em; margin: 0 0.25em; line-height: 1; border-radius: 0.35em;">
Boris Jeľcin,
<span style="font-size: 0.8em; font-weight: bold; line-height: 1; border-radius: 0.35em; vertical-align: middle; margin-left: 0.5rem">PER</span>
</mark>
, ktorý je na dovolenke mimo
<mark class="entity" style="background: #ff9561; padding: 0.45em 0.6em; margin: 0 0.25em; line-height: 1; border-radius: 0.35em;">
Moskvy
<span style="font-size: 0.8em; font-weight: bold; line-height: 1; border-radius: 0.35em; vertical-align: middle; margin-left: 0.5rem">LOC</span>
</mark>
, podporil mierový plán šéfa bezpečnosti
<mark class="entity" style="background: #ddd; padding: 0.45em 0.6em; margin: 0 0.25em; line-height: 1; border-radius: 0.35em;">
Alexandra Lebedu
<span style="font-size: 0.8em; font-weight: bold; line-height: 1; border-radius: 0.35em; vertical-align: middle; margin-left: 0.5rem">PER</span>
</mark>
pre
<mark class="entity" style="background: #ff9561; padding: 0.45em 0.6em; margin: 0 0.25em; line-height: 1; border-radius: 0.35em;">
Čečensko,
<span style="font-size: 0.8em; font-weight: bold; line-height: 1; border-radius: 0.35em; vertical-align: middle; margin-left: 0.5rem">LOC</span>
</mark>
uviedla tlačová agentúra
<mark class="entity" style="background: #7aecec; padding: 0.45em 0.6em; margin: 0 0.25em; line-height: 1; border-radius: 0.35em;">
Interfax
<span style="font-size: 0.8em; font-weight: bold; line-height: 1; border-radius: 0.35em; vertical-align: middle; margin-left: 0.5rem">ORG</span>
</mark>
</div></span>
</div>
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
### Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.3237 | 1.0 | 878 | 0.2541 | 0.7125 | 0.8059 | 0.7563 | 0.9283 |
| 0.1663 | 2.0 | 1756 | 0.2370 | 0.7775 | 0.8090 | 0.7929 | 0.9394 |
| 0.1251 | 3.0 | 2634 | 0.2289 | 0.7732 | 0.8029 | 0.7878 | 0.9385 |
| 0.0984 | 4.0 | 3512 | 0.2818 | 0.7294 | 0.8189 | 0.7715 | 0.9294 |
| 0.0808 | 5.0 | 4390 | 0.3138 | 0.7615 | 0.7900 | 0.7755 | 0.9326 |
| 0.0578 | 6.0 | 5268 | 0.3072 | 0.7548 | 0.8222 | 0.7871 | 0.9370 |
| 0.0481 | 7.0 | 6146 | 0.2778 | 0.7897 | 0.8156 | 0.8025 | 0.9408 |
| 0.0414 | 8.0 | 7024 | 0.3336 | 0.7695 | 0.8201 | 0.7940 | 0.9389 |
| 0.0268 | 9.0 | 7902 | 0.3294 | 0.7868 | 0.8140 | 0.8002 | 0.9409 |
| 0.0204 | 10.0 | 8780 | 0.3693 | 0.7657 | 0.8239 | 0.7938 | 0.9376 |
| 0.016 | 11.0 | 9658 | 0.3816 | 0.7932 | 0.8242 | 0.8084 | 0.9425 |
| 0.0108 | 12.0 | 10536 | 0.3607 | 0.7929 | 0.8256 | 0.8089 | 0.9431 |
| 0.0078 | 13.0 | 11414 | 0.3980 | 0.7915 | 0.8240 | 0.8074 | 0.9423 |
| 0.0062 | 14.0 | 12292 | 0.4096 | 0.7995 | 0.8247 | 0.8119 | 0.9436 |
| 0.0035 | 15.0 | 13170 | 0.4177 | 0.8006 | 0.8251 | 0.8127 | 0.9438 |
### Framework versions
- Transformers 4.15.0
- Pytorch 1.10.1+cu102
- Datasets 1.17.0
- Tokenizers 0.10.3
|
0c018c4d6391ea846568b2b682fc19af
|
ramrajput/bert-finetuned-squad
|
ramrajput
|
bert
| 16 | 3 |
transformers
| 0 |
question-answering
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 875 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bert-finetuned-squad
This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Framework versions
- Transformers 4.21.1
- Pytorch 1.12.1+cu113
- Tokenizers 0.12.1
|
bf8ce5c7a28f71f120950be82ff173a8
|
fathyshalab/all-roberta-large-v1-kitchen_and_dining-16-16-5-oos
|
fathyshalab
|
roberta
| 11 | 3 |
transformers
| 0 |
text-classification
| true | false | false |
apache-2.0
| null | null | null | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
['generated_from_trainer']
| true | true | true | 1,528 | false |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# all-roberta-large-v1-kitchen_and_dining-16-16-5-oos
This model is a fine-tuned version of [sentence-transformers/all-roberta-large-v1](https://huggingface.co/sentence-transformers/all-roberta-large-v1) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.3560
- Accuracy: 0.2692
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.7421 | 1.0 | 1 | 2.5878 | 0.2012 |
| 2.1065 | 2.0 | 2 | 2.4975 | 0.2012 |
| 1.5994 | 3.0 | 3 | 2.4274 | 0.2249 |
| 1.1739 | 4.0 | 4 | 2.3808 | 0.2456 |
| 1.083 | 5.0 | 5 | 2.3560 | 0.2692 |
### Framework versions
- Transformers 4.20.0
- Pytorch 1.11.0+cu102
- Datasets 2.3.2
- Tokenizers 0.12.1
|
4a223fb321a9324e85f13883ad9664fe
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.