modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-09-07 18:30:29
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 544
values | tags
listlengths 1
4.05k
| pipeline_tag
stringclasses 55
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-09-07 18:30:28
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
bigmorning/whisper_charsplit_new_round2__0040
|
bigmorning
| 2023-08-13T19:34:23Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:34:15Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0040
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0040
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0002
- Train Accuracy: 0.0795
- Train Wermet: 7.9500
- Validation Loss: 0.5660
- Validation Accuracy: 0.0769
- Validation Wermet: 7.0891
- Epoch: 39
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
| 0.0012 | 0.0795 | 8.3370 | 0.5683 | 0.0768 | 7.1040 | 37 |
| 0.0005 | 0.0795 | 7.9931 | 0.5658 | 0.0769 | 6.8043 | 38 |
| 0.0002 | 0.0795 | 7.9500 | 0.5660 | 0.0769 | 7.0891 | 39 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_prefix_tuning_500_10_3000_8_e-1_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:31:02Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:33:45Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e4_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:29:03Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:29:02Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0037
|
bigmorning
| 2023-08-13T19:21:21Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:21:12Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0037
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0037
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0018
- Train Accuracy: 0.0795
- Train Wermet: 8.4062
- Validation Loss: 0.5713
- Validation Accuracy: 0.0768
- Validation Wermet: 7.2127
- Epoch: 36
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
| 0.0018 | 0.0795 | 8.4062 | 0.5713 | 0.0768 | 7.2127 | 36 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e3_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:20:14Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:20:13Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e5_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T19:20:07Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T19:20:06Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0036
|
bigmorning
| 2023-08-13T19:16:56Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:16:50Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0036
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0036
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0036
- Train Accuracy: 0.0795
- Train Wermet: 8.9171
- Validation Loss: 0.5687
- Validation Accuracy: 0.0767
- Validation Wermet: 7.6962
- Epoch: 35
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
| 0.0036 | 0.0795 | 8.9171 | 0.5687 | 0.0767 | 7.6962 | 35 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
bigmorning/whisper_charsplit_new_round2__0035
|
bigmorning
| 2023-08-13T19:12:33Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T19:12:25Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0035
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0035
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0010
- Train Accuracy: 0.0795
- Train Wermet: 8.1006
- Validation Loss: 0.5918
- Validation Accuracy: 0.0766
- Validation Wermet: 7.4447
- Epoch: 34
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
| 0.0009 | 0.0795 | 8.4768 | 0.5611 | 0.0769 | 7.6392 | 32 |
| 0.0005 | 0.0795 | 8.2728 | 0.5669 | 0.0768 | 7.1451 | 33 |
| 0.0010 | 0.0795 | 8.1006 | 0.5918 | 0.0766 | 7.4447 | 34 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
TheRains/yt-special-batch12-lr4-augment-small
|
TheRains
| 2023-08-13T19:08:42Z | 114 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"whisper",
"automatic-speech-recognition",
"whisper-event",
"generated_from_trainer",
"dataset:yt",
"base_model:openai/whisper-small",
"base_model:finetune:openai/whisper-small",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:31:02Z |
---
license: apache-2.0
base_model: openai/whisper-small
tags:
- whisper-event
- generated_from_trainer
datasets:
- yt
metrics:
- wer
model-index:
- name: Whisper Small Indonesian
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: yt id
type: yt
metrics:
- name: Wer
type: wer
value: 38.89501329356072
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Whisper Small Indonesian
This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the yt id dataset.
It achieves the following results on the evaluation set:
- Loss: 0.7352
- Wer: 38.8950
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 12
- eval_batch_size: 6
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 5000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 1.296 | 0.13 | 1000 | 1.3286 | 110.0123 |
| 1.1522 | 0.26 | 2000 | 1.0917 | 81.6354 |
| 1.0175 | 0.39 | 3000 | 0.9399 | 60.1258 |
| 0.7844 | 0.52 | 4000 | 0.8174 | 43.4862 |
| 0.5919 | 0.64 | 5000 | 0.7352 | 38.8950 |
### Framework versions
- Transformers 4.31.0.dev0
- Pytorch 2.0.1+cu117
- Datasets 2.13.1
- Tokenizers 0.13.3
|
Lyviaa/Tokyoo
|
Lyviaa
| 2023-08-13T19:03:38Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-08-13T19:02:57Z |
---
license: creativeml-openrail-m
---
|
bigmorning/whisper_charsplit_new_round2__0032
|
bigmorning
| 2023-08-13T18:59:27Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:59:19Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0032
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0032
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0019
- Train Accuracy: 0.0795
- Train Wermet: 8.6037
- Validation Loss: 0.5715
- Validation Accuracy: 0.0767
- Validation Wermet: 7.6157
- Epoch: 31
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
| 0.0019 | 0.0795 | 8.6037 | 0.5715 | 0.0767 | 7.6157 | 31 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
bigmorning/whisper_charsplit_new_round2__0031
|
bigmorning
| 2023-08-13T18:55:01Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:54:54Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0031
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0031
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0022
- Train Accuracy: 0.0795
- Train Wermet: 8.2353
- Validation Loss: 0.5789
- Validation Accuracy: 0.0767
- Validation Wermet: 7.4442
- Epoch: 30
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
| 0.0007 | 0.0795 | 8.5183 | 0.5688 | 0.0768 | 7.4310 | 28 |
| 0.0014 | 0.0795 | 8.6604 | 0.5750 | 0.0767 | 8.0751 | 29 |
| 0.0022 | 0.0795 | 8.2353 | 0.5789 | 0.0767 | 7.4442 | 30 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prefix_tuning_500_10_3000_8_e1_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T18:52:26Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:13:23Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
mani05/dqn-SpaceInvadersNoFrameskip-v4
|
mani05
| 2023-08-13T18:48:49Z | 7 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-08-13T18:48:14Z |
---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 649.00 +/- 125.61
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga mani05 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga mani05 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga mani05
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
# Environment Arguments
```python
{'render_mode': 'rgb_array'}
```
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_8_e-1_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T18:44:54Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:37:23Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
fathyshalab/mdcsi-wasser-strom-gas-setfit
|
fathyshalab
| 2023-08-13T18:43:28Z | 5 | 0 |
sentence-transformers
|
[
"sentence-transformers",
"pytorch",
"roberta",
"setfit",
"text-classification",
"arxiv:2209.11055",
"license:apache-2.0",
"region:us"
] |
text-classification
| 2023-08-13T16:58:18Z |
---
license: apache-2.0
tags:
- setfit
- sentence-transformers
- text-classification
pipeline_tag: text-classification
---
# C:\Users\F896D~1.SHA\AppData\Local\Temp\tmpj64pfrsp\fathyshalab\mdcsi-wasser-strom-gas-setfit
This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Usage
To use this model for inference, first install the SetFit library:
```bash
python -m pip install setfit
```
You can then run inference as follows:
```python
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("C:\Users\F896D~1.SHA\AppData\Local\Temp\tmpj64pfrsp\fathyshalab\mdcsi-wasser-strom-gas-setfit")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
```
## BibTeX entry and citation info
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e0_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T18:41:53Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:43:19Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0028
|
bigmorning
| 2023-08-13T18:41:43Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:41:37Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0028
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0028
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0009
- Train Accuracy: 0.0795
- Train Wermet: 8.3074
- Validation Loss: 0.5641
- Validation Accuracy: 0.0768
- Validation Wermet: 7.1747
- Epoch: 27
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
| 0.0009 | 0.0795 | 8.3074 | 0.5641 | 0.0768 | 7.1747 | 27 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
jstoone/whisper-tiny-en
|
jstoone
| 2023-08-13T18:39:23Z | 85 | 0 |
transformers
|
[
"transformers",
"pytorch",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"dataset:PolyAI/minds14",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T13:56:01Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_trainer
datasets:
- PolyAI/minds14
metrics:
- wer
model-index:
- name: whisper-tiny-en
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: PolyAI/minds14
type: PolyAI/minds14
config: en-US
split: train
args: en-US
metrics:
- name: Wer
type: wer
value: 0.35723514211886304
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# whisper-tiny-en
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the PolyAI/minds14 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9035
- Wer Ortho: 0.354643
- Wer: 0.357235
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- training_steps: 4000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer |
|:-------------:|:------:|:----:|:---------------:|:---------:|:-------:|
| 0.0005 | 35.71 | 500 | 0.7515 | 36.1373 | 36.4341 |
| 0.0002 | 71.43 | 1000 | 0.8095 | 36.4065 | 36.5633 |
| 0.0001 | 107.14 | 1500 | 0.8421 | 36.4738 | 36.6925 |
| 0.0001 | 142.86 | 2000 | 0.8636 | 35.4643 | 35.5943 |
| 0.0001 | 178.57 | 2500 | 0.8822 | 35.6662 | 35.7235 |
| 0.0 | 214.29 | 3000 | 0.8931 | 35.4643 | 35.7235 |
| 0.0 | 250.0 | 3500 | 0.9013 | 35.4643 | 35.7235 |
| 0.0 | 285.71 | 4000 | 0.9035 | 35.4643 | 35.7235 |
### Framework versions
- Transformers 4.32.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.14.4
- Tokenizers 0.13.3
|
KenmaKIN101/LenKagamineV4_250epochs
|
KenmaKIN101
| 2023-08-13T18:38:53Z | 0 | 0 | null |
[
"license:openrail",
"region:us"
] | null | 2023-08-13T18:32:40Z |
---
license: openrail
---
This model is for fair use only. Also, it may sound a bit bad because this was my first model (because I discarded the Miku model).
|
bigmorning/whisper_charsplit_new_round2__0027
|
bigmorning
| 2023-08-13T18:37:27Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:37:20Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0027
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0027
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0011
- Train Accuracy: 0.0795
- Train Wermet: 8.4237
- Validation Loss: 0.5710
- Validation Accuracy: 0.0768
- Validation Wermet: 7.4035
- Epoch: 26
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
| 0.0012 | 0.0795 | 7.9451 | 0.5615 | 0.0768 | 7.0907 | 24 |
| 0.0025 | 0.0795 | 8.1065 | 0.5619 | 0.0768 | 7.7020 | 25 |
| 0.0011 | 0.0795 | 8.4237 | 0.5710 | 0.0768 | 7.4035 | 26 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_prompt_tuning_500_10_3000_8_e-1_s108_v4_l5_v50
|
KingKazma
| 2023-08-13T18:34:32Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:35:50Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KemalHal/whisper-base-bosnian-google
|
KemalHal
| 2023-08-13T18:29:56Z | 78 | 0 |
transformers
|
[
"transformers",
"pytorch",
"whisper",
"automatic-speech-recognition",
"bs",
"dataset:google/fleurs",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-11T15:15:09Z |
---
datasets:
- google/fleurs
language:
- bs
metrics:
- wer
pipeline_tag: automatic-speech-recognition
---
Ovaj model je Fine-Tuned verzija Whisper AI base modela na Bosanskom jeziku. Dataset koristen je google/fleurs bs_ba.
|
dn118/epicnegative
|
dn118
| 2023-08-13T18:29:16Z | 0 | 0 | null |
[
"region:us"
] | null | 2023-08-13T18:25:53Z |
all credits go to epinikion https://civitai.com/user/epinikion
reuploaded from civit.ai, source: https://civitai.com/models/89484?modelVersionId=95263
|
SaudxInu/Reinforce-CartPole-v1
|
SaudxInu
| 2023-08-13T18:28:55Z | 0 | 0 | null |
[
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-08-13T18:05:19Z |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-CartPole-v1
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
josephamess/llama-2-7b-ExtraData-v2
|
josephamess
| 2023-08-13T18:24:51Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:36:15Z |
---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- quant_method: bitsandbytes
- load_in_8bit: True
- load_in_4bit: False
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0024
|
bigmorning
| 2023-08-13T18:24:25Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:24:17Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0024
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0024
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0008
- Train Accuracy: 0.0795
- Train Wermet: 8.3246
- Validation Loss: 0.5620
- Validation Accuracy: 0.0768
- Validation Wermet: 7.4475
- Epoch: 23
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
| 0.0008 | 0.0795 | 8.3246 | 0.5620 | 0.0768 | 7.4475 | 23 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
charliezjw/t2
|
charliezjw
| 2023-08-13T18:20:49Z | 0 | 1 |
diffusers
|
[
"diffusers",
"tensorboard",
"stable-diffusion-xl",
"stable-diffusion-xl-diffusers",
"text-to-image",
"lora",
"base_model:stabilityai/stable-diffusion-xl-base-1.0",
"base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0",
"license:openrail++",
"region:us"
] |
text-to-image
| 2023-08-13T17:48:17Z |
---
license: openrail++
base_model: stabilityai/stable-diffusion-xl-base-1.0
instance_prompt: a photo of sks dog
tags:
- stable-diffusion-xl
- stable-diffusion-xl-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA DreamBooth - charliezjw/t2
These are LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained on a photo of sks dog using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following.




LoRA for the text encoder was enabled: False.
Special VAE used for training: madebyollin/sdxl-vae-fp16-fix.
|
bigmorning/whisper_charsplit_new_round2__0023
|
bigmorning
| 2023-08-13T18:20:01Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:19:52Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0023
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0023
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0007
- Train Accuracy: 0.0795
- Train Wermet: 8.6456
- Validation Loss: 0.5543
- Validation Accuracy: 0.0768
- Validation Wermet: 7.4625
- Epoch: 22
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
| 0.0020 | 0.0795 | 8.6599 | 0.5612 | 0.0767 | 7.7369 | 21 |
| 0.0007 | 0.0795 | 8.6456 | 0.5543 | 0.0768 | 7.4625 | 22 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e8_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T18:19:07Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:19:06Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
RAVIKUMAR/ddpm-butterflies-128
|
RAVIKUMAR
| 2023-08-13T18:16:08Z | 5 | 0 |
diffusers
|
[
"diffusers",
"tensorboard",
"en",
"dataset:huggan/smithsonian_butterflies_subset",
"license:apache-2.0",
"diffusers:DDPMPipeline",
"region:us"
] | null | 2023-08-13T18:09:17Z |
---
language: en
license: apache-2.0
library_name: diffusers
tags: []
datasets: huggan/smithsonian_butterflies_subset
metrics: []
---
<!-- This model card has been generated automatically according to the information the training script had access to. You
should probably proofread and complete it, then remove this comment. -->
# ddpm-butterflies-128
## Model description
This diffusion model is trained with the [🤗 Diffusers](https://github.com/huggingface/diffusers) library
on the `huggan/smithsonian_butterflies_subset` dataset.
## Intended uses & limitations
#### How to use
```python
# TODO: add an example code snippet for running this diffusion pipeline
```
#### Limitations and bias
[TODO: provide examples of latent issues and potential remediations]
## Training data
[TODO: describe the data used to train the model]
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- gradient_accumulation_steps: 1
- optimizer: AdamW with betas=(None, None), weight_decay=None and epsilon=None
- lr_scheduler: None
- lr_warmup_steps: 500
- ema_inv_gamma: None
- ema_inv_gamma: None
- ema_inv_gamma: None
- mixed_precision: fp16
### Training results
📈 [TensorBoard logs](https://huggingface.co/HuggingFace7/ddpm-butterflies-128/tensorboard?#scalars)
license: mit
---
|
bigmorning/whisper_charsplit_new_round2__0021
|
bigmorning
| 2023-08-13T18:11:16Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:11:08Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0021
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0021
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0058
- Train Accuracy: 0.0794
- Train Wermet: 8.8460
- Validation Loss: 0.5706
- Validation Accuracy: 0.0766
- Validation Wermet: 7.4342
- Epoch: 20
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
| 0.0058 | 0.0794 | 8.8460 | 0.5706 | 0.0766 | 7.4342 | 20 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e7_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T18:10:32Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:10:31Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0020
|
bigmorning
| 2023-08-13T18:06:52Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T18:06:44Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0020
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0020
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0014
- Train Accuracy: 0.0795
- Train Wermet: 8.7153
- Validation Loss: 0.5804
- Validation Accuracy: 0.0765
- Validation Wermet: 7.8654
- Epoch: 19
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
| 0.0004 | 0.0795 | 9.1734 | 0.5607 | 0.0768 | 8.0252 | 17 |
| 0.0003 | 0.0795 | 9.0084 | 0.5571 | 0.0769 | 8.1563 | 18 |
| 0.0014 | 0.0795 | 8.7153 | 0.5804 | 0.0765 | 7.8654 | 19 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e6_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T18:01:56Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T18:01:55Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0017
|
bigmorning
| 2023-08-13T17:53:35Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:53:25Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0017
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0017
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0005
- Train Accuracy: 0.0795
- Train Wermet: 9.0553
- Validation Loss: 0.5620
- Validation Accuracy: 0.0768
- Validation Wermet: 8.5020
- Epoch: 16
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
| 0.0012 | 0.0795 | 8.8259 | 0.5563 | 0.0768 | 8.2647 | 15 |
| 0.0005 | 0.0795 | 9.0553 | 0.5620 | 0.0768 | 8.5020 | 16 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e5_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T17:53:21Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:53:20Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
YassineBenlaria/tamasheq-3e-5
|
YassineBenlaria
| 2023-08-13T17:52:52Z | 7 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:jonatasgrosman/wav2vec2-large-xlsr-53-arabic",
"base_model:finetune:jonatasgrosman/wav2vec2-large-xlsr-53-arabic",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-12T10:54:16Z |
---
license: apache-2.0
base_model: jonatasgrosman/wav2vec2-large-xlsr-53-arabic
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: tamasheq-3e-5
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# tamasheq-3e-5
This model is a fine-tuned version of [jonatasgrosman/wav2vec2-large-xlsr-53-arabic](https://huggingface.co/jonatasgrosman/wav2vec2-large-xlsr-53-arabic) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6463
- Wer: 0.7256
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 16.1726 | 5.8 | 200 | 3.5775 | 1.0 |
| 3.2056 | 11.59 | 400 | 3.0132 | 1.0 |
| 2.3808 | 17.39 | 600 | 1.0264 | 0.8069 |
| 0.6535 | 23.19 | 800 | 0.6919 | 0.7388 |
| 0.4426 | 28.99 | 1000 | 0.6463 | 0.7256 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.4
- Tokenizers 0.13.3
|
Icebear-AI/Llama-2-13b-chat-arabic-lora
|
Icebear-AI
| 2023-08-13T17:52:46Z | 0 | 7 | null |
[
"en",
"ar",
"dataset:Yasbok/Alpaca_arabic_instruct",
"license:apache-2.0",
"region:us"
] | null | 2023-08-13T17:49:32Z |
---
license: apache-2.0
datasets:
- Yasbok/Alpaca_arabic_instruct
language:
- en
- ar
---
This model is an LoRA adapter file from finetuned Llama-2-13b-chat-hf model. This is an experimental model.
This model is presented by IceBear-AI.
To run it, you need to:
- Agree with Meta's agreements to download the Llama-2-13b-chat-hf model from here: https://huggingface.co/meta-llama/Llama-2-13b-chat-hf
- Clone this repository
- Clone the Alpaca-LoRA repository from here: https://github.com/tloen/alpaca-lora
- Use this command to run it:
-python generate.py \
--load_8bit \
--base_model 'PATH_TO_YOUR_LOCAL_LLAMA_2_7B_CHAT_HF' \
--lora_weights 'PATH_TO_YOUR_LOCAL_FILE_OF_THIS_MODEL'
You must agree with Meta/Llama-2's agreements to use this model.
If you would like to contact us, please don't hesitate to email to icebearai@163.com.
|
bigmorning/whisper_charsplit_new_round2__0015
|
bigmorning
| 2023-08-13T17:44:49Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:44:29Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0015
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0015
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0038
- Train Accuracy: 0.0795
- Train Wermet: 8.7270
- Validation Loss: 0.5605
- Validation Accuracy: 0.0767
- Validation Wermet: 7.7098
- Epoch: 14
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
| 0.0038 | 0.0795 | 8.7270 | 0.5605 | 0.0767 | 7.7098 | 14 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e9_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T17:42:02Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:41:58Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0014
|
bigmorning
| 2023-08-13T17:40:14Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:40:08Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0014
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0014
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0037
- Train Accuracy: 0.0795
- Train Wermet: 9.2838
- Validation Loss: 0.5751
- Validation Accuracy: 0.0765
- Validation Wermet: 7.4189
- Epoch: 13
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
| 0.0037 | 0.0795 | 9.2838 | 0.5751 | 0.0765 | 7.4189 | 13 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e8_s55555_v4_l4_r2
|
KingKazma
| 2023-08-13T17:39:00Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:38:59Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0013
|
bigmorning
| 2023-08-13T17:35:54Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:35:47Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0013
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0013
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0005
- Train Accuracy: 0.0795
- Train Wermet: 9.2292
- Validation Loss: 0.5687
- Validation Accuracy: 0.0767
- Validation Wermet: 8.5576
- Epoch: 12
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
| 0.0003 | 0.0795 | 9.3340 | 0.5584 | 0.0768 | 8.1322 | 11 |
| 0.0005 | 0.0795 | 9.2292 | 0.5687 | 0.0767 | 8.5576 | 12 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e8_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T17:33:23Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:33:19Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e2_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T17:27:47Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:27:45Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0011
|
bigmorning
| 2023-08-13T17:27:17Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:27:08Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0011
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0011
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0005
- Train Accuracy: 0.0795
- Train Wermet: 9.3749
- Validation Loss: 0.5552
- Validation Accuracy: 0.0768
- Validation Wermet: 8.0800
- Epoch: 10
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
| 0.0005 | 0.0795 | 9.3749 | 0.5552 | 0.0768 | 8.0800 | 10 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
bigmorning/whisper_charsplit_new_round2__0010
|
bigmorning
| 2023-08-13T17:22:53Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:22:45Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0010
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0010
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0011
- Train Accuracy: 0.0795
- Train Wermet: 8.9730
- Validation Loss: 0.5605
- Validation Accuracy: 0.0767
- Validation Wermet: 8.3958
- Epoch: 9
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
| 0.0011 | 0.0795 | 8.9730 | 0.5605 | 0.0767 | 8.3958 | 9 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e8_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T17:22:22Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:22:20Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0009
|
bigmorning
| 2023-08-13T17:18:33Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:18:25Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0009
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0009
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0031
- Train Accuracy: 0.0795
- Train Wermet: 9.2135
- Validation Loss: 0.5636
- Validation Accuracy: 0.0766
- Validation Wermet: 8.2384
- Epoch: 8
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
| 0.0031 | 0.0795 | 9.2135 | 0.5636 | 0.0766 | 8.2384 | 8 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e5_s55555_v4_l4_r2
|
KingKazma
| 2023-08-13T17:16:58Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:16:56Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0008
|
bigmorning
| 2023-08-13T17:14:11Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:14:03Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0008
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0008
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0037
- Train Accuracy: 0.0795
- Train Wermet: 9.3428
- Validation Loss: 0.5717
- Validation Accuracy: 0.0764
- Validation Wermet: 8.2631
- Epoch: 7
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
| 0.0009 | 0.0795 | 8.7510 | 0.5642 | 0.0766 | 7.9083 | 6 |
| 0.0037 | 0.0795 | 9.3428 | 0.5717 | 0.0764 | 8.2631 | 7 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e0_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T17:10:36Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:10:35Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e4_s55555_v4_l4_r2
|
KingKazma
| 2023-08-13T17:09:37Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:09:36Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e6_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T17:08:38Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:08:36Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0006
|
bigmorning
| 2023-08-13T17:05:24Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:05:17Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0006
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0006
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0012
- Train Accuracy: 0.0795
- Train Wermet: 8.8862
- Validation Loss: 0.5667
- Validation Accuracy: 0.0767
- Validation Wermet: 8.2913
- Epoch: 5
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
| 0.0012 | 0.0795 | 8.8862 | 0.5667 | 0.0767 | 8.2913 | 5 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
jiaqixuac/controlnet_training
|
jiaqixuac
| 2023-08-13T17:05:05Z | 0 | 0 |
diffusers
|
[
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"controlnet",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-08-13T14:21:44Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- controlnet
inference: true
---
# controlnet-jiaqixuac/controlnet_training
These are controlnet weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.
You can find some example images below.
prompt: red circle with blue background

prompt: cyan circle with brown floral background

|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e-1_s55555_v4_l4_v100
|
KingKazma
| 2023-08-13T17:02:02Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:58:53Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e5_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T17:01:46Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T17:01:44Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0005
|
bigmorning
| 2023-08-13T17:00:54Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T17:00:46Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0005
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0005
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0011
- Train Accuracy: 0.0795
- Train Wermet: 8.9053
- Validation Loss: 0.5609
- Validation Accuracy: 0.0767
- Validation Wermet: 7.5155
- Epoch: 4
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
| 0.0011 | 0.0795 | 8.9053 | 0.5609 | 0.0767 | 7.5155 | 4 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e4_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:58:48Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:58:44Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_round2__0004
|
bigmorning
| 2023-08-13T16:56:30Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:56:23Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_round2__0004
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_round2__0004
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0019
- Train Accuracy: 0.0795
- Train Wermet: 8.9450
- Validation Loss: 0.5623
- Validation Accuracy: 0.0766
- Validation Wermet: 7.7117
- Epoch: 3
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.0010 | 0.0795 | 8.7507 | 0.5575 | 0.0767 | 7.6778 | 0 |
| 0.0013 | 0.0795 | 8.9468 | 0.5652 | 0.0766 | 8.3360 | 1 |
| 0.0025 | 0.0795 | 8.7338 | 0.5673 | 0.0765 | 8.3770 | 2 |
| 0.0019 | 0.0795 | 8.9450 | 0.5623 | 0.0766 | 7.7117 | 3 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e2_s55555_v4_l4_r2
|
KingKazma
| 2023-08-13T16:54:56Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:54:53Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
jsnbuchanan/segformer-b0-scene-parse-150
|
jsnbuchanan
| 2023-08-13T16:50:58Z | 31 | 0 |
transformers
|
[
"transformers",
"pytorch",
"segformer",
"generated_from_trainer",
"dataset:scene_parse_150",
"base_model:nvidia/mit-b0",
"base_model:finetune:nvidia/mit-b0",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2023-08-01T20:37:16Z |
---
license: other
base_model: nvidia/mit-b0
tags:
- generated_from_trainer
datasets:
- scene_parse_150
model-index:
- name: segformer-b0-scene-parse-150
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# segformer-b0-scene-parse-150
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the scene_parse_150 dataset.
It achieves the following results on the evaluation set:
- Loss: 4.5716
- Mean Iou: 0.0039
- Mean Accuracy: 0.0219
- Overall Accuracy: 0.1398
- Per Category Iou: [0.1424604255351693, 0.0028172808510882213, 0.009342676914231785, 0.0, 0.0, 0.02331811292704824, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan]
- Per Category Accuracy: [0.9514506078251098, 0.0028769356391743226, 0.00966095515858549, 0.0, 0.0, 0.045009037210949, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 4.8633 | 1.0 | 20 | 4.8626 | 0.0009 | 0.0023 | 0.0067 | [0.0, 0.0004889263991152761, 0.0, 0.0, 0.0, 0.0, 0.03284478144986514, 0.0, 0.0, 0.014472940861907617, 0.0, 0.0009606283639651349, 0.0, 0.001090056864633105, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0033905507210453163, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.011123126834543489, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan] | [0.0, 0.0004916838121884905, 0.0, 0.0, 0.0, 0.0, 0.05156049842785606, 0.0, 0.0, 0.02758031245634076, 0.0, 0.002084802403654536, 0.0, 0.0011670427137633237, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.003448710560437977, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.02041973908111174, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.8081 | 2.0 | 40 | 4.6105 | 0.0014 | 0.0050 | 0.0207 | [0.0014870097866647892, 0.00010797969981643452, 0.0, 0.0, 0.0, 0.005608097195054107, 0.06877789289425044, 0.0, 0.0, 0.012758644335110486, 0.0, 0.0, 0.0, 0.0023358985966500678, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, 0.0030744981206588954, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan] | [0.0014982803827425341, 0.0001082875062557985, 0.0, 0.0, 0.0, 0.005822945199098142, 0.19299522534063118, 0.0, 0.0, 0.024465610900633625, 0.0, 0.0, 0.0, 0.002509141834591146, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.00319658550641118, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.6722 | 3.0 | 60 | 4.3976 | 0.0024 | 0.0122 | 0.0530 | [0.025172656541163803, 0.0010250959756997078, 0.0, 0.00034731034851257663, 0.0, 0.01513758223102497, 0.08697308653455893, 0.0, 0.0, 0.018849001504380403, 0.0, 0.0, 0.0, 0.002455752600466593, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0010695187165775401, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan] | [0.031049590810546986, 0.001033121343467483, 0.0, 0.0003484413948377067, 0.0, 0.01993776436171204, 0.465229832471011, 0.0, 0.0, 0.042011489057453506, 0.0, 0.0, 0.0, 0.0027230996654477556, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0010805359458291313, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.3876 | 4.0 | 80 | 4.1595 | 0.0032 | 0.0120 | 0.0594 | [0.07854437258265586, 0.002560575118273506, 0.0, 0.0, 0.0, 0.001192829675600377, 0.0810610478529872, 0.0, 0.0, 0.014937473673721733, 0.0, 0.0, 0.0, 0.005779474740910093, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan] | [0.16734486555203687, 0.0026310937330800773, 0.0, 0.0, 0.0, 0.001332289861553655, 0.3434053136801477, 0.0, 0.0, 0.027103656281588746, 0.0, 0.0, 0.0, 0.009686454524235588, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.5927 | 5.0 | 100 | 4.1201 | 0.0038 | 0.0139 | 0.0738 | [0.10272745389575265, 5.808145343029064e-06, 0.0011583054074428688, 0.0, 0.0, 0.004795549066148129, 0.07399210984610646, 0.0, 0.0, 0.021106484070283166, 0.0, 0.0, 0.0, 0.004504373601464648, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.31741694191893394, 5.853378716529649e-06, 0.0011806276115426681, 0.0, 0.0, 0.00645648163676002, 0.2597614333959973, 0.0, 0.0, 0.0499420616201379, 0.0, 0.0, 0.0, 0.006029720687777173, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6357 | 6.0 | 120 | 3.8936 | 0.0031 | 0.0183 | 0.1193 | [0.13782355615369993, 0.00012947106753210882, 0.0020724837921139334, 0.0, 0.0, 0.002555414284014255, 0.00625454345947577, 0.0, 0.0, 0.0021069049880189433, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8250104993132882, 0.0001317010211219171, 0.002187386073641998, 0.0, 0.0, 0.0038012186259712673, 0.0075862183699612375, 0.0, 0.0, 0.0027531003196883653, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.9752 | 7.0 | 140 | 3.7886 | 0.0046 | 0.0208 | 0.1099 | [0.12970556062622363, 0.02142529574722971, 0.015682682019160826, 0.0, 0.0, 0.046496050677108665, 0.011263870769586333, 0.0, 0.0, 7.157683773530885e-05, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.6339173221643342, 0.026799694453630996, 0.017715022855380128, 0.0, 0.0, 0.2645107794361526, 0.01540534695303532, 0.0, 0.0, 8.218209909517509e-05, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.3363 | 8.0 | 160 | 3.7281 | 0.0044 | 0.0211 | 0.1287 | [0.13844309203854324, 0.006048050986190329, 0.03636386971467197, 0.0, 0.0, 0.029408017788998847, 0.00848082968593638, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8096133982588166, 0.006476763549840056, 0.047673799040915336, 0.0, 0.0, 0.09462984701958373, 0.010468482257232695, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 4.0164 | 9.0 | 180 | 3.7593 | 0.0058 | 0.0207 | 0.1113 | [0.1257575586728653, 0.031513436815826704, 0.04777990089072147, 0.0, 1.3152702880441932e-05, 0.04167081950119985, 0.042454800300151495, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.5380103517553717, 0.04309550080044954, 0.06794918533890462, 0.0, 1.6288501946475982e-05, 0.20255464251774832, 0.09977291254221497, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.5911 | 10.0 | 200 | 3.4747 | 0.0045 | 0.0213 | 0.1331 | [0.1420633887795078, 0.015648167497586848, 0.018910536514932744, 0.0, 0.0, 0.024232767348835664, 0.007368267428066774, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8642552297930784, 0.018408876063485746, 0.022330968338988753, 0.0, 0.0, 0.0680119999254663, 0.008189289457485567, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6199 | 11.0 | 220 | 3.5161 | 0.0043 | 0.0210 | 0.1347 | [0.14182326280462046, 0.005032394655020479, 0.028134080245829304, 0.0, 0.0, 0.014572871324844493, 0.013168143969916734, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8793316761444252, 0.005314867874608921, 0.037280910849995796, 0.0, 0.0, 0.029226526543313397, 0.016312033139796036, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6997 | 12.0 | 240 | 3.4768 | 0.0032 | 0.0217 | 0.1424 | [0.14365582667494992, 0.0001899779333323591, 0.00036495363737125246, 0.0, 0.0, 0.0013590921597892156, 0.0009173402092527396, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9941061962974314, 0.0001902348082872136, 0.0003785860512072689, 0.0, 0.0, 0.0015186241079247955, 0.0009233226305544927, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.8775 | 13.0 | 260 | 3.4909 | 0.0034 | 0.0215 | 0.1412 | [0.14366726677393113, 0.0, 0.00935392023848197, 0.0, 0.0, 0.002252103613265138, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9757607745655554, 0.0, 0.01098179982613085, 0.0, 0.0, 0.0025434624629660685, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6896 | 14.0 | 280 | 3.4944 | 0.0032 | 0.0216 | 0.1419 | [0.14372170627530323, 0.0, 0.0026556544494494754, 0.0, 0.0, 0.0026339040125360493, 5.807995950997109e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9887260076503105, 0.0, 0.002880058330295297, 0.0, 0.0, 0.002944081092664021, 5.822755327821125e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.5723 | 15.0 | 300 | 3.4798 | 0.0033 | 0.0216 | 0.1420 | [0.14360235216056103, 0.0, 0.005107262858619798, 0.0, 0.0, 0.0031149607580008946, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.986302652637314, 0.0, 0.005541378053226394, 0.0, 0.0, 0.0035683008180073415, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7581 | 16.0 | 320 | 3.5572 | 0.0037 | 0.0217 | 0.1428 | [0.14338450814233764, 0.0, 0.023961774556760896, 0.0, 0.0, 0.0016283550899666187, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9689248703192926, 0.0, 0.029392299279284332, 0.0, 0.0, 0.001677008217340265, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.5605 | 17.0 | 340 | 3.4733 | 0.0035 | 0.0217 | 0.1426 | [0.14333996536268165, 5.851409311932779e-06, 0.015588103385475387, 0.0, 0.0, 0.0036605352700892673, 2.4902258634858183e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9784338429756756, 5.853378716529649e-06, 0.017642109986258727, 0.0, 0.0, 0.004099353420165092, 2.4954665690661966e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7163 | 18.0 | 360 | 3.5516 | 0.0036 | 0.0217 | 0.1420 | [0.1431641368804744, 1.7551198308064484e-05, 0.017346049550404807, 0.0, 0.0, 0.006382572138352761, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9707466430574, 1.7560136149588946e-05, 0.019983734821503688, 0.0, 0.0, 0.007779454785995118, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.2231 | 19.0 | 380 | 3.5751 | 0.0051 | 0.0222 | 0.1352 | [0.14144860113740007, 0.003382545372614381, 0.05970141064540731, 0.0, 0.0, 0.03208762207850171, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.8172324945233311, 0.003427153238528109, 0.0915421071819176, 0.0, 0.0, 0.1078409450872976, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7188 | 20.0 | 400 | 3.4891 | 0.0042 | 0.0218 | 0.1394 | [0.14279628860196056, 0.00783605754787311, 0.023641167287138224, 0.0, 0.0, 0.019599678095523283, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9274525828310689, 0.00824448392223201, 0.02760312964468998, 0.0, 0.0, 0.03902770790243539, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6156 | 21.0 | 420 | 3.5195 | 0.0039 | 0.0218 | 0.1430 | [0.1431243377907112, 0.003771829965220418, 0.028600253499599246, 0.0, 0.0, 0.003834416457157314, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9602076026378815, 0.003869083331626098, 0.03442328724866093, 0.0, 0.0, 0.004518605474500158, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7812 | 22.0 | 440 | 3.5044 | 0.0034 | 0.0218 | 0.1431 | [0.14375063548866288, 0.003941135854106936, 0.008966284779050737, 0.0, 0.0, 0.0008825520320117902, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9869070725644431, 0.004059318139913311, 0.00960206399506436, 0.0, 0.0, 0.0009596213688113738, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.9414 | 23.0 | 460 | 3.6536 | 0.0031 | 0.0217 | 0.1430 | [0.1433341827741543, 0.0009574223513012504, 1.945649666321082e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9991458666757471, 0.0009628807988691273, 1.9630387840376903e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3775 | 24.0 | 480 | 3.5794 | 0.0031 | 0.0217 | 0.1429 | [0.14339061919565174, 0.0004489089181295073, 0.0005344750160480256, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9989443933667042, 0.00045071016117278296, 0.0005440421772904456, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7282 | 25.0 | 500 | 4.1077 | 0.0031 | 0.0217 | 0.1430 | [0.14301211156076965, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9999744611298396, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.2262 | 26.0 | 520 | 4.1736 | 0.0031 | 0.0217 | 0.1430 | [0.1429770208690924, 0.00013157471543312935, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9998581173879979, 0.0001317010211219171, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3175 | 27.0 | 540 | 4.0696 | 0.0031 | 0.0217 | 0.1430 | [0.143028637658645, 0.0009629118921073225, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9990578994563059, 0.0009687341775856569, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.8936 | 28.0 | 560 | 4.5036 | 0.0031 | 0.0217 | 0.1430 | [0.14297076727730323, 0.0002453013117779673, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9997531242551163, 0.00024584190609424527, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7712 | 29.0 | 580 | 4.0164 | 0.0031 | 0.0217 | 0.1429 | [0.14297121162029977, 0.0007422241109319424, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9989103415398236, 0.0007463057863575303, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.0557 | 30.0 | 600 | 4.2748 | 0.0031 | 0.0217 | 0.1429 | [0.14303412843917407, 0.0003856052395572551, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9988961532786234, 0.00038632299529095683, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3498 | 31.0 | 620 | 4.4146 | 0.0031 | 0.0216 | 0.1423 | [0.14290109320524125, 0.0001608060182383262, 0.0, 0.0, 0.0, 0.0005434010215939206, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9951334264083268, 0.00016096791470456535, 0.0, 0.0, 0.0, 0.0006055863007062068, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6119 | 32.0 | 640 | 4.1771 | 0.0031 | 0.0217 | 0.1425 | [0.14301796804098232, 0.00011991109031352363, 0.0, 0.0, 0.0, 2.5596614421132564e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9967650764463514, 0.00011999426368885781, 0.0, 0.0, 0.0, 2.7950136955671083e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.2928 | 33.0 | 660 | 4.2533 | 0.0031 | 0.0217 | 0.1426 | [0.1430651199052239, 0.0, 0.0, 0.0, 0.0, 0.0005383649088625119, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9971878866301177, 0.0, 0.0, 0.0, 0.0, 0.0005869528760690928, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.9388 | 34.0 | 680 | 4.4678 | 0.0031 | 0.0217 | 0.1424 | [0.14302521070021848, 8.777165326686094e-06, 0.0, 0.0, 0.0, 0.00013477429517255322, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9962911885222642, 8.780068074794473e-06, 0.0, 0.0, 0.0, 0.00014906739709691244, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.0637 | 35.0 | 700 | 4.3135 | 0.0032 | 0.0216 | 0.1418 | [0.1429503635871641, 0.00015791597700275475, 1.398789208061502e-05, 0.0, 0.0, 0.0028473543966501713, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9908826233527429, 0.00015804122534630053, 1.4021705600269217e-05, 0.0, 0.0, 0.0034844504071403284, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.8097 | 36.0 | 720 | 4.5076 | 0.0033 | 0.0215 | 0.1403 | [0.14292913158156023, 0.0009571919257359143, 6.428765090128491e-05, 0.0, 0.0, 0.008040658872532778, 0.00016170025747656383, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9762885778822034, 0.0009599541095108625, 6.44998457612384e-05, 0.0, 0.0, 0.012661412040919001, 0.00016220532698930278, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.5005 | 37.0 | 740 | 4.9189 | 0.0032 | 0.0216 | 0.1416 | [0.14321751517531003, 0.0011148109769586326, 3.077732326821598e-05, 0.0, 0.0, 0.002581131431496914, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9885330472979875, 0.0011209220242154277, 3.0847752320592275e-05, 0.0, 0.0, 0.003381966571636201, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3499 | 38.0 | 760 | 4.8759 | 0.0037 | 0.0216 | 0.1383 | [0.14275769206892197, 0.00015486525262029086, 0.008065831982682663, 0.0, 0.0, 0.018616346885301328, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9461782499631105, 0.0001551145359880357, 0.0083176757620797, 0.0, 0.0, 0.041040117763243705, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.0865 | 39.0 | 780 | 3.8349 | 0.0041 | 0.0219 | 0.1405 | [0.14317728070866245, 0.0019387900438625815, 0.0272595934506007, 0.0, 0.0, 0.01595177221476263, 0.0019825804723817746, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9389535873599618, 0.001952101801962638, 0.030160688746179085, 0.0, 0.0, 0.03231967503307433, 0.0020629190304280558, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.4726 | 40.0 | 800 | 3.9250 | 0.0037 | 0.0216 | 0.1400 | [0.14244053050451713, 0.0022596513184196345, 0.012475440730295781, 0.0, 0.0, 0.013494958163484231, 0.000710175687780943, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9559227477554171, 0.002282817699446563, 0.013087860007291287, 0.0, 0.0, 0.02344084819348948, 0.0007320035269260843, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.0128 | 41.0 | 820 | 4.4055 | 0.0033 | 0.0215 | 0.1403 | [0.14267630712423943, 0.0006876557059397718, 0.002249988165285022, 0.0, 0.0, 0.008327223523661356, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9745434217545771, 0.0006906986885504985, 0.0022659076250035053, 0.0, 0.0, 0.013015447109024168, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.1359 | 42.0 | 840 | 4.3485 | 0.0032 | 0.0215 | 0.1409 | [0.1428689296917672, 0.00043785776628320057, 0.000735403087574484, 0.0, 0.0, 0.004521202653144983, 4.1517204729639965e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9827243731626202, 0.00043900340373972366, 0.0007375417145741608, 0.0, 0.0, 0.00636331451357445, 4.159110948443661e-06, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.3446 | 43.0 | 860 | 4.1294 | 0.0033 | 0.0216 | 0.1415 | [0.14297466923456498, 0.0010833321199237007, 0.0038991084994498782, 0.0, 0.0, 0.0033002377289889527, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9833543319599096, 0.0010887284412745147, 0.003965338343756134, 0.0, 0.0, 0.004397488214358917, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7443 | 44.0 | 880 | 4.2309 | 0.0034 | 0.0215 | 0.1398 | [0.14257015931769873, 0.0010936527419852125, 0.004274154927043165, 0.0, 0.0, 0.010032222702056273, 4.527177469472419e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9670633704498246, 0.001100435198707574, 0.004329902689363134, 0.0, 0.0, 0.016881882721225334, 4.575022043288027e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.8209 | 45.0 | 900 | 4.4797 | 0.0033 | 0.0215 | 0.1407 | [0.14293897407373585, 0.001303224682901344, 0.002799500704740247, 0.0, 0.0, 0.006579395289821434, 2.0683977777134275e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9771739253810967, 0.0013140835218609062, 0.0028239715078942204, 0.0, 0.0, 0.00990366519462612, 2.0795554742218304e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.9059 | 46.0 | 920 | 4.5053 | 0.0040 | 0.0220 | 0.1364 | [0.14228787838186308, 0.0013217808522283945, 0.007278362518756201, 0.0, 0.0, 0.030787987593700387, 6.171544243800683e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9191155605498235, 0.0013287169686522302, 0.007467960402703385, 0.0, 0.0, 0.0856392196321762, 6.238666422665492e-05, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.2262 | 47.0 | 940 | 5.3410 | 0.0039 | 0.0219 | 0.1365 | [0.1423258178089665, 0.003478698692055064, 0.0075020785926833535, 0.0, 0.0, 0.02704553235193427, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9217006617405024, 0.0035588542596500265, 0.007692307692307693, 0.0, 0.0, 0.07286600704343452, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 2.7618 | 48.0 | 960 | 5.0946 | 0.0038 | 0.0218 | 0.1373 | [0.1426949704126623, 0.0032438673758399044, 0.0051523797709134515, 0.0, 0.0, 0.023948251838033993, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.93417498098773, 0.0033217924216305756, 0.005241313553380633, 0.0, 0.0, 0.05908658952428867, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.7284 | 49.0 | 980 | 4.0830 | 0.0043 | 0.0221 | 0.1373 | [0.14283896315255537, 0.0034238556212341873, 0.024018439121225456, 0.0, 0.0, 0.029172219275206687, 0.0002549698559831555, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9045640798628847, 0.0035091005405595245, 0.026329958776185537, 0.0, 0.0, 0.08402742840106583, 0.00025786487880350697, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
| 3.6351 | 50.0 | 1000 | 4.5716 | 0.0039 | 0.0219 | 0.1398 | [0.1424604255351693, 0.0028172808510882213, 0.009342676914231785, 0.0, 0.0, 0.02331811292704824, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] | [0.9514506078251098, 0.0028769356391743226, 0.00966095515858549, 0.0, 0.0, 0.045009037210949, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, 0.0, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan] |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.1.0.dev20230812
- Datasets 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e3_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:50:09Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:50:05Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
LBR47/speecht5_finetuned_voxpopuli_nl
|
LBR47
| 2023-08-13T16:49:06Z | 82 | 0 |
transformers
|
[
"transformers",
"pytorch",
"speecht5",
"text-to-audio",
"text-to-speech",
"dataset:voxpopuli",
"endpoints_compatible",
"region:us"
] |
text-to-speech
| 2023-08-13T16:41:15Z |
---
base_model: speechT5
tags:
- text-to-speech
datasets:
- voxpopuli
model-index:
- name: speecht5_finetuned_voxpopuli_nl
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# speecht5_finetuned_voxpopuli_nl
This model is a fine-tuned version of [speechT5](https://huggingface.co/speechT5) on the voxpopuli dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- training_steps: 4000
### Framework versions
- Transformers 4.31.0
- Pytorch 1.12.1+cpu
- Datasets 2.13.1
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e8_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:47:03Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:47:02Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e1_s55555_v4_l4_r4
|
KingKazma
| 2023-08-13T16:34:17Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:34:15Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e1_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:32:55Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:32:49Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
digicazter/wav2vec2-base-timit-demo-google-colab
|
digicazter
| 2023-08-13T16:32:43Z | 105 | 0 |
transformers
|
[
"transformers",
"pytorch",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"base_model:facebook/wav2vec2-base",
"base_model:finetune:facebook/wav2vec2-base",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-07T03:20:25Z |
---
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: wav2vec2-base-timit-demo-google-colab
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-base-timit-demo-google-colab
This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 3.0392
- Wer: 1.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 400
- num_epochs: 150
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:---:|
| 5.2993 | 8.0 | 200 | 3.0327 | 1.0 |
| 3.0806 | 16.0 | 400 | 3.0476 | 1.0 |
| 3.0219 | 24.0 | 600 | 3.0472 | 1.0 |
| 3.0179 | 32.0 | 800 | 3.0435 | 1.0 |
| 3.0157 | 40.0 | 1000 | 3.0546 | 1.0 |
| 3.0146 | 48.0 | 1200 | 3.0484 | 1.0 |
| 3.0139 | 56.0 | 1400 | 3.0344 | 1.0 |
| 3.0118 | 64.0 | 1600 | 3.0351 | 1.0 |
| 3.0114 | 72.0 | 1800 | 3.0559 | 1.0 |
| 3.0114 | 80.0 | 2000 | 3.0526 | 1.0 |
| 3.0108 | 88.0 | 2200 | 3.0417 | 1.0 |
| 3.0092 | 96.0 | 2400 | 3.0629 | 1.0 |
| 3.0089 | 104.0 | 2600 | 3.0352 | 1.0 |
| 3.0083 | 112.0 | 2800 | 3.0503 | 1.0 |
| 3.0078 | 120.0 | 3000 | 3.0529 | 1.0 |
| 3.0072 | 128.0 | 3200 | 3.0378 | 1.0 |
| 3.0068 | 136.0 | 3400 | 3.0481 | 1.0 |
| 3.0063 | 144.0 | 3600 | 3.0392 | 1.0 |
### Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu117
- Datasets 2.14.3
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e9_s108_v4_l4_r2
|
KingKazma
| 2023-08-13T16:22:59Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:22:58Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_0098
|
bigmorning
| 2023-08-13T16:22:52Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:22:44Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0098
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0098
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0044
- Train Accuracy: 0.0794
- Train Wermet: 8.8948
- Validation Loss: 0.5589
- Validation Accuracy: 0.0765
- Validation Wermet: 7.4085
- Epoch: 97
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
| 0.0019 | 0.0795 | 10.3176 | 0.5492 | 0.0766 | 8.5812 | 88 |
| 0.0007 | 0.0795 | 10.2569 | 0.5488 | 0.0766 | 8.9133 | 89 |
| 0.0006 | 0.0795 | 10.2538 | 0.5541 | 0.0766 | 8.7676 | 90 |
| 0.0029 | 0.0795 | 10.1412 | 0.5666 | 0.0764 | 9.0822 | 91 |
| 0.0042 | 0.0795 | 9.5603 | 0.5582 | 0.0765 | 7.6837 | 92 |
| 0.0015 | 0.0795 | 9.4004 | 0.5495 | 0.0766 | 7.7859 | 93 |
| 0.0008 | 0.0795 | 9.5417 | 0.5503 | 0.0767 | 7.8876 | 94 |
| 0.0005 | 0.0795 | 9.3473 | 0.5590 | 0.0766 | 7.8967 | 95 |
| 0.0016 | 0.0795 | 9.1740 | 0.5746 | 0.0765 | 7.8469 | 96 |
| 0.0044 | 0.0794 | 8.8948 | 0.5589 | 0.0765 | 7.4085 | 97 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_prompt_tuning_500_10_3000_5_e-1_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:15:39Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:15:36Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_0096
|
bigmorning
| 2023-08-13T16:14:08Z | 60 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:14:00Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0096
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0096
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0005
- Train Accuracy: 0.0795
- Train Wermet: 9.3473
- Validation Loss: 0.5590
- Validation Accuracy: 0.0766
- Validation Wermet: 7.8967
- Epoch: 95
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
| 0.0019 | 0.0795 | 10.3176 | 0.5492 | 0.0766 | 8.5812 | 88 |
| 0.0007 | 0.0795 | 10.2569 | 0.5488 | 0.0766 | 8.9133 | 89 |
| 0.0006 | 0.0795 | 10.2538 | 0.5541 | 0.0766 | 8.7676 | 90 |
| 0.0029 | 0.0795 | 10.1412 | 0.5666 | 0.0764 | 9.0822 | 91 |
| 0.0042 | 0.0795 | 9.5603 | 0.5582 | 0.0765 | 7.6837 | 92 |
| 0.0015 | 0.0795 | 9.4004 | 0.5495 | 0.0766 | 7.7859 | 93 |
| 0.0008 | 0.0795 | 9.5417 | 0.5503 | 0.0767 | 7.8876 | 94 |
| 0.0005 | 0.0795 | 9.3473 | 0.5590 | 0.0766 | 7.8967 | 95 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e9_s108_v4_l4_r4
|
KingKazma
| 2023-08-13T16:13:40Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:13:38Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e4_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:12:42Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:12:40Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_0095
|
bigmorning
| 2023-08-13T16:09:46Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:09:40Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0095
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0095
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0008
- Train Accuracy: 0.0795
- Train Wermet: 9.5417
- Validation Loss: 0.5503
- Validation Accuracy: 0.0767
- Validation Wermet: 7.8876
- Epoch: 94
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
| 0.0019 | 0.0795 | 10.3176 | 0.5492 | 0.0766 | 8.5812 | 88 |
| 0.0007 | 0.0795 | 10.2569 | 0.5488 | 0.0766 | 8.9133 | 89 |
| 0.0006 | 0.0795 | 10.2538 | 0.5541 | 0.0766 | 8.7676 | 90 |
| 0.0029 | 0.0795 | 10.1412 | 0.5666 | 0.0764 | 9.0822 | 91 |
| 0.0042 | 0.0795 | 9.5603 | 0.5582 | 0.0765 | 7.6837 | 92 |
| 0.0015 | 0.0795 | 9.4004 | 0.5495 | 0.0766 | 7.7859 | 93 |
| 0.0008 | 0.0795 | 9.5417 | 0.5503 | 0.0767 | 7.8876 | 94 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e7_s108_v4_l4_r2
|
KingKazma
| 2023-08-13T16:08:19Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:08:17Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
DHEIVER/Brain_Tumor_Classification
|
DHEIVER
| 2023-08-13T16:06:23Z | 182 | 0 |
transformers
|
[
"transformers",
"pytorch",
"swin",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
image-classification
| 2023-08-13T16:04:30Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
- f1
- recall
- precision
model-index:
- name: Brain_Tumor_Classification
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9646761984861227
- name: F1
type: f1
value: 0.9646761984861227
- name: Recall
type: recall
value: 0.9646761984861227
- name: Precision
type: precision
value: 0.9646761984861227
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# Brain_Tumor_Classification
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1012
- Accuracy: 0.9647
- F1: 0.9647
- Recall: 0.9647
- Precision: 0.9647
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Recall | Precision |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:|:---------:|
| 0.4856 | 0.99 | 83 | 0.3771 | 0.8444 | 0.8444 | 0.8444 | 0.8444 |
| 0.3495 | 1.99 | 166 | 0.2608 | 0.8949 | 0.8949 | 0.8949 | 0.8949 |
| 0.252 | 2.99 | 249 | 0.1445 | 0.9487 | 0.9487 | 0.9487 | 0.9487 |
| 0.2364 | 3.99 | 332 | 0.1029 | 0.9588 | 0.9588 | 0.9588 | 0.9588 |
| 0.2178 | 4.99 | 415 | 0.1012 | 0.9647 | 0.9647 | 0.9647 | 0.9647 |
### Framework versions
- Transformers 4.23.1
- Pytorch 1.12.1
- Datasets 2.6.1
- Tokenizers 0.13.1
|
bigmorning/whisper_charsplit_new_0094
|
bigmorning
| 2023-08-13T16:05:30Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:05:24Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0094
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0094
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0015
- Train Accuracy: 0.0795
- Train Wermet: 9.4004
- Validation Loss: 0.5495
- Validation Accuracy: 0.0766
- Validation Wermet: 7.7859
- Epoch: 93
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
| 0.0019 | 0.0795 | 10.3176 | 0.5492 | 0.0766 | 8.5812 | 88 |
| 0.0007 | 0.0795 | 10.2569 | 0.5488 | 0.0766 | 8.9133 | 89 |
| 0.0006 | 0.0795 | 10.2538 | 0.5541 | 0.0766 | 8.7676 | 90 |
| 0.0029 | 0.0795 | 10.1412 | 0.5666 | 0.0764 | 9.0822 | 91 |
| 0.0042 | 0.0795 | 9.5603 | 0.5582 | 0.0765 | 7.6837 | 92 |
| 0.0015 | 0.0795 | 9.4004 | 0.5495 | 0.0766 | 7.7859 | 93 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
evanyin/openai-whisper-tiny-LORA-colab-test
|
evanyin
| 2023-08-13T16:05:21Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-08T15:32:52Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e3_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T16:04:06Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:04:05Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
udayGay/resume_model
|
udayGay
| 2023-08-13T16:03:19Z | 3 | 0 |
transformers
|
[
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"base_model:distilbert/distilbert-base-uncased",
"base_model:finetune:distilbert/distilbert-base-uncased",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-08-12T16:11:53Z |
---
license: apache-2.0
base_model: distilbert-base-uncased
tags:
- generated_from_keras_callback
model-index:
- name: udayGay/resume_model
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# udayGay/resume_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.1040
- Validation Loss: 1.4298
- Train Accuracy: 0.6640
- Train Precision: 0.5589
- Train Recall: 0.5938
- Train F1: 0.5692
- Epoch: 29
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 1470, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Train Precision | Train Recall | Train F1 | Epoch |
|:----------:|:---------------:|:--------------:|:---------------:|:------------:|:--------:|:-----:|
| 3.1654 | 3.1440 | 0.0402 | 0.0017 | 0.0417 | 0.0033 | 0 |
| 3.1553 | 3.1474 | 0.0282 | 0.0012 | 0.0417 | 0.0023 | 1 |
| 3.1208 | 3.0528 | 0.0805 | 0.0147 | 0.0812 | 0.0225 | 2 |
| 2.9896 | 2.8784 | 0.1469 | 0.0746 | 0.1384 | 0.0825 | 3 |
| 2.6886 | 2.5739 | 0.3300 | 0.2207 | 0.3033 | 0.2182 | 4 |
| 2.2855 | 2.1620 | 0.4547 | 0.3432 | 0.4138 | 0.3395 | 5 |
| 1.9018 | 1.9030 | 0.5151 | 0.4141 | 0.4679 | 0.4118 | 6 |
| 1.6218 | 1.7029 | 0.5795 | 0.4872 | 0.5205 | 0.4854 | 7 |
| 1.4058 | 1.5916 | 0.6217 | 0.5261 | 0.5595 | 0.5278 | 8 |
| 1.2705 | 1.4954 | 0.6479 | 0.5457 | 0.5815 | 0.5557 | 9 |
| 1.1692 | 1.4469 | 0.6600 | 0.5548 | 0.5896 | 0.5643 | 10 |
| 1.1179 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 11 |
| 1.1162 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 12 |
| 1.1109 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 13 |
| 1.1142 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 14 |
| 1.1095 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 15 |
| 1.1108 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 16 |
| 1.1133 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 17 |
| 1.1132 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 18 |
| 1.1064 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 19 |
| 1.1098 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 20 |
| 1.1029 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 21 |
| 1.1055 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 22 |
| 1.1125 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 23 |
| 1.1081 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 24 |
| 1.1125 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 25 |
| 1.1130 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 26 |
| 1.1101 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 27 |
| 1.1134 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 28 |
| 1.1040 | 1.4298 | 0.6640 | 0.5589 | 0.5938 | 0.5692 | 29 |
### Framework versions
- Transformers 4.31.0
- TensorFlow 2.12.0
- Datasets 2.14.4
- Tokenizers 0.13.3
|
bigmorning/whisper_charsplit_new_0093
|
bigmorning
| 2023-08-13T16:01:10Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T16:01:02Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0093
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0093
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0042
- Train Accuracy: 0.0795
- Train Wermet: 9.5603
- Validation Loss: 0.5582
- Validation Accuracy: 0.0765
- Validation Wermet: 7.6837
- Epoch: 92
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
| 0.0019 | 0.0795 | 10.3176 | 0.5492 | 0.0766 | 8.5812 | 88 |
| 0.0007 | 0.0795 | 10.2569 | 0.5488 | 0.0766 | 8.9133 | 89 |
| 0.0006 | 0.0795 | 10.2538 | 0.5541 | 0.0766 | 8.7676 | 90 |
| 0.0029 | 0.0795 | 10.1412 | 0.5666 | 0.0764 | 9.0822 | 91 |
| 0.0042 | 0.0795 | 9.5603 | 0.5582 | 0.0765 | 7.6837 | 92 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e6_s108_v4_l4_r2
|
KingKazma
| 2023-08-13T16:00:59Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T16:00:57Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e7_s108_v4_l4_r4
|
KingKazma
| 2023-08-13T15:59:56Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:59:53Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
Sakil/llama2_finetuned_medical_assistant
|
Sakil
| 2023-08-13T15:59:29Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T13:39:26Z |
---
library_name: peft
---
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: False
- bnb_4bit_compute_dtype: float16
### Framework versions
- PEFT 0.4.0
|
fathyshalab/mdcsi-unterhaltung-kultur-freizeit-setfit
|
fathyshalab
| 2023-08-13T15:56:44Z | 5 | 0 |
sentence-transformers
|
[
"sentence-transformers",
"pytorch",
"roberta",
"setfit",
"text-classification",
"arxiv:2209.11055",
"license:apache-2.0",
"region:us"
] |
text-classification
| 2023-08-13T15:55:53Z |
---
license: apache-2.0
tags:
- setfit
- sentence-transformers
- text-classification
pipeline_tag: text-classification
---
# C:\Users\F896D~1.SHA\AppData\Local\Temp\tmpyb2_zis8\fathyshalab\mdcsi-unterhaltung-kultur-freizeit-setfit
This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Usage
To use this model for inference, first install the SetFit library:
```bash
python -m pip install setfit
```
You can then run inference as follows:
```python
from setfit import SetFitModel
# Download from Hub and run inference
model = SetFitModel.from_pretrained("C:\Users\F896D~1.SHA\AppData\Local\Temp\tmpyb2_zis8\fathyshalab\mdcsi-unterhaltung-kultur-freizeit-setfit")
# Run inference
preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"])
```
## BibTeX entry and citation info
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e2_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T15:55:31Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:55:30Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e5_s108_v4_l4_r2
|
KingKazma
| 2023-08-13T15:53:39Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:53:37Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e6_s108_v4_l4_r4
|
KingKazma
| 2023-08-13T15:53:04Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:53:02Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_0091
|
bigmorning
| 2023-08-13T15:52:18Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T15:52:10Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0091
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0091
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0006
- Train Accuracy: 0.0795
- Train Wermet: 10.2538
- Validation Loss: 0.5541
- Validation Accuracy: 0.0766
- Validation Wermet: 8.7676
- Epoch: 90
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
| 0.0019 | 0.0795 | 10.3176 | 0.5492 | 0.0766 | 8.5812 | 88 |
| 0.0007 | 0.0795 | 10.2569 | 0.5488 | 0.0766 | 8.9133 | 89 |
| 0.0006 | 0.0795 | 10.2538 | 0.5541 | 0.0766 | 8.7676 | 90 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_p_tuning_500_10_3000_8_e1_s108_v4_l4_v100
|
KingKazma
| 2023-08-13T15:46:55Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:46:54Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e4_s108_v4_l4_r2
|
KingKazma
| 2023-08-13T15:46:18Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:46:17Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e5_s108_v4_l4_r4
|
KingKazma
| 2023-08-13T15:46:12Z | 0 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:46:10Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_0089
|
bigmorning
| 2023-08-13T15:43:30Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T15:43:22Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0089
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0089
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0019
- Train Accuracy: 0.0795
- Train Wermet: 10.3176
- Validation Loss: 0.5492
- Validation Accuracy: 0.0766
- Validation Wermet: 8.5812
- Epoch: 88
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
| 0.0019 | 0.0795 | 10.3176 | 0.5492 | 0.0766 | 8.5812 | 88 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/xsum_gpt2_lora_500_10_3000_8_e4_s108_v4_l4_r4
|
KingKazma
| 2023-08-13T15:39:20Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:39:18Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_0088
|
bigmorning
| 2023-08-13T15:39:10Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T15:39:03Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0088
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0088
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0038
- Train Accuracy: 0.0795
- Train Wermet: 10.1430
- Validation Loss: 0.5538
- Validation Accuracy: 0.0765
- Validation Wermet: 9.1635
- Epoch: 87
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
| 0.0038 | 0.0795 | 10.1430 | 0.5538 | 0.0765 | 9.1635 | 87 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
KingKazma/cnn_dailymail_gpt2_lora_500_10_3000_8_e3_s108_v4_l4_r2
|
KingKazma
| 2023-08-13T15:38:58Z | 1 | 0 |
peft
|
[
"peft",
"region:us"
] | null | 2023-08-13T15:38:56Z |
---
library_name: peft
---
## Training procedure
### Framework versions
- PEFT 0.5.0.dev0
|
bigmorning/whisper_charsplit_new_0087
|
bigmorning
| 2023-08-13T15:34:44Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"whisper",
"automatic-speech-recognition",
"generated_from_keras_callback",
"base_model:openai/whisper-tiny",
"base_model:finetune:openai/whisper-tiny",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-08-13T15:34:36Z |
---
license: apache-2.0
base_model: openai/whisper-tiny
tags:
- generated_from_keras_callback
model-index:
- name: whisper_charsplit_new_0087
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# whisper_charsplit_new_0087
This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0034
- Train Accuracy: 0.0795
- Train Wermet: 10.5048
- Validation Loss: 0.5578
- Validation Accuracy: 0.0764
- Validation Wermet: 8.9920
- Epoch: 86
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 1e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Train Accuracy | Train Wermet | Validation Loss | Validation Accuracy | Validation Wermet | Epoch |
|:----------:|:--------------:|:------------:|:---------------:|:-------------------:|:-----------------:|:-----:|
| 0.8733 | 0.0602 | 13.0686 | 0.6470 | 0.0676 | 11.4066 | 0 |
| 0.5740 | 0.0666 | 12.7778 | 0.5113 | 0.0706 | 11.1022 | 1 |
| 0.4553 | 0.0692 | 12.2404 | 0.4371 | 0.0723 | 10.9105 | 2 |
| 0.3813 | 0.0708 | 11.9157 | 0.3935 | 0.0733 | 9.4615 | 3 |
| 0.3292 | 0.0720 | 11.5732 | 0.3630 | 0.0740 | 9.9885 | 4 |
| 0.2886 | 0.0729 | 11.5171 | 0.3403 | 0.0745 | 9.8042 | 5 |
| 0.2561 | 0.0736 | 11.3173 | 0.3256 | 0.0749 | 9.9431 | 6 |
| 0.2282 | 0.0743 | 11.7308 | 0.3159 | 0.0752 | 9.2086 | 7 |
| 0.2036 | 0.0748 | 11.4503 | 0.3071 | 0.0754 | 9.5236 | 8 |
| 0.1820 | 0.0754 | 11.7175 | 0.3005 | 0.0756 | 10.0755 | 9 |
| 0.1628 | 0.0758 | 11.7056 | 0.2993 | 0.0757 | 9.9497 | 10 |
| 0.1450 | 0.0762 | 11.7637 | 0.2971 | 0.0758 | 10.1481 | 11 |
| 0.1287 | 0.0766 | 11.8509 | 0.3029 | 0.0759 | 10.2042 | 12 |
| 0.1140 | 0.0770 | 12.1100 | 0.3004 | 0.0760 | 10.3873 | 13 |
| 0.0998 | 0.0773 | 11.9502 | 0.3025 | 0.0761 | 10.7066 | 14 |
| 0.0872 | 0.0777 | 12.3196 | 0.3129 | 0.0759 | 10.7707 | 15 |
| 0.0760 | 0.0779 | 12.2637 | 0.3142 | 0.0761 | 10.2638 | 16 |
| 0.0651 | 0.0782 | 12.1215 | 0.3192 | 0.0761 | 10.0750 | 17 |
| 0.0547 | 0.0785 | 12.0551 | 0.3294 | 0.0761 | 10.4732 | 18 |
| 0.0463 | 0.0787 | 11.9677 | 0.3402 | 0.0760 | 10.2814 | 19 |
| 0.0386 | 0.0789 | 11.6855 | 0.3517 | 0.0760 | 10.0599 | 20 |
| 0.0318 | 0.0790 | 11.6314 | 0.3628 | 0.0760 | 9.6652 | 21 |
| 0.0262 | 0.0792 | 11.4603 | 0.3728 | 0.0760 | 10.0035 | 22 |
| 0.0224 | 0.0792 | 11.4330 | 0.3824 | 0.0760 | 9.1995 | 23 |
| 0.0181 | 0.0793 | 11.3124 | 0.3982 | 0.0759 | 9.8710 | 24 |
| 0.0142 | 0.0794 | 11.3562 | 0.4057 | 0.0760 | 9.6831 | 25 |
| 0.0118 | 0.0794 | 11.0532 | 0.4207 | 0.0759 | 9.7227 | 26 |
| 0.0101 | 0.0794 | 11.2963 | 0.4282 | 0.0760 | 9.5792 | 27 |
| 0.0114 | 0.0794 | 11.3093 | 0.4431 | 0.0758 | 9.5545 | 28 |
| 0.0109 | 0.0794 | 11.4214 | 0.4419 | 0.0760 | 9.4377 | 29 |
| 0.0084 | 0.0794 | 10.9143 | 0.4474 | 0.0760 | 9.3668 | 30 |
| 0.0043 | 0.0795 | 10.9497 | 0.4525 | 0.0761 | 9.3202 | 31 |
| 0.0036 | 0.0795 | 10.7759 | 0.4667 | 0.0761 | 9.0385 | 32 |
| 0.0047 | 0.0795 | 10.7613 | 0.4788 | 0.0759 | 9.4065 | 33 |
| 0.0130 | 0.0793 | 11.1022 | 0.4748 | 0.0760 | 9.4521 | 34 |
| 0.0074 | 0.0794 | 10.9738 | 0.4730 | 0.0760 | 9.3348 | 35 |
| 0.0032 | 0.0795 | 10.6370 | 0.4750 | 0.0762 | 8.8298 | 36 |
| 0.0020 | 0.0795 | 10.7428 | 0.4835 | 0.0762 | 9.0566 | 37 |
| 0.0014 | 0.0795 | 10.6908 | 0.4937 | 0.0761 | 9.2445 | 38 |
| 0.0035 | 0.0795 | 10.6833 | 0.5276 | 0.0757 | 8.9798 | 39 |
| 0.0120 | 0.0793 | 10.4810 | 0.4963 | 0.0760 | 8.9194 | 40 |
| 0.0045 | 0.0795 | 10.2251 | 0.5014 | 0.0761 | 8.5737 | 41 |
| 0.0028 | 0.0795 | 10.3174 | 0.4968 | 0.0762 | 8.8525 | 42 |
| 0.0023 | 0.0795 | 10.4871 | 0.5027 | 0.0762 | 8.6712 | 43 |
| 0.0024 | 0.0795 | 10.3731 | 0.5055 | 0.0762 | 8.6347 | 44 |
| 0.0041 | 0.0795 | 10.2751 | 0.5242 | 0.0760 | 8.3671 | 45 |
| 0.0070 | 0.0794 | 10.2166 | 0.5169 | 0.0760 | 8.8409 | 46 |
| 0.0037 | 0.0795 | 10.0455 | 0.5174 | 0.0762 | 8.2514 | 47 |
| 0.0023 | 0.0795 | 9.9201 | 0.5167 | 0.0763 | 8.9537 | 48 |
| 0.0008 | 0.0795 | 10.0022 | 0.5166 | 0.0764 | 8.4855 | 49 |
| 0.0006 | 0.0795 | 9.9494 | 0.5233 | 0.0763 | 8.5719 | 50 |
| 0.0069 | 0.0794 | 10.2037 | 0.5434 | 0.0759 | 8.5399 | 51 |
| 0.0083 | 0.0794 | 9.9557 | 0.5173 | 0.0762 | 8.2406 | 52 |
| 0.0032 | 0.0795 | 10.0283 | 0.5240 | 0.0763 | 9.0101 | 53 |
| 0.0018 | 0.0795 | 10.0694 | 0.5247 | 0.0763 | 8.5717 | 54 |
| 0.0008 | 0.0795 | 10.1079 | 0.5217 | 0.0764 | 8.5608 | 55 |
| 0.0005 | 0.0795 | 10.0546 | 0.5286 | 0.0764 | 8.8830 | 56 |
| 0.0007 | 0.0795 | 10.2557 | 0.5328 | 0.0764 | 8.5665 | 57 |
| 0.0006 | 0.0795 | 10.2165 | 0.5412 | 0.0763 | 8.4623 | 58 |
| 0.0124 | 0.0792 | 10.2304 | 0.5284 | 0.0762 | 9.1194 | 59 |
| 0.0044 | 0.0795 | 10.3884 | 0.5223 | 0.0764 | 8.8152 | 60 |
| 0.0015 | 0.0795 | 9.8557 | 0.5227 | 0.0764 | 8.3774 | 61 |
| 0.0005 | 0.0795 | 9.8123 | 0.5233 | 0.0765 | 8.5043 | 62 |
| 0.0003 | 0.0795 | 9.7631 | 0.5282 | 0.0765 | 8.3860 | 63 |
| 0.0003 | 0.0795 | 9.7593 | 0.5320 | 0.0765 | 8.4815 | 64 |
| 0.0002 | 0.0795 | 9.7663 | 0.5357 | 0.0765 | 8.4281 | 65 |
| 0.0034 | 0.0795 | 9.8382 | 0.5771 | 0.0758 | 8.8051 | 66 |
| 0.0123 | 0.0792 | 10.2575 | 0.5261 | 0.0763 | 9.3701 | 67 |
| 0.0027 | 0.0795 | 10.3802 | 0.5272 | 0.0764 | 8.8216 | 68 |
| 0.0011 | 0.0795 | 10.1683 | 0.5291 | 0.0764 | 8.5736 | 69 |
| 0.0012 | 0.0795 | 10.1305 | 0.5336 | 0.0765 | 8.6648 | 70 |
| 0.0008 | 0.0795 | 10.2545 | 0.5315 | 0.0765 | 9.0617 | 71 |
| 0.0006 | 0.0795 | 10.4562 | 0.5369 | 0.0765 | 9.6485 | 72 |
| 0.0032 | 0.0795 | 10.2347 | 0.5569 | 0.0763 | 8.4947 | 73 |
| 0.0062 | 0.0794 | 10.1654 | 0.5471 | 0.0763 | 8.8666 | 74 |
| 0.0029 | 0.0795 | 10.1320 | 0.5376 | 0.0765 | 8.7713 | 75 |
| 0.0012 | 0.0795 | 10.2943 | 0.5406 | 0.0765 | 8.6959 | 76 |
| 0.0006 | 0.0795 | 10.1888 | 0.5371 | 0.0767 | 8.9689 | 77 |
| 0.0005 | 0.0795 | 10.2138 | 0.5398 | 0.0766 | 8.7470 | 78 |
| 0.0016 | 0.0795 | 10.2173 | 0.5497 | 0.0764 | 8.9675 | 79 |
| 0.0065 | 0.0794 | 10.2806 | 0.5559 | 0.0763 | 9.4487 | 80 |
| 0.0028 | 0.0795 | 10.7728 | 0.5394 | 0.0766 | 8.9716 | 81 |
| 0.0012 | 0.0795 | 10.3247 | 0.5453 | 0.0765 | 8.9986 | 82 |
| 0.0013 | 0.0795 | 10.3174 | 0.5535 | 0.0765 | 8.9229 | 83 |
| 0.0011 | 0.0795 | 10.2846 | 0.5452 | 0.0766 | 9.1239 | 84 |
| 0.0007 | 0.0795 | 10.1996 | 0.5491 | 0.0766 | 8.9308 | 85 |
| 0.0034 | 0.0795 | 10.5048 | 0.5578 | 0.0764 | 8.9920 | 86 |
### Framework versions
- Transformers 4.32.0.dev0
- TensorFlow 2.12.0
- Tokenizers 0.13.3
|
Subsets and Splits
Filtered Qwen2.5 Distill Models
Identifies specific configurations of models by filtering cards that contain 'distill', 'qwen2.5', '7b' while excluding certain base models and incorrect model ID patterns, uncovering unique model variants.
Filtered Model Cards Count
Finds the count of entries with specific card details that include 'distill', 'qwen2.5', '7b' but exclude certain base models, revealing valuable insights about the dataset's content distribution.
Filtered Distill Qwen 7B Models
Filters for specific card entries containing 'distill', 'qwen', and '7b', excluding certain strings and patterns, to identify relevant model configurations.
Filtered Qwen-7b Model Cards
The query performs a detailed filtering based on specific keywords and excludes certain entries, which could be useful for identifying a specific subset of cards but does not provide deeper insights or trends.
Filtered Qwen 7B Model Cards
The query filters for specific terms related to "distilled" or "distill", "qwen", and "7b" in the 'card' column but excludes certain base models, providing a limited set of entries for further inspection.
Qwen 7B Distilled Models
The query provides a basic filtering of records to find specific card names that include keywords related to distilled Qwen 7b models, excluding a particular base model, which gives limited insight but helps in focusing on relevant entries.
Qwen 7B Distilled Model Cards
The query filters data based on specific keywords in the modelId and card fields, providing limited insight primarily useful for locating specific entries rather than revealing broad patterns or trends.
Qwen 7B Distilled Models
Finds all entries containing the terms 'distilled', 'qwen', and '7b' in a case-insensitive manner, providing a filtered set of records but without deeper analysis.
Distilled Qwen 7B Models
The query filters for specific model IDs containing 'distilled', 'qwen', and '7b', providing a basic retrieval of relevant entries but without deeper analysis or insight.
Filtered Model Cards with Distill Qwen2.
Filters and retrieves records containing specific keywords in the card description while excluding certain phrases, providing a basic count of relevant entries.
Filtered Model Cards with Distill Qwen 7
The query filters specific variations of card descriptions containing 'distill', 'qwen', and '7b' while excluding a particular base model, providing limited but specific data retrieval.
Distill Qwen 7B Model Cards
The query filters and retrieves rows where the 'card' column contains specific keywords ('distill', 'qwen', and '7b'), providing a basic filter result that can help in identifying specific entries.