modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-09-06 12:28:13
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 543
values | tags
listlengths 1
4.05k
| pipeline_tag
stringclasses 55
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-09-06 12:27:52
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
asenella/mmnist_MVTCAEconfig2_seed_2_ratio_02_c
|
asenella
| 2023-05-13T02:16:00Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T02:15:52Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MVTCAEconfig2_seed_3_ratio_02_i
|
asenella
| 2023-05-13T02:08:42Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T02:08:34Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
Gracevonoiste/distilbert-base-uncased-finetuned-cola
|
Gracevonoiste
| 2023-05-13T02:07:02Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:glue",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-04-28T15:55:52Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- glue
metrics:
- matthews_correlation
model-index:
- name: distilbert-base-uncased-finetuned-cola
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: glue
type: glue
config: cola
split: validation
args: cola
metrics:
- name: Matthews Correlation
type: matthews_correlation
value: 0.4580724598795155
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-cola
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4865
- Matthews Correlation: 0.4581
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 10
- eval_batch_size: 10
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss | Matthews Correlation |
|:-------------:|:-----:|:----:|:---------------:|:--------------------:|
| 0.543 | 1.0 | 856 | 0.4865 | 0.4581 |
### Framework versions
- Transformers 4.29.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
DunnBC22/bert-base-uncased-Masked_Language_Modeling-Reddit_Comments
|
DunnBC22
| 2023-05-13T02:02:47Z | 162 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"fill-mask",
"generated_from_trainer",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-03-15T17:25:41Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: bert-base-uncased-Masked_Language_Modeling-Reddit_Comments
results: []
language:
- en
metrics:
- perplexity
---
# bert-base-uncased-Masked_Language_Modeling-Reddit_Comments
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased).
It achieves the following results on the evaluation set:
- Loss: 2.5415
## Model description
This is a masked language modeling project.
For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Masked%20Language%20Model/Datasets%20for%20NLP%20-%20Reddit%20Comments/Datasets_for_NLP_MLM.ipynb
## Intended uses & limitations
This model is intended to demonstrate my ability to solve a complex problem using technology.
## Training and evaluation data
Dataset Source: https://www.kaggle.com/datasets/toygarr/datasets-for-natural-language-processing
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 2.8757 | 1.0 | 10812 | 2.6382 |
| 2.6818 | 2.0 | 21624 | 2.5699 |
| 2.6103 | 3.0 | 32436 | 2.5402 |
Perplexity: 12.70
### Framework versions
- Transformers 4.27.0
- Pytorch 1.13.1+cu116
- Datasets 2.10.1
- Tokenizers 0.13.2
|
asenella/mmnist_MVTCAEconfig2_seed_0_ratio_02_c
|
asenella
| 2023-05-13T01:50:58Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T01:50:50Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
DunnBC22/distilgpt2-CLM_US_Economic_News_Articles
|
DunnBC22
| 2023-05-13T01:49:31Z | 162 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"text-generation",
"generated_from_trainer",
"en",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-03-08T03:22:37Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: distilgpt2-CLM_US_Economic_News_Articles
results: []
language:
- en
metrics:
- perplexity
---
# distilgpt2-CLM_US_Economic_News_Articles
This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2).
It achieves the following results on the evaluation set:
- Loss: 3.4472
## Model description
This is a causal lamguage modeling project.
For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Causal%20Language%20Modeling/US%20Economic%20News%20Articles/US%20Economic%20News%20Articles%20-%20CLM.ipynb
## Intended uses & limitations
This model is intended to demonstrate my ability to solve a complex problem using technology.
## Training and evaluation data
Dataset Source: https://www.kaggle.com/datasets/heeraldedhia/us-economic-news-articles
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 3.6225 | 1.0 | 1869 | 3.4853 |
| 3.5092 | 2.0 | 3738 | 3.4555 |
| 3.4514 | 3.0 | 5607 | 3.4472 |
Perplexity: 31.41
### Framework versions
- Transformers 4.26.1
- Pytorch 1.12.1
- Datasets 2.9.0
- Tokenizers 0.12.1
|
az00/pls-gelu-segformer-b5-scene-parse-150-cvfinal
|
az00
| 2023-05-13T01:49:08Z | 2 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"segformer",
"generated_from_trainer",
"dataset:scene_parse_150",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2023-05-12T18:04:27Z |
---
license: other
tags:
- generated_from_trainer
datasets:
- scene_parse_150
model-index:
- name: pls-gelu-segformer-b5-scene-parse-150-cvfinal
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# pls-gelu-segformer-b5-scene-parse-150-cvfinal
This model is a fine-tuned version of [nvidia/segformer-b5-finetuned-ade-640-640](https://huggingface.co/nvidia/segformer-b5-finetuned-ade-640-640) on the scene_parse_150 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2162
- Mean Iou: 0.6275
- Mean Accuracy: 0.7470
- Overall Accuracy: 0.9419
- Per Category Iou: [0.8685218749687071, 0.9788782456158296, 0.9606716149161357, 0.9253138334602241, 0.24566552901023891, 0.9673287812952618, 0.9833229666973768, nan, 0.6997756475627167, nan, 0.8422355336194145, 0.8849884526558891, 0.9198037983859276, 0.7675987245523669, 0.4259095355662179, 0.9140075841657418, 0.9757245751875003, 0.5952880426012587, nan, 0.8031902797180078, 0.43475935828877005, nan, 0.9271270640917996, nan, 0.0, nan, nan, 0.9220219140987388, nan, nan, nan, nan, nan, 0.6804193386143593, nan, nan, 0.5649463383838383, 0.7019282753649306, 0.0, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.8639147321831813, nan, nan, 0.9696930423599549, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3532369578881207, nan, 0.6765980498374865, 0.8919627154414351, nan, nan, 0.8257598257598258, 0.0, 0.8610134166992315, nan, nan, 0.7764935790061418, 0.8315092165898618, nan, nan, 0.869378791047898, nan, nan, nan, nan, nan, nan, 0.8238296374516015, 0.0, nan, 0.9686018546354324, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09776864357017029, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8853530031612223, 0.6824877250409165, nan, nan, nan, 0.9154241645244215, nan, nan, nan, 0.2217741935483871, 0.8228541612314114, 0.0, nan, 0.9068718753063425, 0.777870955011839, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
- Per Category Accuracy: [0.9598309042417804, 0.9873921185264668, 0.9962340538039527, 0.9644669676042977, 0.26165030897855324, 0.9937295413000042, 0.9923073655794724, nan, 0.7353488808219362, nan, 0.9995828336027534, 0.9734359123239947, 0.9432798096937258, 0.8383899269975219, 0.4320956536807448, 0.9670939961833929, 0.998369031064481, 0.8503918856615952, nan, 0.8516362383740957, 0.9387990762124712, nan, 0.9688871014916642, nan, nan, nan, nan, 0.9622535358689537, nan, nan, nan, nan, nan, 0.8896629675297986, nan, nan, 0.5999832383506537, 0.7032862154354107, nan, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.9703745789537762, nan, nan, 0.9902872094562263, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46484698097601324, nan, 0.7071479122434536, 0.9857723182180411, nan, nan, 0.9196251378169791, nan, 0.8752068851373718, nan, nan, 0.8066237457224059, 0.8360845641471184, nan, nan, 0.8787061994609164, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9719508516266766, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10438871473354232, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8853530031612223, 0.7387068201948627, nan, nan, nan, 0.9157380421738385, nan, nan, nan, 0.22261328640563133, 0.8276043033324587, nan, nan, 0.9306841046277666, 0.7947684088503604, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 0.1696 | 1.0 | 20 | 0.1269 | 0.6721 | 0.7965 | 0.9565 | [0.9135663110592681, 0.9825519725750829, 0.9682770522478501, 0.9110306258322237, 0.19175221854880808, 0.9731929353526031, 0.9846199197072778, nan, 0.8321642999001251, nan, 0.8806729926333758, 0.8467248339627556, 0.9137088054158243, 0.5747675962815405, 0.9107026007030572, 0.8487774294670847, 0.9760582258241872, 0.5338296112489661, nan, 0.7846778120434994, 0.5497866287339972, nan, 0.9038519968940955, nan, 0.0, nan, nan, 0.8968909926766707, nan, nan, nan, nan, nan, 0.6853492960572518, nan, nan, 0.6837282780410743, 0.9318723684878, 0.0, nan, nan, 0.0, 0.0, 0.4291920069504778, nan, nan, nan, 0.8979626711223346, nan, nan, 0.9895701866865907, nan, 0.09646256570535587, nan, nan, nan, nan, nan, nan, 0.34575260804769004, nan, 0.8046462080550661, 0.7515476138209597, nan, nan, 0.8324633113365508, nan, 0.7898716370669949, nan, nan, 0.8725899260956121, 0.8209644816632977, nan, nan, 0.8700244109002142, nan, nan, nan, nan, nan, nan, 0.7378111148994204, 0.0, nan, 0.9809370339259081, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.7434462444771723, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9083719135802469, 0.271036315323295, nan, nan, nan, 0.9814439090371122, nan, nan, nan, 0.470176038284054, 0.8237577639751553, 0.0, nan, 0.9149274498111707, 0.9366989912772282, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9395206002456757, 0.9891615552471795, 0.990194328772556, 0.9469901930749896, 0.24034896401308614, 0.986183735067806, 0.993109664874865, nan, 0.9710125514044982, nan, 0.9953590238306305, 0.9438234867179561, 0.9632024977698483, 0.5797334404929342, 0.9427527213724224, 0.9936389881098009, 0.9993549152717723, 0.743891194098663, nan, 0.8326558732345849, 0.8926096997690531, nan, 0.978831529687043, nan, nan, nan, nan, 0.9469979458596642, nan, nan, nan, nan, nan, 0.842334566378956, nan, nan, 0.7254441837076768, 0.9476888155179529, nan, nan, nan, 0.0, nan, 0.4391111111111111, nan, nan, nan, 0.9596404366705896, nan, nan, 0.997851783223547, nan, 0.09659980082515293, nan, nan, nan, nan, nan, nan, 0.7675765095119934, nan, 0.9265864590705355, 0.9981535112441446, nan, nan, 0.9318632855567806, nan, 0.8921549155908639, nan, nan, 0.9279334145351198, 0.8233420214306401, nan, nan, 0.9229956133396755, nan, nan, nan, nan, nan, nan, 0.9029835176298769, 0.0, nan, 0.9958636933215761, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.7912225705329153, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9924130663856692, 0.271036315323295, nan, nan, nan, 0.9951568661066347, nan, nan, nan, 0.4841179058512978, 0.8352138546313304, nan, nan, 0.9261569416498994, 0.9688019757068697, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1831 | 2.0 | 40 | 0.1248 | 0.6877 | 0.7875 | 0.9584 | [0.9172159758044477, 0.9821494521728426, 0.9682313648143698, 0.9109209291746041, 0.17895339954163483, 0.9708509835171748, 0.9844997563444166, nan, 0.8506306218224482, nan, 0.8806148748159057, 0.8446081281902177, 0.9194468411465823, 0.6330292908530318, 0.917229910135331, 0.8438263604105136, 0.9748195031586947, 0.5588631012445341, nan, 0.7911082680313449, 0.5737963693764798, nan, 0.9132947976878613, nan, 0.0, nan, nan, 0.9044381750164676, nan, nan, nan, nan, nan, 0.7094443992516066, nan, nan, 0.6580006439150032, 0.8918481642812695, 0.0, nan, nan, 0.0, nan, 0.48442622950819675, nan, nan, nan, 0.9011837166778943, nan, nan, 0.9890751724972764, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.413867047891351, nan, 0.797870584351271, 0.8356403928810734, nan, nan, 0.8065164923572004, nan, 0.9018156512937306, nan, nan, 0.868433236199243, 0.6597161888213148, nan, nan, 0.8863738795132455, nan, nan, nan, nan, nan, nan, 0.7635009310986964, 0.0, nan, 0.9807468111370182, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.7605790645879733, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9159864622735417, 0.3029229406554473, nan, nan, nan, 0.9808758197588322, nan, nan, nan, 0.3184564917127072, 0.8988336713995944, 0.0, nan, 0.8979420802681813, 0.9294648550546917, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9517911091929219, 0.9877480693536655, 0.9920979575649415, 0.9460120842576154, 0.2043620501635769, 0.9840581558474685, 0.9915852962136192, nan, 0.932340294428891, nan, 0.997757730614799, 0.9487588909856293, 0.9506145306769749, 0.6600361663652803, 0.9470533562280439, 0.9715956353672262, 0.9991693012697823, 0.7660212079299217, nan, 0.8520151567344126, 0.8394919168591224, nan, 0.9762357414448669, nan, nan, nan, nan, 0.9521267548507805, nan, nan, nan, nan, nan, 0.8961364570489108, nan, nan, 0.6851324170298357, 0.9242158481221626, nan, nan, nan, 0.0, nan, 0.5253333333333333, nan, nan, nan, 0.9500629032912625, nan, nan, 0.998248846987915, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.47890818858560796, nan, 0.9121962727058268, 0.993838558572567, nan, nan, 0.8842337375964718, nan, 0.9437272426348892, nan, nan, 0.9247723449915898, 0.6597161888213148, nan, nan, 0.9354685270334548, nan, nan, nan, nan, nan, nan, 0.9409555601919466, 0.0, nan, 0.9955852322829164, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8564263322884013, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9696522655426765, 0.3029229406554473, nan, nan, nan, 0.9936139207954741, nan, nan, nan, 0.3245930488341399, 0.9302020467069011, nan, nan, 0.9701207243460764, 0.9550425885792047, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1655 | 3.0 | 60 | 0.1231 | 0.6705 | 0.7841 | 0.9596 | [0.9221188809327335, 0.9816427249944908, 0.9677609064133385, 0.922621508718402, 0.19513746554526823, 0.9743198629278281, 0.9845204417929398, nan, 0.8511589973621501, nan, 0.8628781623001276, 0.8811161328355276, 0.9250830360437938, 0.7085691977357017, 0.9107827038861521, 0.8539530693663936, 0.9754312970860514, 0.5834205933682374, nan, 0.8120402991532233, 0.5976430976430976, nan, 0.9110754852133575, nan, 0.0, nan, nan, 0.923766872240083, nan, nan, nan, nan, nan, 0.6821963132188436, nan, nan, 0.6447014743015395, 0.8380763962867144, 0.0, nan, nan, 0.0, nan, 0.43739279588336194, nan, nan, nan, 0.9041428489494714, nan, nan, 0.9903678023832867, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.2640877246420956, nan, 0.7677694337324944, 0.8654598995323055, nan, nan, 0.8086072450893298, 0.0, 0.8869372225745086, nan, nan, 0.8887636123638764, 0.6017955401100492, nan, nan, 0.8746488225979465, nan, nan, nan, nan, nan, nan, 0.7643256464011181, 0.0, nan, 0.9818548387096774, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6197695573074591, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9088738559490649, 0.39681133746678476, nan, nan, nan, 0.9860363822700486, nan, nan, nan, 0.3377597654565836, 0.8339788277820811, 0.0, nan, 0.9108986615678776, 0.9245089416593374, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9583911556721224, 0.9881246550114265, 0.9925226386410205, 0.955398237927738, 0.20072700836059615, 0.9911363346511924, 0.995332505863864, nan, 0.8947128715523823, nan, 0.9989744659401019, 0.976339091304979, 0.9317078005748836, 0.7293550331525015, 0.943682162604956, 0.9740421784019181, 0.9987189591010197, 0.7706316274781005, nan, 0.905132621426111, 0.8198614318706697, nan, 0.976528224627084, nan, nan, nan, nan, 0.9661655610943204, nan, nan, nan, nan, nan, 0.8898273736128237, nan, nan, 0.6633422728796513, 0.8523008666941808, nan, nan, nan, 0.0, nan, 0.4533333333333333, nan, nan, nan, 0.9596404366705896, nan, nan, 0.9976074362916281, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.71712158808933, nan, 0.8923802783675395, 0.9711170285136738, nan, nan, 0.9031973539140022, nan, 0.9259847732538894, nan, nan, 0.9372716199756395, 0.6017955401100492, nan, nan, 0.9049733100787485, nan, nan, nan, nan, nan, nan, 0.9127894846651367, 0.0, nan, 0.9944771893999165, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6407523510971787, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9626975763962066, 0.39681133746678476, nan, nan, nan, 0.989670838333619, nan, nan, nan, 0.3446546414430268, 0.8475465757019155, nan, nan, 0.9585513078470825, 0.9536313693866236, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0718 | 4.0 | 80 | 0.1275 | 0.6694 | 0.7799 | 0.9578 | [0.9134609402105082, 0.9805005224446312, 0.9674453795281459, 0.9272392167446749, 0.1936352682684811, 0.9758118795221269, 0.9829411626981339, nan, 0.7762752591418602, nan, 0.8531686534150795, 0.8562648056853832, 0.9227758357540249, 0.7198100407055631, 0.9047094838116206, 0.8656670746634026, 0.9754753072280017, 0.6143911439114391, nan, 0.8196726471514006, 0.6182634730538922, nan, 0.9099145124484861, nan, 0.0, nan, nan, 0.9272715831865089, nan, nan, nan, nan, nan, 0.6974777804467932, nan, nan, 0.6408289817232375, 0.8422032860495969, 0.0, nan, nan, 0.0, nan, 0.36837455830388693, nan, nan, nan, 0.9071361012990522, nan, nan, 0.9878809247972599, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.37797619047619047, nan, 0.7633030637358907, 0.8615868996081518, nan, nan, 0.8135490394337714, nan, 0.8868760840238967, nan, nan, 0.8970246078887582, 0.6938893715609615, nan, nan, 0.8847610275304473, nan, nan, nan, nan, nan, nan, 0.7708590580349268, 0.0, nan, 0.981798333812123, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.47213622291021673, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9172468669186393, 0.44729849424269263, nan, nan, nan, 0.9738002482983005, nan, nan, nan, 0.2739821771798008, 0.8330749354005168, 0.0, nan, 0.8943535514764565, 0.9158512720156555, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan] | [0.9585936721888384, 0.9875262449251213, 0.9952029215414105, 0.9637324632470998, 0.19861868411486733, 0.9861412234833993, 0.990660292320108, nan, 0.8135640898557325, nan, 0.9989744659401019, 0.9706778922920598, 0.9288581623550402, 0.7816623133078829, 0.9450999543156005, 0.9689778343201056, 0.9977726319761197, 0.7676348547717843, nan, 0.897037547364795, 0.953810623556582, nan, 0.9767475870137468, nan, nan, nan, nan, 0.9640198348837513, nan, nan, nan, nan, nan, 0.8950678175092478, nan, nan, 0.6582299698290311, 0.8515012381345439, nan, nan, nan, 0.0, nan, 0.37066666666666664, nan, nan, nan, 0.9536138955399537, nan, nan, 0.9983812015760377, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.7353184449958643, nan, 0.848690728945506, 0.9786779140508076, nan, nan, 0.8871003307607497, nan, 0.9140019860973188, nan, nan, 0.9503799083579839, 0.6938893715609615, nan, nan, 0.9137994820569737, nan, nan, nan, nan, nan, nan, 0.9117462966826622, 0.0, nan, 0.9913329001717176, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4780564263322884, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9717597471022128, 0.44729849424269263, nan, nan, nan, 0.9749271386936397, nan, nan, nan, 0.27593488781346237, 0.845972185778011, nan, nan, 0.9019114688128773, 0.9435008316113099, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.057 | 5.0 | 100 | 0.1223 | 0.6828 | 0.8014 | 0.9605 | [0.919318568775981, 0.981836786902021, 0.9689497069084484, 0.92946941270112, 0.23047684865238424, 0.9760735423750158, 0.9862021402868416, nan, 0.8376372409039436, nan, 0.852049143840698, 0.8994162249211568, 0.9332927423283001, 0.7229107865995827, 0.8926554516389279, 0.8837199018285538, 0.9753755181481867, 0.6558861578266494, nan, 0.8289918536883795, 0.5458064516129032, nan, 0.9126838235294118, nan, 0.0, nan, nan, 0.9274671217749222, nan, nan, nan, nan, nan, 0.7073408665962116, nan, nan, 0.6886927480916031, 0.8975832822399346, 0.0, nan, nan, 0.0, nan, 0.38461538461538464, nan, nan, nan, 0.9081566503965833, nan, nan, 0.9903408460086599, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.334515785739624, nan, 0.774993735905788, 0.8439532503549143, nan, nan, 0.8356548752379998, nan, 0.876711448222665, nan, nan, 0.8803390576838397, 0.8221836084564147, nan, nan, 0.8849562028926462, nan, nan, nan, nan, nan, nan, 0.7531624500665779, 0.011428571428571429, nan, 0.9824606699025784, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5639121015165584, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9169781625024592, 0.5326855123674912, nan, nan, nan, 0.9834111158252016, nan, nan, nan, 0.3407927834157342, 0.8794125225457357, 0.0, nan, 0.9138204924543288, 0.9362483474514028, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan] | [0.9500713787722851, 0.989955996223826, 0.9931654438495604, 0.9654672222816879, 0.242457288258815, 0.9885218722101773, 0.9920572369756149, nan, 0.9115909608455119, nan, 0.9981227512123898, 0.9728552765277979, 0.9495737932401626, 0.7890965106154979, 0.9412404102144016, 0.9778343201056907, 0.9988102446757688, 0.818118948824343, nan, 0.8868756458835687, 0.976905311778291, nan, 0.9802208248025739, nan, nan, nan, nan, 0.9669898339678926, nan, nan, nan, nan, nan, 0.8940608302507193, nan, nan, 0.7258632249413343, 0.9062886917044986, nan, nan, nan, 0.0, nan, 0.38666666666666666, nan, nan, nan, 0.9664989245566332, nan, nan, 0.9989717066615083, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.7799834574028123, nan, 0.875583864118896, 0.9937025015063461, nan, nan, 0.9194046306504962, nan, 0.9029460443561734, nan, nan, 0.9276724087929934, 0.8221836084564147, nan, nan, 0.9183975476983246, nan, nan, nan, nan, nan, nan, 0.94408512413937, 0.011428571428571429, nan, 0.9933923516034715, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5711598746081504, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9822971548998947, 0.5341009743135519, nan, nan, nan, 0.9934424824275673, nan, nan, nan, 0.3457105147382314, 0.8955654683810024, nan, nan, 0.9259557344064386, 0.9637115064764881, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.2006 | 6.0 | 120 | 0.1255 | 0.6920 | 0.7890 | 0.9586 | [0.914168907082392, 0.9814688768225341, 0.9678994575848762, 0.9243524847069252, 0.20678252304228523, 0.9762269134818976, 0.9853815193063628, nan, 0.7930727126644571, nan, 0.8462842069585464, 0.8530493216685685, 0.9107916787055764, 0.7199654278305964, 0.9000107929753149, 0.9034987422821862, 0.9753916185314602, 0.6227132765803296, nan, 0.8113354794654181, 0.6070656092285508, nan, 0.9208300245140352, nan, 0.0, nan, nan, 0.9204013633126462, nan, nan, nan, nan, nan, 0.7020924233255836, nan, nan, 0.6512574265483845, 0.8478788343166552, 0.0, nan, nan, 0.0, nan, 0.39649122807017545, nan, nan, nan, 0.9120451188398012, nan, nan, 0.9911723413496015, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4043824701195219, nan, 0.7583185218864961, 0.8781575548859234, nan, nan, 0.7914745363254432, nan, 0.8845300261096606, nan, nan, 0.8754020839633795, 0.8166811468288445, nan, nan, 0.8800309997416689, nan, nan, nan, nan, nan, nan, 0.7718996908278942, 0.0, nan, 0.9818009913187978, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.622704004817826, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9111599121581154, 0.42515500442869797, nan, nan, nan, 0.9832710597826086, nan, nan, nan, 0.36172624237140366, 0.851421188630491, 0.0, nan, 0.9151111993645751, 0.9367415566060724, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.956912674435333, 0.9893369513069585, 0.9934116597263534, 0.9698927033082224, 0.21366775717920755, 0.9898184755345831, 0.9890320966912233, nan, 0.8462218531070419, nan, 0.998209660878483, 0.9766294092030774, 0.9373327386262266, 0.7810595405532115, 0.9195481970415413, 0.9666291530068014, 0.998877187430585, 0.7925311203319502, nan, 0.8594901825697554, 0.9722863741339491, nan, 0.9750658087159988, nan, nan, nan, nan, 0.9469063599848229, nan, nan, nan, nan, nan, 0.9060830250719276, nan, nan, 0.67063359034529, 0.8562990094923648, nan, nan, nan, 0.0, nan, 0.4017777777777778, nan, nan, nan, 0.9647335741244267, nan, nan, 0.9991040612496309, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6716294458229942, nan, 0.865581505071951, 0.9912534743143696, nan, nan, 0.8515986769570011, nan, 0.8971201588877855, nan, nan, 0.9234093150049301, 0.8166811468288445, nan, nan, 0.9002166904497648, nan, nan, nan, nan, nan, nan, 0.9376173586480284, 0.0, nan, 0.9939840813106233, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6482758620689655, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.961854583772392, 0.42515500442869797, nan, nan, nan, 0.9925424309960569, nan, nan, nan, 0.36506819181698197, 0.8646024665442141, nan, nan, 0.9272635814889336, 0.9687515750214203, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1193 | 7.0 | 140 | 0.1288 | 0.6815 | 0.8008 | 0.9574 | [0.9089575137186, 0.9816081595871052, 0.9677470281869162, 0.9303302939352245, 0.23104916231804448, 0.9677582394440349, 0.9849904869283349, nan, 0.815754039497307, nan, 0.8332608001856848, 0.8776824034334764, 0.9190563951345374, 0.7456661413504667, 0.8790098586820498, 0.8849203690053385, 0.9755307322351094, 0.6229286438958404, nan, 0.801372566882269, 0.5059952038369304, nan, 0.9137378011328738, nan, 0.0, nan, nan, 0.929423093980056, nan, nan, nan, nan, nan, 0.695702968712491, nan, nan, 0.6630271382229607, 0.8463853117429555, 0.0, nan, nan, 0.0, nan, 0.43287435456110157, nan, nan, nan, 0.8963680206366712, nan, nan, 0.992082751926227, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.27741935483870966, nan, 0.7707094979132414, 0.880852677109721, nan, nan, 0.8125062331704398, 0.0, 0.9097897211104758, nan, nan, 0.8780698094840773, 0.8591264101822389, nan, nan, 0.8853840226046751, nan, nan, nan, nan, nan, nan, 0.7501209872560091, 0.022857142857142857, nan, 0.9813980417784504, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6667617689015692, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.917906008328376, 0.6247863247863248, nan, nan, nan, 0.9831886278515491, nan, nan, nan, 0.3229048326487809, 0.8598347960764068, 0.0, nan, 0.9109728836621634, 0.9193383106426585, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9480727731483019, 0.9889861591874004, 0.9926763170070725, 0.9633190738601342, 0.24463831334060343, 0.9767886749139141, 0.9895181956760788, nan, 0.8764550654360843, nan, 0.9984182440771062, 0.9796051676585862, 0.9267766874814154, 0.8239233808854062, 0.907355188329999, 0.9692469540539218, 0.9986337592312537, 0.8492392807745505, nan, 0.9090595935239407, 0.9745958429561201, nan, 0.9790143316759287, nan, nan, nan, nan, 0.9597022150698015, nan, nan, nan, nan, nan, 0.8933621043978627, nan, nan, 0.7023131076097888, 0.8561442426743706, nan, nan, nan, 0.0, nan, 0.4471111111111111, nan, nan, nan, 0.9730327502942251, nan, nan, 0.9989208010506918, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.49793217535153017, nan, 0.8625619249823071, 0.9935081342688876, nan, nan, 0.8982359426681367, nan, 0.94809665673618, nan, nan, 0.9155791427411403, 0.8601216333622936, nan, nan, 0.9108398076211617, nan, nan, nan, nan, nan, nan, 0.9701648237012309, 0.022857142857142857, nan, 0.9931719032811992, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.7326018808777429, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9755532139093783, 0.6474756421612046, nan, nan, nan, 0.9900994342533859, nan, nan, nan, 0.3251209854817422, 0.874048806087641, nan, nan, 0.9429577464788732, 0.9495489138652285, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1854 | 8.0 | 160 | 0.1337 | 0.6622 | 0.7850 | 0.9570 | [0.9085511525173974, 0.9805941628799328, 0.9668637761555303, 0.9271132059399756, 0.19197565808095104, 0.9767836919592299, 0.9850631762620049, nan, 0.7729774560808264, nan, 0.8331738853133879, 0.8907867836180176, 0.917889274195504, 0.7535005289688219, 0.8453517607317487, 0.8889038876889849, 0.9756129469621736, 0.6256572029442692, nan, 0.8095651056722554, 0.5261669024045261, nan, 0.9182886079437803, nan, 0.0, nan, nan, 0.9221372302774289, nan, nan, nan, nan, nan, 0.7017566191446029, nan, nan, 0.6359503125761143, 0.8379795396419437, 0.0, nan, nan, 0.0, 0.0, 0.36977777777777776, nan, nan, nan, 0.901723811327903, nan, nan, 0.9924761779388981, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.32410349523377213, nan, 0.7699065014133507, 0.8733552912265133, nan, nan, 0.8205051112447385, 0.0, 0.8838298425913881, nan, nan, 0.8889593209127991, 0.7995945554590211, nan, nan, 0.8775066065599254, nan, nan, nan, nan, nan, nan, 0.7707325212748206, 0.0, nan, 0.9810676009181437, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5659486574210681, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9245663574021783, 0.552212389380531, nan, nan, nan, 0.9840746607303394, nan, nan, nan, 0.37336928161419375, 0.833764888658726, 0.0, nan, 0.899057344854674, 0.9194301154507492, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9566205194603986, 0.9874333881875913, 0.9953731244629519, 0.9641421616573962, 0.19723736822973464, 0.9899885218722102, 0.9937845401645186, nan, 0.8239722449198291, nan, 0.9990961394726321, 0.9803309624038322, 0.9379274457329765, 0.8109302792847096, 0.8633563855763323, 0.9666291530068014, 0.9984329309668055, 0.8229598893499308, nan, 0.8682397519807096, 0.859122401847575, nan, 0.9745905235448962, nan, nan, nan, nan, 0.9510931428347137, nan, nan, nan, nan, nan, 0.9063707357172215, nan, nan, 0.6564699966476701, 0.8451557985967808, nan, nan, nan, 0.0, nan, 0.36977777777777776, nan, nan, nan, 0.9659104744125644, nan, nan, 0.997851783223547, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5905707196029777, nan, 0.8352913422977117, 0.9946937744173842, nan, nan, 0.902646085997795, nan, 0.8995696789142668, nan, nan, 0.9354155791427411, 0.7995945554590211, nan, nan, 0.8950372601870937, nan, nan, nan, nan, nan, nan, 0.9636970582098894, 0.0, nan, 0.9893372627279899, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6012539184952979, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9660695468914647, 0.5527015057573074, nan, nan, nan, 0.9852134407680438, nan, nan, nan, 0.3777386713594369, 0.844922592495408, nan, nan, 0.9211267605633803, 0.9432488281840633, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0497 | 9.0 | 180 | 0.1374 | 0.6811 | 0.7752 | 0.9544 | [0.9061207683443958, 0.9804929498141455, 0.9665703110078181, 0.9217189753931908, 0.1995841995841996, 0.9776169358212702, 0.9847457864971442, nan, 0.7844154535838412, nan, 0.8318011379265415, 0.8755630998237253, 0.8414422820581179, 0.6912935481816445, 0.8019049674028637, 0.922056463034652, 0.9754015303469281, 0.6882621951219512, nan, 0.7049147774284296, 0.5472242249459265, nan, 0.9143111050346926, nan, 0.0, nan, nan, 0.9269665057217807, nan, nan, nan, nan, nan, 0.6957855746148832, nan, nan, 0.6685853263072558, 0.8635116773912375, 0.0, nan, nan, 0.0, nan, 0.35733333333333334, nan, nan, nan, 0.9047063994038178, nan, nan, 0.9904402337953382, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.40131219245489336, nan, 0.7549393562765182, 0.8563989585419587, nan, nan, 0.809762074951496, nan, 0.8201688371179897, nan, nan, 0.881779718848096, 0.8077034462785984, nan, nan, 0.8848308777754721, nan, nan, nan, nan, nan, nan, 0.7758881966661054, 0.0, nan, 0.9805811277330264, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5210682492581602, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9222266881028939, 0.5057573073516386, nan, nan, nan, 0.985501066098081, nan, nan, nan, 0.3745854424855996, 0.8318629994810587, 0.0, nan, 0.8876404494382022, 0.8116704355520598, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9546451534366942, 0.9879131479981635, 0.995126908586159, 0.9716643721095038, 0.2163576881134133, 0.9933681928325468, 0.9935108145225612, nan, 0.834420585910814, nan, 0.9986789730753854, 0.9733633328494702, 0.9385221528397264, 0.7269439421338155, 0.8196411411647947, 0.9556686402113813, 0.9988072018232772, 0.8326417704011065, nan, 0.7336892869445402, 0.8764434180138568, nan, 0.9779906405381691, nan, nan, nan, nan, 0.9570200573065902, nan, nan, nan, nan, nan, 0.8994451294697904, nan, nan, 0.6911666107945021, 0.8688351217498969, nan, nan, nan, 0.0, nan, 0.35733333333333334, nan, nan, nan, 0.9607158800373361, nan, nan, 0.9989106199285285, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6071133167907361, nan, 0.8310922387355508, 0.9973371688468192, nan, nan, 0.8743109151047409, nan, 0.8297252565375703, nan, nan, 0.9368366104054289, 0.8077034462785984, nan, nan, 0.9014322710216162, nan, nan, nan, nan, nan, nan, 0.9614020446484457, 0.0, nan, 0.988681719032812, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5504702194357367, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9671232876712329, 0.5057573073516386, nan, nan, nan, 0.990485170581176, nan, nan, nan, 0.3776506819181698, 0.8412490160062975, nan, nan, 0.8980885311871227, 0.8321657174537573, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0526 | 10.0 | 200 | 0.1350 | 0.6821 | 0.7807 | 0.9551 | [0.9046255568846483, 0.9805004808380906, 0.9672752890312868, 0.9239218920523421, 0.2472218503146338, 0.978052874527906, 0.9846014874307886, nan, 0.7915552640871715, nan, 0.8393415782241324, 0.8698363211223694, 0.9030861510182363, 0.7192090395480226, 0.8127281408350917, 0.8858752767032623, 0.9754204552207761, 0.615257048092869, nan, 0.8069604086845467, 0.5455893254262416, nan, 0.9210762952203085, nan, 0.0, nan, nan, 0.9245853534064813, nan, nan, nan, nan, nan, 0.7016878315383379, nan, nan, 0.6348756178591686, 0.8231901148482362, 0.0, nan, nan, 0.0, nan, 0.35644444444444445, nan, nan, nan, 0.9026181776877497, nan, nan, 0.9899655095907541, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.19914417379855168, nan, 0.7552450540528992, 0.8611377034312491, nan, nan, 0.8351272507795996, nan, 0.8816575733956872, nan, nan, 0.8622128859283009, 0.8042282073559224, nan, nan, 0.8887046632124352, nan, nan, nan, nan, nan, nan, 0.7762273032952253, 0.0, nan, 0.9810499565919819, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5227998838222481, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9265684890054469, 0.6089193825042881, nan, nan, nan, 0.9858554873892297, nan, nan, nan, 0.3099666608176873, 0.8354922279792746, 0.0, nan, 0.8995581737849779, 0.8495840921395875, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9565917466219581, 0.9888210805429024, 0.9937636327582788, 0.959292218313888, 0.268484187568157, 0.9908174977681418, 0.9928076227871877, nan, 0.8407431717412562, nan, 0.9980010776798596, 0.9719843228335027, 0.9252403607889781, 0.7673297166968053, 0.8312670331920793, 0.9692958849146156, 0.9988711017256017, 0.855232826187183, nan, 0.8706166035136066, 0.8498845265588915, nan, 0.9736765136004679, nan, nan, nan, nan, 0.9481623948397901, nan, nan, nan, nan, nan, 0.897081792026305, nan, nan, 0.6566376131411331, 0.8282862154354107, nan, nan, nan, 0.0, nan, 0.35644444444444445, nan, nan, nan, 0.9674729110019885, nan, nan, 0.9994094949145295, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5004135649296939, nan, 0.8339230950695918, 0.9965791366207312, nan, nan, 0.9153252480705623, nan, 0.8986428334988414, nan, nan, 0.9046459022098486, 0.8042282073559224, nan, nan, 0.9065059986258655, nan, nan, nan, nan, nan, nan, 0.9632797830168996, 0.0, nan, 0.9898999860769481, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5642633228840125, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9679662802950474, 0.6288751107174491, nan, nan, nan, 0.9917709583404766, nan, nan, nan, 0.3108666959964804, 0.8462345840986618, nan, nan, 0.9217303822937626, 0.8699662315407489, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0863 | 11.0 | 220 | 0.1323 | 0.6723 | 0.7847 | 0.9565 | [0.908841042503188, 0.9804408981637768, 0.9673339510701002, 0.9204359100147173, 0.2631543714017941, 0.9767724623705505, 0.984859247074657, nan, 0.8018783598343217, nan, 0.8436857260386672, 0.8873749341758821, 0.8998911334220394, 0.7517889630078836, 0.8248143429761055, 0.8947464262951834, 0.975490385443999, 0.5889837083010085, nan, 0.7984091714166103, 0.5450236966824644, nan, 0.9238296838671618, nan, 0.0, nan, nan, 0.9236883739920384, nan, nan, nan, nan, nan, 0.6994805278611299, nan, nan, 0.6095268956578095, 0.9020386211135584, 0.0, nan, nan, 0.0, nan, 0.32711111111111113, nan, nan, nan, 0.8986123924462938, nan, nan, 0.9914210361448218, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.22557876905702992, nan, 0.761516654854713, 0.8648511759045094, nan, nan, 0.7890176322418136, 0.0, 0.8779905848971432, nan, nan, 0.859643785607611, 0.8492476851851852, nan, nan, 0.8880336991003173, nan, nan, nan, nan, nan, nan, 0.7904316172646906, 0.0, nan, 0.9808106024872588, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.46900763358778624, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9249697458652683, 0.62382176520994, nan, nan, nan, 0.9832773073486235, nan, nan, nan, 0.33748028736639213, 0.8631551768654789, 0.0, nan, 0.8787, 0.9045463491438693, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9543419319854365, 0.9888520327887458, 0.9936512657809505, 0.9695236056412887, 0.28578698655034535, 0.9903923819240743, 0.9946245947208706, nan, 0.8532008090766614, nan, 0.998209660878483, 0.9784438960661925, 0.9217216770740411, 0.8302859821847164, 0.8363395768679406, 0.9616871360767236, 0.9988863159880599, 0.8750576302443522, nan, 0.8540475370306579, 0.7967667436489607, nan, 0.9733109096226967, nan, nan, nan, nan, 0.9471942013057529, nan, nan, nan, nan, nan, 0.8910398684751336, nan, nan, 0.6305732484076433, 0.908507016095749, nan, nan, nan, 0.0, nan, 0.32711111111111113, nan, nan, nan, 0.9684671888316221, nan, nan, 0.9989106199285285, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6608767576509512, nan, 0.8111347015805614, 0.9956656106046765, nan, nan, 0.8633958103638368, nan, 0.9013571665011586, nan, nan, 0.9014268313902906, 0.8499855198378222, nan, nan, 0.9024892976058347, nan, nan, nan, nan, nan, nan, 0.951387440016691, 0.0, nan, 0.9891748271221051, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.48150470219435737, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.966491043203372, 0.6448184233835252, nan, nan, nan, 0.9903994513972227, nan, nan, nan, 0.3389353277606687, 0.87719758593545, nan, nan, 0.8840040241448692, 0.9265662013003377, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.043 | 12.0 | 240 | 0.1394 | 0.6661 | 0.7837 | 0.9533 | [0.9013573092265916, 0.9811649361492872, 0.9664180363290531, 0.9242075122955344, 0.2736760225578531, 0.9735057890893919, 0.9865504768168366, nan, 0.7663989121949991, nan, 0.838999883163921, 0.8960317460317461, 0.8965451467537732, 0.7503942739293946, 0.8163407496864694, 0.8818719168036976, 0.9755058984340178, 0.5622984462034594, nan, 0.7971734393026535, 0.5044937088076693, nan, 0.9186885245901639, nan, 0.0, nan, nan, 0.9235223895760839, nan, nan, nan, nan, nan, 0.6883727839952462, nan, nan, 0.6026941491519923, 0.7981632548355652, 0.0, nan, nan, 0.0, nan, 0.3413333333333333, nan, nan, nan, 0.869447719234606, nan, nan, 0.9920897439787981, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1244031163608947, nan, 0.737146614069691, 0.8369364511332255, nan, nan, 0.8094903691813804, 0.0, 0.8710171764255156, nan, nan, 0.8055066878891961, 0.8302924992759919, nan, nan, 0.889256711322609, nan, nan, nan, nan, nan, nan, 0.7927620348241721, 0.0, nan, 0.9797803007622689, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5669985775248934, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9314679643146796, 0.6264020707506471, nan, nan, nan, 0.982962047961835, nan, nan, nan, 0.3826944905264996, 0.856074284240392, 0.0, nan, 0.8939733229481063, 0.8903105162147532, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9554419397319699, 0.9881194963037859, 0.9940726419459317, 0.9578490464361775, 0.3069429298436932, 0.9919015431705139, 0.9911086360440038, nan, 0.8153992471836363, nan, 0.9985572995428551, 0.983306720859341, 0.931757359500446, 0.828544638671221, 0.8305896438192159, 0.9709350687478593, 0.9989289159229428, 0.88427846934071, nan, 0.8568722011712022, 0.9722863741339491, nan, 0.9732012284293653, nan, nan, nan, nan, 0.9514464026376732, nan, nan, nan, nan, nan, 0.8570283600493218, nan, nan, 0.6224438484746899, 0.802569129178704, nan, nan, nan, 0.0, nan, 0.3413333333333333, nan, nan, nan, 0.9727080881457733, nan, nan, 0.9985339184084869, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4094292803970223, nan, 0.7934890304317056, 0.9955101168147097, nan, nan, 0.8896361631753033, nan, 0.9030784508440913, nan, nan, 0.8365524041528913, 0.8302924992759919, nan, nan, 0.9086200517943026, nan, nan, nan, nan, nan, nan, 0.9689129981222616, 0.0, nan, 0.9872604074813199, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6247648902821317, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9681770284510011, 0.6430469441984057, nan, nan, nan, 0.9890708040459455, nan, nan, nan, 0.3856577210734712, 0.8709000262398321, nan, nan, 0.9237424547283702, 0.9118492011491356, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0536 | 13.0 | 260 | 0.1350 | 0.6692 | 0.7833 | 0.9545 | [0.9024655980106084, 0.9807079881626509, 0.966114963992537, 0.9236172513764959, 0.2434179195863664, 0.9753535100361342, 0.9849157719189762, nan, 0.7787409959285938, nan, 0.8327772781339755, 0.8818375104696863, 0.9047320589157004, 0.7376698078570555, 0.8023361533601467, 0.9130035157057836, 0.975741496436772, 0.6045332065303535, nan, 0.8075535476322033, 0.5168393782383419, nan, 0.9209763170613823, nan, 0.0, nan, nan, 0.9273921533818569, nan, nan, nan, nan, nan, 0.7078097167976729, nan, nan, 0.6309820638623019, 0.7643962927781058, 0.0, nan, nan, 0.0, nan, 0.3448888888888889, nan, nan, nan, 0.8681460012680011, nan, nan, 0.9921401620522573, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1866011925640126, nan, 0.7294660743837644, 0.8802731882308302, nan, nan, 0.8096544715447155, 0.0, 0.9041325510927729, nan, nan, 0.8189077149931836, 0.8169707500724008, nan, nan, 0.8912425729785585, nan, nan, nan, nan, nan, nan, 0.7696592398427261, 0.0, nan, 0.9793905262066821, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5669246202350243, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.928254735993551, 0.5687992988606485, nan, nan, nan, 0.9837215232913975, nan, nan, nan, 0.38737007598916934, 0.8592783505154639, 0.0, nan, 0.9096812278630461, 0.9200176973748894, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9542368004603654, 0.988129813719067, 0.9951434331416484, 0.9677371729333298, 0.26012359142130137, 0.992581728521022, 0.9922460132804131, nan, 0.8326925910546127, nan, 0.9978967860805479, 0.993395267818261, 0.9299980176429775, 0.8073806175071998, 0.8202082578490525, 0.9720849439741645, 0.998226016997374, 0.8792070078377132, nan, 0.8558732345849122, 0.9214780600461894, nan, 0.9753217315004388, nan, nan, nan, nan, 0.9568892203425312, nan, nan, nan, nan, nan, 0.8901150842581176, nan, nan, 0.6574756956084479, 0.7680045398266612, nan, nan, nan, 0.0, nan, 0.3448888888888889, nan, nan, nan, 0.9724848829187127, nan, nan, 0.9985644617749768, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4400330851943755, nan, 0.7902807265864591, 0.9920503799879492, nan, nan, 0.8783902976846748, nan, 0.9284342932803707, nan, nan, 0.8536047793051447, 0.8169707500724008, nan, nan, 0.9116854288885365, nan, nan, nan, nan, nan, nan, 0.9801794283329857, 0.0, nan, 0.9866744790458069, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6200626959247649, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9707060063224446, 0.5748449955713021, nan, nan, nan, 0.99198525630036, nan, nan, nan, 0.3902331720193577, 0.8748360010495932, nan, nan, 0.9301810865191147, 0.9432488281840633, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0344 | 14.0 | 280 | 0.1332 | 0.6693 | 0.7817 | 0.9566 | [0.9119309616329353, 0.9796317977085772, 0.9657236789351294, 0.9181987967216446, 0.22905759162303665, 0.9752718015066464, 0.9849055896195459, nan, 0.8374032088211609, nan, 0.8447667235394481, 0.8764607679465777, 0.8750584494529131, 0.7381863047501542, 0.833230312770429, 0.9270366948570345, 0.9755878619984484, 0.6142244800762026, nan, 0.7716468312514344, 0.5483870967741935, nan, 0.9221018267694658, nan, 0.0, nan, nan, 0.9280201427197733, nan, nan, nan, nan, nan, 0.6977210835602694, nan, nan, 0.6372275013974288, 0.7725218284540318, 0.0, nan, nan, 0.0, nan, 0.30044444444444446, nan, nan, nan, 0.8852650294477165, nan, nan, 0.9921097353726632, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16789146410401357, nan, 0.7312910473556862, 0.8881546134663342, nan, nan, 0.8020975353959098, 0.0, 0.8897054041750666, nan, nan, 0.8577757962493382, 0.8308717057631045, nan, nan, 0.8921056713022799, nan, nan, nan, nan, nan, nan, 0.7772089182493807, 0.0, nan, 0.9787256098965652, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.503345941227815, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9269183922046285, 0.5795454545454546, nan, nan, nan, 0.9855103345408055, nan, nan, nan, 0.38739525139664804, 0.8464119772844605, 0.0, nan, 0.8926080892608089, 0.921998031496063, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9572878279826921, 0.9884857645462659, 0.9950707250974948, 0.9671909083862681, 0.24173027989821882, 0.9934107044169537, 0.9915664185831394, nan, 0.9054023281046978, nan, 0.9983139524777945, 0.9907098272608507, 0.9274457329765091, 0.8014198647110039, 0.8494147670883284, 0.9574056857660126, 0.9987189591010197, 0.8918856615952052, nan, 0.8107475025835342, 0.9226327944572749, nan, 0.9707151213805206, nan, nan, nan, nan, 0.9596498802841779, nan, nan, nan, nan, nan, 0.9003699136868064, nan, nan, 0.6687898089171974, 0.7759492364836978, nan, nan, nan, 0.0, nan, 0.30044444444444446, nan, nan, nan, 0.9699078771153768, nan, nan, 0.9985237372863237, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4913151364764268, nan, 0.7985373908940788, 0.9691344826915975, nan, nan, 0.8432194046306505, nan, 0.9057265806024495, nan, nan, 0.8927266399860797, 0.8308717057631045, nan, nan, 0.9120025368638022, nan, nan, nan, nan, nan, nan, 0.9818485291049447, 0.0, nan, 0.985879704831299, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.542319749216301, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9622760800842992, 0.587245349867139, nan, nan, nan, 0.9911280644608264, nan, nan, nan, 0.3904971403431588, 0.8604040934138022, nan, nan, 0.9014084507042254, 0.9442568418930497, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0384 | 15.0 | 300 | 0.1342 | 0.6594 | 0.7883 | 0.9562 | [0.9104536834673739, 0.9810159292216531, 0.9668250255223029, 0.918346480209413, 0.2589676290463692, 0.9735974286191351, 0.9852684483670346, nan, 0.8471517556638368, nan, 0.8364611260053619, 0.8913645352669743, 0.8728764590315498, 0.7327358664293168, 0.8302041701521207, 0.9302490934515204, 0.9750708232121964, 0.606646058732612, nan, 0.7707979626485568, 0.5163398692810458, nan, 0.9243530884808013, nan, 0.0, nan, nan, 0.9284387673880908, nan, nan, nan, nan, nan, 0.6882350185072389, nan, nan, 0.642378361327971, 0.7626134114683732, 0.0, nan, nan, 0.0, 0.0, 0.3022222222222222, nan, nan, nan, 0.8897683253873659, nan, nan, 0.991397871200558, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1627963823026155, nan, 0.7426598332175122, 0.87578009957226, nan, nan, 0.820236260914227, 0.0, 0.8679738562091504, nan, nan, 0.8615018274448998, 0.8291340863017665, nan, nan, 0.8881118881118881, nan, nan, nan, nan, nan, nan, 0.7685489166119501, 0.08, nan, 0.9801169725298764, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5391868002357101, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9271131073907718, 0.619813717188823, nan, nan, nan, 0.985981507520559, nan, nan, nan, 0.37879185243465335, 0.863342805476621, 0.0, nan, 0.896794680956634, 0.928596016719941, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9547547115522946, 0.9890119527256032, 0.9937371934694956, 0.9646958081577967, 0.2797528171573973, 0.9915189389108532, 0.9936382385282999, nan, 0.9036073567036824, nan, 0.9978620222141107, 0.9814196545217012, 0.9357964119337893, 0.7994775969459513, 0.8442634572063202, 0.9602681411166022, 0.9991480013023408, 0.904794836330106, nan, 0.8132276954874268, 0.9122401847575058, nan, 0.971665691722726, nan, nan, nan, nan, 0.9614554303881933, nan, nan, nan, nan, nan, 0.9056103575832306, nan, nan, 0.6827019778746228, 0.7653477094510936, nan, nan, nan, 0.0, nan, 0.3022222222222222, nan, nan, nan, 0.9694614666612557, nan, nan, 0.9985440995306503, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5508684863523573, nan, 0.8067468742627978, 0.9710198448949445, nan, nan, 0.880374862183021, nan, 0.879179079774909, nan, nan, 0.9023258511687257, 0.8291340863017665, nan, nan, 0.906136039321389, nan, nan, nan, nan, nan, nan, 0.9768412267890674, 0.08, nan, 0.9877361117556969, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5736677115987461, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9570073761854584, 0.6483613817537643, nan, nan, nan, 0.9917709583404766, nan, nan, nan, 0.3812582490101188, 0.8769351876147993, nan, nan, 0.9091549295774648, 0.9517161433395495, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0573 | 16.0 | 320 | 0.1362 | 0.6572 | 0.7822 | 0.9554 | [0.9072774944426769, 0.9810018332462798, 0.9664209848366824, 0.9202994260181679, 0.2618247782854071, 0.968396216652524, 0.9833102506035822, nan, 0.8220215482344149, nan, 0.8350753754233962, 0.8774749292877346, 0.8864643073623351, 0.7335318198734648, 0.8206447613143212, 0.9230858928613074, 0.9755310423526335, 0.6414082145850797, nan, 0.7749462187655138, 0.53125, nan, 0.9249659887675725, nan, 0.0, nan, nan, 0.9266769682845273, nan, nan, nan, nan, nan, 0.6876548017758393, nan, nan, 0.6412068281063914, 0.7347861842105263, 0.0, nan, nan, 0.0, 0.0, 0.34044444444444444, nan, nan, nan, 0.8998809456318385, nan, nan, 0.9908140736089449, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16219875440021664, nan, 0.743365145770689, 0.878558374920925, nan, nan, 0.8102824040550326, 0.0, 0.837533547162401, nan, nan, 0.8694906818558867, 0.8158123370981755, nan, nan, 0.8875960939123207, nan, nan, nan, nan, nan, nan, 0.7937776266575994, 0.0, nan, 0.9791655869396216, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5374385660595548, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9372160264353573, 0.6224137931034482, nan, nan, nan, 0.9868409809450568, nan, nan, nan, 0.3631353337415799, 0.842091638622832, 0.0, nan, 0.8851526641388945, 0.924493010435125, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9561911401790556, 0.9882639401177217, 0.994305638178333, 0.9669952866227932, 0.28331515812431846, 0.9945585171959359, 0.9918165471869971, nan, 0.8840368103090298, nan, 0.9984877718099807, 0.9907098272608507, 0.9422142927941323, 0.7920433996383364, 0.8341026166133682, 0.9683661985614327, 0.9988893588405515, 0.8819732595666205, nan, 0.8065794006200482, 0.9226327944572749, nan, 0.9694355074583212, nan, nan, nan, nan, 0.9572424801454907, nan, nan, nan, nan, nan, 0.9071927661323469, nan, nan, 0.6768354006034194, 0.7375154766817994, nan, nan, nan, 0.0, nan, 0.34044444444444444, nan, nan, nan, 0.9662554279452944, nan, nan, 0.9960293623563189, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.49545078577336643, nan, 0.8048124557678697, 0.9717778771210325, nan, nan, 0.8636163175303198, nan, 0.8470705064548163, nan, nan, 0.9119540629893857, 0.8158123370981755, nan, nan, 0.903123513556366, nan, nan, nan, nan, nan, nan, 0.9741289380346339, 0.0, nan, 0.9864308256369796, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5827586206896552, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9563751317175975, 0.6395039858281665, nan, nan, nan, 0.9899708554774559, nan, nan, nan, 0.36524417069951604, 0.8535817370768827, nan, nan, 0.8924547283702213, 0.9466256741091679, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0213 | 17.0 | 340 | 0.1367 | 0.6715 | 0.7875 | 0.9551 | [0.9067448033006597, 0.9821732444900051, 0.9656804316055102, 0.9244569140556936, 0.2547740720817644, 0.9755399590591971, 0.983596521153945, nan, 0.816729848983904, nan, 0.8332825269636662, 0.8924226395133562, 0.8872224159981398, 0.7199675324675324, 0.8144250354916364, 0.8886029330197679, 0.9756620395008726, 0.6550468262226847, nan, 0.7784452296819788, 0.4939759036144578, nan, 0.9252411017831125, nan, 0.0, nan, nan, 0.925007552109556, nan, nan, nan, nan, nan, 0.6919514431132046, nan, nan, 0.6560384191465911, 0.744874364112841, 0.0, nan, nan, 0.0, nan, 0.4, nan, nan, nan, 0.8915952292210212, nan, nan, 0.9906291969181368, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.15155374426897605, nan, 0.7191069799765452, 0.8856366663745509, nan, nan, 0.8122109158186864, 0.0, 0.882685558299753, nan, nan, 0.8530180080035571, 0.8163915435852882, nan, nan, 0.8882825370675453, nan, nan, nan, nan, nan, nan, 0.7806710065097646, 0.045714285714285714, nan, 0.9789477321018294, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5720411663807891, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9365595770638471, 0.6238297872340426, nan, nan, nan, 0.9864006479942021, nan, nan, nan, 0.3700262927256792, 0.8407973077918716, 0.0, nan, 0.8855751068694702, 0.9134898801336215, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9546042074742981, 0.9885218754997498, 0.99470222751008, 0.959554277657411, 0.27546346782988007, 0.9927092632742422, 0.9932890023644232, nan, 0.8802057519456686, nan, 0.9977924944812362, 0.9796051676585862, 0.9454604024184756, 0.7722858482352153, 0.8314245656043731, 0.979889416254832, 0.9985455165089961, 0.8706777316735823, nan, 0.8347571477781606, 0.9468822170900693, nan, 0.975102369113776, nan, nan, nan, nan, 0.9615208488702228, nan, nan, nan, nan, nan, 0.8996506370735717, nan, nan, 0.6983741200134094, 0.7478332645480809, nan, nan, nan, 0.0, nan, 0.4053333333333333, nan, nan, nan, 0.9708006980236191, nan, nan, 0.9987986275847324, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4921422663358147, nan, 0.7811276244397264, 0.9821376508775681, nan, nan, 0.8712238147739801, nan, 0.8991062562065542, nan, nan, 0.8901745838408445, 0.8163915435852882, nan, nan, 0.9118968342053803, nan, nan, nan, nan, nan, nan, 0.975798038806593, 0.045714285714285714, nan, 0.9862567874878173, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6272727272727273, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9707060063224446, 0.6492471213463242, nan, nan, nan, 0.9916852391565232, nan, nan, nan, 0.37149142102947647, 0.852269745473629, nan, nan, 0.8961770623742454, 0.9372007459301447, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0662 | 18.0 | 360 | 0.1409 | 0.6605 | 0.7686 | 0.9542 | [0.9030824597089657, 0.981512175158584, 0.9657510051509974, 0.9221769534383271, 0.24558893657606104, 0.9744408279085295, 0.9850586029067042, nan, 0.7842085427135679, nan, 0.829798913671559, 0.8767873244879557, 0.9070155902004454, 0.7495425155544712, 0.8235741326878918, 0.8790246264222364, 0.9748452690166975, 0.6531489441225939, nan, 0.8031982318718107, 0.5259562841530054, nan, 0.9270603867268087, nan, 0.0, nan, nan, 0.9240010148420652, nan, nan, nan, nan, nan, 0.6951976735046765, nan, nan, 0.6410132689987937, 0.7250796423800226, 0.0, nan, nan, 0.0, nan, 0.2773333333333333, nan, nan, nan, 0.8955403248521265, nan, nan, 0.9914389965331474, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1462972553081305, nan, 0.7355538071509747, 0.8869960942948807, nan, nan, 0.793825222396651, 0.0, 0.82638570774164, nan, nan, 0.856755034765506, 0.8010425716768028, nan, nan, 0.8837379274340903, nan, nan, nan, nan, nan, nan, 0.8042420027816412, 0.0, nan, 0.9787715610332018, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.42269832078749275, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9310203243687127, 0.479185119574845, nan, nan, nan, 0.9873379817769603, nan, nan, nan, 0.3659689990366932, 0.837551652892562, 0.0, nan, 0.8177062374245473, 0.931119311193112, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9580691212111152, 0.9897806001640469, 0.9938892193799986, 0.9632600182334248, 0.26208651399491095, 0.9927092632742422, 0.9916088932517191, nan, 0.836175371384941, nan, 0.9984530079435435, 0.9880243867034403, 0.9284121320249777, 0.8229857343781395, 0.8361977976968761, 0.9771982189166707, 0.9992758011069898, 0.80567081604426, nan, 0.8512573200137789, 0.8891454965357968, nan, 0.9693258262649898, nan, nan, nan, nan, 0.9530033625099763, nan, nan, nan, nan, nan, 0.9088573777229757, nan, nan, 0.6680355346966141, 0.7279973173751547, nan, nan, nan, 0.0, nan, 0.2773333333333333, nan, nan, nan, 0.9677366989976056, nan, nan, 0.9986662729966097, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46732837055417703, nan, 0.7939608398207124, 0.9887655736749014, nan, nan, 0.8362734288864389, nan, 0.836014564713671, nan, nan, 0.896931732498115, 0.8010425716768028, nan, nan, 0.8946673008826171, nan, nan, nan, nan, nan, nan, 0.9651575213853536, 0.0, nan, 0.9859203137327702, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.45768025078369906, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9557428872497366, 0.479185119574845, nan, nan, nan, 0.9892422424138522, nan, nan, nan, 0.3677078750549934, 0.8509577538703752, nan, nan, 0.8177062374245473, 0.953832972128421, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0518 | 19.0 | 380 | 0.1410 | 0.6674 | 0.7806 | 0.9539 | [0.9011438624610261, 0.9815720099306391, 0.9658476196898829, 0.9227872139906322, 0.2412848435038696, 0.9721603795176963, 0.9846708704438291, nan, 0.8097110085465832, nan, 0.8330528148578061, 0.8857404021937842, 0.9157622612979538, 0.7334782608695652, 0.7613958898797983, 0.8800970338515823, 0.974799120195433, 0.6422065647018388, nan, 0.8059830151331333, 0.49967170059093896, nan, 0.9215144396849523, nan, 0.0, nan, nan, 0.9262646037259236, nan, nan, nan, nan, nan, 0.689437265000474, nan, nan, 0.6376547127895906, 0.7299609495426986, 0.0, nan, nan, 0.0, nan, 0.3075555555555556, nan, nan, nan, 0.8923154512851166, nan, nan, 0.9903905421108473, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1653194263363755, nan, 0.7432807352176175, 0.889131352949458, nan, nan, 0.8123127776056193, 0.0, 0.8750081417312577, nan, nan, 0.846617920987517, 0.8355053576600058, nan, nan, 0.8887051772243464, nan, nan, nan, nan, nan, nan, 0.7863521482729571, 0.0, nan, 0.9783277453250838, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5151515151515151, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9325040783034257, 0.6293103448275862, nan, nan, nan, 0.9867781284654099, nan, nan, nan, 0.3629117028731605, 0.861811631497684, 0.0, nan, 0.8676205723434061, 0.9216796298119524, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9556953620397729, 0.9892079836159445, 0.9939966289906802, 0.9613259464586924, 0.25612504543802256, 0.9931343791183097, 0.9934305845930219, nan, 0.8718604744618434, nan, 0.9979663138134224, 0.9846131514007839, 0.9304440479730399, 0.7909048288795124, 0.7733423651916381, 0.9763663942848755, 0.999281886811973, 0.8614568925772246, nan, 0.8696176369273165, 0.8787528868360277, nan, 0.9752851711026616, nan, nan, nan, nan, 0.9595190433201188, nan, nan, nan, nan, nan, 0.8968351829017673, nan, nan, 0.6735668789808917, 0.7328982666116385, nan, nan, nan, 0.0, nan, 0.3075555555555556, nan, nan, nan, 0.9693397183555862, nan, nan, 0.9978925077122001, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5244003308519437, nan, 0.80896437839113, 0.982021030535093, nan, nan, 0.8670341786108049, nan, 0.8893743793445879, nan, nan, 0.8831274288034336, 0.8355053576600058, nan, nan, 0.90904286242799, nan, nan, nan, nan, nan, nan, 0.973711662841644, 0.0, nan, 0.9851951547779273, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5595611285266457, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9637513171759747, 0.6465899025686448, nan, nan, nan, 0.9915995199725699, nan, nan, nan, 0.36454025516937966, 0.8787719758593545, nan, nan, 0.8723340040241448, 0.9436520336676579, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0943 | 20.0 | 400 | 0.1485 | 0.6727 | 0.7694 | 0.9524 | [0.8977662840775619, 0.9797098928976953, 0.9645674018175735, 0.9230733844313751, 0.22154146069762584, 0.9725938893751823, 0.9835765995829722, nan, 0.7573401651299352, nan, 0.8685743685970673, 0.86688, 0.9187631283278784, 0.7359396433470508, 0.7421994565387138, 0.8850474789354019, 0.9755247352921074, 0.6362859097127223, nan, 0.8091909385113268, 0.49522673031026254, nan, 0.9277091570733755, nan, 0.0, nan, nan, 0.9202631545395433, nan, nan, nan, nan, nan, 0.6803734266383862, nan, nan, 0.5962727993655829, 0.7276511550222268, 0.0, nan, nan, 0.0, nan, 0.2568888888888889, nan, nan, nan, 0.8923100001875153, nan, nan, 0.9881606594472281, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16540917005223446, nan, 0.726726077175982, 0.890043802958495, nan, nan, 0.8035789473684211, nan, 0.8655325250945118, nan, nan, 0.8405240757740132, 0.8268172603533159, nan, nan, 0.8888542478565862, nan, nan, nan, nan, nan, nan, 0.8121180373668587, 0.0, nan, 0.9759715235150228, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4811075858090568, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.93647837781916, 0.576048951048951, nan, nan, nan, 0.985508481800358, nan, nan, nan, 0.280589421980528, 0.8505806451612903, 0.0, nan, 0.8696303696303697, 0.9071428571428571, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9642818410190012, 0.9881349724267077, 0.9956788287395069, 0.9628097190797656, 0.23133406034169393, 0.9919440547549208, 0.990650853504868, nan, 0.7998874794047125, nan, 0.9976882028819245, 0.9830889824357671, 0.9321042719793835, 0.7905029803763981, 0.7486885426676538, 0.9713999119244507, 0.99886805887311, 0.8577685569386814, nan, 0.8613158801240096, 0.9584295612009238, nan, 0.9693258262649898, nan, nan, nan, nan, 0.9443681228820766, nan, nan, nan, nan, nan, 0.9031237155774764, nan, nan, 0.630154207173986, 0.7304477919933966, nan, nan, nan, 0.0, nan, 0.2568888888888889, nan, nan, nan, 0.9655858122641127, nan, nan, 0.9959173700125228, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.47146401985111663, nan, 0.7925925925925926, 0.9636533265952691, nan, nan, 0.8416758544652702, nan, 0.87911287653095, nan, nan, 0.8762832782321212, 0.8268172603533159, nan, nan, 0.9040748374821627, nan, nan, nan, nan, nan, nan, 0.9703734612977258, 0.0, nan, 0.981406924397828, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5228840125391849, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9538461538461539, 0.5837023914969, nan, nan, nan, 0.9909994856848963, nan, nan, nan, 0.2814782226132864, 0.8648648648648649, nan, nan, 0.8757545271629779, 0.9281286225492666, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0412 | 21.0 | 420 | 0.1430 | 0.6653 | 0.7779 | 0.9551 | [0.9073118268287437, 0.9810413680114684, 0.965398617548474, 0.9217180964184611, 0.2718217915068309, 0.9743199265687583, 0.9844570753810757, nan, 0.8135458759947443, nan, 0.8578925335645725, 0.8786878881987578, 0.9283813801352307, 0.766562274150807, 0.8124854377980397, 0.8967391304347826, 0.9755279979316923, 0.6120951013243976, nan, 0.8051795939614784, 0.4980340760157274, nan, 0.9251698322473312, nan, 0.0, nan, nan, 0.9256345177664974, nan, nan, nan, nan, nan, 0.6752282669999847, nan, nan, 0.5780181326546843, 0.7795900544539196, 0.0, nan, nan, 0.0, nan, 0.2648888888888889, nan, nan, nan, 0.8866810280790373, nan, nan, 0.9913157761714604, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16864805087088747, nan, 0.7323730098559514, 0.8772442366939339, nan, nan, 0.8153356600287416, 0.0, 0.8626341463414634, nan, nan, 0.8292254992062829, 0.8475556841191785, nan, nan, 0.8872872977051569, nan, nan, nan, nan, nan, nan, 0.7913279132791328, 0.0, nan, 0.976764705882353, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5087769784172662, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9346846846846847, 0.5722996515679443, nan, nan, nan, 0.9857015192135835, nan, nan, nan, 0.31721419185282523, 0.8463130659767141, 0.0, nan, 0.8654982473710566, 0.8995570866141732, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9569281675021857, 0.988496081961547, 0.9947038799656289, 0.9662201815222325, 0.29363867684478373, 0.992751774858649, 0.9915097856917, nan, 0.8791609178465701, nan, 0.9973927100172081, 0.985701843518653, 0.9424373079591635, 0.8524546246065233, 0.8240047889853337, 0.9688310417380241, 0.9988832731355682, 0.88427846934071, nan, 0.8524974164657251, 0.8775981524249422, nan, 0.9759066978648727, nan, nan, nan, nan, 0.954324815846973, nan, nan, nan, nan, nan, 0.9103370324702014, nan, nan, 0.6091183372443848, 0.7828879488237722, nan, nan, nan, 0.0, nan, 0.2648888888888889, nan, nan, nan, 0.9688324337486304, nan, nan, 0.998320114843058, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5045492142266336, nan, 0.7748053786270347, 0.9629924779879103, nan, nan, 0.8757442116868798, nan, 0.8780536246276067, nan, nan, 0.863493996867931, 0.8485375036200405, nan, nan, 0.9011680143755616, nan, nan, nan, nan, nan, nan, 0.9747548508241185, 0.0, nan, 0.9825671787255766, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5542319749216301, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9620653319283456, 0.5819309123117803, nan, nan, nan, 0.9927567289559404, nan, nan, nan, 0.31860976682798064, 0.8583049068485962, nan, nan, 0.8694164989939638, 0.9212741293281589, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0199 | 22.0 | 440 | 0.1447 | 0.6543 | 0.7736 | 0.9546 | [0.9039458175667718, 0.9805792442592574, 0.9651967509488153, 0.927996292444086, 0.2595808180153293, 0.9682457592056268, 0.9838779997280965, nan, 0.7846103627073107, nan, 0.8698609385887842, 0.8608492058469911, 0.9186319027391241, 0.7599633251833741, 0.834067024211799, 0.8890530888247017, 0.9756241826180002, 0.6381489071038251, nan, 0.7967240655186896, 0.5219478737997256, nan, 0.925573253982146, nan, 0.0, nan, nan, 0.9157658862024624, nan, nan, nan, nan, nan, 0.6809279624366736, nan, nan, 0.5789314363553807, 0.7217811351781906, 0.0, nan, nan, 0.0, 0.0, 0.2568888888888889, nan, nan, nan, 0.8853688882286308, nan, nan, 0.9851883284836728, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.2638966872543515, nan, 0.735663042999868, 0.8889245014245014, nan, nan, 0.8114026027680231, 0.0, 0.864782127271543, nan, nan, 0.8360970452395526, 0.8485375036200405, nan, nan, 0.8884051210283522, nan, nan, nan, nan, nan, nan, 0.8027269589230238, 0.0, nan, 0.9759326668474216, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.43415051311288483, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9351755661749429, 0.6102430555555556, nan, nan, nan, 0.9860163710777626, nan, nan, nan, 0.39161938018332604, 0.8355348355348355, 0.0, nan, 0.889253404911025, 0.8919477961093327, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9599083695760433, 0.9882226704565972, 0.9951434331416484, 0.9681911630636583, 0.2782260996001454, 0.9948773540789865, 0.9904715160153098, nan, 0.8371398336302627, nan, 0.9981227512123898, 0.9873711714327188, 0.9324264049955397, 0.8326970732034024, 0.8476661573118669, 0.9714243773547977, 0.9987828590033441, 0.8614568925772246, nan, 0.849500516706855, 0.8787528868360277, nan, 0.9666203568294823, nan, nan, nan, nan, 0.9410710313877877, nan, nan, nan, nan, nan, 0.9060008220304151, nan, nan, 0.6411330874958096, 0.7245924473792819, nan, nan, nan, 0.0, nan, 0.2568888888888889, nan, nan, nan, 0.9674526196177103, nan, nan, 0.9927612221419044, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.38875103391232424, nan, 0.7886293937249351, 0.9703201228400941, nan, nan, 0.8661521499448732, nan, 0.8789804700430321, nan, nan, 0.8714981729598051, 0.8485375036200405, nan, nan, 0.9058717826753343, nan, nan, nan, nan, nan, nan, 0.9703734612977258, 0.0, nan, 0.981430129484383, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4774294670846395, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.948577449947313, 0.6226749335695305, nan, nan, nan, 0.9912566432367564, nan, nan, nan, 0.3947206335239771, 0.8464969824193125, nan, nan, 0.8998993963782697, 0.9128068141726727, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0611 | 23.0 | 460 | 0.1401 | 0.6670 | 0.7796 | 0.9550 | [0.9056082260270742, 0.9814994034543323, 0.9650913031212331, 0.9239970529686382, 0.23318632855567806, 0.9696410043577506, 0.9840532213801841, nan, 0.7937188434695912, nan, 0.8488055818353831, 0.8835370237239396, 0.9147135416666666, 0.7555787150564002, 0.860634385534784, 0.8944032123440637, 0.9751605372428582, 0.637986192961778, nan, 0.8058558119907813, 0.5060936497754971, nan, 0.9279099835761959, nan, 0.0, nan, nan, 0.9161192329599392, nan, nan, nan, nan, nan, 0.6888586311173034, nan, nan, 0.5841628620612229, 0.7256700845475805, 0.0, nan, nan, 0.0, nan, 0.28355555555555556, nan, nan, nan, 0.8843658144307027, nan, nan, 0.9908382912958716, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.20670958996950187, nan, 0.7390048867170147, 0.8939562476826103, nan, nan, 0.8051398493136547, 0.0, 0.8741618384219777, nan, nan, 0.8322463768115942, 0.8537503620040544, nan, nan, 0.8850285244385054, nan, nan, nan, nan, nan, nan, 0.7772393048128342, 0.0, nan, 0.9768233312567777, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.41261994765920323, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9369721936148301, 0.6206008583690987, nan, nan, nan, 0.985980312779648, nan, nan, nan, 0.37178590169669407, 0.8574380165289256, 0.0, nan, 0.9031655524970508, 0.9118095519157936, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9561335945021746, 0.9888107631276213, 0.9949071319981493, 0.9628318649397817, 0.2460196292257361, 0.9932194022871232, 0.9933786711092024, nan, 0.8531204372228846, nan, 0.998070605412734, 0.9812019160981275, 0.9400089206066012, 0.8254638001473444, 0.8847807936482931, 0.9700053823946764, 0.9990262872026753, 0.8734439834024896, nan, 0.8551842921116087, 0.9110854503464203, nan, 0.970824802573852, nan, nan, nan, nan, 0.9469717784668524, nan, nan, nan, nan, nan, 0.8941430332922318, nan, nan, 0.6541233657391887, 0.7283842344201403, nan, nan, nan, 0.0, nan, 0.28355555555555556, nan, nan, nan, 0.9706992411022279, nan, nan, 0.9986866352409363, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5045492142266336, nan, 0.7848549186128804, 0.9841007599758984, nan, nan, 0.8600882028665932, nan, 0.8889771598808341, nan, nan, 0.8659880517371382, 0.8537503620040544, nan, nan, 0.9100998890122086, nan, nan, nan, nan, nan, nan, 0.9703734612977258, 0.0, nan, 0.982422146934608, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.44482758620689655, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9586933614330875, 0.6403897254207263, nan, nan, nan, 0.9916852391565232, nan, nan, nan, 0.3740431148262209, 0.8711624245604828, nan, nan, 0.9242454728370222, 0.9343279068595333, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0257 | 24.0 | 480 | 0.1440 | 0.6568 | 0.7687 | 0.9532 | [0.9010835423059151, 0.981331380791701, 0.9646435441991738, 0.921076509743597, 0.226492511796485, 0.9711600440812592, 0.9840923225115613, nan, 0.7780197097655217, nan, 0.8423012092341517, 0.8862742527530152, 0.9174979319741132, 0.7362816336757705, 0.8343019641969169, 0.875799973518118, 0.9752499658394872, 0.6314594972067039, nan, 0.8019852870313855, 0.5064141722663409, nan, 0.9274151069283983, nan, 0.0, nan, nan, 0.9195131876796296, nan, nan, nan, nan, nan, 0.682495078427637, nan, nan, 0.5709958349110186, 0.7235442257285296, 0.0, nan, nan, 0.0, nan, 0.28, nan, nan, nan, 0.8955246304956148, nan, nan, 0.9840478664904297, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1895734597156398, nan, 0.7303173768192179, 0.8977009256473337, nan, nan, 0.8155783487356555, 0.0, 0.8312385441214978, nan, nan, 0.8402549685464566, 0.7955401100492325, nan, nan, 0.8828275467180261, nan, nan, nan, nan, nan, nan, 0.8043478260869565, 0.0, nan, 0.9767390890582013, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2432350258437215, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9314332920280876, 0.5692982456140351, nan, nan, nan, 0.9864483584131327, nan, nan, nan, 0.3169662428759316, 0.8274072151570205, 0.0, nan, 0.8962771252239697, 0.8974737775151426, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603311089715924, 0.9886921128518883, 0.9951516954193932, 0.9591630341304612, 0.24078516902944383, 0.9927730306508523, 0.9932323694729838, nan, 0.8280444188445206, nan, 0.9988701743407902, 0.9813470750471767, 0.9344583209436019, 0.7872212176009644, 0.845775768364341, 0.9709350687478593, 0.9990110729402171, 0.8337943752881513, nan, 0.8599724423010678, 0.9572748267898383, nan, 0.9655601052939456, nan, nan, nan, nan, 0.9460166686292212, nan, nan, nan, nan, nan, 0.8834566378956021, nan, nan, 0.631914180355347, 0.7262690879075526, nan, nan, nan, 0.0, nan, 0.28, nan, nan, nan, 0.9675743679233797, nan, nan, 0.9929444823408436, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6286186931348222, nan, 0.7860344420853975, 0.9858500651130245, nan, nan, 0.8854465270121279, nan, 0.8406487917907978, nan, nan, 0.875442259729714, 0.7955401100492325, nan, nan, 0.896358543417367, nan, nan, nan, nan, nan, nan, 0.9572292927185478, 0.0, nan, 0.9824337494778855, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2507836990595611, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9504741833508957, 0.5748449955713021, nan, nan, nan, 0.9889850848619921, nan, nan, nan, 0.31808183018037833, 0.8365258462345841, nan, nan, 0.9058350100603622, 0.9185524923138955, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0322 | 25.0 | 500 | 0.1493 | 0.6623 | 0.7793 | 0.9527 | [0.8987965308094777, 0.9821812013467875, 0.9653117469662208, 0.9247574368269539, 0.2470813135795726, 0.9660166095112176, 0.9841560166906798, nan, 0.7979931843998486, nan, 0.8386362309533539, 0.8875180469877937, 0.924700192951176, 0.7647022523912372, 0.7671044073466101, 0.8789296777169973, 0.9754739406905568, 0.6307927346800204, nan, 0.795945308816596, 0.46658851113716293, nan, 0.9266125378959473, nan, 0.0, nan, nan, 0.9198790246361288, nan, nan, nan, nan, nan, 0.6851921099373656, nan, nan, 0.6118728341117975, 0.7163603649916463, 0.0, nan, nan, 0.0, nan, 0.3552397868561279, nan, nan, nan, 0.8873634019184029, nan, nan, 0.9861066203217201, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.21227887617065558, nan, 0.7311713154499755, 0.8828182422736341, nan, nan, 0.8264090499549505, 0.0, 0.8383249229962645, nan, nan, 0.8367744448772887, 0.8543717429067748, nan, nan, 0.8847777372452967, nan, nan, nan, nan, nan, nan, 0.7805977625647019, 0.0, nan, 0.9777006337471816, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2943854324734446, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9334292171769057, 0.6505922165820643, nan, nan, nan, 0.9866084798699354, nan, nan, nan, 0.3661379370782578, 0.8176378772112383, 0.0, nan, 0.8780901116427432, 0.9024282343688557, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9550878124896252, 0.9886869541442478, 0.9949286139202855, 0.960388438384681, 0.26310432569974557, 0.993942099222038, 0.992892572124347, nan, 0.8469184091731076, nan, 0.9987658827414785, 0.9815648134707504, 0.9381504608980077, 0.829951108432121, 0.7803368042974842, 0.9788129373195674, 0.9989258730704511, 0.8566159520516367, nan, 0.8723045125732002, 0.9191685912240185, nan, 0.9721775372916057, nan, nan, nan, nan, 0.9550705865421099, nan, nan, nan, nan, nan, 0.9037607891491986, nan, nan, 0.6806905799530674, 0.7188918695831614, nan, nan, nan, 0.0, nan, 0.35555555555555557, nan, nan, nan, 0.9704963272594457, nan, nan, 0.9936062552814571, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.674937965260546, nan, 0.7754659117716443, 0.980504966082917, nan, nan, 0.9101433296582139, nan, 0.8468718967229394, nan, nan, 0.8721071863580999, 0.8546191717347235, nan, nan, 0.8973098673431636, nan, nan, nan, nan, nan, nan, 0.9753807636136032, 0.0, nan, 0.9835824012623567, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.30407523510971785, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9574288724973656, 0.6811337466784765, nan, nan, nan, 0.9883421909823419, nan, nan, nan, 0.3676198856137263, 0.8247179218053005, nan, nan, 0.8862173038229376, 0.9253061841641046, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0596 | 26.0 | 520 | 0.1515 | 0.6627 | 0.7697 | 0.9535 | [0.8990218651104814, 0.9810023280206708, 0.9653286574400126, 0.9288986377117224, 0.24670202638378894, 0.9696523092890503, 0.9849194989381905, nan, 0.7819721871049304, nan, 0.831709400471978, 0.8830855503390714, 0.9178432327166505, 0.7569215758019271, 0.8105184863792729, 0.875969509078383, 0.9748540824263584, 0.6173755357731618, nan, 0.8045509293922328, 0.4954308093994778, nan, 0.9284713955605349, nan, 0.0, nan, nan, 0.9218677180033001, nan, nan, nan, nan, nan, 0.689629745842485, nan, nan, 0.6222591625108842, 0.7164785436968092, 0.0, nan, nan, 0.0, nan, 0.296, nan, nan, nan, 0.8885306997111712, nan, nan, 0.9915889929032128, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3549807374793616, nan, 0.7381344979722804, 0.8912864090454202, nan, nan, 0.8024056749254652, 0.0, 0.850494854820738, nan, nan, 0.8338621687820085, 0.8300028960324356, nan, nan, 0.883858164917228, nan, nan, nan, nan, nan, nan, 0.8001721170395869, 0.0, nan, 0.9772476979582894, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.31045655375552283, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9208454206382097, 0.5389380530973451, nan, nan, nan, 0.9858889934148636, nan, nan, nan, 0.36202066188058135, 0.833419689119171, 0.0, nan, 0.8467241033860949, 0.9189255669798789, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9553998871219415, 0.9890893333402117, 0.9943271201004693, 0.9722217095865737, 0.26375863322428206, 0.992921821196276, 0.9937279072730791, nan, 0.828553440585107, nan, 0.9985399176096366, 0.9829438234867179, 0.9343096441669144, 0.8312906034425022, 0.8239890357441043, 0.964329402554191, 0.999163215564799, 0.8633010603964961, nan, 0.8513606613847744, 0.8764434180138568, nan, 0.9695451886516525, nan, nan, nan, nan, 0.9575695725556385, nan, nan, nan, nan, nan, 0.9033497739416358, nan, nan, 0.6588166275561516, 0.7187886917044986, nan, nan, nan, 0.0, nan, 0.296, nan, nan, nan, 0.9675540765391015, nan, nan, 0.9986255485079566, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.533498759305211, nan, 0.7814578910120311, 0.9867052809578417, nan, nan, 0.860529217199559, nan, 0.8590532936113869, nan, nan, 0.8688301142625138, 0.8300028960324356, nan, nan, 0.8945087468949844, nan, nan, nan, nan, nan, nan, 0.9699561861047361, 0.0, nan, 0.9832401262356708, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3304075235109718, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9365648050579557, 0.5394154118689105, nan, nan, nan, 0.9881707526144351, nan, nan, nan, 0.3638363396392433, 0.8441353975334558, nan, nan, 0.8503018108651912, 0.9414344035078878, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0373 | 27.0 | 540 | 0.1534 | 0.6515 | 0.7773 | 0.9525 | [0.8965442948028729, 0.9815482100996038, 0.9656944596078633, 0.9268953324908056, 0.2605733558178752, 0.9681430969801231, 0.9855979522088297, nan, 0.7766338236596132, nan, 0.8436343340579284, 0.8865554465161923, 0.9174874468906914, 0.7758433734939759, 0.7340610270404366, 0.8995186114312834, 0.974827587230619, 0.6112903225806452, nan, 0.809461791423827, 0.5012360939431397, nan, 0.926875196150225, nan, 0.0, nan, nan, 0.9245910694334601, nan, nan, nan, nan, nan, 0.6829161997260615, nan, nan, 0.623784473138849, 0.7161630448759162, 0.0, nan, nan, 0.0, 0.0, 0.3431111111111111, nan, nan, nan, 0.8808033577556055, nan, nan, 0.9882696164833161, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.20992951271835733, nan, 0.754056476916181, 0.8854859481678811, nan, nan, 0.824291742013261, 0.0, 0.8544051278697102, nan, nan, 0.8231600100477267, 0.8297132927888793, nan, nan, 0.8852645749197473, nan, nan, nan, nan, nan, nan, 0.8034321372854915, 0.0, nan, 0.9770795271824013, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3830478440637921, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9215442092154421, 0.5933970460469157, nan, nan, nan, 0.9851231190150479, nan, nan, nan, 0.3543740178103719, 0.8534126163391934, 0.0, nan, 0.884470073573275, 0.9072352565048448, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9578477916846497, 0.9892802055229124, 0.9939140062132329, 0.9637103173870838, 0.2808433296982915, 0.9928580538196659, 0.9921657833508738, nan, 0.8225121562428838, nan, 0.998383480210669, 0.9835244592829148, 0.9417682624640697, 0.8625678119349005, 0.7458214527639062, 0.9737485932377551, 0.9991449584498492, 0.8736745043798986, nan, 0.8563899414398898, 0.9364896073903002, nan, 0.9717753729160573, nan, nan, nan, nan, 0.96364040768798, nan, nan, nan, nan, nan, 0.901685162351007, nan, nan, 0.6558833389205497, 0.7183243912505158, nan, nan, nan, 0.0, nan, 0.3431111111111111, nan, nan, nan, 0.9708818635607321, nan, nan, 0.9932702782500687, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5665839536807279, nan, 0.793724935126209, 0.9755680382514723, nan, nan, 0.9046306504961411, nan, 0.864812975835816, nan, nan, 0.8553158169479729, 0.8297132927888793, nan, nan, 0.9036520268484752, nan, nan, nan, nan, nan, nan, 0.9670352597538077, 0.0, nan, 0.9835243885459692, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.40658307210031347, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9357218124341412, 0.6049601417183348, nan, nan, nan, 0.9876564375107149, nan, nan, nan, 0.3571491421029476, 0.8661768564681186, nan, nan, 0.8949698189134809, 0.9296406431127463, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0385 | 28.0 | 560 | 0.1548 | 0.6607 | 0.7661 | 0.9524 | [0.8970703868628847, 0.980633622368448, 0.9635848024491128, 0.9232820570401976, 0.2255383976178935, 0.9688394078865531, 0.9854110411041104, nan, 0.7475496116245661, nan, 0.8497019186674359, 0.881279129984464, 0.9239615056645145, 0.779497559233242, 0.782851904221763, 0.9113117118793914, 0.9753082385097597, 0.6455189965460826, nan, 0.8005810537311484, 0.5100207325501037, nan, 0.9286014721345952, nan, 0.0, nan, nan, 0.9258318163430421, nan, nan, nan, nan, nan, 0.6797519697888014, nan, nan, 0.6272930291912586, 0.7125848590824934, 0.0, nan, nan, 0.0, nan, 0.3552397868561279, nan, nan, nan, 0.8880132647738281, nan, nan, 0.9867851216254503, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1975133214920071, nan, 0.7174610345606506, 0.8870913913333685, nan, nan, 0.8105869279725225, 0.0, 0.8285695555264193, nan, nan, 0.8382646830243144, 0.8169707500724008, nan, nan, 0.883779133427186, nan, nan, nan, nan, nan, nan, 0.8160127253446448, 0.0, nan, 0.9757095062298108, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3934997098084736, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9184308841843088, 0.5150442477876106, nan, nan, nan, 0.9847889249700906, nan, nan, nan, 0.31959755030621173, 0.828838174273859, 0.0, nan, 0.8817322050771528, 0.8872150839363954, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9595276827905227, 0.9871341831444387, 0.9955036684513187, 0.9711993090491675, 0.23678662304616502, 0.9933044254559368, 0.9920194817146551, nan, 0.7876843529396006, nan, 0.9984008621438877, 0.9880969661779648, 0.9397363465160076, 0.8769673832964973, 0.7936797996187716, 0.9583109066888487, 0.9989045731030097, 0.8185799907791609, nan, 0.8448157078883913, 0.8521939953810623, nan, 0.9685946183094472, nan, nan, nan, nan, 0.958210673679528, nan, nan, nan, nan, nan, 0.8988902589395807, nan, nan, 0.6591518605430774, 0.7147905489063144, nan, nan, nan, 0.0, nan, 0.35555555555555557, nan, nan, nan, 0.9671888316220932, nan, nan, 0.9928833956078639, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4598842018196857, nan, 0.7492804906817646, 0.9816128593364303, nan, nan, 0.8846747519294377, nan, 0.8367428003972195, nan, nan, 0.8758482686619106, 0.8169707500724008, nan, nan, 0.8958300301252576, nan, nan, nan, nan, nan, nan, 0.9632797830168996, 0.0, nan, 0.9812792964217757, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4250783699059561, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9325605900948367, 0.5155004428697962, nan, nan, nan, 0.9878278758786216, nan, nan, nan, 0.32142542894852616, 0.8386250327997901, nan, nan, 0.891046277665996, 0.908321153167683, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0412 | 29.0 | 580 | 0.1592 | 0.6500 | 0.7727 | 0.9515 | [0.8949229686819773, 0.9812312658519714, 0.9639549693118036, 0.9258280298666742, 0.24209730621220452, 0.9663563004345127, 0.9845830907075905, nan, 0.7593595686788269, nan, 0.8658385842202057, 0.8839937434827946, 0.930771307271344, 0.769801126566272, 0.7257131334439808, 0.9009547359021911, 0.9755798884806981, 0.6496658459373901, nan, 0.803027362701339, 0.499683343888537, nan, 0.9274151253452676, nan, 0.0, nan, nan, 0.9155771149934943, nan, nan, nan, nan, nan, 0.6819490035625944, nan, nan, 0.6085852930197835, 0.7122244798230498, 0.0, nan, nan, 0.0, 0.0, 0.34044444444444444, nan, nan, nan, 0.8760496898955341, nan, nan, 0.9832898119471266, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.19413012729844414, nan, 0.7086617740327987, 0.8856046335580867, nan, nan, 0.8265838011226945, 0.0, 0.8471458496988741, nan, nan, 0.8152432039784315, 0.8349261511728931, nan, nan, 0.8866524874202417, nan, nan, nan, nan, nan, nan, 0.826296098343132, 0.0, nan, 0.9758325064038954, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4595292766934558, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9329179646936656, 0.6211072664359861, nan, nan, nan, 0.9847837237134552, nan, nan, nan, 0.27795275590551183, 0.8326838139776566, 0.0, nan, 0.8979992153785799, 0.8739350962722214, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9600057545676881, 0.9879028305828824, 0.9955515896622381, 0.9670395783428253, 0.25612504543802256, 0.9927305190664456, 0.989494598637979, nan, 0.8074826195866208, nan, 0.998383480210669, 0.9844679924517347, 0.9398354643671325, 0.8969928337016945, 0.7370311441579105, 0.9627391495816412, 0.998749387625936, 0.8515444905486399, nan, 0.855253186358939, 0.9110854503464203, nan, 0.9697645510383153, nan, nan, nan, nan, 0.9482801481074433, nan, nan, nan, nan, nan, 0.9008425811755035, nan, nan, 0.6832048273550118, 0.7143004539826661, nan, nan, nan, 0.0, nan, 0.34044444444444444, nan, nan, nan, 0.9716326447790268, nan, nan, 0.9891061992852852, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.45409429280397023, nan, 0.7380514272234018, 0.9777643880347529, nan, nan, 0.9091510474090408, nan, 0.8567361800728236, nan, nan, 0.8462386172495795, 0.8349261511728931, nan, nan, 0.9033349188732097, nan, nan, nan, nan, nan, nan, 0.9676611725432923, 0.0, nan, 0.9812444887919431, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5018808777429468, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9466807165437302, 0.6359610274579274, nan, nan, nan, 0.9874849991428082, nan, nan, nan, 0.27954245490541135, 0.8409866176856469, nan, nan, 0.9211267605633803, 0.8944609646691195, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.042 | 30.0 | 600 | 0.1653 | 0.6589 | 0.7658 | 0.9500 | [0.889510131590396, 0.9806404031712863, 0.9635016074959992, 0.92723304776501, 0.23933324187131294, 0.9666390899689762, 0.9854054611218351, nan, 0.7471153967672223, nan, 0.8652111721749605, 0.8867350314257361, 0.925598589825197, 0.7766966966966967, 0.6394694596405917, 0.9097214113910749, 0.975578001177107, 0.646199339015481, nan, 0.8052929448839153, 0.49779179810725555, nan, 0.9281004489337823, nan, 0.0, nan, nan, 0.915954919896902, nan, nan, nan, nan, nan, 0.6781977830364927, nan, nan, 0.6012451598208185, 0.7131027388453131, 0.0, nan, nan, 0.0, nan, 0.2951111111111111, nan, nan, nan, 0.8805753010938824, nan, nan, 0.9842689354469949, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.21606285464862504, nan, 0.717439293598234, 0.8869279776926743, nan, nan, 0.8349117735021434, 0.0, 0.8309591019497145, nan, nan, 0.8200379337275466, 0.8268172603533159, nan, nan, 0.8870607774951991, nan, nan, nan, nan, nan, nan, 0.8271494826971102, 0.0, nan, 0.9740183990621445, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.39105504587155965, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.927675585284281, 0.5916230366492147, nan, nan, nan, 0.9828449197860962, nan, nan, nan, 0.32187226596675417, 0.8244155844155844, 0.0, nan, 0.9002923976608187, 0.8799369644440067, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9616303132919447, 0.9877480693536655, 0.9958969528719678, 0.9691471260210164, 0.2536532170119956, 0.993431960209157, 0.9925905300366697, nan, 0.7918904799539201, nan, 0.9988180285411343, 0.9727826970532734, 0.9368371493706017, 0.8661174737124104, 0.6463397344003529, 0.9530997700249547, 0.9986702734611534, 0.8563854310742277, nan, 0.8626593179469514, 0.9110854503464203, nan, 0.9674612459783563, nan, nan, nan, nan, 0.9485156546427497, nan, nan, nan, nan, nan, 0.895232223592273, nan, nan, 0.6636775058665773, 0.7152548493602972, nan, nan, nan, 0.0, nan, 0.2951111111111111, nan, nan, nan, 0.970273122032385, nan, nan, 0.9905620997546349, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4094292803970223, nan, 0.7513564519933946, 0.9768119885712064, nan, nan, 0.9233737596471885, nan, 0.8380006620324396, nan, nan, 0.8526187576126675, 0.8268172603533159, nan, nan, 0.9032820675439988, nan, nan, nan, nan, nan, nan, 0.9674525349467974, 0.0, nan, 0.9784540771337077, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.42758620689655175, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.935300316122234, 0.6005314437555359, nan, nan, nan, 0.984656266072347, nan, nan, nan, 0.3237131544214694, 0.8328522697454737, nan, nan, 0.9292756539235413, 0.9005594476084875, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0461 | 31.0 | 620 | 0.1564 | 0.6615 | 0.7732 | 0.9528 | [0.8993040452370596, 0.9799249617386765, 0.9628634632124518, 0.9267651783529612, 0.22411308584368353, 0.9725827621408779, 0.9850336882674806, nan, 0.8307365702864095, nan, 0.861146916874007, 0.8849632931305715, 0.9243775730248971, 0.7685376841592597, 0.7254391672088484, 0.889300633337825, 0.9756300597282055, 0.6190864600326265, nan, 0.8099731320368475, 0.4732868757259001, nan, 0.9268402934733475, nan, 0.0, nan, nan, 0.9235464424028977, nan, nan, nan, nan, nan, 0.6849405237669848, nan, nan, 0.6065027755749405, 0.7188705446690326, 0.0, nan, nan, 0.0, nan, 0.25866666666666666, nan, nan, nan, 0.8851305040475705, nan, nan, 0.9866937850363264, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.19426523297491038, nan, 0.7057401271603833, 0.8850865118224112, nan, nan, 0.8275247524752475, 0.0, 0.8420293654955427, nan, nan, 0.8259743154024013, 0.8420900692840647, nan, nan, 0.8843906510851419, nan, nan, nan, nan, nan, nan, 0.8115512127028441, 0.0, nan, 0.9742064752314026, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.45501432664756447, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9349492438367516, 0.5899218071242398, nan, nan, nan, 0.9848173808912839, nan, nan, nan, 0.3248993171073367, 0.843579766536965, 0.0, nan, 0.8931471578530402, 0.8747723357125277, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9563814835718158, 0.9876087842473703, 0.9961729129486416, 0.9646182976477405, 0.23744093057070156, 0.9922841474301747, 0.9914861886536002, nan, 0.8963470992458441, nan, 0.9986615911421668, 0.9798954855566846, 0.9347308950341957, 0.8454892505525416, 0.7377242867720033, 0.9687576454469834, 0.9985424736565045, 0.8748271092669433, nan, 0.8723045125732002, 0.941108545034642, nan, 0.974517402749342, nan, nan, nan, nan, 0.9640983370621868, nan, nan, nan, nan, nan, 0.898150431565968, nan, nan, 0.6409654710023466, 0.7210586050350805, nan, nan, nan, 0.0, nan, 0.25866666666666666, nan, nan, nan, 0.9695629235826468, nan, nan, 0.9927714032640678, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.44830438378825477, nan, 0.7436659589525831, 0.987307819393963, nan, nan, 0.9214994487320838, nan, 0.850446871896723, nan, nan, 0.8598689171161765, 0.8447726614538082, nan, nan, 0.8959357327836796, nan, nan, nan, nan, nan, nan, 0.9703734612977258, 0.0, nan, 0.9787673458021998, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.49780564263322885, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9511064278187565, 0.6014171833480957, nan, nan, nan, 0.9869278244471112, nan, nan, nan, 0.32652881654201493, 0.853319338756232, nan, nan, 0.9073440643863179, 0.8956705811199033, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0259 | 32.0 | 640 | 0.1623 | 0.6596 | 0.7696 | 0.9512 | [0.8944007327779179, 0.9808424017411344, 0.9630997312769063, 0.9274842410935619, 0.24440502183406113, 0.9683950924314018, 0.9847604681014532, nan, 0.7880212358881441, nan, 0.8687867652618367, 0.8931646575705391, 0.8979092107741571, 0.7675816834689664, 0.6775171384601035, 0.8963206904383375, 0.9755262524406879, 0.6287012113055181, nan, 0.7903252326613831, 0.46532951289398283, nan, 0.9269515230116724, nan, 0.0, nan, nan, 0.9199645187860356, nan, nan, nan, nan, nan, 0.6803676482062325, nan, nan, 0.5943527233451628, 0.7178253486337673, 0.0, nan, nan, 0.0, nan, 0.28355555555555556, nan, nan, nan, 0.8874150039014602, nan, nan, 0.9811744701524261, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.17035398230088494, nan, 0.7042556047791649, 0.8954655218674538, nan, nan, 0.8214285714285714, 0.0, 0.8699440759526597, nan, nan, 0.8211678832116789, 0.8506212077434268, nan, nan, 0.8782017773131208, nan, nan, nan, nan, nan, nan, 0.8210433244916003, 0.0, nan, 0.9740202757314735, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.42902389864670315, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9302422723475355, 0.6126516464471404, nan, nan, nan, 0.9837102911625123, nan, nan, nan, 0.3375010929439538, 0.8256357031655422, 0.0, nan, 0.8806445195941913, 0.8865639308748954, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9617210583977955, 0.9880679092273803, 0.9955515896622381, 0.9666778626292303, 0.2604143947655398, 0.9932194022871232, 0.992043078752755, nan, 0.8330944503234967, nan, 0.9986268272757296, 0.9787342139642909, 0.9449895926256319, 0.8464938718103275, 0.6865892657414263, 0.96552820864119, 0.9988102446757688, 0.8614568925772246, nan, 0.8278677230451257, 0.9376443418013857, nan, 0.9668397192161451, nan, nan, nan, nan, 0.9498632753725582, nan, nan, nan, nan, nan, 0.896013152486642, nan, nan, 0.6456587328193094, 0.7196399092034668, nan, nan, nan, 0.0, nan, 0.28355555555555556, nan, nan, nan, 0.969238261434195, nan, nan, 0.9869783447531587, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7425336164189668, 0.9730218274407666, nan, nan, 0.9052921719955899, nan, 0.8856669976828865, nan, nan, 0.8547938054637202, 0.8525919490298292, nan, nan, 0.8879023307436182, nan, nan, nan, nan, nan, nan, 0.9687043605257667, 0.0, nan, 0.9787441407156449, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4670846394984326, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9386722866174921, 0.6262178919397697, nan, nan, nan, 0.9861134921995542, nan, nan, nan, 0.3396392432908051, 0.8349514563106796, nan, nan, 0.8907444668008049, 0.9075651428859433, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.034 | 33.0 | 660 | 0.1697 | 0.6459 | 0.7693 | 0.9489 | [0.8862089918661721, 0.9810743653456977, 0.9632795557915342, 0.9287623326035654, 0.2574994966780753, 0.9694387427173394, 0.9850265876455293, nan, 0.7578602786307002, nan, 0.8668095252467328, 0.8914878064987707, 0.9211702954898912, 0.7683416474023604, 0.5627054518389878, 0.9046869295363271, 0.9754600862373816, 0.6265401265401266, nan, 0.8117658417275448, 0.46395881006864986, nan, 0.9279955128654561, nan, 0.0, nan, nan, 0.9175251208585385, nan, nan, nan, nan, nan, 0.6831347705496722, nan, nan, 0.5811546017832855, 0.7140651536205033, 0.0, nan, nan, 0.0, 0.0, 0.32355555555555554, nan, nan, nan, 0.8817456074042663, nan, nan, 0.9826489007598062, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1712846347607053, nan, 0.7025004481089802, 0.8953998004561003, nan, nan, 0.8254698120751699, 0.0, 0.8874773139745916, nan, nan, 0.8217341620446501, 0.8556998556998557, nan, nan, 0.8831999168312714, nan, nan, nan, nan, nan, nan, 0.8264888888888889, 0.0, nan, 0.9746440818871454, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.35232860862019094, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9371476299853831, 0.6733167082294265, nan, nan, nan, 0.9834622452031965, nan, nan, nan, 0.3685175484546883, 0.79374185136897, 0.0, nan, 0.8922469135802469, 0.8546886537041595, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9619324280955701, 0.9878460847988362, 0.9953252032520326, 0.9645407871376845, 0.2789531079607416, 0.9938570760532245, 0.992250732688033, nan, 0.8001018043481173, nan, 0.9984182440771062, 0.9737262302220931, 0.9393398751115076, 0.8458910990556561, 0.5716851242143071, 0.9699809169643294, 0.9988315446432103, 0.8674504379898571, nan, 0.8688942473303479, 0.9364896073903002, nan, 0.9678268499561276, nan, nan, nan, nan, 0.9485810731247792, nan, nan, nan, nan, nan, 0.8908549116317304, nan, nan, 0.6445692256118002, 0.715796533223277, nan, nan, nan, 0.0, nan, 0.32355555555555554, nan, nan, nan, 0.970435453106611, nan, nan, 0.9888516712312031, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.44995864350703063, nan, 0.739655579146025, 0.9768314252949523, nan, nan, 0.9104740904079383, nan, 0.906454816285998, nan, nan, 0.8550258105678324, 0.8586736171445121, nan, nan, 0.8979969346229058, nan, nan, nan, nan, nan, nan, 0.9699561861047361, 0.0, nan, 0.9793822805959066, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.38181818181818183, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9458377239199157, 0.7174490699734278, nan, nan, nan, 0.9863706497514144, nan, nan, nan, 0.37140343158820943, 0.7987404880608764, nan, nan, 0.9088531187122736, 0.8751071014565798, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0269 | 34.0 | 680 | 0.1723 | 0.6601 | 0.7683 | 0.9478 | [0.8812703988142674, 0.9801313654118864, 0.9626128075608015, 0.9351206300737055, 0.2193976236529428, 0.9720141489804411, 0.9850751167545061, nan, 0.758910647323965, nan, 0.8611202734714684, 0.8851655888692925, 0.9180585814572743, 0.7616770871643986, 0.47605917434745565, 0.9080025637504006, 0.975459922132731, 0.6282072504552226, nan, 0.8120233288470166, 0.47214076246334313, nan, 0.9273534139352855, nan, 0.0, nan, nan, 0.9186200461497724, nan, nan, nan, nan, nan, 0.681517110865794, nan, nan, 0.5815338793745346, 0.7175093271581114, 0.0, nan, nan, 0.0, nan, 0.31377777777777777, nan, nan, nan, 0.8856318010215412, nan, nan, 0.9847837963618704, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.36535859269282817, nan, 0.7032455440276669, 0.889568479633021, nan, nan, 0.8237452886332077, 0.0, 0.9030980061050854, nan, nan, 0.8241491085899514, 0.8601499423298731, nan, nan, 0.8842099824325721, nan, nan, nan, nan, nan, nan, 0.8045125732001378, 0.0, nan, 0.9737367904371427, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3769075727037144, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9332362728785357, 0.6576955424726662, nan, nan, nan, 0.986626789147618, nan, nan, nan, 0.35135608048993877, 0.8118811881188119, 0.0, nan, 0.9000099000099, 0.8706964391107613, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9606708497947168, 0.9876448952008543, 0.9960440214158239, 0.9698151927981663, 0.23089785532533624, 0.9929643327806827, 0.9914767498383603, nan, 0.8051652311360562, nan, 0.9983313344110132, 0.9835244592829148, 0.9397611259787888, 0.832228249949769, 0.4841286094613967, 0.9704702255712678, 0.9987037448385615, 0.8748271092669433, nan, 0.8728901136755081, 0.9295612009237876, nan, 0.9702763966071951, nan, nan, nan, nan, 0.953186534259659, nan, nan, nan, nan, nan, 0.8881011097410604, nan, nan, 0.6545424069728462, 0.7193045810978126, nan, nan, nan, 0.0, nan, 0.31377777777777777, nan, nan, nan, 0.9710644860192362, nan, nan, 0.9910100691298195, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4466501240694789, nan, 0.7483368719037509, 0.9724581624521371, nan, nan, 0.9156560088202866, nan, 0.9205561072492552, nan, nan, 0.8553158169479729, 0.8638864755285259, nan, nan, 0.9044447967866391, nan, nan, nan, nan, nan, nan, 0.9745462132276236, 0.0, nan, 0.978222026268158, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4103448275862069, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.945626975763962, 0.6926483613817538, nan, nan, nan, 0.9897136979255957, nan, nan, nan, 0.3533655961284646, 0.8176331671477303, nan, nan, 0.9145875251509055, 0.8922433345093493, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0128 | 35.0 | 700 | 0.1675 | 0.6568 | 0.7616 | 0.9495 | [0.8889078716515174, 0.9808356928181209, 0.9627958098956627, 0.9324157321274915, 0.22162837509543973, 0.9702731877439176, 0.9843800419961745, nan, 0.7379540635593759, nan, 0.8493045393404579, 0.887016393442623, 0.9167230219302482, 0.7693876289933881, 0.659526213592233, 0.8993163172288058, 0.9754795762195575, 0.639704619611884, nan, 0.806705217334806, 0.5012658227848101, nan, 0.9302547096592112, nan, 0.0, nan, nan, 0.9151303685846602, nan, nan, nan, nan, nan, 0.660601946141661, nan, nan, 0.5654289564306034, 0.7149949836647544, 0.0, nan, nan, 0.0, nan, 0.26222222222222225, nan, nan, nan, 0.882623047559334, nan, nan, 0.9900061765271717, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34115281501340483, nan, 0.7111111111111111, 0.8653072224218469, nan, nan, 0.8269499301815281, 0.0, 0.8755949664210733, nan, nan, 0.814993006993007, 0.823953823953824, nan, nan, 0.8790444827766453, nan, nan, nan, nan, nan, nan, 0.8215167548500882, 0.0, nan, 0.9734145777983135, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2825138201920279, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9265195670274771, 0.6708860759493671, nan, nan, nan, 0.9867048563611491, nan, nan, nan, 0.27058307759754496, 0.8079729025534133, 0.0, nan, 0.8806461262339216, 0.8881672816728168, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9615694476721667, 0.9882433052871594, 0.9958738184942825, 0.9718969036396721, 0.2321337695383497, 0.9934957275857671, 0.9933833905168223, nan, 0.7785621475359329, nan, 0.9987311188750413, 0.9817825518943243, 0.9405292893250075, 0.8339026187127453, 0.6688353628759117, 0.9654548123501493, 0.9987980732658023, 0.8586906408483171, nan, 0.8553909748535997, 0.9145496535796767, nan, 0.9640611289850833, nan, nan, nan, nan, 0.9436746869725635, nan, nan, nan, nan, nan, 0.8998766954377312, nan, nan, 0.6330036875628562, 0.7169314898885679, nan, nan, nan, 0.0, nan, 0.26222222222222225, nan, nan, nan, 0.9711862343249057, nan, nan, 0.9954490383930117, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.42100909842845324, nan, 0.7503656522764803, 0.936130925771152, nan, nan, 0.9141124586549063, nan, 0.8890433631247932, nan, nan, 0.8449625891769619, 0.8268172603533159, nan, nan, 0.8888008033402041, nan, nan, nan, nan, nan, nan, 0.9718339244731901, 0.0, nan, 0.9777289181788648, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.30438871473354234, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9380400421496312, 0.704162976085031, nan, nan, nan, 0.9892422424138522, nan, nan, nan, 0.27153541575011, 0.813697192337969, nan, nan, 0.8885311871227364, 0.9098331737311628, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0262 | 36.0 | 720 | 0.1661 | 0.6472 | 0.7706 | 0.9495 | [0.889873921317715, 0.9811252212887447, 0.9640712138994878, 0.9346838109948147, 0.26672969140387603, 0.9692176941254509, 0.9843506575530885, nan, 0.7497218851132686, nan, 0.8482852332141908, 0.8934785496081817, 0.9143529072976411, 0.7562941833064616, 0.6354382920707712, 0.9092869597136827, 0.9753915454121367, 0.6468127823322737, nan, 0.8165051070197125, 0.4682814302191465, nan, 0.9295264231933009, nan, 0.0, nan, nan, 0.9112649146559032, nan, nan, nan, nan, nan, 0.6520744903449512, nan, nan, 0.58598355287097, 0.7181268493503152, 0.0, nan, nan, 0.0, 0.0, 0.2657777777777778, nan, nan, nan, 0.8823312838260742, nan, nan, 0.9899324440663203, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.30830248545742994, nan, 0.7205153178678945, 0.8577169838523645, nan, nan, 0.8308881064162754, 0.0, 0.8996231808731808, nan, nan, 0.81834466929773, 0.8382352941176471, nan, nan, 0.8842263602883967, nan, nan, nan, nan, nan, nan, 0.8089082969432314, 0.0, nan, 0.9742504633184571, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3546140452698781, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9301890712653231, 0.6994266994266994, nan, nan, nan, 0.986336464560205, nan, nan, nan, 0.3085814360770578, 0.8206663196251952, 0.0, nan, 0.8876973488233542, 0.8328825062705946, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9604174274869139, 0.9892183010312257, 0.994946790931324, 0.9693759665745153, 0.28716830243547803, 0.9938570760532245, 0.9929728020538862, nan, 0.7944221933478895, nan, 0.9983139524777945, 0.9764842502540282, 0.9441470908910695, 0.8168240573303864, 0.6461191890231415, 0.9696628663698195, 0.9988741445780933, 0.8911940986629784, nan, 0.8646572511195315, 0.9376443418013857, nan, 0.9658891488739397, nan, nan, nan, nan, 0.9422878151535372, nan, nan, nan, nan, nan, 0.8966091245376079, nan, nan, 0.6748240026818639, 0.7199236483697895, nan, nan, nan, 0.0, nan, 0.2657777777777778, nan, nan, nan, 0.9725660484558256, nan, nan, 0.9950926991172967, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.48221670802315963, nan, 0.7678697806086341, 0.9250519932360202, nan, nan, 0.9366041896361632, nan, 0.9167163190996359, nan, nan, 0.8478916536163795, 0.841876629018245, nan, nan, 0.9009566090587178, nan, nan, nan, nan, nan, nan, 0.9662007093678281, 0.0, nan, 0.9789471852230008, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3830721003134796, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9435194942044257, 0.7564216120460585, nan, nan, nan, 0.9900565746614092, nan, nan, nan, 0.310074791025077, 0.8273419050118079, nan, nan, 0.8993963782696177, 0.85353560808427, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0356 | 37.0 | 740 | 0.1642 | 0.6432 | 0.7651 | 0.9506 | [0.896953443001252, 0.9805451652023518, 0.9630383186158948, 0.9277041197170582, 0.23596737715029814, 0.9724778282050215, 0.9832588139754133, nan, 0.773141275632155, nan, 0.8382243495264641, 0.8859625668449198, 0.9221997514680441, 0.7426046628227626, 0.7515533724840014, 0.9024028816341418, 0.9755192172806924, 0.6463456875649463, nan, 0.8136504054575878, 0.4830917874396135, nan, 0.9302652874533812, nan, 0.0, nan, nan, 0.9180410806435293, nan, nan, nan, nan, nan, 0.6539480417833168, nan, nan, 0.601217192848992, 0.7116685964235173, 0.0, nan, nan, 0.0, 0.0, 0.25866666666666666, nan, nan, nan, 0.8768684706441993, nan, nan, 0.9892466591808038, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16958151155527795, nan, 0.7073310930314821, 0.8640058862588159, nan, nan, 0.8416451420623883, 0.0, 0.8613090956646832, nan, nan, 0.8044300075606956, 0.8094410657399362, nan, nan, 0.8827668890742285, nan, nan, nan, nan, nan, nan, 0.8165121567255553, 0.022857142857142857, nan, 0.9737945371600162, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.32850940665701883, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.920250521920668, 0.6725738396624472, nan, nan, nan, 0.9857319834251783, nan, nan, nan, 0.343459989493959, 0.8249219562955254, 0.0, nan, 0.8754101620761658, 0.7493713327745181, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.960076580016157, 0.9885373516226715, 0.9956854385617027, 0.9633116919067954, 0.25030897855325335, 0.9929005654040726, 0.994261200334134, nan, 0.823235502926875, nan, 0.9984356260103249, 0.9619683553491073, 0.9378531073446328, 0.7935838189002746, 0.764079459348761, 0.9684151294221265, 0.9987585161834109, 0.8603042876901799, nan, 0.8709955218739235, 0.9237875288683602, nan, 0.9666569172272594, nan, nan, nan, nan, 0.9496670199264696, nan, nan, nan, nan, nan, 0.8928688861487875, nan, nan, 0.6623365739188736, 0.7134492364836978, nan, nan, nan, 0.0, nan, 0.25866666666666666, nan, nan, nan, 0.972505174302991, nan, nan, 0.9956119363476242, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4491315136476427, nan, 0.7547534795942439, 0.9357810647437268, nan, nan, 0.9340683572216097, nan, 0.8720291294273419, nan, nan, 0.833101328229221, 0.8094410657399362, nan, nan, 0.8950372601870937, nan, nan, nan, nan, nan, nan, 0.9739203004381389, 0.022857142857142857, nan, 0.9782800389845454, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3557993730407524, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9289778714436249, 0.7059344552701505, nan, nan, nan, 0.9889850848619921, nan, nan, nan, 0.34518257809062913, 0.8320650747835214, nan, nan, 0.8857142857142857, 0.7659896174587975, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0307 | 38.0 | 760 | 0.1636 | 0.6452 | 0.7736 | 0.9498 | [0.8948510915810068, 0.9815781787249742, 0.9637981027728704, 0.9280973608370363, 0.2652577389419495, 0.9679604009609809, 0.9838531732675119, nan, 0.7715887328945199, nan, 0.8373798795514531, 0.8888668517783949, 0.9270299301791904, 0.7738684258750229, 0.7186041824918226, 0.8846000266205244, 0.9754878360466706, 0.6260618688892451, nan, 0.812523973916379, 0.46681792399319344, nan, 0.9295710427554277, nan, 0.0, nan, nan, 0.9120556238047595, nan, nan, nan, nan, nan, 0.6527505273872571, nan, nan, 0.594208211143695, 0.7142305812128541, 0.0, nan, nan, 0.0, 0.0, 0.3022222222222222, nan, nan, nan, 0.8742666618081114, nan, nan, 0.9856163136493293, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.21177985194721596, nan, 0.7149239586623946, 0.8451480556546557, nan, nan, 0.8394029850746269, 0.0, 0.8522994512673112, nan, nan, 0.8180678589336453, 0.8389708008094825, nan, nan, 0.8775051017738476, nan, nan, nan, nan, nan, nan, 0.8028556683296061, 0.0, nan, 0.9740069992261582, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.31836854460093894, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9255008347245409, 0.6974858069748581, nan, nan, nan, 0.9862066020412521, nan, nan, nan, 0.3901693731447529, 0.839119170984456, 0.0, nan, 0.8856772650254161, 0.7465483234714004, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9566448657083099, 0.9881710833801916, 0.9952558001189769, 0.9651793261014797, 0.2846964740094511, 0.993431960209157, 0.9932370888806037, nan, 0.8149572019878638, nan, 0.9981575150788271, 0.9758310349833067, 0.9409505401922886, 0.8485031143258991, 0.7302414971880464, 0.9755835005137741, 0.9987798161508524, 0.9004149377593361, nan, 0.8756114364450568, 0.9503464203233256, nan, 0.9689602222872185, nan, nan, nan, nan, 0.9422223966715076, nan, nan, nan, nan, nan, 0.9093300452116728, nan, nan, 0.6792658397586322, 0.7160544779199339, nan, nan, nan, 0.0, nan, 0.3022222222222222, nan, nan, nan, 0.9736820745911287, nan, nan, 0.9920485435904746, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.544251447477254, nan, 0.7474404340646379, 0.9208925343544092, nan, nan, 0.9300992282249173, nan, 0.8637537239324727, nan, nan, 0.8516617365582043, 0.8404286128004633, nan, nan, 0.8863167908672903, nan, nan, nan, nan, nan, nan, 0.973711662841644, 0.0, nan, 0.978448275862069, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3401253918495298, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.934668071654373, 0.7617360496014172, nan, nan, nan, 0.9897994171095491, nan, nan, nan, 0.3932248130224373, 0.8499081605877722, nan, nan, 0.893963782696177, 0.7630663777027368, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0289 | 39.0 | 780 | 0.1727 | 0.6429 | 0.7686 | 0.9487 | [0.8901535255439609, 0.9812361126754795, 0.9631441184888608, 0.9257805290133886, 0.2623762376237624, 0.9690012432656444, 0.9852000037480205, nan, 0.7533068699192147, nan, 0.8349774090915695, 0.8893374809716064, 0.923285811269518, 0.7751598173515982, 0.6592018597442851, 0.9052694308831282, 0.975482985191723, 0.6150901260169086, nan, 0.8143871825641358, 0.481216457960644, nan, 0.9277797264117853, nan, 0.0, nan, nan, 0.9145941704319651, nan, nan, nan, nan, nan, 0.6692143214594423, nan, nan, 0.5765159534250718, 0.7104369306777829, 0.0, nan, nan, 0.0, 0.0, 0.264, nan, nan, nan, 0.874477750816442, nan, nan, 0.9857435045867844, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16191014410850524, nan, 0.7132467764243966, 0.8625496234845922, nan, nan, 0.8386290967226219, 0.0, 0.8770475755400379, nan, nan, 0.8010532802958149, 0.8216044019693021, nan, nan, 0.8759078321751398, nan, nan, nan, nan, nan, nan, 0.8085325993462928, 0.0, nan, 0.9722308892355694, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3481159420289855, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9220289247537204, 0.6998368678629691, nan, nan, nan, 0.9859589041095891, nan, nan, nan, 0.37161808343515446, 0.8243383497664764, 0.0, nan, 0.8927479033053775, 0.7693975072663678, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9595774819339774, 0.9887024302671694, 0.995371472007403, 0.9626879168496776, 0.2812795347146492, 0.9940058665986481, 0.9924300701775913, nan, 0.7956679570814301, nan, 0.9990092298065391, 0.9752503991871099, 0.9436019427098821, 0.8527225236085996, 0.6700641156918036, 0.9662866369819445, 0.9988193732332438, 0.8888888888888888, nan, 0.864967275232518, 0.9318706697459584, nan, 0.9670590816028078, nan, nan, nan, nan, 0.945467153380173, nan, nan, nan, nan, nan, 0.906905055487053, nan, nan, 0.6390378813275226, 0.7121595130004127, nan, nan, nan, 0.0, nan, 0.264, nan, nan, nan, 0.9725863398401039, nan, nan, 0.991172967084432, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4739454094292804, nan, 0.7542344892663364, 0.9417481389337014, nan, nan, 0.9253583241455348, nan, 0.8897053955643827, nan, nan, 0.8293022446493823, 0.8216044019693021, nan, nan, 0.8859996828920247, nan, nan, nan, nan, nan, nan, 0.9805967035259754, 0.0, nan, 0.9761451710214879, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.37648902821316615, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9270811380400421, 0.7599645704162976, nan, nan, nan, 0.9871421224069947, nan, nan, nan, 0.37465904091509017, 0.8336394647074259, nan, nan, 0.9102615694164989, 0.7871579053475127, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0446 | 40.0 | 800 | 0.1747 | 0.6421 | 0.7689 | 0.9492 | [0.8913463052802886, 0.980630171897629, 0.9621990877843889, 0.9256938926098308, 0.235456299972429, 0.9698975146259491, 0.9844503042026331, nan, 0.7866969613742478, nan, 0.8427813132349923, 0.8852620517535794, 0.9116218685387315, 0.761605946143536, 0.6474760116812682, 0.9097513013302487, 0.9755130548496872, 0.5998162890385793, nan, 0.8170524388649143, 0.47463556851311955, nan, 0.9285239012432148, nan, 0.0, nan, nan, 0.9173168890007551, nan, nan, nan, nan, nan, 0.6659197268334324, nan, nan, 0.5805581251927228, 0.7130945027794935, 0.0, nan, nan, 0.0, 0.0, 0.2853333333333333, nan, nan, nan, 0.8720778040356298, nan, nan, 0.9851946368534046, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.15611931976582102, nan, 0.6966181850629071, 0.8644001493625421, nan, nan, 0.836204321417903, 0.0, 0.8796320459290188, nan, nan, 0.7934455718260772, 0.819047619047619, nan, nan, 0.8798858032701791, nan, nan, nan, nan, nan, nan, 0.8000683526999316, 0.0, nan, 0.9716564098119095, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3245997088791849, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9212400502723084, 0.7063106796116505, nan, nan, nan, 0.9855004277159966, nan, nan, nan, 0.3284416491963662, 0.822886866059818, 0.0, nan, 0.8936880337685286, 0.8408655028276371, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.958274957670728, 0.9885218754997498, 0.995949831449534, 0.963130834049998, 0.24834605597964376, 0.9937295413000042, 0.9919675682308357, nan, 0.8441187895998821, nan, 0.9990439936729764, 0.9782987371171433, 0.9395876697393201, 0.8372513562386981, 0.660123820476063, 0.9620785829622743, 0.9987433019209527, 0.9031811894882434, nan, 0.8609025146400275, 0.9399538106235565, nan, 0.9693623866627669, nan, nan, nan, nan, 0.9536706310266777, nan, nan, nan, nan, nan, 0.8977599671187834, nan, nan, 0.6311599061347637, 0.7147131654973173, nan, nan, nan, 0.0, nan, 0.2853333333333333, nan, nan, nan, 0.9734385779797898, nan, nan, 0.9904806507773287, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46319272125723737, nan, 0.7366831799952819, 0.9448774514567825, nan, nan, 0.9259095920617421, nan, 0.8926183382985766, nan, nan, 0.818688011136245, 0.8218940052128584, nan, nan, 0.8958828814544686, nan, nan, nan, nan, nan, nan, 0.9768412267890674, 0.0, nan, 0.9754838260546712, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3495297805642633, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9268703898840885, 0.7732506643046945, nan, nan, nan, 0.9875278587347849, nan, nan, nan, 0.3308402991641003, 0.8302282865389662, nan, nan, 0.9158953722334005, 0.8618013204979588, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0192 | 41.0 | 820 | 0.1705 | 0.6545 | 0.7673 | 0.9498 | [0.8938108109779859, 0.9808216934439571, 0.9628674813562248, 0.9298363237766852, 0.23947043490190698, 0.9726871513031893, 0.9839861481585475, nan, 0.7479683029950125, nan, 0.8367296439696079, 0.8854159826646529, 0.9108562398237716, 0.7615840850117386, 0.741993545183714, 0.9083137471772893, 0.9754668314889874, 0.6234369990381533, nan, 0.8121299355556283, 0.4582164890633763, nan, 0.9267770682759104, nan, 0.0, nan, nan, 0.9145897543218802, nan, nan, nan, nan, nan, 0.6764796690234339, nan, nan, 0.579228883441052, 0.7123481572987441, 0.0, nan, nan, 0.0, nan, 0.30133333333333334, nan, nan, nan, 0.8729532057346627, nan, nan, 0.9861338839294751, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.2286036036036036, nan, 0.7065013823240881, 0.8864303616183316, nan, nan, 0.8358032009484292, 0.0, 0.8783299680844134, nan, nan, 0.794371808540486, 0.8353552859618717, nan, nan, 0.8779051589369463, nan, nan, nan, nan, nan, nan, 0.8080441912653202, 0.0, nan, 0.9733141606463275, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2326061320754717, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9189698492462312, 0.7008064516129032, nan, nan, nan, 0.9859589041095891, nan, nan, nan, 0.3407362070473026, 0.8274072151570205, 0.0, nan, 0.8974029821269872, 0.7084032783647675, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9594734570565386, 0.9885631451608743, 0.9955135831846124, 0.967475113589807, 0.25379861868411485, 0.9931556349105131, 0.9923404014328121, nan, 0.7915020159939989, nan, 0.9992004310719438, 0.9786616344897663, 0.9426355436614134, 0.8255977496483825, 0.7533199955890925, 0.9644027988452317, 0.9988711017256017, 0.8964960811433841, nan, 0.8551842921116087, 0.9434180138568129, nan, 0.9657429072828312, nan, nan, nan, nan, 0.943452264133663, nan, nan, nan, nan, nan, 0.9005548705302097, nan, nan, 0.6534528997653369, 0.713965125877012, nan, nan, nan, 0.0, nan, 0.30133333333333334, nan, nan, nan, 0.9736009090540156, nan, nan, 0.9919670946131682, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5037220843672456, nan, 0.7475347959424392, 0.9624288129992808, nan, nan, 0.9327453142227122, nan, 0.8927507447864945, nan, nan, 0.82109506409141, 0.8375325803649001, nan, nan, 0.8903863432165319, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9777579245370586, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24733542319749216, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9249736564805058, 0.7697077059344553, nan, nan, nan, 0.9871421224069947, nan, nan, nan, 0.3428948526176859, 0.8365258462345841, nan, nan, 0.9142857142857143, 0.7231490348268736, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0205 | 42.0 | 840 | 0.1683 | 0.6439 | 0.7689 | 0.9508 | [0.8956305369945103, 0.9794809125957438, 0.9628582382823733, 0.9295548742406178, 0.23611397185032612, 0.9693321452993224, 0.9841788163861924, nan, 0.7758326645264848, nan, 0.8305831695757716, 0.8744544287548138, 0.91282918579367, 0.7509580912350106, 0.7465264855476988, 0.9033687335845609, 0.9756013459686337, 0.6346378018318068, nan, 0.8163205899476034, 0.4824242424242424, nan, 0.9294739421091316, nan, 0.0, nan, nan, 0.9181977911646586, nan, nan, nan, nan, nan, 0.6857200691341109, nan, nan, 0.5973088034058082, 0.7148555106662207, 0.0, nan, nan, 0.0, 0.0, 0.25333333333333335, nan, nan, nan, 0.8811810154525387, nan, nan, 0.9876549456109284, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.232212389380531, nan, 0.6997873659962789, 0.8843194901677728, nan, nan, 0.8338404161350476, 0.0, 0.866992824527071, nan, nan, 0.8078097425698199, 0.8370818915801614, nan, nan, 0.8831175610774692, nan, nan, nan, nan, nan, nan, 0.8185328185328186, 0.005714285714285714, nan, 0.9733795493934142, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.23076923076923078, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9274937133277452, 0.6924979389942292, nan, nan, nan, 0.9864743397680092, nan, nan, nan, 0.3556254917387884, 0.8312159709618875, 0.0, nan, 0.8835453100158982, 0.7490136121522983, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.955996370195766, 0.9882071943336755, 0.9959845330160618, 0.9668734843927052, 0.2500181752090149, 0.9943247034816988, 0.9914106781316809, nan, 0.8287409749105863, nan, 0.9991656672055066, 0.9888227609232109, 0.9362176628010704, 0.8136762440559909, 0.7583925392649538, 0.9677056319420658, 0.9986763591661367, 0.8785154449054864, nan, 0.8694109541853255, 0.9191685912240185, nan, 0.9708979233694063, nan, nan, nan, nan, 0.9572293964490848, nan, nan, nan, nan, nan, 0.9050349362926429, nan, nan, 0.6584813945692256, 0.7165703673132481, nan, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.9719775983117568, nan, nan, 0.9937284287474165, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5425971877584781, nan, 0.7453172918141071, 0.9763455072013062, nan, nan, 0.9367144432194047, nan, 0.8799073154584575, nan, nan, 0.8363203990487791, 0.8407182160440196, nan, nan, 0.894085936261297, nan, nan, nan, nan, nan, nan, 0.9730857500521594, 0.005714285714285714, nan, 0.9774678609551214, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2445141065830721, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9327713382507903, 0.7440212577502214, nan, nan, nan, 0.987785016286645, nan, nan, nan, 0.3579410470743511, 0.8412490160062975, nan, nan, 0.8945674044265594, 0.7654352099188549, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0113 | 43.0 | 860 | 0.1748 | 0.6693 | 0.7650 | 0.9486 | [0.8862180059576961, 0.9802456117000855, 0.9624647113677299, 0.9315637561312485, 0.2546024819309969, 0.9719206289779109, 0.9846526353256339, nan, 0.7367487809511747, nan, 0.8552594378227163, 0.8863457502623295, 0.9228475135174454, 0.7635841329992055, 0.6107677772098571, 0.916954774808447, 0.9756959932437657, 0.6050876376609275, nan, 0.8132295719844358, 0.47469458987783597, nan, 0.9291352394316786, nan, 0.0, nan, nan, 0.9203567694226432, nan, nan, nan, nan, nan, 0.6854760581275312, nan, nan, 0.5864972324004054, 0.7150724413906688, 0.0, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.8725284738041003, nan, nan, 0.9822530007369346, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35317200784826686, nan, 0.6970102632753236, 0.8911601723986279, nan, nan, 0.8359893101059092, nan, 0.8660877513711152, nan, nan, 0.7962785722328567, 0.8303030303030303, nan, nan, 0.8800020707149143, nan, nan, nan, nan, nan, nan, 0.8085655314757482, 0.0, nan, 0.9715996093862858, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.31168454624528846, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9161764705882353, 0.6882690730106645, nan, nan, nan, 0.975255790059506, nan, nan, nan, 0.3235010940919037, 0.8317320176577513, 0.0, nan, 0.8897087760724124, 0.7912726556343578, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9610271903323263, 0.9878409260911957, 0.9960291493158834, 0.9708781940789353, 0.27146492184660126, 0.9932406580793266, 0.9922318550575532, nan, 0.7792051223661474, nan, 0.9990439936729764, 0.980911598200029, 0.9431311329170383, 0.8367155582345456, 0.6211975613982577, 0.960341537407643, 0.9983872881794309, 0.8992623328722914, nan, 0.8639338615225629, 0.9422632794457275, nan, 0.9683021351272302, nan, nan, nan, nan, 0.9558556083264644, nan, nan, nan, nan, nan, 0.8869913686806412, nan, nan, 0.6304894401609118, 0.7167509286009079, nan, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.9715514792419139, nan, nan, 0.9906333676097779, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4466501240694789, nan, 0.7369662656286861, 0.9846255515170362, nan, nan, 0.9312017640573319, nan, 0.8781860311155246, nan, nan, 0.820341047503045, 0.8331885317115552, nan, nan, 0.8984197452565932, nan, nan, nan, nan, nan, nan, 0.9808053411224703, 0.0, nan, 0.9754606209681163, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.33699059561128525, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.919072708113804, 0.7431355181576617, nan, nan, nan, 0.9763843648208469, nan, nan, nan, 0.32520897492300926, 0.8404618210443453, nan, nan, 0.9097585513078471, 0.809737412428809, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0424 | 44.0 | 880 | 0.1763 | 0.6578 | 0.7665 | 0.9496 | [0.8891415629467557, 0.9809505286970407, 0.9625978498195937, 0.9312893115134535, 0.24847738315198795, 0.9684108045405585, 0.9849131369527441, nan, 0.7589677776519583, nan, 0.8496260604806527, 0.890840402969247, 0.9281244673857466, 0.7730816077953715, 0.6337940962494002, 0.9188380034616644, 0.9756155570486559, 0.6026479750778816, nan, 0.8113047727642939, 0.45278851463279957, nan, 0.9284037969806298, nan, 0.0, nan, nan, 0.9202359095596188, nan, nan, nan, nan, nan, 0.6884743083003952, nan, nan, 0.577511696138292, 0.7136608084810745, 0.0, nan, nan, 0.0, nan, 0.2408888888888889, nan, nan, nan, 0.863911072739046, nan, nan, 0.984174715995383, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34705882352941175, nan, 0.6982588404236223, 0.8904824909588075, nan, nan, 0.8351778656126482, 0.0, 0.8551462904911181, nan, nan, 0.7807334273624824, 0.8477697841726619, nan, nan, 0.8798750325436084, nan, nan, nan, nan, nan, nan, 0.8201991962257558, 0.0, nan, 0.9722562455224053, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.30887470997679817, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9221242653232578, 0.6905940594059405, nan, nan, nan, 0.9823217190309049, nan, nan, nan, 0.32689112374289464, 0.8398650402283935, 0.0, nan, 0.8999217374290746, 0.8278103303953912, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9601396589311997, 0.9887385412206534, 0.9958853856831251, 0.9716680630861733, 0.26397673573246094, 0.9937295413000042, 0.9923781566937718, nan, 0.8080318272540956, nan, 0.9991830491387252, 0.9755407170852083, 0.9445683417583507, 0.8501774830888755, 0.6450164621370847, 0.9610999657483975, 0.9985424736565045, 0.8918856615952052, nan, 0.8578367206338271, 0.9468822170900693, nan, 0.9690333430827728, nan, nan, nan, nan, 0.9574518192879853, nan, nan, nan, nan, nan, 0.8949034114262228, nan, nan, 0.610375460945357, 0.7154096161782914, nan, nan, nan, 0.0, nan, 0.2408888888888889, nan, nan, nan, 0.9745951868836492, nan, nan, 0.9896356176377761, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4392059553349876, nan, 0.7341354092946449, 0.9811075045190383, nan, nan, 0.9318632855567806, nan, 0.8668652763985435, nan, nan, 0.8026506583144829, 0.8531711555169418, nan, nan, 0.8930817610062893, nan, nan, nan, nan, nan, nan, 0.9793448779470061, 0.0, nan, 0.9762495939109853, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3338557993730408, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9258166491043204, 0.7413640389725421, nan, nan, nan, 0.9835847762729298, nan, nan, nan, 0.3289045314562253, 0.84912096562582, nan, nan, 0.9254527162977867, 0.8473363237740034, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0414 | 45.0 | 900 | 0.1792 | 0.6435 | 0.7676 | 0.9488 | [0.8882439833973854, 0.9806965815903437, 0.9625191452934121, 0.9285496561990502, 0.25153667531757956, 0.9685805356625409, 0.9851431405763058, nan, 0.76485924806348, nan, 0.843434121150771, 0.8906909724549293, 0.9101571774538202, 0.7726548129981606, 0.6114302525432389, 0.9010009598244892, 0.9756848123734513, 0.6038838017050837, nan, 0.8006469073866261, 0.4669350201265095, nan, 0.9265795359239586, nan, 0.0, nan, nan, 0.9194410141384611, nan, nan, nan, nan, nan, 0.6873570779688626, nan, nan, 0.5661163283853378, 0.7119695292997401, 0.0, nan, nan, 0.0, 0.0, 0.296, nan, nan, nan, 0.8753031712165144, nan, nan, 0.9841530054644809, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.2653985507246377, nan, 0.6995879243930843, 0.8808489686824105, nan, nan, 0.8364921928704704, 0.0, 0.8458221728620893, nan, nan, 0.8047526754866436, 0.832564841498559, nan, nan, 0.8793391014281247, nan, nan, nan, nan, nan, nan, 0.8089127796081151, 0.0, nan, 0.971926276866189, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.29544126241963764, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9194461925739459, 0.7001633986928104, nan, nan, nan, 0.9838087895142636, nan, nan, nan, 0.3585532063603005, 0.8245841995841996, 0.0, nan, 0.8963378617838157, 0.8302555768946669, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9588991069353607, 0.9883206859017679, 0.9958771234053804, 0.9669657588094386, 0.26775717920756087, 0.9946860519491562, 0.9935816056368605, nan, 0.8134435320750674, nan, 0.9993047226712555, 0.964581216431993, 0.9498959262563188, 0.8440158060411225, 0.6220639896658737, 0.9645740568576602, 0.9985272593940464, 0.8817427385892116, nan, 0.8356183258697899, 0.9376443418013857, nan, 0.969398947060544, nan, nan, nan, nan, 0.9546519082571208, nan, nan, nan, nan, nan, 0.9018701191944102, nan, nan, 0.6264666443178009, 0.7136040033016922, nan, nan, nan, 0.0, nan, 0.296, nan, nan, nan, 0.9739661539710239, nan, nan, 0.9901548548681036, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.48469809760132343, nan, 0.7369190846897853, 0.9736438026006337, nan, nan, 0.9391400220507167, nan, 0.8571333995365773, nan, nan, 0.8308682791021402, 0.8366637706342311, nan, nan, 0.8916547751175942, nan, nan, nan, nan, nan, nan, 0.9732943876486543, 0.0, nan, 0.9758957163410219, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.31692789968652035, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9237091675447839, 0.7590788308237378, nan, nan, nan, 0.9843991085204868, nan, nan, nan, 0.3611086669599648, 0.8325898714248229, nan, nan, 0.9159959758551308, 0.8497555566755708, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0226 | 46.0 | 920 | 0.1801 | 0.6550 | 0.7666 | 0.9484 | [0.8863154297384601, 0.9805562518223718, 0.9624241456403705, 0.9279251125726218, 0.23793791726891045, 0.9720586705502965, 0.9842659227830001, nan, 0.7633767673967286, nan, 0.8356220120028481, 0.8813824584284317, 0.9174398376340969, 0.7674969623329283, 0.6063663272738543, 0.9142218432393815, 0.9756115687492753, 0.6065074531174868, nan, 0.8069166127989658, 0.4572697003329634, nan, 0.9276982186517639, nan, 0.0, nan, nan, 0.9153336080620388, nan, nan, nan, nan, nan, 0.6815701680541265, nan, nan, 0.5752359380898452, 0.7084127065849766, 0.0, nan, nan, 0.0, nan, 0.30933333333333335, nan, nan, nan, 0.88962973999851, nan, nan, 0.9856549488400365, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35960591133004927, nan, 0.6943997137233853, 0.875225375041574, nan, nan, 0.8320258874289076, 0.0, 0.8375710830773253, nan, nan, 0.814228367528992, 0.8080023028209556, nan, nan, 0.8794540529276933, nan, nan, nan, nan, nan, nan, 0.8085991678224688, 0.0, nan, 0.9728185515552064, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20908820908820908, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9250786988457502, 0.695829926410466, nan, nan, nan, 0.9843622809648258, nan, nan, nan, 0.33972003499562553, 0.8136565024758926, 0.0, nan, 0.8963822257176564, 0.8384191357417069, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9586003120746324, 0.9888520327887458, 0.9958853856831251, 0.9644522036976204, 0.25132679025808796, 0.9931131233261064, 0.9907877163258468, nan, 0.8114476310396099, nan, 0.999548069736316, 0.9809841776745536, 0.9409009812667262, 0.8460920233072132, 0.6163770695820665, 0.9705191564319616, 0.9986185449687955, 0.8722913785154449, nan, 0.860006889424733, 0.9515011547344111, nan, 0.9710441649605148, nan, nan, nan, nan, 0.9590218628566942, nan, nan, nan, nan, nan, 0.898479243732018, nan, nan, 0.6385350318471338, 0.7098380107304993, nan, nan, nan, 0.0, nan, 0.30933333333333335, nan, nan, nan, 0.969238261434195, nan, nan, 0.9905620997546349, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.543424317617866, nan, 0.7324368954942203, 0.9718167505685241, nan, nan, 0.9355016538037486, nan, 0.8483283680900364, nan, nan, 0.8470506351139725, 0.8129163046626122, nan, nan, 0.8922361397389145, nan, nan, nan, nan, nan, nan, 0.9730857500521594, 0.0, nan, 0.9768819325196083, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2206896551724138, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9289778714436249, 0.7537643932683791, nan, nan, nan, 0.9847419852563004, nan, nan, nan, 0.3416630004399472, 0.8192075570716347, nan, nan, 0.9173038229376258, 0.8585756766292022, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0467 | 47.0 | 940 | 0.1790 | 0.6529 | 0.7599 | 0.9478 | [0.884878492252123, 0.9802005512346531, 0.9625649315383583, 0.9235081829032041, 0.23904984209803654, 0.9708320349018386, 0.9842435941284782, nan, 0.73862380074973, nan, 0.8382112873134329, 0.8733238782877772, 0.917863827511607, 0.7676141885325559, 0.6291023690502064, 0.9129549974561769, 0.9755831477941898, 0.625250166777852, nan, 0.8075346224618265, 0.48849878934624696, nan, 0.9269380729916752, nan, 0.0, nan, nan, 0.9178051145163119, nan, nan, nan, nan, nan, 0.692783603280613, nan, nan, 0.5831975797067722, 0.7094184251158003, 0.0, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.886313834726091, nan, nan, 0.9833658703483809, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3541666666666667, nan, 0.7054628632938643, 0.8839047834642242, nan, nan, 0.8208704253214639, 0.0, 0.8235178512938094, nan, nan, 0.8120445598458832, 0.8034113905753107, nan, nan, 0.8795773174304248, nan, nan, nan, nan, nan, nan, 0.8221399612198131, 0.0, nan, 0.9717580715986041, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.257243195785777, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9152151101783841, 0.6601529311809685, nan, nan, nan, 0.9604490722886404, nan, nan, nan, 0.3089110644257703, 0.8227650727650727, 0.0, nan, 0.8959258163164644, 0.7780517879161529, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9608113940440225, 0.9888675089116674, 0.9957829334390905, 0.9624553853195094, 0.25314431115957836, 0.9933044254559368, 0.9914248363545408, nan, 0.7786157287717841, nan, 0.9996002155359719, 0.9832341413848164, 0.935672514619883, 0.8464268970598084, 0.6383685943382851, 0.9658462592357, 0.9986428877887287, 0.8642231443061319, nan, 0.8616947984843265, 0.9318706697459584, nan, 0.9647923369406259, nan, nan, nan, nan, 0.9537098821158954, nan, nan, nan, nan, nan, 0.8974722564734895, nan, nan, 0.6300703989272545, 0.7111019397441188, nan, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.9684671888316221, nan, nan, 0.9949094389183576, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4921422663358147, nan, 0.7421089879688606, 0.9866080973391125, nan, nan, 0.9149944873208379, nan, 0.8322409798080106, nan, nan, 0.843483556638246, 0.804807413843035, nan, nan, 0.8886422493525712, nan, nan, nan, nan, nan, nan, 0.9730857500521594, 0.0, nan, 0.9757042743769434, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2755485893416928, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.919072708113804, 0.6882196634189548, nan, nan, nan, 0.9606548945654038, nan, nan, nan, 0.31051473823141224, 0.8307530831802676, nan, nan, 0.913682092555332, 0.7950708129630563, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.029 | 48.0 | 960 | 0.1754 | 0.6435 | 0.7679 | 0.9485 | [0.8879804021379486, 0.9804234672726435, 0.962629872775407, 0.9239842493404677, 0.2593921342990591, 0.9715420113915104, 0.9842255923016018, nan, 0.7667185655809748, nan, 0.8451777022953519, 0.8809911966090642, 0.923166388020644, 0.7666707154135795, 0.6509498174617907, 0.9163994609414936, 0.9756136715649405, 0.60896518836433, nan, 0.8156695432948705, 0.48492016558249557, nan, 0.9279840431115932, nan, 0.0, nan, nan, 0.9155877588436955, nan, nan, nan, nan, nan, 0.6875128411802822, nan, nan, 0.5730311700045174, 0.7125836335563561, 0.0, nan, nan, 0.0, 0.0, 0.27644444444444444, nan, nan, nan, 0.8769633507853403, nan, nan, 0.9833370205748074, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35436094349972574, nan, 0.7008064516129032, 0.8871992065839611, nan, nan, 0.8119443370699652, 0.0, 0.8435027140147799, nan, nan, 0.7985220847562, 0.8241568175266647, nan, nan, 0.8801332431166398, nan, nan, nan, nan, nan, nan, 0.8127263321262287, 0.0, nan, 0.9723262155666482, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.25655375552282766, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9254918375889494, 0.6974317817014446, nan, nan, nan, 0.9594692916755831, nan, nan, nan, 0.34167249825052487, 0.8426645930533956, 0.0, nan, 0.9034079844206426, 0.7319409470201945, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9575169040425838, 0.988212353041316, 0.9956226452508428, 0.9605028586614304, 0.27858960378044345, 0.993431960209157, 0.9914531528002605, nan, 0.8179577511955314, nan, 0.9990787575394136, 0.980548700827406, 0.9440975319655069, 0.8454892505525416, 0.6628963909324344, 0.964916572882517, 0.9985850735913875, 0.8831258644536653, nan, 0.86930761281433, 0.9468822170900693, nan, 0.9695451886516525, nan, nan, nan, nan, 0.9539453886512017, nan, nan, nan, nan, nan, 0.8939786272092067, nan, nan, 0.6378645658732819, 0.7142746595130004, nan, nan, nan, 0.0, nan, 0.27644444444444444, nan, nan, nan, 0.9720587638488698, nan, nan, 0.9955610307368078, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5343258891645989, nan, 0.738004246284501, 0.991078543800657, nan, nan, 0.9263506063947078, nan, 0.8538894405825885, nan, nan, 0.8273302012644278, 0.8279756733275413, nan, nan, 0.8937159769568205, nan, nan, nan, nan, nan, nan, 0.9833089922804089, 0.0, nan, 0.9763424142572051, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2730407523510972, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9319283456269758, 0.7697077059344553, nan, nan, nan, 0.9608263329333104, nan, nan, nan, 0.3436867575890893, 0.8530569404355812, nan, nan, 0.933400402414487, 0.747139761100751, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0698 | 49.0 | 980 | 0.1807 | 0.6530 | 0.7598 | 0.9475 | [0.8836222961289916, 0.9804725070088198, 0.9630159382096722, 0.9263623758373554, 0.25809297590770275, 0.9646338698707776, 0.9843290662404247, nan, 0.7408599969575579, nan, 0.8519666128004032, 0.8864900662251656, 0.9246598722387478, 0.7614076030215562, 0.5906265539532571, 0.9201170960187354, 0.9756832217798407, 0.6030191103259996, nan, 0.811063011063011, 0.4717548076923077, nan, 0.9274007144358059, nan, 0.0, nan, nan, 0.9204964734820913, nan, nan, nan, nan, nan, 0.6857539475967378, nan, nan, 0.5694271481942715, 0.712462365868094, 0.0, nan, nan, 0.0, nan, 0.2088888888888889, nan, nan, nan, 0.8774394717534849, nan, nan, 0.9864953587265033, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.36174365647365, nan, 0.7095619490804957, 0.8934203757251069, nan, nan, 0.8244492525570417, 0.0, 0.8720521172638437, nan, nan, 0.8035244755244755, 0.8277553375649164, nan, nan, 0.8793778380917584, nan, nan, nan, nan, nan, nan, 0.8146664344190907, 0.0, nan, 0.9714589509329727, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22813688212927757, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9111344537815126, 0.6941467436108821, nan, nan, nan, 0.9763819955502311, nan, nan, nan, 0.3569556099265991, 0.8322965266977709, 0.0, nan, 0.9053805138148328, 0.7096056922620813, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9602746699423437, 0.988671478021326, 0.9957482318725627, 0.9641716894707508, 0.27648127953471463, 0.9948773540789865, 0.9939544388388369, nan, 0.7828486464040293, nan, 0.9988701743407902, 0.971548845986355, 0.9397363465160076, 0.8303529569352354, 0.598749192646387, 0.9612222929001321, 0.9984603166392303, 0.8656062701705856, nan, 0.8712710988632449, 0.9064665127020786, nan, 0.9681558935361216, nan, nan, nan, nan, 0.9596498802841779, nan, nan, nan, nan, nan, 0.8933826551582409, nan, nan, 0.6131411330874958, 0.7141714816343376, nan, nan, nan, 0.0, nan, 0.2088888888888889, nan, nan, nan, 0.9706992411022279, nan, nan, 0.9943494771993769, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4598842018196857, nan, 0.7481953290870488, 0.9788917180120119, nan, nan, 0.9242557883131202, nan, 0.8861966236345581, nan, nan, 0.833072327591207, 0.8308717057631045, nan, nan, 0.8904391945457428, nan, nan, nan, nan, nan, nan, 0.975798038806593, 0.0, nan, 0.9752517751891214, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2445141065830721, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9140147523709168, 0.745792736935341, nan, nan, nan, 0.9780558889079376, nan, nan, nan, 0.3594368675758909, 0.8425610076095513, nan, nan, 0.9395372233400402, 0.7238042437377148, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0359 | 50.0 | 1000 | 0.1795 | 0.6513 | 0.7613 | 0.9477 | [0.8876257683423482, 0.9803569602127442, 0.9621584849980118, 0.9281566998872893, 0.26263172115644423, 0.9711005460169826, 0.9844666832130597, nan, 0.7408947027191946, nan, 0.8340176123257265, 0.8858099678245452, 0.9277117275796272, 0.7666829327802854, 0.6726437279534625, 0.9066307671245379, 0.9756302770840766, 0.5871698113207547, nan, 0.8127587744442322, 0.4629418472063854, nan, 0.9262269240200371, nan, 0.0, nan, nan, 0.9217819543204603, nan, nan, nan, nan, nan, 0.6871943260626838, nan, nan, 0.574563436872199, 0.7096815712925065, 0.0, nan, nan, 0.0, nan, 0.2311111111111111, nan, nan, nan, 0.8750685733094393, nan, nan, 0.9818491322434755, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35294117647058826, nan, 0.6943423294180938, 0.8944393252055571, nan, nan, 0.8251371473354232, 0.0, 0.8603650586701435, nan, nan, 0.8021756760535809, 0.8233766233766234, nan, nan, 0.8777916927572532, nan, nan, nan, nan, nan, nan, 0.8151055293912437, 0.0, nan, 0.9701810867517094, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.26550218340611353, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9176470588235294, 0.6945337620578779, nan, nan, nan, 0.966422544862735, nan, nan, nan, 0.3118543162318333, 0.8311284046692607, 0.0, nan, 0.9081091694482624, 0.5992250757538126, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9585195267974724, 0.988919095988073, 0.9956375173507833, 0.9635110046469396, 0.2826608505997819, 0.9942396803128852, 0.9921327474975341, nan, 0.7806920016610183, nan, 0.9992525768715996, 0.9790971113369139, 0.9397115670532263, 0.8418056392739937, 0.6849036689298823, 0.9721094094045114, 0.9985516022139794, 0.896726602120793, nan, 0.8790561488115742, 0.9376443418013857, nan, 0.9666934776250365, nan, nan, nan, nan, 0.9599769726943256, nan, nan, nan, nan, nan, 0.8980271270036991, nan, nan, 0.6231981226952732, 0.7111277342137846, nan, nan, nan, 0.0, nan, 0.2311111111111111, nan, nan, nan, 0.9710239032506798, nan, nan, 0.9924252451105161, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46153846153846156, nan, 0.7307383816937957, 0.9810686310715466, nan, nan, 0.9286659316427784, nan, 0.8737504137702747, nan, nan, 0.8318833014326316, 0.8262380538662033, nan, nan, 0.8890650599862586, nan, nan, nan, nan, nan, nan, 0.9749634884206134, 0.0, nan, 0.9737550471063257, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2858934169278997, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9205479452054794, 0.7652790079716564, nan, nan, nan, 0.9671266929538831, nan, nan, nan, 0.31341838979322484, 0.8407242193649961, nan, nan, 0.9305835010060363, 0.6079834685751726, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0349 | 51.0 | 1020 | 0.1808 | 0.6503 | 0.7594 | 0.9480 | [0.8873621900908955, 0.9809340572182963, 0.9624442907944679, 0.9304234952812958, 0.25297090561398716, 0.9635274007867043, 0.9845686820136605, nan, 0.7468168957154406, nan, 0.8459194937081463, 0.8856936035798895, 0.9255073520476365, 0.7789326525650376, 0.633514669642442, 0.9062242751303393, 0.9757200716057166, 0.5944838455476753, nan, 0.8157416423239208, 0.46271186440677964, nan, 0.9289986359343849, nan, 0.0, nan, nan, 0.9204880363654528, nan, nan, nan, nan, nan, 0.689985226165404, nan, nan, 0.5756220076917039, 0.7117413908477892, 0.0, nan, nan, 0.0, nan, 0.21333333333333335, nan, nan, nan, 0.877800370024363, nan, nan, 0.9876028879406233, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33409350057012543, nan, 0.6967762240906434, 0.8899765907639924, nan, nan, 0.817588624853915, 0.0, 0.8579997392098057, nan, nan, 0.8007681094415788, 0.8225806451612904, nan, nan, 0.8754835337166754, nan, nan, nan, nan, nan, nan, 0.8193004042889788, 0.0, nan, 0.9712662609730867, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24463393119670684, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9218357082984073, 0.6769616026711185, nan, nan, nan, 0.9806523414091259, nan, nan, nan, 0.2670225800805181, 0.8316883116883117, 0.0, nan, 0.9084410497453975, 0.6582347408544131, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9590352245941369, 0.9889294134033542, 0.9959746182827682, 0.968309274317077, 0.2692838967648128, 0.9944734940271224, 0.9918684606708166, nan, 0.7919842471166597, nan, 0.9990439936729764, 0.9768471476266511, 0.9436019427098821, 0.858281427901681, 0.6439137352510279, 0.9695894700787787, 0.9984238024093306, 0.8695251267865376, nan, 0.8657595590768171, 0.9457274826789839, nan, 0.9710807253582919, nan, nan, nan, nan, 0.9643861783831168, nan, nan, nan, nan, nan, 0.9021989313604604, nan, nan, 0.6146496815286624, 0.7133202641353694, nan, nan, nan, 0.0, nan, 0.21333333333333335, nan, nan, nan, 0.9723631346130434, nan, nan, 0.9943698394437035, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.48469809760132343, nan, 0.7311630101439018, 0.9754125444615056, nan, nan, 0.9255788313120177, nan, 0.8712346904998345, nan, nan, 0.8284032248709472, 0.8271068635968722, nan, nan, 0.8851540616246498, nan, nan, nan, nan, nan, nan, 0.9724598372626747, 0.0, nan, 0.9749907179653781, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2608150470219436, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9270811380400421, 0.7183348095659876, nan, nan, nan, 0.9819132521858391, nan, nan, nan, 0.2684557853057633, 0.8401994227236945, nan, nan, 0.9332997987927565, 0.6701779144196361, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0192 | 52.0 | 1040 | 0.1796 | 0.6527 | 0.7603 | 0.9494 | [0.8899957354323912, 0.9816889371088172, 0.9629790091285511, 0.9319319638292298, 0.2512995896032832, 0.9706560338919693, 0.983767236779285, nan, 0.7368060228268183, nan, 0.8323270057773338, 0.8861025405369921, 0.9281940911758951, 0.778124045218454, 0.7104021847070506, 0.8909902396140928, 0.9755418306326014, 0.5944805194805195, nan, 0.8054195860127331, 0.45535219079312256, nan, 0.927587536363955, nan, 0.0, nan, nan, 0.9193591018363519, nan, nan, nan, nan, nan, 0.6834757435681935, nan, nan, 0.5580839247145942, 0.7126720699858484, 0.0, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.8800198496572258, nan, nan, 0.9886673810631028, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3535220539828835, nan, 0.7110471392587262, 0.8890428233956769, nan, nan, 0.8267755262640173, 0.0, 0.8470373270686505, nan, nan, 0.801130481014075, 0.8120127057464626, nan, nan, 0.8737818296133292, nan, nan, nan, nan, nan, nan, 0.8244905130007028, 0.0, nan, 0.9722329726950826, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2646546720835752, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9138836378911993, 0.6335616438356164, nan, nan, nan, 0.9823164204667095, nan, nan, nan, 0.27416104442302636, 0.8163584266736129, 0.0, nan, 0.9099684791174153, 0.7564222671465904, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9584520212919004, 0.9895587757355028, 0.9953615572741094, 0.9715130420660611, 0.2671028716830244, 0.9934957275857671, 0.9918967771165363, nan, 0.777410150965132, nan, 0.9991656672055066, 0.9796777471331107, 0.9404301714738824, 0.8528564731096376, 0.7212463964460688, 0.9670450653226991, 0.9987341733634778, 0.8441678192715537, nan, 0.8672063382707544, 0.9480369515011547, nan, 0.9675709271716876, nan, nan, nan, nan, 0.9556855202731875, nan, nan, nan, nan, nan, 0.9024660912453761, nan, nan, 0.6063526651022461, 0.7144552208006604, nan, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.9715717706261922, nan, nan, 0.9956832042027672, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4441687344913151, nan, 0.7458362821420146, 0.9728857703745457, nan, nan, 0.9266813671444322, nan, 0.8593181065872227, nan, nan, 0.8302882663418595, 0.8143643208803939, nan, nan, 0.8814016172506739, nan, nan, nan, nan, nan, nan, 0.9791362403505112, 0.0, nan, 0.9764236320601476, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2858934169278997, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9169652265542676, 0.6554472984942427, nan, nan, nan, 0.9832847591290931, nan, nan, nan, 0.275318961724593, 0.8223563369194438, nan, nan, 0.9293762575452716, 0.7731969154780505, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0238 | 53.0 | 1060 | 0.1803 | 0.6514 | 0.7608 | 0.9485 | [0.889166189220408, 0.9816291392085925, 0.9622702663002956, 0.9342904570805001, 0.25149700598802394, 0.9685744491200431, 0.983694327238631, nan, 0.7382397283703488, nan, 0.8508629111038749, 0.8904418096310526, 0.9242126700123645, 0.7675989004422135, 0.6712373237694781, 0.9094230371544771, 0.9756791551330957, 0.6004154018213772, nan, 0.8117918187055886, 0.43064935064935067, nan, 0.9285564194158437, nan, 0.0, nan, nan, 0.9179924099625525, nan, nan, nan, nan, nan, 0.6824982003693155, nan, nan, 0.5687315186898173, 0.7108108108108108, 0.0, nan, nan, 0.0, nan, 0.30666666666666664, nan, nan, nan, 0.877475134174711, nan, nan, 0.980801630297711, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35251798561151076, nan, 0.6893569347972216, 0.8901827909050379, nan, nan, 0.8365317689171721, 0.0, 0.8720067673086934, nan, nan, 0.7986942023706112, 0.825735718407386, nan, nan, 0.8775149467117235, nan, nan, nan, nan, nan, nan, 0.8084815321477428, 0.0, nan, 0.9721392459916222, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17955401100492324, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9241884816753927, 0.6713810316139767, nan, nan, nan, 0.9729787598492634, nan, nan, nan, 0.2682136602451839, 0.830693326408725, 0.0, nan, 0.9050744514106583, 0.6816788609847736, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9600599803016722, 0.9882071943336755, 0.9959217397052019, 0.9701473806984066, 0.2687022900763359, 0.9931768907027165, 0.9902355456343119, nan, 0.7805446532624275, nan, 0.9992178130051624, 0.9756858760342575, 0.9446179006839132, 0.8602906704172527, 0.6840214874210369, 0.9683172677007389, 0.9985333450990296, 0.8662978331028124, nan, 0.873647950396142, 0.9572748267898383, nan, 0.9693623866627669, nan, nan, nan, nan, 0.9557901898444349, nan, nan, nan, nan, nan, 0.8962803123715577, nan, nan, 0.6286456587328193, 0.7123142798184069, nan, nan, nan, 0.0, nan, 0.30666666666666664, nan, nan, nan, 0.9720384724645915, nan, nan, 0.9898086967145518, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7257843831092239, 0.9702229392213648, nan, nan, 0.9275633958103638, nan, 0.8871896722939424, nan, nan, 0.8266051853140769, 0.8288444830582102, nan, nan, 0.8920775857512816, nan, nan, nan, nan, nan, nan, 0.9864385562278323, 0.0, nan, 0.9760813570334618, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.19435736677115986, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.930031612223393, 0.7147918511957484, nan, nan, nan, 0.9738127893022458, nan, nan, nan, 0.2695116586009679, 0.8394122277617423, nan, nan, 0.929476861167002, 0.6949750516607026, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0614 | 54.0 | 1080 | 0.1883 | 0.6508 | 0.7568 | 0.9468 | [0.8824311096007831, 0.9803388198169123, 0.9614240578503566, 0.9309543709538043, 0.25551020408163266, 0.9694794281801772, 0.9841080595842567, nan, 0.7227950618539661, nan, 0.8520417234635216, 0.8869507323568575, 0.9239567836499768, 0.7648195098394301, 0.5662869663864241, 0.9196025715955581, 0.9757039750913703, 0.6016717569522585, nan, 0.8086332778275906, 0.47769953051643194, nan, 0.9278310670742419, nan, 0.0, nan, nan, 0.9174531873012668, nan, nan, nan, nan, nan, 0.6930390816294082, nan, nan, 0.5631703050125504, 0.7063093675187273, 0.0, nan, nan, 0.0, nan, 0.288, nan, nan, nan, 0.8730486223958809, nan, nan, 0.9780073845287081, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33311170212765956, nan, 0.6860223786320159, 0.892339507918552, nan, nan, 0.8399837958274256, 0.0, 0.8597636612913756, nan, nan, 0.7879923903312444, 0.807315668202765, nan, nan, 0.8743739565943238, nan, nan, nan, nan, nan, nan, 0.8250830855343712, 0.0, nan, 0.969585748883019, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2069165940133682, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9068936527952921, 0.6747088186356073, nan, nan, nan, 0.9529134532990574, nan, nan, nan, 0.30199405282490815, 0.8266286010900596, 0.0, nan, 0.9038480464521208, 0.7813948900069054, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9626783085997588, 0.9872167224666877, 0.9961167294599775, 0.97023965511514, 0.2730643402399128, 0.9931981464949199, 0.9933550740711026, nan, 0.7615233145352498, nan, 0.9995654516695347, 0.9669037596167803, 0.938769947467539, 0.8485700890764182, 0.5740481103987145, 0.9623721681264373, 0.9983568596545145, 0.8628400184416782, nan, 0.8698587667929728, 0.9399538106235565, nan, 0.9654504241006142, nan, nan, nan, nan, 0.9513155656736141, nan, nan, nan, nan, nan, 0.8957665433621044, nan, nan, 0.6205162587998659, 0.7077486586875774, nan, nan, nan, 0.0, nan, 0.288, nan, nan, nan, 0.9702528306481069, nan, nan, 0.9897170666150823, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4143920595533499, nan, 0.7173861759849021, 0.981262998309005, nan, nan, 0.9144432194046307, nan, 0.8718305196954651, nan, nan, 0.8168319703033466, 0.8117578916883869, nan, nan, 0.885788277575181, nan, nan, nan, nan, nan, nan, 0.9841435426663885, 0.0, nan, 0.9731575161275352, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2231974921630094, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9093782929399368, 0.7183348095659876, nan, nan, nan, 0.9532401851534373, nan, nan, nan, 0.3038275406951166, 0.8357386512726318, nan, nan, 0.923943661971831, 0.7984476588881608, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0101 | 55.0 | 1100 | 0.1924 | 0.6495 | 0.7569 | 0.9457 | [0.877919469073189, 0.979830599554748, 0.9609858652062201, 0.9302149007809091, 0.23717992723278644, 0.9692154709867208, 0.9843516092814372, nan, 0.7304253658045181, nan, 0.8450424762632647, 0.8869869803714229, 0.9237416394082898, 0.7623804854130914, 0.5135570386658201, 0.9182524542874424, 0.9756741001588217, 0.6006933501418216, nan, 0.8083141615541922, 0.4939540507859734, nan, 0.927085514834206, nan, 0.0, nan, nan, 0.9205642539142614, nan, nan, nan, nan, nan, 0.6894268242451126, nan, nan, 0.566001238006809, 0.7080158566721582, 0.0, nan, nan, 0.0, nan, 0.26222222222222225, nan, nan, nan, 0.8743736054720362, nan, nan, 0.984188649241639, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3410169491525424, nan, 0.6820848072359289, 0.8900621661850976, nan, nan, 0.8285742811501597, 0.0, 0.8461840298214636, nan, nan, 0.7951493060423385, 0.8292050691244239, nan, nan, 0.8769351055512119, nan, nan, nan, nan, nan, nan, 0.8027810751229438, 0.0, nan, 0.969965613893143, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2162945781385909, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9220452640402347, 0.6652754590984975, nan, nan, nan, 0.9597002141327623, nan, nan, nan, 0.2794503763346753, 0.8348909657320872, 0.0, nan, 0.8975413850524048, 0.8086982219376447, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9585792857696182, 0.9876552126161354, 0.9962836274704211, 0.9677076451199752, 0.2511813885859687, 0.994452238234919, 0.9930341543529456, nan, 0.777008291696248, nan, 0.99937425040413, 0.9740891275947162, 0.9377044305679453, 0.8330989217065167, 0.5224562453724854, 0.9656994666536184, 0.9981986313249492, 0.8787459658828953, nan, 0.8714088873579056, 0.9434180138568129, nan, 0.9710807253582919, nan, nan, nan, nan, 0.9631170598317437, nan, nan, nan, nan, nan, 0.894327990135635, nan, nan, 0.6130573248407644, 0.7094768881551795, nan, nan, nan, 0.0, nan, 0.26222222222222225, nan, nan, nan, 0.9701107909581592, nan, nan, 0.9955915741032977, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4160463192721257, nan, 0.7187072422741213, 0.9879103578300842, nan, nan, 0.9149944873208379, nan, 0.8566037735849057, nan, nan, 0.8224290934400557, 0.8337677381986678, nan, nan, 0.8891707626446805, nan, nan, nan, nan, nan, nan, 0.9876903818068016, 0.0, nan, 0.9736738293033833, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.23385579937304074, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9272918861959958, 0.7059344552701505, nan, nan, nan, 0.9604405966055203, nan, nan, nan, 0.2809502859656841, 0.843872999212805, nan, nan, 0.921830985915493, 0.8275288543924197, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0207 | 56.0 | 1120 | 0.1868 | 0.6532 | 0.7624 | 0.9479 | [0.885396242539976, 0.9800449690837549, 0.9623255278899739, 0.9324030073054826, 0.2559544004885662, 0.9704856062808956, 0.9846195598770313, nan, 0.7213197905174928, nan, 0.8404960007018878, 0.8879756065226038, 0.9226323457989315, 0.7647166250684848, 0.6371033991041121, 0.91397552036512, 0.9754787017471908, 0.5934065934065934, nan, 0.8102748986029743, 0.46869615163699024, nan, 0.9293820047078664, nan, 0.0, nan, nan, 0.9197010493469892, nan, nan, nan, nan, nan, 0.6894518625951792, nan, nan, 0.5629945694336695, 0.7074194212748429, 0.0, nan, nan, 0.0, nan, 0.24, nan, nan, nan, 0.8709589489178433, nan, nan, 0.987726339840867, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34903381642512077, nan, 0.6949701083292129, 0.8916033488110314, nan, nan, 0.8225490196078431, 0.0, 0.865306920370129, nan, nan, 0.7948639046633283, 0.8435432844406098, nan, nan, 0.8722358722358723, nan, nan, nan, nan, nan, nan, 0.8128019323671497, 0.0, nan, 0.9712175757120486, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2402464065708419, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9157031742694975, 0.6911037891268533, nan, nan, nan, 0.9752632029444492, nan, nan, nan, 0.32494539100043685, 0.8368352788586252, 0.0, nan, 0.9027928550281259, 0.7449422678377579, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9584752608921793, 0.9893627448451614, 0.9952871967744068, 0.9704278949252761, 0.27422755361686657, 0.9931768907027165, 0.9930860678367651, nan, 0.761978755039985, nan, 0.9990961394726321, 0.9722746407316011, 0.9414709089106948, 0.8413368160203604, 0.6475212274925566, 0.9700787786857171, 0.9987615590359026, 0.8713692946058091, nan, 0.8671029968997589, 0.9422632794457275, nan, 0.9671322023983621, nan, nan, nan, nan, 0.9563658724862948, nan, nan, nan, nan, nan, 0.8987669543773119, nan, nan, 0.6081964465303386, 0.7088062319438713, nan, nan, nan, 0.0, nan, 0.24, nan, nan, nan, 0.9725254656872692, nan, nan, 0.9946650919864387, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.47808105872622003, nan, 0.7294644963434772, 0.9853058368481409, nan, nan, 0.9250275633958104, nan, 0.87911287653095, nan, nan, 0.8240241285308276, 0.8494063133507095, nan, nan, 0.8818244278843613, nan, nan, nan, nan, nan, nan, 0.9828917170874192, 0.0, nan, 0.9750545319534042, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2567398119122257, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9180189673340359, 0.7431355181576617, nan, nan, nan, 0.9766843819646837, nan, nan, nan, 0.32723273207215137, 0.8464969824193125, nan, nan, 0.9203219315895372, 0.7608991482284159, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0274 | 57.0 | 1140 | 0.1834 | 0.6533 | 0.7612 | 0.9486 | [0.8882206411154816, 0.9797623249794093, 0.961476246689851, 0.9322859738526835, 0.24439124487004105, 0.9669906928645294, 0.9845912465166382, nan, 0.7363857005612581, nan, 0.8482344434771061, 0.8832183908045977, 0.9214935492524937, 0.7676461678279309, 0.6496588089330024, 0.9061362752054198, 0.9756137440617401, 0.5977437550362611, nan, 0.805871181639682, 0.4799764428739694, nan, 0.9283408197641775, nan, 0.0, nan, nan, 0.9218236116886553, nan, nan, nan, nan, nan, 0.6858992444904229, nan, nan, 0.5681104833647207, 0.709476501775879, 0.0, nan, nan, 0.0, nan, 0.25422222222222224, nan, nan, nan, 0.8760964711483876, nan, nan, 0.9861465032560773, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3474676089517079, nan, 0.6883111051237146, 0.89048470788248, nan, nan, 0.8294436238306253, 0.0, 0.8563319632021922, nan, nan, 0.7983212087297146, 0.8336690647482015, nan, nan, 0.872084509988495, nan, nan, nan, nan, nan, nan, 0.8312355965254388, 0.0, nan, 0.970127603504473, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1909681227863046, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.915126050420168, 0.6895127993393889, nan, nan, nan, 0.9772688356164384, nan, nan, nan, 0.3118223154949283, 0.8356661482633488, 0.0, nan, 0.9007323832145685, 0.7749062931544684, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9608457001206246, 0.988000846028053, 0.9959531363606319, 0.9696454078713769, 0.25976008724100325, 0.9937933086766144, 0.992137466905154, nan, 0.7803303283190227, nan, 0.9991830491387252, 0.9759761939323559, 0.936267221726633, 0.8485700890764182, 0.6599032750988516, 0.9685863874345549, 0.9985881164438791, 0.855002305209774, nan, 0.8624181880812952, 0.941108545034642, nan, 0.9671687627961392, nan, nan, nan, nan, 0.9571378105742434, nan, nan, nan, nan, nan, 0.8992807233867653, nan, nan, 0.6067717063359035, 0.7110503508047874, nan, nan, nan, 0.0, nan, 0.25422222222222224, nan, nan, nan, 0.9707601152550627, nan, nan, 0.9928833956078639, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4880066170388751, nan, 0.7231894314696863, 0.9773367801123443, nan, nan, 0.9286659316427784, nan, 0.8689175769612711, nan, nan, 0.8274462038164839, 0.8389805965826818, nan, nan, 0.8813487659214629, nan, nan, nan, nan, nan, nan, 0.9783016899645316, 0.0, nan, 0.9738420661809069, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20282131661442007, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9180189673340359, 0.7395925597874224, nan, nan, nan, 0.9784416252357278, nan, nan, nan, 0.31377034755829303, 0.845972185778011, nan, nan, 0.9155935613682092, 0.791895569779749, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0311 | 58.0 | 1160 | 0.1862 | 0.6519 | 0.7609 | 0.9481 | [0.8875011001111368, 0.9793895121314897, 0.960964655745334, 0.9319866298503124, 0.23071620596953196, 0.970347605797583, 0.9838298788997012, nan, 0.739138680962978, nan, 0.8432422127060479, 0.8837468819745307, 0.9227444756439995, 0.7623419260700389, 0.6399454838309999, 0.9039912220545878, 0.9757118728738968, 0.5987291501191422, nan, 0.8021020251217637, 0.44199785177228784, nan, 0.9275991181719565, nan, 0.0, nan, nan, 0.9228126258558196, nan, nan, nan, nan, nan, 0.6837890990822252, nan, nan, 0.5683256504414408, 0.7084395027411011, 0.0, nan, nan, 0.0, nan, 0.2853333333333333, nan, nan, nan, 0.8758539846511713, nan, nan, 0.9816655777252852, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34440344403444034, nan, 0.6856540084388185, 0.8864557988645579, nan, nan, 0.8340105044098702, 0.0, 0.8397934505523237, nan, nan, 0.7938888577984219, 0.8338133640552995, nan, nan, 0.8769801850786846, nan, nan, nan, nan, nan, nan, 0.8161357340720221, 0.0, nan, 0.9689833761069109, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.266433973240256, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9208315833683326, 0.6545914069081719, nan, nan, nan, 0.9563541354349595, nan, nan, nan, 0.2916411378555799, 0.8371428571428572, 0.0, nan, 0.9000888537861585, 0.7769067692155993, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9597324126025032, 0.9876552126161354, 0.9963712076145151, 0.9704795685986469, 0.24332969829153037, 0.9932831696637333, 0.9914956274688401, nan, 0.7830495760384714, nan, 0.9994437781370044, 0.9771374655247496, 0.9364406779661016, 0.8397963967584221, 0.6509239275981034, 0.9675099084992905, 0.9984451023767721, 0.8688335638543108, nan, 0.8622803995866345, 0.9503464203233256, nan, 0.9691430242761041, nan, nan, nan, nan, 0.9593358715704361, nan, nan, nan, nan, nan, 0.900308261405672, nan, nan, 0.6096211867247737, 0.7099927775484935, nan, nan, nan, 0.0, nan, 0.2853333333333333, nan, nan, nan, 0.9703137048009415, nan, nan, 0.9942985715885605, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46319272125723737, nan, 0.72068884170795, 0.977239596493615, nan, nan, 0.9278941565600882, nan, 0.8505792783846409, nan, nan, 0.8228061017342382, 0.838401390095569, nan, nan, 0.886528196184134, nan, nan, nan, nan, nan, nan, 0.9835176298769038, 0.0, nan, 0.9725135749756346, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2871473354231975, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9241306638566913, 0.6882196634189548, nan, nan, nan, 0.9569689696554089, nan, nan, nan, 0.29318081830180376, 0.8457097874573603, nan, nan, 0.9172032193158953, 0.7942140013104179, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0248 | 59.0 | 1180 | 0.1950 | 0.6513 | 0.7588 | 0.9461 | [0.8795721218496123, 0.9795674121961324, 0.9616304046010322, 0.9302458264083546, 0.26020615760206156, 0.9674828721048165, 0.9839445894248487, nan, 0.7262106460993805, nan, 0.8625191320788692, 0.8888594515829912, 0.9180805938494168, 0.7662369551872314, 0.530264094863618, 0.9146824019437695, 0.9756699461545552, 0.597746747064424, nan, 0.8030896784013991, 0.45585785674625207, nan, 0.9280436990090689, nan, 0.0, nan, nan, 0.9203611862538824, nan, nan, nan, nan, nan, 0.6861032582086959, nan, nan, 0.5692103620474407, 0.7083097358801421, 0.0, nan, nan, 0.0, nan, 0.2551111111111111, nan, nan, nan, 0.8773222587699149, nan, nan, 0.975705006575711, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34951456310679613, nan, 0.690035571164843, 0.8889124181622461, nan, nan, 0.8306642941874258, 0.0, 0.8608469055374592, nan, nan, 0.7938935574229692, 0.8482142857142857, nan, nan, 0.8779054651344184, nan, nan, nan, nan, nan, nan, 0.813969571230982, 0.0, nan, 0.9696582653774801, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.28077255693283365, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9178053394996847, 0.6788990825688074, nan, nan, nan, 0.9493708807669263, nan, nan, nan, 0.25466164755318216, 0.8373180873180873, 0.0, nan, 0.9037521985538401, 0.79691428008084, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9605502252027932, 0.9880266395662559, 0.995766408883601, 0.9689072125375096, 0.2789531079607416, 0.9935382391701739, 0.9929114497548268, nan, 0.7693729655874513, nan, 0.9991135214058507, 0.9740165481201916, 0.9439240757260383, 0.8359788359788359, 0.5396115250712834, 0.9670695307530459, 0.9985120451315882, 0.8683725218994929, nan, 0.8541853255253187, 0.9480369515011547, nan, 0.9689967826849956, nan, nan, nan, nan, 0.9615077651738169, nan, nan, nan, nan, nan, 0.8927661323468968, nan, nan, 0.6113811599061347, 0.7097348328518366, nan, nan, nan, 0.0, nan, 0.2551111111111111, nan, nan, nan, 0.9687715595957956, nan, nan, 0.9894930819274901, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.41687344913151364, nan, 0.7230478886529842, 0.9790666485257246, nan, nan, 0.9264608599779492, nan, 0.874809665673618, nan, nan, 0.8219360825938171, 0.8528815522733855, nan, nan, 0.8922889910681253, nan, nan, nan, nan, nan, nan, 0.9822658042979345, 0.0, nan, 0.9733315542766975, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.30532915360501567, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9201264488935722, 0.7209920283436669, nan, nan, nan, 0.9507543288187896, nan, nan, nan, 0.2559612846458425, 0.8454473891367095, nan, nan, 0.9304828973843058, 0.8148278816591905, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0075 | 60.0 | 1200 | 0.1923 | 0.6393 | 0.7590 | 0.9470 | [0.8832209514243248, 0.979933486825275, 0.9614296857428614, 0.9284353337504552, 0.25273456077179157, 0.9638390373560258, 0.9840069270552995, nan, 0.7278523532468849, nan, 0.8513045280895549, 0.8888301038291119, 0.9184842794549569, 0.7640806353084911, 0.6061924971345374, 0.8997974095741048, 0.9756730640581563, 0.5978903627936711, nan, 0.8014778003007912, 0.4625773776027012, nan, 0.9287272215577288, nan, 0.0, nan, nan, 0.9194657880506546, nan, nan, nan, nan, nan, 0.6881474229408793, nan, nan, 0.5558979890592496, 0.706544125218824, 0.0, nan, nan, 0.0, 0.0, 0.2408888888888889, nan, nan, nan, 0.8763795288025414, nan, nan, 0.9750710890949831, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34826883910386963, nan, 0.6942286099258926, 0.8962499336975548, nan, nan, 0.8300109094515521, 0.0, 0.8750731612148013, nan, nan, 0.7931853756708408, 0.8419689119170984, nan, nan, 0.8724814698820336, nan, nan, nan, nan, nan, nan, 0.8225070224719101, 0.0, nan, 0.9688558513774407, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21248185776487663, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8992427429533025, 0.694421315570358, nan, nan, nan, 0.9480063386012249, nan, nan, nan, 0.3051573426573427, 0.8455158113011924, 0.0, nan, 0.9050290325755339, 0.7910852522064987, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9594524307515244, 0.988036956981537, 0.9959795756494151, 0.9694202582945474, 0.27044711014176664, 0.9943034476894954, 0.9921988192042135, nan, 0.7691720359530092, nan, 0.9993221046044741, 0.9754681376106837, 0.9453860640301318, 0.8377201794923314, 0.6165346019943604, 0.9670939961833929, 0.9985211736890631, 0.8623789764868603, nan, 0.8444367895280744, 0.9491916859122402, nan, 0.9651945013161743, nan, nan, nan, nan, 0.9566144627180071, nan, nan, nan, nan, nan, 0.8955815865187012, nan, nan, 0.6046765001676165, 0.7079292199752373, nan, nan, nan, 0.0, nan, 0.2408888888888889, nan, nan, nan, 0.9684063146787873, nan, nan, 0.9879964569694871, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.42431761786600497, nan, 0.7292757725878745, 0.9852669634006492, nan, nan, 0.9227122381477398, nan, 0.8908308507116849, nan, nan, 0.8229511049243083, 0.8470894874022589, nan, nan, 0.8834099677606891, nan, nan, nan, nan, nan, nan, 0.977467139578552, 0.0, nan, 0.9723743444563048, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22946708463949844, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9009483667017913, 0.7387068201948627, nan, nan, nan, 0.9486970684039088, nan, nan, nan, 0.3071711394632644, 0.8559433219627395, nan, nan, 0.9251509054325956, 0.8086285973489239, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0128 | 61.0 | 1220 | 0.1923 | 0.6385 | 0.7598 | 0.9471 | [0.8846226920043964, 0.9797071483041522, 0.9609692804039838, 0.9285549926898256, 0.2492678607913914, 0.9673501479381763, 0.9839314032305598, nan, 0.7342592709823633, nan, 0.8405479291843925, 0.8895391367959035, 0.9182353648275033, 0.7666626198858808, 0.6292790075332189, 0.893354950573082, 0.9757628025932314, 0.5921197793538219, nan, 0.8013119676250775, 0.4601123595505618, nan, 0.9265800561797752, nan, 0.0, nan, nan, 0.917622128883339, nan, nan, nan, nan, nan, 0.6859718078641222, nan, nan, 0.5486872966633881, 0.7067682722755709, 0.0, nan, nan, 0.0, 0.0, 0.25066666666666665, nan, nan, nan, 0.8742973284749053, nan, nan, 0.9771569779706476, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3432650527622595, nan, 0.6821813913593066, 0.8958116568203862, nan, nan, 0.8334662148694439, 0.0, 0.8770539715528999, nan, nan, 0.7908692129306251, 0.8311239193083574, nan, nan, 0.8710250117487337, nan, nan, nan, nan, nan, nan, 0.81991599579979, 0.0, nan, 0.9686690714021458, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2116788321167883, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.907465825446898, 0.689795918367347, nan, nan, nan, 0.9521178637200737, nan, nan, nan, 0.31181102362204727, 0.8335497273435472, 0.0, nan, 0.9034298705149748, 0.7739310548897766, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9583955822626518, 0.9875004513869186, 0.9962092669707184, 0.9681468713436262, 0.26608505997818976, 0.993772052884411, 0.9926613211509692, nan, 0.7768475479886944, nan, 0.9993916323373485, 0.970823051241109, 0.9464268014669442, 0.8458910990556561, 0.6408576064525277, 0.9706170181533493, 0.9983903310319225, 0.8660673121254034, nan, 0.8457802273510162, 0.9457274826789839, nan, 0.9647923369406259, nan, nan, nan, nan, 0.9518389135298505, nan, nan, nan, nan, nan, 0.8930743937525688, nan, nan, 0.6077774052966812, 0.7081355757325629, nan, nan, nan, 0.0, nan, 0.25066666666666665, nan, nan, nan, 0.9688730165171868, nan, nan, 0.9903686584335325, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4574028122415219, nan, 0.7129511677282377, 0.9873272561177088, nan, nan, 0.9220507166482911, nan, 0.8940086064217146, nan, nan, 0.8208920596253118, 0.8352157544164495, nan, nan, 0.8816130225675176, nan, nan, nan, nan, nan, nan, 0.977467139578552, 0.0, nan, 0.9721306910474776, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22727272727272727, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9093782929399368, 0.7484499557130204, nan, nan, nan, 0.9528115892336705, nan, nan, nan, 0.3135943686757589, 0.8422986092889005, nan, nan, 0.9195171026156942, 0.7909379567562119, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.02 | 62.0 | 1240 | 0.1988 | 0.6257 | 0.7582 | 0.9453 | [0.8783292512600449, 0.9796283973998055, 0.9610360830741319, 0.928610953461975, 0.26121675576910064, 0.9680386105184664, 0.9841593798565731, nan, 0.724144046005548, nan, 0.8510131284874858, 0.8934453557378143, 0.9181923021479026, 0.771131653676515, 0.5274332986698537, 0.9096303792809991, 0.9757619376174728, 0.5988186462324393, nan, 0.802713580888081, 0.46318156267566046, nan, 0.9277713402858246, nan, 0.0, nan, nan, 0.9163281986796605, nan, nan, nan, nan, nan, 0.6816032140202142, nan, nan, 0.5544554455445545, 0.706353292488287, 0.0, 0.0, nan, 0.0, 0.0, 0.2737777777777778, nan, nan, nan, 0.8727946816670928, nan, nan, 0.9745178786661309, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34210526315789475, nan, 0.6792179580014482, 0.8904489939372638, nan, nan, 0.8302484410571117, 0.0, 0.8627069133398247, nan, nan, 0.7862325946263974, 0.824985607369027, nan, nan, 0.8703925759866534, nan, nan, nan, nan, nan, nan, 0.8264144333508495, 0.0, nan, 0.9689696633370329, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20057803468208094, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9108681942400673, 0.7072975140336808, nan, nan, nan, 0.9451773171149563, nan, nan, nan, 0.3037697892066824, 0.8412039439543332, 0.0, nan, 0.9037746919616664, 0.7902422179468206, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603952945342673, 0.9873250553271394, 0.9961117720933307, 0.9662755461722726, 0.2806252271901127, 0.9933681928325468, 0.9922271356499333, nan, 0.7657964180943834, nan, 0.9994090142705672, 0.972492379155175, 0.9469967291109129, 0.8547987408746902, 0.5353266434568913, 0.9675588393599843, 0.9983538168020228, 0.8646841862609498, nan, 0.851877368239752, 0.9515011547344111, nan, 0.9659988300672712, nan, nan, nan, nan, 0.9534220407949654, nan, nan, nan, nan, nan, 0.8855939169749281, nan, nan, 0.6148172980221254, 0.7077744531572431, nan, nan, nan, 0.0, nan, 0.2737777777777778, nan, nan, nan, 0.9697049632725945, nan, nan, 0.9878030156483848, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.41935483870967744, nan, 0.7080915310214674, 0.9848782289257323, nan, nan, 0.9248070562293275, nan, 0.8798411122144985, nan, nan, 0.8138449045879009, 0.8300028960324356, nan, nan, 0.8823529411764706, nan, nan, nan, nan, nan, nan, 0.9843521802628834, 0.0, nan, 0.9724323571726923, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2175548589341693, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9131717597471022, 0.7812223206377326, nan, nan, nan, 0.9458254757414709, nan, nan, nan, 0.30558732952045753, 0.8506953555497245, nan, nan, 0.9297786720321931, 0.8073685802126909, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0234 | 63.0 | 1260 | 0.1964 | 0.6494 | 0.7565 | 0.9457 | [0.8785675466310842, 0.9790631327075281, 0.9614285030551523, 0.928839387902166, 0.2563093622795115, 0.9672382055471674, 0.9840300016854879, nan, 0.731473640304865, nan, 0.843215741473857, 0.8869427800092184, 0.9237376555941087, 0.7629485700406825, 0.5309206230762056, 0.9035101779131272, 0.9756446352492768, 0.5988296488946684, nan, 0.8031735751295337, 0.4771513353115727, nan, 0.9272587251023908, nan, 0.0, nan, nan, 0.9215519292664905, nan, nan, nan, nan, nan, 0.6850257362779203, nan, nan, 0.5552844888644054, 0.70797621805266, 0.0, nan, nan, 0.0, nan, 0.24622222222222223, nan, nan, nan, 0.8709630465238988, nan, nan, 0.9754085180793118, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34243964421855144, nan, 0.6866087895281364, 0.8917198584712764, nan, nan, 0.8224846199642786, 0.0, 0.8762022355081882, nan, nan, 0.7820936484821328, 0.8289170506912442, nan, nan, 0.8730996290684917, nan, nan, nan, nan, nan, nan, 0.8180239624934884, 0.0, nan, 0.969756340014333, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21980255516840883, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9104477611940298, 0.6808688387635756, nan, nan, nan, 0.9497771683236201, nan, nan, nan, 0.28184922511163646, 0.8349792099792099, 0.0, nan, 0.908284898039602, 0.8099364813629425, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9588980002877284, 0.9878564022141173, 0.9958242448278142, 0.9705312422720176, 0.2746637586332243, 0.9940271223908516, 0.9919109353393962, nan, 0.7752133202952326, nan, 0.9996175974691905, 0.9776455218464218, 0.9433541480820696, 0.8415377402719175, 0.5380046944658864, 0.9653814160591084, 0.9985455165089961, 0.8492392807745505, nan, 0.8543575611436445, 0.9284064665127021, nan, 0.9684483767183387, nan, nan, nan, nan, 0.9627638000287841, nan, nan, nan, nan, nan, 0.8943485408960131, nan, nan, 0.5913509889373114, 0.709528477094511, nan, nan, nan, 0.0, nan, 0.24622222222222223, nan, nan, nan, 0.9713282740148533, nan, nan, 0.988169536046263, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7201698513800424, 0.9748100060253844, nan, nan, 0.9138919514884234, nan, 0.8926183382985766, nan, nan, 0.809900817817992, 0.8334781349551115, nan, nan, 0.8832514137730564, nan, nan, nan, nan, nan, nan, 0.9828917170874192, 0.0, nan, 0.9734243746229173, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2373040752351097, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9127502634351949, 0.7218777679362267, nan, nan, nan, 0.9499399965712326, nan, nan, nan, 0.28323801143862737, 0.8430858042508528, nan, nan, 0.9275653923541247, 0.8290408749558994, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0194 | 64.0 | 1280 | 0.1946 | 0.6509 | 0.7569 | 0.9460 | [0.8797088850251884, 0.9799556958463577, 0.96148169658599, 0.9272891662382329, 0.2556436851738865, 0.9673712759570195, 0.9846212202808828, nan, 0.7206164479284453, nan, 0.8416870326196713, 0.8889623702136102, 0.9226897669804244, 0.7706455320951056, 0.5602104018176725, 0.9078017428695593, 0.9756451421096445, 0.5981353480147886, nan, 0.8068499382595697, 0.4808277541083384, nan, 0.9289504199023156, nan, 0.0, nan, nan, 0.923159058362435, nan, nan, nan, nan, nan, 0.6788469914753282, nan, nan, 0.5549086849031023, 0.7068091131419745, 0.0, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.8670331061167664, nan, nan, 0.9742735591587735, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3505786249149081, nan, 0.6911579326490186, 0.8869905200578437, nan, nan, 0.8269725133478347, 0.0, 0.8802469135802469, nan, nan, 0.7762057877813505, 0.8182080092192452, nan, nan, 0.8725049639460759, nan, nan, nan, nan, nan, nan, 0.819327731092437, 0.0, nan, 0.9695006357646515, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.27035641842944075, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9038097242685751, 0.6753794266441822, nan, nan, nan, 0.9393965369449683, nan, nan, nan, 0.26681260945709284, 0.839968774395004, 0.0, nan, 0.9099320799291268, 0.8301357733175915, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.960836846939566, 0.9881452898419888, 0.9957697137946989, 0.9652937463782292, 0.274154852780807, 0.9931768907027165, 0.9922884879489926, nan, 0.7597685290611228, nan, 0.9997218890685022, 0.975613296559733, 0.9448904747745069, 0.8531243721117139, 0.5670851777753273, 0.971032930469247, 0.9985668164764376, 0.8577685569386814, nan, 0.8553220806062694, 0.9122401847575058, nan, 0.966547236033928, nan, nan, nan, nan, 0.9615077651738169, nan, nan, nan, nan, nan, 0.8968146321413892, nan, nan, 0.5831377807576266, 0.70821295914156, nan, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.9730327502942251, nan, nan, 0.9885869620549577, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.42597187758478083, nan, 0.7243217740033027, 0.965674745864837, nan, nan, 0.9221609702315325, nan, 0.8968553459119497, nan, nan, 0.805086711907662, 0.822473211699971, nan, nan, 0.8825114951641034, nan, nan, nan, nan, nan, nan, 0.9764239515960776, 0.0, nan, 0.9731227084977027, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.29247648902821316, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9049525816649104, 0.7094774136403897, nan, nan, nan, 0.9393965369449683, nan, nan, nan, 0.2681038275406951, 0.847021779060614, nan, nan, 0.929979879275654, 0.8505115669573107, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.008 | 65.0 | 1300 | 0.2010 | 0.6497 | 0.7567 | 0.9449 | [0.8758932153529457, 0.9805794919844963, 0.9617009868379058, 0.9252969150622555, 0.2523987750935692, 0.9668182757907794, 0.984051953881112, nan, 0.7177942951527857, nan, 0.8377272197622931, 0.885824931561728, 0.9227212201912601, 0.766593992248062, 0.5254969615019972, 0.9206890947505282, 0.9756318709086693, 0.5989566866898514, nan, 0.8105115764108242, 0.4756598240469208, nan, 0.9278133865713986, nan, 0.0, nan, nan, 0.9225033522563505, nan, nan, nan, nan, nan, 0.6780005586765573, nan, nan, 0.560610806577917, 0.7053295571575695, 0.0, nan, nan, 0.0, nan, 0.272, nan, nan, nan, 0.8670216619575455, nan, nan, 0.9726856297007521, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3345272206303725, nan, 0.6887566495356595, 0.8900426742532006, nan, nan, 0.8286880350388214, 0.0, 0.8723279457768509, nan, nan, 0.7767779796003912, 0.8146439896223696, nan, nan, 0.8719168060200669, nan, nan, nan, nan, nan, nan, 0.8126297577854671, 0.0, nan, 0.9704845458538502, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2646889400921659, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9104289318755256, 0.6797385620915033, nan, nan, nan, 0.9280387450711469, nan, nan, nan, 0.2790575457650872, 0.8354134165366615, 0.0, nan, 0.8956033123028391, 0.8408912059807201, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9591691289576486, 0.9879750524898503, 0.9958490316610483, 0.9630422506099339, 0.26964740094511086, 0.9940271223908516, 0.9932984411796631, nan, 0.7567545845444925, nan, 0.9997218890685022, 0.9629118885179271, 0.9444196649816632, 0.8477663920701896, 0.5326328392066668, 0.9701766404071047, 0.9986185449687955, 0.8734439834024896, nan, 0.8573889080261798, 0.9364896073903002, nan, 0.9684849371161158, nan, nan, nan, nan, 0.9631301435281495, nan, nan, nan, nan, nan, 0.8978421701602959, nan, nan, 0.5999832383506537, 0.7066394964919521, nan, nan, nan, 0.0, nan, 0.272, nan, nan, nan, 0.9713485653991315, nan, nan, 0.9901446737459403, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.38626964433416044, nan, 0.7208303845246521, 0.9729246438220374, nan, nan, 0.9178610804851157, nan, 0.8861304203905992, nan, nan, 0.8061307348761673, 0.8184187662901824, nan, nan, 0.8818244278843613, nan, nan, nan, nan, nan, nan, 0.9799707907364907, 0.0, nan, 0.974155334849399, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.28808777429467086, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9125395152792413, 0.7369353410097431, nan, nan, nan, 0.9280387450711469, nan, nan, nan, 0.28033435987681476, 0.8430858042508528, nan, nan, 0.9139839034205232, 0.8617005191270601, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0182 | 66.0 | 1320 | 0.2009 | 0.6493 | 0.7551 | 0.9454 | [0.8771949663182721, 0.9802383875588295, 0.9619323180886958, 0.9254530769557763, 0.2522639068564036, 0.9687461139896373, 0.983537671248884, nan, 0.7172822096022157, nan, 0.8498884175989831, 0.8845746791903691, 0.922679787672395, 0.7619454967615789, 0.5377705947256367, 0.9279703962099607, 0.9754026998196357, 0.6001597444089457, nan, 0.8125875036629441, 0.46352670878805285, nan, 0.9280269767115108, nan, 0.0, nan, nan, 0.9236289146936109, nan, nan, nan, nan, nan, 0.6748781513259315, nan, nan, 0.5609024928999684, 0.7058444902162719, 0.0, nan, nan, 0.0, nan, 0.24266666666666667, nan, nan, nan, 0.8715651570995876, nan, nan, 0.9711799457129171, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33071988595866003, nan, 0.6946832935668302, 0.8856848609680742, nan, nan, 0.8322918766377746, 0.0, 0.8696588166220889, nan, nan, 0.7809294916580499, 0.8271391529818496, nan, nan, 0.872259344330758, nan, nan, nan, nan, nan, nan, 0.8177708224656581, 0.0, nan, 0.9707223477602538, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2541101817132968, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9050726162913071, 0.6855500821018062, nan, nan, nan, 0.920519302455118, nan, nan, nan, 0.2685939553219448, 0.822303282959875, 0.0, nan, 0.9060146699266504, 0.8360720259765817, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9614731693281542, 0.988496081961547, 0.995460704607046, 0.964260272910815, 0.2693565976008724, 0.9935382391701739, 0.9930624707986654, nan, 0.7562723534218316, nan, 0.9995654516695347, 0.9706053128175351, 0.9432798096937258, 0.8351751389726073, 0.5451409127427969, 0.9632284581885795, 0.9988558874631435, 0.8660673121254034, nan, 0.8596968653117465, 0.9318706697459584, nan, 0.9659257092717168, nan, nan, nan, nan, 0.9615862673522524, nan, nan, nan, nan, nan, 0.8906699547883272, nan, nan, 0.5958766342608113, 0.7071553858852662, nan, nan, nan, 0.0, nan, 0.24266666666666667, nan, nan, nan, 0.9692585528184733, nan, nan, 0.9908268089308804, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3837882547559967, nan, 0.7280490681764568, 0.9695037804427685, nan, nan, 0.9104740904079383, nan, 0.8825554452168156, nan, nan, 0.8103938286642306, 0.8314509122502172, nan, nan, 0.8830928597854236, nan, nan, nan, nan, nan, nan, 0.98122261631546, 0.0, nan, 0.9744221933447812, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2761755485893417, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9062170706006323, 0.7395925597874224, nan, nan, nan, 0.9207954740270873, nan, nan, nan, 0.26977562692476903, 0.8281290999737602, nan, nan, 0.9319919517102616, 0.8565092485257799, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.025 | 67.0 | 1340 | 0.2063 | 0.6477 | 0.7549 | 0.9443 | [0.8736914712734519, 0.9797302481164936, 0.9612205022689029, 0.9283955254290138, 0.2519395671702736, 0.9647527121230871, 0.9836264204080296, nan, 0.7105008899059242, nan, 0.8528905763355382, 0.8843640683783248, 0.9262973420309609, 0.762613981762918, 0.47524460319925454, 0.9262967402289125, 0.9756241843762913, 0.5983593626755008, nan, 0.8116769380473565, 0.4539877300613497, nan, 0.9277099985976721, nan, 0.0, nan, nan, 0.9202498016947231, nan, nan, nan, nan, nan, 0.6811928053013568, nan, nan, 0.5587016034415331, 0.7058005715609794, 0.0, nan, nan, 0.0, nan, 0.2391111111111111, nan, nan, nan, 0.8692869868741592, nan, nan, 0.9689323686945399, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33488992661774514, nan, 0.6819963064726814, 0.8894146419267372, nan, nan, 0.8312525171163915, 0.0, 0.8844377967093712, nan, nan, 0.779737409367039, 0.8375035950532068, nan, nan, 0.8725705329153605, nan, nan, nan, nan, nan, nan, 0.818734793187348, 0.0, nan, 0.9692429523528766, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20901994796183868, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9043313708999159, 0.6969943135662063, nan, nan, nan, 0.9215871111491988, nan, nan, nan, 0.2818269314900691, 0.8307332293291732, 0.0, nan, 0.9072296238244514, 0.8356299212598425, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9615827274437546, 0.9881504485496293, 0.9958391169277546, 0.96706910615618, 0.26913849509269355, 0.9942821918972921, 0.9922979267642326, nan, 0.7486370273130349, nan, 0.9995828336027534, 0.972492379155175, 0.9430072356031322, 0.8401982452615364, 0.4820649348603475, 0.9642560062631502, 0.9986611449036785, 0.8743660673121254, nan, 0.8620048225973131, 0.9399538106235565, nan, 0.9674612459783563, nan, nan, nan, nan, 0.9562742866114534, nan, nan, nan, nan, nan, 0.887258528565557, nan, nan, 0.59864230640295, 0.7071295914156005, nan, nan, nan, 0.0, nan, 0.2391111111111111, nan, nan, nan, 0.9702528306481069, nan, nan, 0.9871921483185877, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4152191894127378, nan, 0.7143665958952583, 0.9754708546327431, nan, nan, 0.9102535832414553, nan, 0.9003641178417743, nan, nan, 0.8077547706049533, 0.8433246452360267, nan, nan, 0.8826700491517362, nan, nan, nan, nan, nan, nan, 0.9828917170874192, 0.0, nan, 0.9727572283844619, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22664576802507838, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9064278187565858, 0.7599645704162976, nan, nan, nan, 0.9218241042345277, nan, nan, nan, 0.28341399032116144, 0.8383626344791393, nan, nan, 0.9316901408450704, 0.8558036389294894, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0216 | 68.0 | 1360 | 0.2074 | 0.6324 | 0.7528 | 0.9437 | [0.8727529030396326, 0.9791916909881775, 0.9616537766211221, 0.929539314609052, 0.25981955091242115, 0.9681313675142881, 0.9838699865700208, nan, 0.7152685223969638, nan, 0.8416893493722013, 0.8856879161447689, 0.9286952699567776, 0.7628205128205128, 0.44383025635451767, 0.9186022228925447, 0.9757329193654872, 0.5985319929790969, nan, 0.8100319138648012, 0.44299674267100975, nan, 0.927901113523356, nan, 0.0, nan, nan, 0.9196799275307617, nan, nan, nan, nan, nan, 0.6793222564359066, nan, nan, 0.555176424970919, 0.705010040677617, 0.0, nan, nan, 0.0, 0.0, 0.21244444444444444, nan, nan, nan, 0.869641072239891, nan, nan, 0.9702188876789514, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3403041825095057, nan, 0.6788012419565316, 0.8869554376090906, nan, nan, 0.825098814229249, 0.0, 0.8742113821138211, nan, nan, 0.7794570389028828, 0.8272884283246977, nan, nan, 0.8739364201075325, nan, nan, nan, nan, nan, nan, 0.8155794587092297, 0.0, nan, 0.9703643771491317, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.14907192575406034, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9071969696969697, 0.6935615321923391, nan, nan, nan, 0.9203668781073204, nan, nan, nan, 0.27733520091044384, 0.8343302990897269, 0.0, nan, 0.9067454455933038, 0.8367437284800787, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9605767847459691, 0.9882639401177217, 0.9959002577830657, 0.9677150270733139, 0.27844420210832427, 0.993772052884411, 0.9922743297261328, nan, 0.7548390553628119, nan, 0.9997392710017208, 0.9751052402380607, 0.9423877490336009, 0.8368495077355836, 0.45029064730068213, 0.9685619220042081, 0.9983538168020228, 0.8646841862609498, nan, 0.8655873234584912, 0.9422632794457275, nan, 0.96881398069611, nan, nan, nan, nan, 0.9563920398791066, nan, nan, nan, nan, nan, 0.8931565967940813, nan, nan, 0.5999832383506537, 0.7063557573256294, nan, nan, nan, 0.0, nan, 0.21244444444444444, nan, nan, nan, 0.9709833204821233, nan, nan, 0.9887498600095702, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4441687344913151, nan, 0.71172446331682, 0.967929405819355, nan, nan, 0.9206174200661521, nan, 0.8898378020523006, nan, nan, 0.8076677686909112, 0.8323197219808862, nan, nan, 0.8848369536493843, nan, nan, nan, nan, nan, nan, 0.9808053411224703, 0.0, nan, 0.9740741170464566, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.16112852664576802, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9085353003161223, 0.7537643932683791, nan, nan, nan, 0.9203668781073204, nan, nan, nan, 0.27875054993400794, 0.8417738126475991, nan, nan, 0.9263581488933602, 0.8573660601784184, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0314 | 69.0 | 1380 | 0.2052 | 0.6459 | 0.7523 | 0.9446 | [0.8762166929362688, 0.9794756362706228, 0.9610684418768511, 0.9291690755801787, 0.25093760654619846, 0.9667831806242114, 0.9846780613439475, nan, 0.7034695097902008, nan, 0.8351629203694023, 0.8887632978723404, 0.9282193919875079, 0.7706405480110564, 0.5259092322039167, 0.911947944067562, 0.9757457692650864, 0.6018473505104521, nan, 0.8120606256046436, 0.46747967479674796, nan, 0.9287970241437394, nan, 0.0, nan, nan, 0.9201474974515159, nan, nan, nan, nan, nan, 0.6700924974306269, nan, nan, 0.5554497102484719, 0.7042347792508689, 0.0, nan, nan, 0.0, nan, 0.24711111111111111, nan, nan, nan, 0.8699436670906778, nan, nan, 0.9747255272175103, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3264382676147382, nan, 0.6764493407982662, 0.8732887103971442, nan, nan, 0.8286370903277378, 0.0, 0.8679061301436651, nan, nan, 0.7818176729084446, 0.8183914672816374, nan, nan, 0.868747708584298, nan, nan, nan, nan, nan, nan, 0.8202364394993046, 0.0, nan, 0.9692539798147956, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17468649752114318, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9002315302041676, 0.6853556485355649, nan, nan, nan, 0.9215669466826676, nan, nan, nan, 0.26389983363978636, 0.8261322228006247, 0.0, nan, 0.9012041058033952, 0.8414862204724409, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.960941978464637, 0.9877016409849004, 0.9962390111705995, 0.9669214670894065, 0.26753907669938204, 0.9935594949623773, 0.9923875955090117, nan, 0.7406534231712054, nan, 0.9997392710017208, 0.9701698359703875, 0.9427098820497571, 0.8589511754068716, 0.5330424234786307, 0.9669227381709644, 0.9982838311947152, 0.8561549100968188, nan, 0.8674474681364106, 0.9295612009237876, nan, 0.9676440479672419, nan, nan, nan, nan, 0.9566013790216011, nan, nan, nan, nan, nan, 0.8977394163584053, nan, nan, 0.5864063023801542, 0.7056335121749897, nan, nan, nan, 0.0, nan, 0.24711111111111111, nan, nan, nan, 0.9714094395519662, nan, nan, 0.9933822705938649, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.41770057899090157, nan, 0.7068648266100496, 0.9509611459892321, nan, nan, 0.9143329658213892, nan, 0.8838795100959947, nan, nan, 0.8100458210080622, 0.8221836084564147, nan, nan, 0.8766449976216902, nan, nan, nan, nan, nan, nan, 0.9843521802628834, 0.0, nan, 0.9727514271128231, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1877742946708464, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9013698630136986, 0.7254207263064659, nan, nan, nan, 0.9215669466826676, nan, nan, nan, 0.2652001759788825, 0.8328522697454737, nan, nan, 0.9186116700201207, 0.8618013204979588, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0434 | 70.0 | 1400 | 0.2037 | 0.6469 | 0.7543 | 0.9448 | [0.8769420096994309, 0.9787297154926841, 0.9603484027558792, 0.9286568212536052, 0.23324618136782715, 0.9671859805929696, 0.9842219320293376, nan, 0.7147341398942659, nan, 0.8369764531245452, 0.8883572567783095, 0.927233658095866, 0.7670999878508079, 0.5329500427317225, 0.9133677700308077, 0.9757329765090693, 0.5995157384987894, nan, 0.8114988282880164, 0.4497528830313015, nan, 0.9276290463692038, nan, 0.0, nan, nan, 0.923755245193211, nan, nan, nan, nan, nan, 0.6713479681815379, nan, nan, 0.571818108910097, 0.7061473510786181, 0.0, nan, nan, 0.0, nan, 0.26666666666666666, nan, nan, nan, 0.8702929630706601, nan, nan, 0.9733028375098629, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3282828282828283, nan, 0.673942455127627, 0.8815831756346627, nan, nan, 0.832104424073336, 0.0, 0.8760791950665369, nan, nan, 0.7839404205280399, 0.8302104352839434, nan, nan, 0.8743928552775891, nan, nan, nan, nan, nan, nan, 0.8218080473785054, 0.0, nan, 0.9682322198711946, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.16943231441048034, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9094157208911308, 0.6838174273858921, nan, nan, nan, 0.9240989156987957, nan, nan, nan, 0.26221755123489227, 0.8286011982287054, 0.0, nan, 0.9069561799960699, 0.8237466758593519, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9597169195356506, 0.9872321985896093, 0.9964257386476304, 0.966223872498902, 0.24645583424209377, 0.9936445181311907, 0.9923970343242516, nan, 0.7533655713769039, nan, 0.9996697432688464, 0.9702424154449122, 0.9409505401922886, 0.8457571495546179, 0.5403046676853762, 0.9646963840093947, 0.9984785737541801, 0.8561549100968188, nan, 0.8707888391319325, 0.9457274826789839, nan, 0.969106463878327, nan, nan, nan, nan, 0.9648964425429473, nan, nan, nan, nan, nan, 0.8949650637073572, nan, nan, 0.5975527991954408, 0.7075680973999174, nan, nan, nan, 0.0, nan, 0.26666666666666666, nan, nan, nan, 0.9716935189318615, nan, nan, 0.9921503548121073, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43010752688172044, nan, 0.7050719509318235, 0.9598048552935917, nan, nan, 0.9207276736493936, nan, 0.893478980470043, nan, nan, 0.8120178643930166, 0.8340573414422241, nan, nan, 0.8848369536493843, nan, nan, nan, nan, nan, nan, 0.9843521802628834, 0.0, nan, 0.9715911727850745, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18244514106583073, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9119072708113805, 0.7298494242692648, nan, nan, nan, 0.9241385222012687, nan, nan, nan, 0.26344038715354157, 0.8346890579900289, nan, nan, 0.928672032193159, 0.8430522655108109, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0056 | 71.0 | 1420 | 0.2062 | 0.6455 | 0.7533 | 0.9442 | [0.8748289062295267, 0.9786648806450129, 0.9608138454452291, 0.9294037959600545, 0.24759106129980182, 0.9655927302767452, 0.9838034500196662, nan, 0.7163041410362103, nan, 0.8496704985371908, 0.8858852933438569, 0.9282215018645349, 0.7662662357460821, 0.4814308356900892, 0.911271536314782, 0.9757316054133299, 0.6013491808544812, nan, 0.8101580135440181, 0.4409406734366649, nan, 0.927221833449965, nan, 0.0, nan, nan, 0.9229186272664534, nan, nan, nan, nan, nan, 0.6714076585253101, nan, nan, 0.569907002623003, 0.706675935223089, 0.0, nan, nan, 0.0, nan, 0.25155555555555553, nan, nan, nan, 0.8699438253313214, nan, nan, 0.9721132245020219, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33118140735958684, nan, 0.6751440403312927, 0.880587666869475, nan, nan, 0.8308590657165479, 0.0, 0.8750649350649351, nan, nan, 0.783469085005738, 0.8378456221198156, nan, nan, 0.8736315295589615, nan, nan, nan, nan, nan, nan, 0.8219345810739899, 0.0, nan, 0.9687866684392756, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17737268518518517, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9102564102564102, 0.6775271512113618, nan, nan, nan, 0.9208552941680593, nan, nan, nan, 0.2356034709439916, 0.8330294347486324, 0.0, nan, 0.906695506833153, 0.8279342260732572, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9605236656596173, 0.9874798165563563, 0.9961894375041311, 0.9673200925696949, 0.263395129043984, 0.9937933086766144, 0.9915711379907594, nan, 0.7565268642921249, nan, 0.9995306878030975, 0.9775729423718972, 0.943701060561007, 0.8416047150224365, 0.4880669197687424, 0.9666291530068014, 0.998420759556839, 0.8630705394190872, nan, 0.8654150878401653, 0.9526558891454965, nan, 0.9688505410938871, nan, nan, nan, nan, 0.9623320380473891, nan, nan, nan, nan, nan, 0.8954171804356761, nan, nan, 0.6009051290647, 0.7080066033842344, nan, nan, nan, 0.0, nan, 0.25155555555555553, nan, nan, nan, 0.9710036118664015, nan, nan, 0.9912544160617384, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.42431761786600497, nan, 0.7076669025713612, 0.9564617388093063, nan, nan, 0.9255788313120177, nan, 0.8921549155908639, nan, nan, 0.8117568586508903, 0.8424558355053576, nan, nan, 0.8856825749167592, nan, nan, nan, nan, nan, nan, 0.9803880659294805, 0.0, nan, 0.9723105304682786, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1921630094043887, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9127502634351949, 0.7183348095659876, nan, nan, nan, 0.9210526315789473, nan, nan, nan, 0.2365156181258249, 0.8391498294410916, nan, nan, 0.9277665995975856, 0.84758832720125, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0193 | 72.0 | 1440 | 0.2084 | 0.6451 | 0.7505 | 0.9445 | [0.8753517209129829, 0.9790715995825574, 0.9601649073471032, 0.9288862321198659, 0.236840299224487, 0.9658485186562539, 0.9839085188544286, nan, 0.7095166028665539, nan, 0.8465456579664061, 0.8871726881007623, 0.9233155598642754, 0.7621472392638037, 0.5050819009728654, 0.9107192382361098, 0.9757445821538169, 0.599128891756735, nan, 0.8083696146598219, 0.47103128621089224, nan, 0.9272765524737894, nan, 0.0, nan, nan, 0.9238959832247209, nan, nan, nan, nan, nan, 0.676258543740832, nan, nan, 0.5642368103101087, 0.7047209637561779, 0.0, nan, nan, 0.0, nan, 0.25955555555555554, nan, nan, nan, 0.8732018497870552, nan, nan, 0.972076701591618, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3246505717916137, nan, 0.6688899905358511, 0.8880492434321395, nan, nan, 0.8353818255087648, 0.0, 0.8567611599869664, nan, nan, 0.7865787193605187, 0.8280529953917051, nan, nan, 0.8720200752823086, nan, nan, nan, nan, nan, nan, 0.8301253752428042, 0.0, nan, 0.9678914993987605, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1791218377435301, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.900105152471083, 0.6633081444164568, nan, nan, nan, 0.9267320793521573, nan, nan, nan, 0.22319628298413255, 0.8253719655442443, 0.0, nan, 0.9024390243902439, 0.8388382968250061, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.961212000486925, 0.9872992617889367, 0.9963860797144557, 0.9692910741111206, 0.2508905852417303, 0.9936870297155975, 0.9932465276958437, nan, 0.7473242870346805, nan, 0.999548069736316, 0.9713311075627813, 0.943998414114382, 0.8320273256982118, 0.5119803399549457, 0.9640602828203748, 0.9984785737541801, 0.8561549100968188, nan, 0.857044436789528, 0.9387990762124712, nan, 0.9668397192161451, nan, nan, nan, nan, 0.9626983815467546, nan, nan, nan, nan, nan, 0.8905877517468146, nan, nan, 0.587076768354006, 0.7061751960379694, nan, nan, nan, 0.0, nan, 0.25955555555555554, nan, nan, nan, 0.9693600097398645, nan, nan, 0.989920689058348, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4226633581472291, nan, 0.7002594951639538, 0.9730412641645124, nan, nan, 0.9142227122381478, nan, 0.8703740483283681, nan, nan, 0.8161649556290238, 0.8326093252244425, nan, nan, 0.8815601712383067, nan, nan, nan, nan, nan, nan, 0.9808053411224703, 0.0, nan, 0.9712605003016661, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.19310344827586207, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9020021074815595, 0.6997342781222321, nan, nan, nan, 0.9270101148637065, nan, nan, nan, 0.22402111746590408, 0.8297034898976646, nan, nan, 0.9194164989939638, 0.8588780807418981, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0156 | 73.0 | 1460 | 0.2037 | 0.6458 | 0.7532 | 0.9450 | [0.8768304823508604, 0.9791674122091745, 0.9610234105073845, 0.9276799059296881, 0.24441254869797005, 0.9674034521296411, 0.983936456852032, nan, 0.7126956698515958, nan, 0.8407080939642445, 0.8856428477431241, 0.9222463768115942, 0.7648466257668711, 0.5328651598300933, 0.9131590504588216, 0.9756938766408492, 0.5975590171832343, nan, 0.8082298641357343, 0.4661308840413318, nan, 0.9274484400714311, nan, 0.0, nan, nan, 0.9222323994027528, nan, nan, nan, nan, nan, 0.6789715306394246, nan, nan, 0.566658735823618, 0.7041938058337409, 0.0, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.873076782516537, nan, nan, 0.9729257205654628, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35969084423305586, nan, 0.6736277118758739, 0.8909454108329797, nan, nan, 0.8274560445018377, 0.0, 0.8550488599348535, nan, nan, 0.7856345885634588, 0.8262748487467588, nan, nan, 0.8718552225534808, nan, nan, nan, nan, nan, nan, 0.8245027283928886, 0.0, nan, 0.9698302518191434, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1276658767772512, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9004629629629629, 0.6797004991680532, nan, nan, nan, 0.9246528373049888, nan, nan, nan, 0.23715588286866562, 0.8262910798122066, 0.0, nan, 0.9067663257277734, 0.8349161049057718, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9600810066066864, 0.9880575918120992, 0.9960192345825897, 0.9667516821626171, 0.25997818974918213, 0.9935594949623773, 0.9926896375966888, nan, 0.7513696703414464, nan, 0.9996697432688464, 0.9769197271011758, 0.946104668450788, 0.8349742147210502, 0.5414861607775799, 0.9665312912854137, 0.9985424736565045, 0.8577685569386814, nan, 0.8565621770582157, 0.9376443418013857, nan, 0.9683752559227844, nan, nan, nan, nan, 0.9616647695306878, nan, nan, nan, nan, nan, 0.8905466502260584, nan, nan, 0.598809922896413, 0.7055561287659926, nan, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.9695223408140904, nan, nan, 0.9914885818714939, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5004135649296939, nan, 0.7046473224817174, 0.9795914400668624, nan, nan, 0.918412348401323, nan, 0.8689175769612711, nan, nan, 0.8168029696653326, 0.8305821025195482, nan, nan, 0.8809788066169865, nan, nan, nan, nan, nan, nan, 0.9772585019820572, 0.0, nan, 0.9734591822527499, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.13510971786833856, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.901791359325606, 0.7236492471213464, nan, nan, nan, 0.9246528373049888, nan, nan, nan, 0.2380114386273647, 0.8312778798215691, nan, nan, 0.9275653923541247, 0.8551988307040975, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.033 | 74.0 | 1480 | 0.2052 | 0.6327 | 0.7524 | 0.9446 | [0.8757982158714782, 0.979359463356273, 0.9613601557016145, 0.9265040661920121, 0.25686766601098826, 0.963574930981911, 0.9833918292854135, nan, 0.7072376372403089, nan, 0.8483306038743563, 0.8777533039647577, 0.9148017557746276, 0.7656460502143294, 0.5227237426609922, 0.9101146884067984, 0.9756747998418357, 0.5950465373087238, nan, 0.8053155276250328, 0.4680974477958237, nan, 0.9273746933052927, nan, 0.0, nan, nan, 0.9207126837152774, nan, nan, nan, nan, nan, 0.6784274508578393, nan, nan, 0.5657129194229696, 0.7065001930750419, 0.0, nan, nan, 0.0, 0.0, 0.23644444444444446, nan, nan, nan, 0.8732945608837193, nan, nan, 0.9727757170994091, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35903614457831323, nan, 0.684822273280173, 0.8901225182539823, nan, nan, 0.8239952484656504, 0.0, 0.8494784876140808, nan, nan, 0.7884218165578072, 0.8308844713339095, nan, nan, 0.8700653594771242, nan, nan, nan, nan, nan, nan, 0.8251636878428596, 0.0, nan, 0.9689026328718823, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11222978975421972, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8939170700905072, 0.6846921797004991, nan, nan, nan, 0.9247182827027722, nan, nan, nan, 0.23729407641079564, 0.8261662757362522, 0.0, nan, 0.9087871043837232, 0.8353184368540211, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9604384537919282, 0.9881452898419888, 0.9958011104501289, 0.9658990665520003, 0.27531806615776083, 0.9941334013518683, 0.9917410366650778, nan, 0.744658620551083, nan, 0.9994437781370044, 0.9833793003338656, 0.9450639310139756, 0.8373853057397361, 0.5301595803336536, 0.9668493418799237, 0.9985942021488624, 0.8695251267865376, nan, 0.8475370306579401, 0.9318706697459584, nan, 0.9673150043872477, nan, nan, nan, nan, 0.9614292629953815, nan, nan, nan, nan, nan, 0.8922729140978216, nan, nan, 0.5948709353000335, 0.7079034255055716, nan, nan, nan, 0.0, nan, 0.23644444444444446, nan, nan, nan, 0.9689135992857433, nan, nan, 0.990602824243288, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.49296939619520264, nan, 0.7171974522292993, 0.9786001671558242, nan, nan, 0.9177508269018743, nan, 0.8626944720291294, nan, nan, 0.8191520213444696, 0.8352157544164495, nan, nan, 0.8794461180698695, nan, nan, nan, nan, nan, nan, 0.9728771124556645, 0.0, nan, 0.972438158444331, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11880877742946709, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8950474183350896, 0.728963684676705, nan, nan, nan, 0.9249957140408024, nan, nan, nan, 0.23827540695116586, 0.8318026764628706, nan, nan, 0.9301810865191147, 0.8554004334458949, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0308 | 75.0 | 1500 | 0.2072 | 0.6451 | 0.7508 | 0.9443 | [0.8751402049991082, 0.9784107976663635, 0.9609964923469387, 0.9257416627125487, 0.2488237299693147, 0.9665412134496878, 0.983618830157074, nan, 0.7125100189565018, nan, 0.848741604546461, 0.8805462981114814, 0.9207382224380767, 0.7611318674588106, 0.5060837093511747, 0.9111321364286703, 0.9756685644631417, 0.5938352865628512, nan, 0.8066948326291843, 0.46714031971580816, nan, 0.9272306234868952, nan, 0.0, nan, nan, 0.9225575306600524, nan, nan, nan, nan, nan, 0.6747191446501898, nan, nan, 0.5690625502492362, 0.7045981154420473, 0.0, nan, nan, 0.0, nan, 0.21955555555555556, nan, nan, nan, 0.8705687324662076, nan, nan, 0.9738601277016697, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.32514177693761814, nan, 0.6802968960863698, 0.8890937128879594, nan, nan, 0.8295137504982064, 0.0, 0.8472330984077264, nan, nan, 0.785091640590076, 0.8361175115207373, nan, nan, 0.8693309619710206, nan, nan, nan, nan, nan, nan, 0.8304062444562711, 0.0, nan, 0.9687516257131462, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20954598370197905, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.887578947368421, 0.6885928393005828, nan, nan, nan, 0.9252912954078135, nan, nan, nan, 0.24597056762438682, 0.8168058455114823, 0.0, nan, 0.9034768048852556, 0.819891678975874, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9610116972654738, 0.9879956873204125, 0.9959961002049045, 0.9649504855479808, 0.265285350781534, 0.9934957275857671, 0.9921138698670543, nan, 0.7501908831527199, nan, 0.9994263962037858, 0.9780084192190448, 0.9395381108137576, 0.8322952247002879, 0.5123899242269097, 0.9659685863874345, 0.9985759450339126, 0.8526970954356846, nan, 0.855046503616948, 0.9110854503464203, nan, 0.9661816320561568, nan, nan, nan, nan, 0.9635488218131386, nan, nan, nan, nan, nan, 0.8948417591450883, nan, nan, 0.5931947703654039, 0.7059430458109781, nan, nan, nan, 0.0, nan, 0.21955555555555556, nan, nan, nan, 0.9697049632725945, nan, nan, 0.9922623471559036, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4267990074441687, nan, 0.713517338995046, 0.9749460630916053, nan, nan, 0.9178610804851157, nan, 0.8595167163190996, nan, nan, 0.8149179281944203, 0.8407182160440196, nan, nan, 0.87833624015644, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9722641202951687, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22570532915360503, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8885142255005268, 0.7325066430469442, nan, nan, nan, 0.9257671866963827, nan, nan, nan, 0.24707435107787065, 0.8213067436368408, nan, nan, 0.9228370221327968, 0.8392722141021118, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0194 | 76.0 | 1520 | 0.2028 | 0.6472 | 0.7569 | 0.9444 | [0.8749596168446218, 0.979070088370738, 0.9620105220300014, 0.927606817899647, 0.26824670078103957, 0.9662169142715458, 0.983947544794552, nan, 0.7143654609812106, nan, 0.8539117262682666, 0.8842222513432053, 0.9199806716598212, 0.7670846583068338, 0.5024431862250833, 0.9098827161916173, 0.975747123018823, 0.5968304278922345, nan, 0.8047348729196233, 0.46507666098807493, nan, 0.9272021269152733, nan, 0.0, nan, nan, 0.9198932242291189, nan, nan, nan, nan, nan, 0.6732250984543063, nan, nan, 0.5758579881656805, 0.705746061167748, 0.0, nan, nan, 0.0, nan, 0.25155555555555553, nan, nan, nan, 0.868555765320941, nan, nan, 0.9724127587241276, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3448275862068966, nan, 0.6902507215007215, 0.8844774906836687, nan, nan, 0.8167686777421549, 0.0, 0.8573012456792539, nan, nan, 0.7819477434679335, 0.8444700460829493, nan, nan, 0.8718549773402094, nan, nan, nan, nan, nan, nan, 0.8151757744517926, 0.0, nan, 0.9703596257534342, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18556103218324152, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9036195286195287, 0.6995192307692307, nan, nan, nan, 0.9316283254080452, nan, nan, nan, 0.2512267788293025, 0.8248631743549648, 0.0, nan, 0.911778962675715, 0.8201948627103631, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9590850237375917, 0.9881917182107538, 0.9956242977063917, 0.9651793261014797, 0.2896401308615049, 0.9939633550142414, 0.9907829969182268, nan, 0.7542496617684487, nan, 0.9994437781370044, 0.979460008709537, 0.9435523837843196, 0.8562721853861094, 0.5102474834197136, 0.9660909135391692, 0.9984633594917219, 0.8681420009220839, nan, 0.8594557354460902, 0.9457274826789839, nan, 0.9690333430827728, nan, nan, nan, nan, 0.9648702751501355, nan, nan, nan, nan, nan, 0.8888203863542951, nan, nan, 0.6117163928930607, 0.7071295914156005, nan, nan, nan, 0.0, nan, 0.25155555555555553, nan, nan, nan, 0.9708818635607321, nan, nan, 0.9901243115016137, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4466501240694789, nan, 0.7221986317527719, 0.9733911251919376, nan, nan, 0.9269018743109151, nan, 0.8702416418404502, nan, nan, 0.811495852908764, 0.8491167101071532, nan, nan, 0.8845726970033296, nan, nan, nan, nan, nan, nan, 0.9772585019820572, 0.0, nan, 0.9741031234046503, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2006269592476489, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9049525816649104, 0.7732506643046945, nan, nan, nan, 0.9320675467169552, nan, nan, nan, 0.2522657281126265, 0.8304906848596169, nan, nan, 0.9461770623742455, 0.840078625069301, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0312 | 77.0 | 1540 | 0.2063 | 0.6462 | 0.7540 | 0.9441 | [0.8738931657549438, 0.9790482518912288, 0.9612390983967386, 0.927272083303818, 0.25351443123938877, 0.9660502117987395, 0.9841411837440108, nan, 0.7062034739454094, nan, 0.8489496449608055, 0.883582285115304, 0.9221062848823102, 0.7698504027617952, 0.4872380419754237, 0.8980746937012707, 0.9757133172373601, 0.5936273740386124, nan, 0.8039931509062126, 0.4671240708976558, nan, 0.9275768947497632, nan, 0.0, nan, nan, 0.922327055845832, nan, nan, nan, nan, nan, 0.679492826579411, nan, nan, 0.5699904122722914, 0.7042224510813594, 0.0, nan, nan, 0.0, nan, 0.23466666666666666, nan, nan, nan, 0.8712129497976594, nan, nan, 0.9738035415769928, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33849129593810445, nan, 0.683808921798808, 0.8925355743088168, nan, nan, 0.8262850351589581, 0.0, 0.8640302299824093, nan, nan, 0.7882228235097213, 0.8396660909614277, nan, nan, 0.8717573986116186, nan, nan, nan, nan, nan, nan, 0.8258565877781703, 0.0, nan, 0.9690005375629325, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22522522522522523, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8968421052631579, 0.6832504145936982, nan, nan, nan, 0.93152504606419, nan, nan, nan, 0.236201156474505, 0.81478578892372, 0.0, nan, 0.9092154177264723, 0.8228489860208702, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9601374456359351, 0.9881040201808643, 0.9958853856831251, 0.9663198378923047, 0.2713922210105416, 0.9937507970922076, 0.9916560873279185, nan, 0.7433994615085797, nan, 0.9995828336027534, 0.9788793729133402, 0.9416195856873822, 0.8513160538476994, 0.4940846579183667, 0.9666046875764545, 0.9985059594266049, 0.871830336560627, nan, 0.8572511195315191, 0.9434180138568129, nan, 0.9669494004094764, nan, nan, nan, nan, 0.9630908924389319, nan, nan, nan, nan, nan, 0.8876695437731196, nan, nan, 0.5978880321823667, 0.7055303342963268, nan, nan, nan, 0.0, nan, 0.23466666666666666, nan, nan, nan, 0.9697861288097074, nan, nan, 0.9904399262886755, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43424317617866004, nan, 0.714555319650861, 0.9826235689712142, nan, nan, 0.919845644983462, nan, 0.8779874213836478, nan, nan, 0.819471028362624, 0.8447726614538082, nan, nan, 0.8827229004809471, nan, nan, nan, nan, nan, nan, 0.9755894012100981, 0.0, nan, 0.9725251775189121, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24294670846394983, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8977871443624869, 0.7298494242692648, nan, nan, nan, 0.9317246699811418, nan, nan, nan, 0.2372195336559613, 0.8184203621096825, nan, nan, 0.935010060362173, 0.8425482586563178, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0356 | 78.0 | 1560 | 0.2083 | 0.6455 | 0.7534 | 0.9440 | [0.8742784106920011, 0.9803577368919368, 0.9608471817209255, 0.9274340976998285, 0.2558376323649199, 0.9664861065167053, 0.9838665111857294, nan, 0.7072894532591961, nan, 0.8534663163987711, 0.8881744663926819, 0.9201806501473216, 0.7741413414853444, 0.4887856132552031, 0.9062823491523084, 0.9756914816125205, 0.5955639452571968, nan, 0.8033857156917424, 0.44408427876823336, nan, 0.9268446499894891, nan, 0.0, nan, nan, 0.9190668022037323, nan, nan, nan, nan, nan, 0.6851288996677113, nan, nan, 0.5611544830849302, 0.7038391224862889, 0.0, nan, nan, 0.0, nan, 0.2648888888888889, nan, nan, nan, 0.8657429181288665, nan, nan, 0.9715533474898834, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3405474220241884, nan, 0.6811390800434625, 0.8933781070660768, nan, nan, 0.8227471218737594, 0.0, 0.8643019850309144, nan, nan, 0.7776379098131364, 0.8477259643062752, nan, nan, 0.8700585651537335, nan, nan, nan, nan, nan, nan, 0.8246110325318247, 0.0, nan, 0.9680600759620999, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1788736504231106, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8974736842105263, 0.6895695364238411, nan, nan, nan, 0.9350070691058652, nan, nan, nan, 0.22680683311432326, 0.8298538622129437, 0.0, nan, 0.9096215585173655, 0.806545418699788, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9613647178601862, 0.9881865595031133, 0.9958556414832441, 0.9680804337635782, 0.27400945110868774, 0.9936445181311907, 0.9914814692459802, nan, 0.7446184346241946, nan, 0.9995306878030975, 0.972492379155175, 0.9441223114282883, 0.8649789029535865, 0.4953921769404055, 0.9638645593775994, 0.998563773623946, 0.8727524204702628, nan, 0.8435411643127799, 0.9491916859122402, nan, 0.9671687627961392, nan, nan, nan, nan, 0.9581714225903102, nan, nan, nan, nan, nan, 0.8940608302507193, nan, nan, 0.5963794837412001, 0.7050918283120099, nan, nan, nan, 0.0, nan, 0.2648888888888889, nan, nan, nan, 0.9711456515563491, nan, nan, 0.987528125349976, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4425144747725393, nan, 0.7098372257607927, 0.9773173433885984, nan, nan, 0.9140022050716649, nan, 0.879179079774909, nan, nan, 0.8061887361521953, 0.8528815522733855, nan, nan, 0.8793932667406585, nan, nan, nan, nan, nan, nan, 0.9730857500521594, 0.0, nan, 0.9714577435373833, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1921630094043887, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8984193888303478, 0.7378210806023029, nan, nan, nan, 0.93536773529916, nan, nan, nan, 0.22780466344038716, 0.8344266596693781, nan, nan, 0.9406438631790744, 0.8247568166927071, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0293 | 79.0 | 1580 | 0.2077 | 0.6459 | 0.7533 | 0.9443 | [0.8740922048777834, 0.9800527564207426, 0.9619015681389926, 0.9276681251699527, 0.26417517064269785, 0.9665040112480358, 0.9840927486326781, nan, 0.7116194378058496, nan, 0.8490476893547911, 0.8873566266657826, 0.9212733973426438, 0.7701532311792139, 0.4902207125463372, 0.9138718275582338, 0.9755905441596543, 0.5964576352321685, nan, 0.7983676134508652, 0.4556179775280899, nan, 0.9279288739542861, nan, 0.0, nan, nan, 0.9178415839098065, nan, nan, nan, nan, nan, 0.6861409608091024, nan, nan, 0.5566617406860076, 0.7046057204644337, 0.0, nan, nan, 0.0, nan, 0.24977777777777777, nan, nan, nan, 0.8692548050721215, nan, nan, 0.9727293667564672, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3371501272264631, nan, 0.692432872253865, 0.8904000706526539, nan, nan, 0.8266547049441786, 0.0, 0.8603410126252766, nan, nan, 0.7831136484482132, 0.8457109959700633, nan, nan, 0.8695538606206248, nan, nan, nan, nan, nan, nan, 0.8224841660802252, 0.0, nan, 0.9695461691404264, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17667638483965015, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8909244051379238, 0.6895127993393889, nan, nan, nan, 0.9342358939205689, nan, nan, nan, 0.2187061711079944, 0.8299426186750131, 0.0, nan, 0.9105160662122688, 0.8275573496111056, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9599382490621161, 0.9889964766026815, 0.9953764293740498, 0.9695642063846515, 0.28418756815703383, 0.9935807507545806, 0.992675479373829, nan, 0.7538478024995646, nan, 0.9995654516695347, 0.9714036870373058, 0.9415452472990385, 0.8516509276002947, 0.4978969422958774, 0.9646474531487009, 0.9987098305435448, 0.8616874135546335, nan, 0.842369962108164, 0.9364896073903002, nan, 0.9692161450716584, nan, nan, nan, nan, 0.9565097931467598, nan, nan, nan, nan, nan, 0.8922934648581997, nan, nan, 0.5998156218571907, 0.7059688402806439, nan, nan, nan, 0.0, nan, 0.24977777777777777, nan, nan, nan, 0.9709224463292886, nan, nan, 0.9888720334755297, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43837882547559964, nan, 0.7227176220806794, 0.9798052440280667, nan, nan, 0.9143329658213892, nan, 0.8752068851373718, nan, nan, 0.8115248535467781, 0.8508543295684912, nan, nan, 0.8797103747159241, nan, nan, nan, nan, nan, nan, 0.9753807636136032, 0.0, nan, 0.9731459135842576, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18996865203761756, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8916754478398314, 0.7395925597874224, nan, nan, nan, 0.9345962626435796, nan, nan, nan, 0.21953365596128466, 0.8349514563106796, nan, nan, 0.9407444668008048, 0.847285923088554, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0312 | 80.0 | 1600 | 0.2061 | 0.6448 | 0.7520 | 0.9441 | [0.8735628245234947, 0.9801552577309036, 0.9610476176807808, 0.9290197784166658, 0.25647808981142317, 0.9663525132275133, 0.9839056496171573, nan, 0.7060398766387878, nan, 0.846425575131364, 0.8871696108380928, 0.9254773197130001, 0.7693568956094006, 0.49119052050375334, 0.9069505178365938, 0.9757003398741009, 0.5949286846275753, nan, 0.8015652286398034, 0.47380675203725264, nan, 0.9276539148087222, nan, 0.0, nan, nan, 0.9207941849845491, nan, nan, nan, nan, nan, 0.6826833467595445, nan, nan, 0.5672629836866577, 0.7030686849963959, 0.0, nan, nan, 0.0, nan, 0.24533333333333332, nan, nan, nan, 0.8710529672211796, nan, nan, 0.9737491877842755, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3416030534351145, nan, 0.6775436211194199, 0.8935482170161163, nan, nan, 0.8274208054361947, 0.0, 0.8470013037809648, nan, nan, 0.7844473618055362, 0.8440287769784173, nan, nan, 0.8698515575998328, nan, nan, nan, nan, nan, nan, 0.8255731922398589, 0.0, nan, 0.9689192548019954, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1687170474516696, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8905263157894737, 0.6900489396411092, nan, nan, nan, 0.9255355612682091, nan, nan, nan, 0.21337775050407645, 0.8099346405228758, 0.0, nan, 0.9089846033911518, 0.8221828385762812, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9592089682724124, 0.9880937027655832, 0.995888690594223, 0.9696601717780542, 0.2748818611414031, 0.9938358202610211, 0.9933550740711026, nan, 0.7451944329095951, nan, 0.9995828336027534, 0.9695892001741907, 0.9428833382892259, 0.8485031143258991, 0.4989051497345579, 0.9641092136810686, 0.9984481452292637, 0.8653757491931766, nan, 0.8538064071650017, 0.9399538106235565, nan, 0.9689967826849956, nan, nan, nan, nan, 0.9629600554748727, nan, nan, nan, nan, nan, 0.8882038635429511, nan, nan, 0.6032517599731814, 0.7044469665703673, nan, nan, nan, 0.0, nan, 0.24533333333333332, nan, nan, nan, 0.9700499168053245, nan, nan, 0.9917329288034127, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4441687344913151, nan, 0.7053550365652277, 0.9882019086862719, nan, nan, 0.9128996692392503, nan, 0.8601787487586892, nan, nan, 0.8144539179861957, 0.8494063133507095, nan, nan, 0.8795518207282913, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9724323571726923, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.180564263322884, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8914646996838778, 0.7493356953055802, nan, nan, nan, 0.925852905880336, nan, nan, nan, 0.21416630004399473, 0.8129099973760168, nan, nan, 0.938430583501006, 0.8417418476891285, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0316 | 81.0 | 1620 | 0.2077 | 0.6442 | 0.7502 | 0.9442 | [0.8742880337291975, 0.978936877076412, 0.96071924925729, 0.926770007745276, 0.24991485593624413, 0.9678013831945997, 0.9843259160771719, nan, 0.7014691653511229, nan, 0.8431237447407386, 0.8854509545753786, 0.9243969525107709, 0.7728866852098198, 0.5198850842456713, 0.9143984220907297, 0.9757377692664534, 0.5923076923076923, nan, 0.8016523591299296, 0.47157772621809746, nan, 0.9271209623680426, nan, 0.0, nan, nan, 0.9211245020169878, nan, nan, nan, nan, nan, 0.6828860130052276, nan, nan, 0.5625842251288149, 0.7023401899956234, 0.0, nan, nan, 0.0, nan, 0.21688888888888888, nan, nan, nan, 0.869980879541109, nan, nan, 0.9717779021873876, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.32887189292543023, nan, 0.6766947998910972, 0.8952558401153082, nan, nan, 0.8277511961722488, 0.0, 0.8568266163161664, nan, nan, 0.7838463899974882, 0.8303571428571429, nan, nan, 0.8702649041224725, nan, nan, nan, nan, nan, nan, 0.8296586059743954, 0.0, nan, 0.9679142530091227, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18269230769230768, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8800842992623814, 0.6854026845637584, nan, nan, nan, 0.9239242242413852, nan, nan, nan, 0.22166710491717065, 0.8246804069919124, 0.0, nan, 0.9086543173576399, 0.8033998521803399, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9606099841749388, 0.9880421156891775, 0.9960770705268028, 0.9672093632696148, 0.26673936750272625, 0.9934744717935637, 0.9922696103185128, nan, 0.7380681285413848, nan, 0.9996697432688464, 0.9761939323559298, 0.9410496580434137, 0.8560712611345522, 0.5273870098772823, 0.9640847482507218, 0.9984359738192972, 0.8520055325034578, nan, 0.8556665518429211, 0.9387990762124712, nan, 0.9664741152383738, nan, nan, nan, nan, 0.9619918619408355, nan, nan, nan, nan, nan, 0.8805384299219071, nan, nan, 0.5947871270533021, 0.7036989269500619, nan, nan, nan, 0.0, nan, 0.21688888888888888, nan, nan, nan, 0.9694208838926992, nan, nan, 0.9896559798821026, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4267990074441687, nan, 0.7036093418259023, 0.9899512138233979, nan, nan, 0.9155457552370452, nan, 0.8712346904998345, nan, nan, 0.8145119192622238, 0.8349261511728931, nan, nan, 0.8802917393372444, nan, nan, nan, nan, nan, nan, 0.9735030252451492, 0.0, nan, 0.9712721028449436, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.19655172413793104, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8800842992623814, 0.7236492471213464, nan, nan, nan, 0.9239242242413852, nan, nan, nan, 0.22252529696436427, 0.829441091577014, nan, nan, 0.9326961770623743, 0.821783176251197, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0277 | 82.0 | 1640 | 0.2077 | 0.6442 | 0.7520 | 0.9438 | [0.8734399240337131, 0.9791603822300845, 0.9612318418782224, 0.9262284060268505, 0.2540297898388084, 0.9670038685119675, 0.9841094453756333, nan, 0.7082522357490872, nan, 0.8401262161451486, 0.8860492688710315, 0.9268547290736066, 0.7746342055734321, 0.4963946221719106, 0.9186450154928593, 0.9757157981134096, 0.5950294726780309, nan, 0.8052470939671407, 0.44906900328587074, nan, 0.9281864632183103, nan, 0.0, nan, nan, 0.9200399550505681, nan, nan, nan, nan, nan, 0.6814849771904986, nan, nan, 0.5719124133585382, 0.702409762628083, 0.0, nan, nan, 0.0, nan, 0.23377777777777778, nan, nan, nan, 0.8686116809789219, nan, nan, 0.9714137421647303, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34149484536082475, nan, 0.6803386300873738, 0.8928766303486878, nan, nan, 0.8222727722282662, 0.0, 0.8544448062520351, nan, nan, 0.7818176752479661, 0.8349654377880185, nan, nan, 0.8708143969074857, nan, nan, nan, nan, nan, nan, 0.824580017683466, 0.0, nan, 0.9693569644839067, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1609529343404997, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8891464699683878, 0.6965012205044752, nan, nan, nan, 0.9194308734036171, nan, nan, nan, 0.24425942156003505, 0.8142633228840125, 0.0, nan, 0.9113516676418958, 0.7880954729263241, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9588758673350818, 0.9879595763669285, 0.9960192345825897, 0.9661279071054992, 0.27153762268266085, 0.9935807507545806, 0.9919817264536955, nan, 0.7457838265039584, nan, 0.9996349794024091, 0.976339091304979, 0.9460551095252255, 0.8545308418726141, 0.5042770049937775, 0.9646963840093947, 0.9983659882119894, 0.8609958506224067, nan, 0.8542886668963141, 0.9468822170900693, nan, 0.969654869844984, nan, nan, nan, nan, 0.9640852533657809, nan, nan, nan, nan, nan, 0.8903000411015207, nan, nan, 0.6085316795172645, 0.7037505158893933, nan, nan, nan, 0.0, nan, 0.23377777777777778, nan, nan, nan, 0.9708209894078974, nan, nan, 0.9908675334195335, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43837882547559964, nan, 0.709035149799481, 0.9859666854554996, nan, nan, 0.9166482910694598, nan, 0.8685865607414763, nan, nan, 0.8137869033118729, 0.8395598030697944, nan, nan, 0.8810316579461973, nan, nan, nan, nan, nan, nan, 0.9728771124556645, 0.0, nan, 0.972821042372488, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17366771159874608, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8891464699683878, 0.7581930912311781, nan, nan, nan, 0.9195096862677867, nan, nan, nan, 0.24522657281126264, 0.817895565468381, nan, nan, 0.9401408450704225, 0.8054533541656167, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0049 | 83.0 | 1660 | 0.2097 | 0.6428 | 0.7506 | 0.9436 | [0.8737221924498044, 0.979684365660949, 0.9610890784389705, 0.9271143529911655, 0.2652291105121294, 0.9679242938789033, 0.9841049057487413, nan, 0.7084819154355578, nan, 0.8444865412573241, 0.8866891044677622, 0.9262614762239287, 0.7731204258150366, 0.4962758363850785, 0.9193371710909641, 0.975705503956889, 0.5984302418708954, nan, 0.8054789184978416, 0.4464964693101575, nan, 0.928416257883672, nan, 0.0, nan, nan, 0.9189118067216361, nan, nan, nan, nan, nan, 0.6810891634532148, nan, nan, 0.5631774543202229, 0.7007466529351184, 0.0, nan, nan, 0.0, nan, 0.2568888888888889, nan, nan, nan, 0.8654792043399638, nan, nan, 0.9699815562534271, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3341804320203304, nan, 0.6773289961145749, 0.8916280376457367, nan, nan, 0.8282949216801357, 0.0, 0.8487515483408306, nan, nan, 0.7752485755781477, 0.8289170506912442, nan, nan, 0.8707806458355105, nan, nan, nan, nan, nan, nan, 0.8245861218738992, 0.0, nan, 0.9691606018740643, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12866354044548653, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8911349757843756, 0.6841243862520459, nan, nan, nan, 0.9162382176520995, nan, nan, nan, 0.23866725120561158, 0.8215031315240083, 0.0, nan, 0.910049809551714, 0.7624358468219502, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9598596770802209, 0.9876139429550109, 0.9959316544384956, 0.9664121123090381, 0.2861504907306434, 0.9935594949623773, 0.9916938425888783, nan, 0.7451810376006323, nan, 0.9995828336027534, 0.9751778197125853, 0.9474923183665378, 0.8560712611345522, 0.5038201609981253, 0.9650633654645985, 0.9982990454571734, 0.8612263715998156, nan, 0.8548398208749569, 0.9491916859122402, nan, 0.9687408599005557, nan, nan, nan, nan, 0.9594405411416833, nan, nan, nan, nan, nan, 0.8892930538429922, nan, nan, 0.6096211867247737, 0.7020480808914569, nan, nan, nan, 0.0, nan, 0.2568888888888889, nan, nan, nan, 0.9711659429406274, nan, nan, 0.9905620997546349, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43507030603804797, nan, 0.7073366359990564, 0.9869968318140294, nan, nan, 0.9153252480705623, nan, 0.861900033101622, nan, nan, 0.8049707093556058, 0.8334781349551115, nan, nan, 0.8807674013001427, nan, nan, nan, nan, nan, nan, 0.9768412267890674, 0.0, nan, 0.9726296004084095, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.13761755485893418, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.891886195995785, 0.7404782993799823, nan, nan, nan, 0.9165523744213955, nan, nan, nan, 0.23950725912890453, 0.8260299134085541, nan, nan, 0.9374245472837022, 0.7786905901920266, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0281 | 84.0 | 1680 | 0.2072 | 0.6445 | 0.7526 | 0.9439 | [0.8739329832981687, 0.979693094629156, 0.9608562661869585, 0.9272954399077082, 0.26191762322754897, 0.9674302149936889, 0.9840667500573896, nan, 0.7107135596665989, nan, 0.8542607558830521, 0.8859556494192186, 0.9238000870027551, 0.7738972809667674, 0.4875668967656868, 0.921967213114754, 0.9755701003471727, 0.5937749401436552, nan, 0.8058524504692388, 0.43950749464668093, nan, 0.9279711464388263, nan, 0.0, nan, nan, 0.9198277809856149, nan, nan, nan, nan, nan, 0.6820896937517777, nan, nan, 0.5654919694370809, 0.7042090359119578, 0.0, nan, nan, 0.0, nan, 0.2551111111111111, nan, nan, nan, 0.8647403231052803, nan, nan, 0.9699279628040628, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3465527760829774, nan, 0.6794513130583882, 0.8901162790697674, nan, nan, 0.8262432594367884, 0.0, 0.8603122966818477, nan, nan, 0.7721196154816289, 0.8450460829493087, nan, nan, 0.8698376233488226, nan, nan, nan, nan, nan, nan, 0.8226288931902165, 0.0, nan, 0.9684136288485773, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12463083284111046, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8947368421052632, 0.6916802610114192, nan, nan, nan, 0.9213964446348254, nan, nan, nan, 0.24859796705222573, 0.8250325945241199, 0.0, nan, 0.9071882640586797, 0.8001084759134165, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9607759813197879, 0.988047274396818, 0.9956094256064512, 0.9671724535029214, 0.28200654307524536, 0.993772052884411, 0.9913210093869017, nan, 0.7492800021432494, nan, 0.9994959239366602, 0.97430686601829, 0.9471949648131629, 0.8578126046480476, 0.49514012508073535, 0.9631550618975387, 0.9987037448385615, 0.8575380359612724, nan, 0.851877368239752, 0.9480369515011547, nan, 0.9688871014916642, nan, nan, nan, nan, 0.9587601889285761, nan, nan, nan, nan, nan, 0.8870530209617756, nan, nan, 0.6078612135434127, 0.7056077177053239, nan, nan, nan, 0.0, nan, 0.2551111111111111, nan, nan, nan, 0.9710036118664015, nan, nan, 0.9897272477372456, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46980976013234077, nan, 0.7104505779665016, 0.9820015938113471, nan, nan, 0.9122381477398015, nan, 0.8754054948692486, nan, nan, 0.7989675772867003, 0.8496959165942659, nan, nan, 0.8805031446540881, nan, nan, nan, nan, nan, nan, 0.9753807636136032, 0.0, nan, 0.9718406274655405, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1322884012539185, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8956796628029505, 0.7511071744906997, nan, nan, nan, 0.921909823418481, nan, nan, nan, 0.24962604487461504, 0.8302282865389662, nan, nan, 0.9331991951710261, 0.8178519227861499, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0377 | 85.0 | 1700 | 0.2057 | 0.6449 | 0.7528 | 0.9445 | [0.8758040703885995, 0.9792689161554192, 0.9614356460135751, 0.9266324880480761, 0.26420818445358324, 0.9675683506840242, 0.9838359446659554, nan, 0.7158278406421051, nan, 0.846762081455959, 0.8855465400092354, 0.9199652652145597, 0.7692120326391426, 0.5252808553498102, 0.918241789089548, 0.9756017085937988, 0.5941269841269842, nan, 0.8004042116243439, 0.4381720430107527, nan, 0.9269666853303471, nan, 0.0, nan, nan, 0.9214343928280359, nan, nan, nan, nan, nan, 0.6817399813753808, nan, nan, 0.560374685138539, 0.7040002059308073, 0.0, nan, nan, 0.0, nan, 0.24888888888888888, nan, nan, nan, 0.8609747766211819, nan, nan, 0.9703964124127323, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33800127307447486, nan, 0.6833792138634415, 0.8897765550323616, nan, nan, 0.8248204309656824, 0.0, 0.8539889286877239, nan, nan, 0.7683661742859546, 0.8378456221198156, nan, nan, 0.869235595524417, nan, nan, nan, nan, nan, nan, 0.8202108963093145, 0.0, nan, 0.9694885473681777, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.16661803326524657, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8911349757843756, 0.6938775510204082, nan, nan, nan, 0.9252784918594688, nan, nan, nan, 0.23413394109396915, 0.8286384976525821, 0.0, nan, 0.910371819960861, 0.8078439101300748, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9597799984506933, 0.9881246550114265, 0.9954805340736335, 0.9686562261239947, 0.28491457651762997, 0.9937082855078009, 0.9921610639432539, nan, 0.7550265896882912, nan, 0.9995828336027534, 0.97430686601829, 0.9450639310139756, 0.8460250485566941, 0.5340191244348524, 0.960341537407643, 0.9986915734285949, 0.8628400184416782, nan, 0.8458146744746814, 0.941108545034642, nan, 0.9684483767183387, nan, nan, nan, nan, 0.9615077651738169, nan, nan, nan, nan, nan, 0.8876489930127415, nan, nan, 0.5966309084813946, 0.7054529508873297, nan, nan, nan, 0.0, nan, 0.24888888888888888, nan, nan, nan, 0.9717543930846962, nan, nan, 0.9891876482625915, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4392059553349876, nan, 0.7144609577730597, 0.980641023149138, nan, nan, 0.9115766262403528, nan, 0.8681231380337636, nan, nan, 0.7949944898787773, 0.8424558355053576, nan, nan, 0.8786533481317055, nan, nan, nan, nan, nan, nan, 0.973711662841644, 0.0, nan, 0.973093702139509, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17899686520376176, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.891886195995785, 0.7528786536758193, nan, nan, nan, 0.9255957483284759, nan, nan, nan, 0.2350197976242851, 0.8336394647074259, nan, nan, 0.9360160965794768, 0.8263696386270853, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0209 | 86.0 | 1720 | 0.2076 | 0.6334 | 0.7532 | 0.9445 | [0.8759978395077308, 0.9790521293080964, 0.9606095667944832, 0.9269128242839062, 0.2520574032510372, 0.9669768403639372, 0.9836339470408192, nan, 0.7129774289016754, nan, 0.8523278047045222, 0.8846939447848718, 0.9153408681880354, 0.7696837248831421, 0.5230132655591371, 0.9174642259803011, 0.9756993173330796, 0.5944188996353258, nan, 0.8018623561428243, 0.4379679144385027, nan, 0.9269973415419057, nan, 0.0, nan, nan, 0.9228144989339019, nan, nan, nan, nan, nan, 0.6845612091477408, nan, nan, 0.5644699140401146, 0.7048842658152887, 0.0, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.8659474722609373, nan, nan, 0.9705861796978743, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3424744897959184, nan, 0.6749615419419057, 0.8917141604559404, nan, nan, 0.8258551307847083, 0.0, 0.8555331205627564, nan, nan, 0.7763850027979855, 0.8448474381116868, nan, nan, 0.8690008897262783, nan, nan, nan, nan, nan, nan, 0.821917808219178, 0.0, nan, 0.9680190544462302, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1864801864801865, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8924210526315789, 0.6874487284659557, nan, nan, nan, 0.9247707994173593, nan, nan, nan, 0.23966000701016474, 0.8310740354535975, 0.0, nan, 0.9053324167730531, 0.8047975568909467, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9602259774465213, 0.9875675145862458, 0.996010972304845, 0.9677039541433059, 0.26942929843693203, 0.9939846108064447, 0.9919109353393962, nan, 0.7519054826999585, nan, 0.9995306878030975, 0.9745246044418638, 0.9452126077906631, 0.8491728618310896, 0.531656138250445, 0.9662621715515975, 0.9985272593940464, 0.8642231443061319, nan, 0.8424388563554943, 0.9457274826789839, nan, 0.9688871014916642, nan, nan, nan, nan, 0.962646046761131, nan, nan, nan, nan, nan, 0.8889025893958077, nan, nan, 0.5943680858196446, 0.7061751960379694, nan, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.9707601152550627, nan, nan, 0.9897170666150823, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4441687344913151, nan, 0.7038452465204058, 0.9883574024762386, nan, nan, 0.905071664829107, nan, 0.8695796094008607, nan, nan, 0.8047097036134795, 0.8499855198378222, nan, nan, 0.877543470218276, nan, nan, nan, nan, nan, nan, 0.9764239515960776, 0.0, nan, 0.971399730820996, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2006269592476489, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8933614330874605, 0.7422497785651019, nan, nan, nan, 0.9251671524087091, nan, nan, nan, 0.24065112186537616, 0.8365258462345841, nan, nan, 0.9274647887323944, 0.8234967995564739, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0119 | 87.0 | 1740 | 0.2099 | 0.6436 | 0.7510 | 0.9442 | [0.8747157030744916, 0.9790033093444225, 0.9606612291991369, 0.9272225270349197, 0.25524974515800203, 0.9676858426315027, 0.9834099269007216, nan, 0.7102073486812014, nan, 0.8452776389664599, 0.8844584813791289, 0.9163197034969074, 0.7710821461048802, 0.5111752206349945, 0.9165682802046439, 0.975737336370503, 0.5942793224672419, nan, 0.8022400208374031, 0.4364699627857523, nan, 0.9267140307283099, nan, 0.0, nan, nan, 0.9222334872892494, nan, nan, nan, nan, nan, 0.6828270382130389, nan, nan, 0.5655529037390613, 0.7035168116986766, 0.0, nan, nan, 0.0, nan, 0.24888888888888888, nan, nan, nan, 0.8677861486793138, nan, nan, 0.9728953922166769, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3372018052869117, nan, 0.670173211208851, 0.8919023091208522, nan, nan, 0.8265716007230367, 0.0, 0.8508600469116497, nan, nan, 0.7819563577435669, 0.8341968911917098, nan, nan, 0.868507767142633, nan, nan, nan, nan, nan, nan, 0.8298740464786234, 0.0, nan, 0.9685856004809861, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1162043795620438, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8860092709650231, 0.6957585644371941, nan, nan, nan, 0.9296366129585191, nan, nan, nan, 0.24351559761654398, 0.8183716075156576, 0.0, nan, 0.9067454455933038, 0.8111209613869188, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9597556522027821, 0.9873869598188262, 0.9961216868266244, 0.9668328836493425, 0.2730643402399128, 0.9936232623389873, 0.9917221590345979, nan, 0.7487843757116258, nan, 0.9996523613356277, 0.975613296559733, 0.9434532659331946, 0.8518518518518519, 0.5191795711967737, 0.9686597837255957, 0.9984177167043473, 0.8573075149838635, nan, 0.8487771271098863, 0.9480369515011547, nan, 0.9680827727405674, nan, nan, nan, nan, 0.9640067511873455, nan, nan, nan, nan, nan, 0.8883066173448417, nan, nan, 0.5957928260140798, 0.7048596780850186, nan, nan, nan, 0.0, nan, 0.24888888888888888, nan, nan, nan, 0.9699687512682115, nan, nan, 0.990348296189206, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4325889164598842, nan, 0.6973342769521114, 0.9894847324534977, nan, nan, 0.907497243660419, nan, 0.8645481628599802, nan, nan, 0.8116408560988342, 0.8392701998262381, nan, nan, 0.8775963215474869, nan, nan, nan, nan, nan, nan, 0.9760066764030878, 0.0, nan, 0.9719682554415928, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12476489028213165, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8861959957850368, 0.7555358724534986, nan, nan, nan, 0.929795988342191, nan, nan, nan, 0.24452265728112627, 0.8228811335607452, nan, nan, 0.9263581488933602, 0.8300488886648858, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.046 | 88.0 | 1760 | 0.2083 | 0.6323 | 0.7524 | 0.9440 | [0.8738993507932734, 0.9786506516739075, 0.9606989404969847, 0.9265798598825278, 0.2574679943100996, 0.9675783110080537, 0.9836132329813525, nan, 0.7007443472940936, nan, 0.8521592222650345, 0.88346136169654, 0.9184161573212862, 0.7649786455155583, 0.5082707466948048, 0.9175480100194823, 0.9757004231831821, 0.5942965180400189, nan, 0.802192456134661, 0.44275637547476937, nan, 0.9274910720537778, nan, 0.0, nan, nan, 0.9225369735492105, nan, nan, nan, nan, nan, 0.6805533596837945, nan, nan, 0.5691608168474792, 0.70561577876767, 0.0, 0.0, nan, 0.0, nan, 0.2248888888888889, nan, nan, nan, 0.8667246282309043, nan, nan, 0.9729097056881615, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3452685421994885, nan, 0.6746189230630468, 0.891433983443244, nan, nan, 0.8211085950575957, 0.0, 0.8580267014001953, nan, nan, 0.7815558413958951, 0.8406672418751797, nan, nan, 0.8706797535762765, nan, nan, nan, nan, nan, nan, 0.8206254392129304, 0.0, nan, 0.9679329285920787, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18944914019236375, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8852389976837229, 0.682526661197703, nan, nan, nan, 0.9306489612336689, nan, nan, nan, 0.2295958621898834, 0.8244195147404122, 0.0, nan, 0.9074509803921569, 0.8116413059536121, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9599382490621161, 0.987753228061306, 0.9961068147266838, 0.9665560603991422, 0.27633587786259545, 0.9933894486247502, 0.9917693531107975, nan, 0.7364606914658487, nan, 0.9994959239366602, 0.9766294092030774, 0.9420160570918823, 0.839729422007903, 0.5159974164684383, 0.9678768899544943, 0.9983294739820898, 0.8695251267865376, nan, 0.8520151567344126, 0.9422632794457275, nan, 0.9685214975138929, nan, nan, nan, nan, 0.9646740197040468, nan, nan, nan, nan, nan, 0.8846074804767776, nan, nan, 0.5979718404290982, 0.7068716467189434, nan, nan, nan, 0.0, nan, 0.2248888888888889, nan, nan, nan, 0.970963029097845, nan, nan, 0.9901548548681036, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4466501240694789, nan, 0.7057796650153338, 0.9878909211063384, nan, nan, 0.9195148842337376, nan, 0.8722277391592188, nan, nan, 0.8105678324923148, 0.8465102809151462, nan, nan, 0.8814016172506739, nan, nan, nan, nan, nan, nan, 0.9745462132276236, 0.0, nan, 0.9711560774121687, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20376175548589343, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8859852476290833, 0.7369353410097431, nan, nan, nan, 0.9311674952854448, nan, nan, nan, 0.23044434667839858, 0.8291786932563632, nan, nan, 0.9311871227364185, 0.830704097575727, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0159 | 89.0 | 1780 | 0.2122 | 0.6190 | 0.7509 | 0.9438 | [0.8740002418769652, 0.9787595060920762, 0.9607525313621142, 0.9264992858859963, 0.2598957980918871, 0.967005233652593, 0.9835537406810775, nan, 0.7020375895748859, nan, 0.8465422321665146, 0.8842285188350536, 0.9150692161896308, 0.7671473582161901, 0.520612722677871, 0.9169665869459215, 0.9757007233273056, 0.5937254901960785, nan, 0.8001963993453355, 0.4401939655172414, nan, 0.9272905575791538, nan, 0.0, nan, nan, 0.9214908107566728, nan, nan, nan, nan, nan, 0.6808520740025639, nan, nan, 0.5616654815517105, 0.7017638727951591, 0.0, nan, nan, 0.0, 0.0, 0.2328888888888889, nan, nan, nan, 0.8660856645890709, nan, nan, 0.9718230882456017, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3389830508474576, nan, 0.6709907792442596, 0.8916733892181965, nan, nan, 0.8271678599840891, 0.0, 0.8539684608367001, nan, nan, 0.7767709819056409, 0.8377911993097498, nan, nan, 0.868735956524011, nan, nan, nan, nan, nan, nan, 0.8207596709259584, 0.0, nan, 0.9684612805124825, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1465340742907283, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.886923562855338, 0.6857610474631751, nan, nan, nan, 0.9291284600222812, nan, nan, nan, 0.22636052931382, 0.8241901776384535, 0.0, nan, 0.9064585787200629, 0.787926612744131, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.959704746411695, 0.9879337828287258, 0.9961068147266838, 0.9673200925696949, 0.27924391130498, 0.9936232623389873, 0.9912266212345027, nan, 0.7375189208739099, nan, 0.9996175974691905, 0.9761939323559298, 0.9451134899395381, 0.8479673163217467, 0.5289780872414499, 0.96616430983021, 0.9982199312923907, 0.8725218994928539, nan, 0.8420943851188426, 0.9434180138568129, nan, 0.967973091547236, nan, nan, nan, nan, 0.9630254739569023, nan, nan, nan, nan, nan, 0.8841142622277024, nan, nan, 0.5957928260140798, 0.7029766817994222, nan, nan, nan, 0.0, nan, 0.2328888888888889, nan, nan, nan, 0.9699484598839333, nan, nan, 0.9881288115576099, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4466501240694789, nan, 0.7004010379806558, 0.9882602188575094, nan, nan, 0.9170893054024256, nan, 0.867593512082092, nan, nan, 0.8054927208398585, 0.843614248479583, nan, nan, 0.8786533481317055, nan, nan, nan, nan, nan, nan, 0.9783016899645316, 0.0, nan, 0.9717536083909593, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.15705329153605016, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8876712328767123, 0.7422497785651019, nan, nan, nan, 0.9293673924224242, nan, nan, nan, 0.22727672679278488, 0.8278667016531094, nan, nan, 0.9290744466800804, 0.8052013507383701, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0119 | 90.0 | 1800 | 0.2117 | 0.6299 | 0.7498 | 0.9438 | [0.8739975163887124, 0.978454794408567, 0.9607267221801665, 0.9261621059556272, 0.2556467476090348, 0.9685964803183882, 0.9834285446712867, nan, 0.7028932838580518, nan, 0.8438728798638417, 0.8834001182421336, 0.9193801443433344, 0.7664238008500304, 0.5173284974735732, 0.9149335309648432, 0.9757181352574817, 0.5916982323232324, nan, 0.8026242959007587, 0.42700156985871274, nan, 0.9268446499894891, nan, 0.0, nan, nan, 0.9231164662662839, nan, nan, nan, nan, nan, 0.6797586898501287, nan, nan, 0.5656173182142574, 0.7011020702441034, 0.0, nan, nan, 0.0, nan, 0.23022222222222222, nan, nan, nan, 0.8647802724657608, nan, nan, 0.9705276216010724, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3410224438902743, nan, 0.6720473865075058, 0.8920666807640798, nan, nan, 0.828386454183267, 0.0, 0.8553188717347404, nan, nan, 0.7737885524145066, 0.8317031070195627, nan, nan, 0.8657247514390372, nan, nan, nan, nan, nan, nan, 0.8246387028551286, 0.0, nan, 0.9683639075661832, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.13259020240539748, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8828240252897788, 0.6833060556464812, nan, nan, nan, 0.923538487913595, nan, nan, nan, 0.22753966167061093, 0.8150470219435737, 0.0, nan, 0.9045369550241118, 0.8066597704546574, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603488153337096, 0.9879492589516474, 0.9961530834820543, 0.9665339145391262, 0.27400945110868774, 0.9932406580793266, 0.9911747077506832, nan, 0.738389615956492, nan, 0.9997218890685022, 0.9760487734068806, 0.9438249578749133, 0.8454222758020227, 0.5257959325131146, 0.9665068258550668, 0.9982199312923907, 0.8642231443061319, nan, 0.8491560454702033, 0.9422632794457275, nan, 0.9671687627961392, nan, nan, nan, nan, 0.9632871478850205, nan, nan, nan, nan, nan, 0.8845663789560214, nan, nan, 0.5978042239356353, 0.7023318200577796, nan, nan, nan, 0.0, nan, 0.23022222222222222, nan, nan, nan, 0.9699078771153768, nan, nan, 0.9876910233045886, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.45244003308519437, nan, 0.7012502948808681, 0.9839452661859317, nan, nan, 0.9169790518191842, nan, 0.8692485931810658, nan, nan, 0.8025346557624268, 0.8372429771213438, nan, nan, 0.8743723904656202, nan, nan, nan, nan, nan, nan, 0.9762153139995827, 0.0, nan, 0.9716839931312944, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.14169278996865203, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8828240252897788, 0.7395925597874224, nan, nan, nan, 0.923538487913595, nan, nan, nan, 0.22842058952925648, 0.8186827604303333, nan, nan, 0.9246478873239437, 0.8253616249180988, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0133 | 91.0 | 1820 | 0.2147 | 0.6287 | 0.7481 | 0.9432 | [0.872985949808818, 0.9780722140781187, 0.9606901727822192, 0.9266017111777864, 0.24834640300034094, 0.9675803246107982, 0.9834366882736995, nan, 0.7047843973792473, nan, 0.8474836047454131, 0.8858010429731336, 0.9202038992100113, 0.7660651140104865, 0.489973479737589, 0.9140925299865698, 0.9757260331008639, 0.5942512307447991, nan, 0.8012977273468324, 0.4252214695153726, nan, 0.9272160761798067, nan, 0.0, nan, nan, 0.9213968860349095, nan, nan, nan, nan, nan, 0.6812331211800565, nan, nan, 0.5576892556737025, 0.7019594716378711, 0.0, nan, nan, 0.0, nan, 0.2311111111111111, nan, nan, nan, 0.8644101525046584, nan, nan, 0.9700870920059127, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3499056010069226, nan, 0.6761736610982843, 0.8889240117310293, nan, nan, 0.8292609653514921, 0.0, 0.851067986454806, nan, nan, 0.7737342657342657, 0.8435432844406098, nan, nan, 0.8678909642651599, nan, nan, nan, nan, nan, nan, 0.8223510806536637, 0.0, nan, 0.9682011077577733, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1164906103286385, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8885611965451864, 0.6820428336079077, nan, nan, nan, 0.920397583651086, nan, nan, nan, 0.21928978518193776, 0.8183481442760062, 0.0, nan, 0.9062653683485787, 0.7800148038490008, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603676283434591, 0.9878202912606334, 0.9962390111705995, 0.9677519368400073, 0.26477644492911667, 0.9934532160013604, 0.9916702455507784, nan, 0.7435200192892449, nan, 0.9995654516695347, 0.973943968645667, 0.9438497373376945, 0.8415377402719175, 0.4976921501598954, 0.9657973283750061, 0.9983081740146483, 0.8626094974642693, nan, 0.84653806407165, 0.9422632794457275, nan, 0.9683021351272302, nan, nan, nan, nan, 0.9593227878740301, nan, nan, nan, nan, nan, 0.8864570489108097, nan, nan, 0.58900435802883, 0.7032088320264135, nan, nan, nan, 0.0, nan, 0.2311111111111111, nan, nan, nan, 0.9695629235826468, nan, nan, 0.988882214597693, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4598842018196857, nan, 0.7046945034206181, 0.9838480825672025, nan, nan, 0.9130099228224917, nan, 0.8652101952995697, nan, nan, 0.8021866481062584, 0.8494063133507095, nan, nan, 0.8766978489509011, nan, nan, nan, nan, nan, nan, 0.9764239515960776, 0.0, nan, 0.9714925511672159, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12445141065830721, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8889357218124342, 0.733392382639504, nan, nan, nan, 0.9207526144351106, nan, nan, nan, 0.22006159260888694, 0.8215691419574914, nan, nan, 0.9269617706237424, 0.7966836348974347, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0094 | 92.0 | 1840 | 0.2154 | 0.6404 | 0.7477 | 0.9429 | [0.8719240677257694, 0.9787120944947892, 0.9610061248760517, 0.9265877926835305, 0.2526186913345123, 0.9664503059368281, 0.983324597038396, nan, 0.7069952385776517, nan, 0.8516121388053733, 0.8845191040843214, 0.9212644094830711, 0.764856776400171, 0.4649792170730194, 0.9143512086690747, 0.9757033248081841, 0.5955378486055777, nan, 0.8039766234614254, 0.430152872957301, nan, 0.9275514489710206, nan, 0.0, nan, nan, 0.9214947638062232, nan, nan, nan, nan, nan, 0.6800415081286753, nan, nan, 0.5575525201693426, 0.702211466673532, 0.0, nan, nan, 0.0, nan, 0.22577777777777777, nan, nan, nan, 0.8635739328441826, nan, nan, 0.9702187945845098, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.355, nan, 0.683093833055543, 0.8888087023868196, nan, nan, 0.8277449704734261, 0.0, 0.8539428273751384, nan, nan, 0.7707936863315795, 0.844559585492228, nan, nan, 0.8707721260912751, nan, nan, nan, nan, nan, nan, 0.8187062937062937, 0.0, nan, 0.9684231825408296, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10716382853787434, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.894293535481154, 0.684297520661157, nan, nan, nan, 0.9199897203066775, nan, nan, nan, 0.21001842266865514, 0.8198692810457516, 0.0, nan, 0.9047900068850202, 0.7755081902506414, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603853347055764, 0.9878202912606334, 0.9961299491043691, 0.9672278181529614, 0.27001090512540893, 0.993772052884411, 0.9915711379907594, nan, 0.7478600993931925, nan, 0.999461160070223, 0.9745246044418638, 0.9446179006839132, 0.8387248007501172, 0.4722821720569007, 0.966115378969516, 0.9983294739820898, 0.8614568925772246, nan, 0.8482604202549087, 0.9422632794457275, nan, 0.9689236618894413, nan, nan, nan, nan, 0.9601601444440083, nan, nan, nan, nan, nan, 0.8888614878750514, nan, nan, 0.5849815621857191, 0.7035699546017334, nan, nan, nan, 0.0, nan, 0.22577777777777777, nan, nan, nan, 0.9696237977354815, nan, nan, 0.9900734058907973, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46980976013234077, nan, 0.715451757489974, 0.9814573655464635, nan, nan, 0.9117971334068358, nan, 0.8681893412777226, nan, nan, 0.7987355721825881, 0.8496959165942659, nan, nan, 0.8803445906664552, nan, nan, nan, nan, nan, nan, 0.9770498643855623, 0.0, nan, 0.9717884160207918, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11442006269592477, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8950474183350896, 0.733392382639504, nan, nan, nan, 0.9205811760672038, nan, nan, nan, 0.2106467223933128, 0.8228811335607452, nan, nan, 0.9254527162977867, 0.792197973892445, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0151 | 93.0 | 1860 | 0.2154 | 0.6290 | 0.7483 | 0.9430 | [0.8715209852503526, 0.9784095843870539, 0.9608048483457348, 0.9260015544407546, 0.2572144695840672, 0.9673640860081538, 0.9834939003267514, nan, 0.7018441842152057, nan, 0.853307808743656, 0.8853905941246213, 0.9183097912548812, 0.7668232566652431, 0.46809237072443977, 0.9162067118470388, 0.9757187719319115, 0.5927680404231802, nan, 0.800952039385739, 0.4303062302006336, nan, 0.9268215750183998, nan, 0.0, nan, nan, 0.9216498468644876, nan, nan, nan, nan, nan, 0.6819824389006879, nan, nan, 0.5560710286354184, 0.7039182370507672, 0.0, nan, nan, 0.0, nan, 0.21511111111111111, nan, nan, nan, 0.8648100532357947, nan, nan, 0.9702673540342884, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34766355140186916, nan, 0.6813236422734865, 0.8922890338002918, nan, nan, 0.8253968253968254, 0.0, 0.8567806527109608, nan, nan, 0.7752239641657335, 0.842832469775475, nan, nan, 0.8692331833873836, nan, nan, nan, nan, nan, nan, 0.8216045038705138, 0.0, nan, 0.9677257089994508, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1365237815619495, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8792413066385669, 0.6912251655629139, nan, nan, nan, 0.9243981838430566, nan, nan, nan, 0.20345523107954047, 0.8247126436781609, 0.0, nan, 0.9063328424153166, 0.7943335796994334, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9608932859688146, 0.9879389415363663, 0.9960324542269813, 0.9674566587064604, 0.27604507451835697, 0.9935807507545806, 0.9915097856917, nan, 0.740211377975433, nan, 0.9994785420034417, 0.975613296559733, 0.9440231935771632, 0.8418056392739937, 0.4751492619606484, 0.962494495278172, 0.998369031064481, 0.8653757491931766, nan, 0.8462280399586635, 0.941108545034642, nan, 0.9668397192161451, nan, nan, nan, nan, 0.9606834923002446, nan, nan, nan, nan, nan, 0.8842786683107275, nan, nan, 0.5826349312772376, 0.7052981840693355, nan, nan, nan, 0.0, nan, 0.21511111111111111, nan, nan, nan, 0.9691165131285256, nan, nan, 0.9887498600095702, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46153846153846156, nan, 0.7120547298891248, 0.9867052809578417, nan, nan, 0.9115766262403528, nan, 0.8724925521350546, nan, nan, 0.8030566672466795, 0.8479582971329279, nan, nan, 0.878283388827229, nan, nan, nan, nan, nan, nan, 0.9743375756311288, 0.0, nan, 0.9709762379913677, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.14576802507836992, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8792413066385669, 0.7395925597874224, nan, nan, nan, 0.9249528544488257, nan, nan, nan, 0.20413550373955125, 0.8283914982944109, nan, nan, 0.928672032193159, 0.8125094501285217, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0294 | 94.0 | 1880 | 0.2178 | 0.6156 | 0.7465 | 0.9425 | [0.8710979112887598, 0.9778437211107763, 0.9602493434349065, 0.9267066427187487, 0.24373078237102835, 0.9680432441388451, 0.983342465240216, nan, 0.705094717622296, nan, 0.8481425221584792, 0.8827269747623729, 0.9196342404941131, 0.7686095931997571, 0.4578819950960613, 0.9127661049857846, 0.975693463436105, 0.5940451416680006, nan, 0.8010089503661514, 0.4263322884012539, nan, 0.9261801372741281, nan, 0.0, nan, nan, 0.9213295703982439, nan, nan, nan, nan, nan, 0.6800988914084152, nan, nan, 0.55888, 0.702054794520548, 0.0, nan, nan, 0.0, 0.0, 0.21066666666666667, nan, nan, nan, 0.8658456962530379, nan, nan, 0.9699962058430017, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3458411507191995, nan, 0.6733907524932004, 0.8891105045839457, nan, nan, 0.828577150295384, 0.0, 0.8381412348257408, nan, nan, 0.7731886692206594, 0.8436510221710337, nan, nan, 0.861675392670157, nan, nan, nan, nan, nan, nan, 0.820476858345021, 0.0, nan, 0.9675278118278327, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12617096018735363, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8891464699683878, 0.6886870355078447, nan, nan, nan, 0.9224407593092514, nan, nan, nan, 0.21862667719021311, 0.8125, 0.0, nan, 0.9054187192118227, 0.7778490379871732, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9601164193309208, 0.9876500539084948, 0.996315024125851, 0.9674271308931056, 0.2593238822246456, 0.9935169833779705, 0.990981212038265, nan, 0.7448863408034506, nan, 0.9996349794024091, 0.9773552039483234, 0.9445187828327882, 0.8478333668207086, 0.46479938247294383, 0.966115378969516, 0.9982807883422236, 0.8554633471645919, nan, 0.8477781605235962, 0.9422632794457275, nan, 0.9669494004094764, nan, nan, nan, nan, 0.9610367521032042, nan, nan, nan, nan, nan, 0.887566789971229, nan, nan, 0.585484411666108, 0.7032862154354107, nan, nan, nan, 0.0, nan, 0.21066666666666667, nan, nan, nan, 0.9687309768272392, nan, nan, 0.9890858370409586, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4574028122415219, nan, 0.7008728473696627, 0.9877354273163715, nan, nan, 0.912348401323043, nan, 0.8501820589208872, nan, nan, 0.8018676410881039, 0.8485375036200405, nan, nan, 0.8698271761534803, nan, nan, nan, nan, nan, nan, 0.9764239515960776, 0.0, nan, 0.9707383858541793, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.13510971786833856, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8891464699683878, 0.7387068201948627, nan, nan, nan, 0.9226384364820847, nan, nan, nan, 0.21935767707875056, 0.8152715822618736, nan, nan, 0.9245472837022133, 0.7946676074794617, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0313 | 95.0 | 1900 | 0.2157 | 0.6279 | 0.7476 | 0.9422 | [0.8694681244345772, 0.9781277295643424, 0.9606870247258134, 0.92642325025292, 0.25775850386231197, 0.9671093459103884, 0.9836064038947664, nan, 0.7010461288149381, nan, 0.8548303182554665, 0.883655235842859, 0.9229220243582901, 0.7706326953553578, 0.433197249344158, 0.9134826526130874, 0.9757341461601368, 0.5947596849381128, nan, 0.8011308562197093, 0.4369296833064949, nan, 0.9267678533958174, nan, 0.0, nan, nan, 0.9212764356137578, nan, nan, nan, nan, nan, 0.679515037002047, nan, nan, 0.5611827618748034, 0.7034500514933059, 0.0, nan, nan, 0.0, nan, 0.22311111111111112, nan, nan, nan, 0.8673984951473955, nan, nan, 0.9695080525916158, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3625694873378629, nan, 0.6771282144314935, 0.8905449874179527, nan, nan, 0.8248610007942812, 0.0, 0.84876523098977, nan, nan, 0.7765303842716711, 0.8431203223949338, nan, nan, 0.866802635707562, nan, nan, nan, nan, nan, nan, 0.8208771929824561, 0.0, nan, 0.9679580945993606, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0748742231429417, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8839022334597556, 0.6954397394136808, nan, nan, nan, 0.9224507283633248, nan, nan, nan, 0.21997194459056638, 0.8255419169495952, 0.0, nan, 0.9055899400726987, 0.7891673243883188, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9604118942487523, 0.9878460847988362, 0.9959944477493555, 0.9666594077458837, 0.2765539803707743, 0.9937507970922076, 0.9916324902898188, nan, 0.7387780799164133, nan, 0.9995828336027534, 0.9762665118304543, 0.9426355436614134, 0.8467617708124037, 0.4396257029883899, 0.9668738073102706, 0.9984055452943807, 0.8529276164130936, nan, 0.8541164312779883, 0.9399538106235565, nan, 0.9688505410938871, nan, nan, nan, nan, 0.9609451662283628, nan, nan, nan, nan, nan, 0.8868886148787505, nan, nan, 0.5980556486758297, 0.7047565002063557, nan, nan, nan, 0.0, nan, 0.22311111111111112, nan, nan, nan, 0.9684266060630656, nan, nan, 0.9879760947251606, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4855252274607113, nan, 0.704411417787214, 0.9836342786059982, nan, nan, 0.915986769570011, nan, 0.8623634558093347, nan, nan, 0.8063917406182936, 0.8482479003764842, nan, nan, 0.876010781671159, nan, nan, nan, nan, nan, nan, 0.9762153139995827, 0.0, nan, 0.9712430964867499, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.07931034482758621, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8840885142255005, 0.7564216120460585, nan, nan, nan, 0.9227670152580147, nan, nan, nan, 0.22076550813902332, 0.829441091577014, nan, nan, 0.927364185110664, 0.8063101658182551, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0272 | 96.0 | 1920 | 0.2180 | 0.6272 | 0.7464 | 0.9421 | [0.8691527802139976, 0.9785377877097258, 0.9603828248062065, 0.9257213383173075, 0.24600246002460024, 0.9664435077279114, 0.9835711946095176, nan, 0.7011954222210894, nan, 0.8435959868575452, 0.8829766216661179, 0.925239336435166, 0.7702743809697633, 0.4370729283140763, 0.9129981946951812, 0.9757098626820571, 0.5956770502225047, nan, 0.8011224358147336, 0.4396082698585419, nan, 0.926262767594795, nan, 0.0, nan, nan, 0.9233690067420236, nan, nan, nan, nan, nan, 0.681444602497234, nan, nan, 0.5648854961832062, 0.7027110527535336, 0.0, nan, nan, 0.0, nan, 0.23377777777777778, nan, nan, nan, 0.8634918340800694, nan, nan, 0.9697299021904943, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3445859872611465, nan, 0.6765211879753712, 0.8921718368708415, nan, nan, 0.8258237395791981, 0.0, 0.8540811673506612, nan, nan, 0.7737293581826818, 0.8346297896859695, nan, nan, 0.8700762720718838, nan, nan, nan, nan, nan, nan, 0.8213783403656821, 0.0, nan, 0.9677361534325147, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.08242886476972719, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8811380400421497, 0.6824877250409165, nan, nan, nan, 0.9157881203394189, nan, nan, nan, 0.2175934046658481, 0.817398119122257, 0.0, nan, 0.9066117323916029, 0.7954556656316231, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9596095747153149, 0.9871496592673603, 0.996405909181043, 0.9661832717555392, 0.2617230098146129, 0.9941759129362752, 0.9920147623070352, nan, 0.7370098991333235, nan, 0.9996871252020649, 0.9731455944258963, 0.9411735553573198, 0.8479673163217467, 0.4433592211597536, 0.9650878308949454, 0.9982381884073406, 0.863992623328723, nan, 0.8555976575955908, 0.9330254041570438, nan, 0.9681193331383445, nan, nan, nan, nan, 0.9640460022765632, nan, nan, nan, nan, nan, 0.886046033703247, nan, nan, 0.5953737847804224, 0.704034255055716, nan, nan, nan, 0.0, nan, 0.23377777777777778, nan, nan, nan, 0.9698470029625421, nan, nan, 0.9902261227232465, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.44747725392886684, nan, 0.7050247699929229, 0.9877548640401174, nan, nan, 0.91742006615215, nan, 0.8679907315458457, nan, nan, 0.8030566672466795, 0.8389805965826818, nan, nan, 0.8802388880080334, nan, nan, nan, nan, nan, nan, 0.9747548508241185, 0.0, nan, 0.9709530329048127, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.08808777429467085, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8811380400421497, 0.7387068201948627, nan, nan, nan, 0.9158666209497686, nan, nan, nan, 0.21830180378354597, 0.82104434531619, nan, nan, 0.9297786720321931, 0.8134166624666096, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0358 | 97.0 | 1940 | 0.2203 | 0.6268 | 0.7468 | 0.9413 | [0.8667706524324076, 0.9788105476839376, 0.9603470825029828, 0.9252487637036854, 0.25049269452939177, 0.9667342044326828, 0.9835600668267138, nan, 0.6951745432275186, nan, 0.8428889670535819, 0.8850422832980972, 0.921143575243481, 0.7692962715163312, 0.394693509802094, 0.9127507461650587, 0.975705937458179, 0.593263025342358, nan, 0.8040344883683097, 0.43712898003237993, nan, 0.9269436762698218, nan, 0.0, nan, nan, 0.9229101934449233, nan, nan, nan, nan, nan, 0.6798893133971669, nan, nan, 0.5646702771583306, 0.7010763209393346, 0.0, nan, nan, 0.0, nan, 0.24177777777777779, nan, nan, nan, 0.8630989982853533, nan, nan, 0.9702822291317866, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3568796068796069, nan, 0.6772532188841202, 0.8934443544695982, nan, nan, 0.8262119559829484, 0.0, 0.8640239536548852, nan, nan, 0.7753443025951895, 0.829057365234938, nan, nan, 0.8696969696969697, nan, nan, nan, nan, nan, nan, 0.8193763139453398, 0.0, nan, 0.9671965259426735, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09840425531914894, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8805057955742888, 0.679835390946502, nan, nan, nan, 0.9149775016070281, nan, nan, nan, 0.22212482468443198, 0.8251565762004175, 0.0, nan, 0.9079268888671684, 0.7813994772917797, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9599559554242334, 0.9875056100945591, 0.9961910899596801, 0.9647474818311673, 0.26797528171573975, 0.9938995876376313, 0.9918967771165363, nan, 0.7298434088382248, nan, 0.9996697432688464, 0.9722746407316011, 0.9444940033700069, 0.847096644564999, 0.4002583531561619, 0.9651612271859862, 0.9983173025721233, 0.8688335638543108, nan, 0.8512573200137789, 0.9353348729792148, nan, 0.9681193331383445, nan, nan, nan, nan, 0.963784328348445, nan, nan, nan, nan, nan, 0.8886765310316481, nan, nan, 0.5942004693261816, 0.702306025588114, nan, nan, nan, 0.0, nan, 0.24177777777777779, nan, nan, nan, 0.9703339961852198, nan, nan, 0.9912544160617384, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.48056244830438377, nan, 0.7072894550601557, 0.9864720402728916, nan, nan, 0.9188533627342889, nan, 0.8787818603111552, nan, nan, 0.8049127080795777, 0.8328989284679988, nan, nan, 0.879763226045135, nan, nan, nan, nan, nan, nan, 0.975798038806593, 0.0, nan, 0.9703555019260222, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10438871473354232, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8805057955742888, 0.7316209034543845, nan, nan, nan, 0.9150951482941883, nan, nan, nan, 0.22296524417069952, 0.8297034898976646, nan, nan, 0.9345070422535211, 0.7986492616299582, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0094 | 98.0 | 1960 | 0.2192 | 0.6399 | 0.7481 | 0.9418 | [0.8683307516618989, 0.9781646669255769, 0.9602976521349946, 0.9259917508983732, 0.24664349485449466, 0.966499266331866, 0.9833601692535245, nan, 0.7047768107090141, nan, 0.8476754812652929, 0.8831996845011174, 0.9232748367678827, 0.7646590769984086, 0.41177658038168297, 0.9126312381480968, 0.9757201572658113, 0.5931962025316456, nan, 0.8037802907915994, 0.42692509167103193, nan, 0.9273019871256647, nan, 0.0, nan, nan, 0.922998049317261, nan, nan, nan, nan, nan, 0.6794756801837952, nan, nan, 0.5670152470663367, 0.7035969000231725, 0.0, nan, nan, 0.0, nan, 0.2391111111111111, nan, nan, nan, 0.8639567669172933, nan, nan, 0.9690713431646074, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3424317617866005, nan, 0.6763498579096937, 0.8914826720064639, nan, nan, 0.8272106149123676, 0.0, 0.8548807506842174, nan, nan, 0.7732863692273286, 0.8361175115207373, nan, nan, 0.867357079345154, nan, nan, nan, nan, nan, nan, 0.8204589245051673, 0.0, nan, 0.9673528391349601, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12514619883040937, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8902002107481559, 0.6872964169381107, nan, nan, nan, 0.9133638973392176, nan, nan, nan, 0.22168653576437589, 0.8224079394097675, 0.0, nan, 0.9044305038227799, 0.8016064652835954, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9596936799353718, 0.9874746578487157, 0.9962984995703615, 0.9653786388416239, 0.26310432569974557, 0.9940483781830549, 0.9914956274688401, nan, 0.7419259775226715, nan, 0.9996002155359719, 0.9752503991871099, 0.9425612052730696, 0.8367155582345456, 0.41774445092077694, 0.9655037432108431, 0.9983051311621567, 0.8642231443061319, nan, 0.8569410954185326, 0.941108545034642, nan, 0.9690699034805499, nan, nan, nan, nan, 0.9657599665057373, nan, nan, nan, nan, nan, 0.8873818331278257, nan, nan, 0.5952899765336909, 0.7048854725546843, nan, nan, nan, 0.0, nan, 0.2391111111111111, nan, nan, nan, 0.9699484598839333, nan, nan, 0.9901752171124302, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.456575682382134, nan, 0.7074309978768577, 0.9864720402728916, nan, nan, 0.9210584343991179, nan, 0.8685203574975173, nan, nan, 0.8022156487442724, 0.8407182160440196, nan, nan, 0.8764335923048465, nan, nan, nan, nan, nan, nan, 0.9772585019820572, 0.0, nan, 0.970517937531907, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1341692789968652, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8902002107481559, 0.7475642161204605, nan, nan, nan, 0.9136379221669809, nan, nan, nan, 0.22252529696436427, 0.8262923117292049, nan, nan, 0.9282696177062374, 0.8198679502041227, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0289 | 99.0 | 1980 | 0.2196 | 0.6276 | 0.7470 | 0.9419 | [0.8688042207392105, 0.978883213432935, 0.9604697365339485, 0.9255546437263003, 0.2485864159683902, 0.9668768737723561, 0.9835718765503749, nan, 0.703681566732371, nan, 0.8439040854660719, 0.8840503326964886, 0.9217943436008046, 0.7697360418440579, 0.42132074299558925, 0.9135719577699574, 0.975721673551077, 0.5945471938775511, nan, 0.8042491080116769, 0.43726440495422725, nan, 0.9274515293623574, nan, 0.0, nan, nan, 0.9220969376900647, nan, nan, nan, nan, nan, 0.6825477223083984, nan, nan, 0.5630975143403442, 0.7031306317903301, 0.0, nan, nan, 0.0, nan, 0.2328888888888889, nan, nan, nan, 0.8630948082894072, nan, nan, 0.9699630268180142, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3386683738796415, nan, 0.6773129743404409, 0.8921434581405132, nan, nan, 0.8270879011361372, 0.0, 0.8579053570266224, nan, nan, 0.773250936748504, 0.8340535868625756, nan, nan, 0.8706509246682688, nan, nan, nan, nan, nan, nan, 0.8219803370786517, 0.0, nan, 0.9675971968452945, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10971695360373504, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8878819810326659, 0.6881632653061225, nan, nan, nan, 0.9149528706083976, nan, nan, nan, 0.225124923292715, 0.8215778474399164, 0.0, nan, 0.9079502009606901, 0.7788082083662194, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9596516273253434, 0.9876294190779326, 0.9961976997818759, 0.9657735733452428, 0.265285350781534, 0.9939846108064447, 0.9917740725184175, nan, 0.7412160261476431, nan, 0.9995828336027534, 0.973943968645667, 0.9425116463475072, 0.8476324425691514, 0.42735392807070055, 0.9653814160591084, 0.998369031064481, 0.859612724757953, nan, 0.8541164312779883, 0.9376443418013857, nan, 0.9688871014916642, nan, nan, nan, nan, 0.9640329185801573, nan, nan, nan, nan, nan, 0.8861898890258939, nan, nan, 0.5923566878980892, 0.704472761040033, nan, nan, nan, 0.0, nan, 0.2328888888888889, nan, nan, nan, 0.9701716651109938, nan, nan, 0.9909184390303499, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43755169561621177, nan, 0.707383816937957, 0.9892320550448016, nan, nan, 0.9149944873208379, nan, 0.8725587553790136, nan, nan, 0.801954643002146, 0.838401390095569, nan, nan, 0.8808202526293536, nan, nan, nan, nan, nan, nan, 0.9768412267890674, 0.0, nan, 0.9708080011138441, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11786833855799372, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8878819810326659, 0.7466784765279008, nan, nan, nan, 0.9152665866620949, nan, nan, nan, 0.22595688517377915, 0.8252427184466019, nan, nan, 0.9317907444668008, 0.7957260218738975, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0231 | 100.0 | 2000 | 0.2162 | 0.6275 | 0.7470 | 0.9419 | [0.8685218749687071, 0.9788782456158296, 0.9606716149161357, 0.9253138334602241, 0.24566552901023891, 0.9673287812952618, 0.9833229666973768, nan, 0.6997756475627167, nan, 0.8422355336194145, 0.8849884526558891, 0.9198037983859276, 0.7675987245523669, 0.4259095355662179, 0.9140075841657418, 0.9757245751875003, 0.5952880426012587, nan, 0.8031902797180078, 0.43475935828877005, nan, 0.9271270640917996, nan, 0.0, nan, nan, 0.9220219140987388, nan, nan, nan, nan, nan, 0.6804193386143593, nan, nan, 0.5649463383838383, 0.7019282753649306, 0.0, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.8639147321831813, nan, nan, 0.9696930423599549, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3532369578881207, nan, 0.6765980498374865, 0.8919627154414351, nan, nan, 0.8257598257598258, 0.0, 0.8610134166992315, nan, nan, 0.7764935790061418, 0.8315092165898618, nan, nan, 0.869378791047898, nan, nan, nan, nan, nan, nan, 0.8238296374516015, 0.0, nan, 0.9686018546354324, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09776864357017029, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8853530031612223, 0.6824877250409165, nan, nan, nan, 0.9154241645244215, nan, nan, nan, 0.2217741935483871, 0.8228541612314114, 0.0, nan, 0.9068718753063425, 0.777870955011839, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9598309042417804, 0.9873921185264668, 0.9962340538039527, 0.9644669676042977, 0.26165030897855324, 0.9937295413000042, 0.9923073655794724, nan, 0.7353488808219362, nan, 0.9995828336027534, 0.9734359123239947, 0.9432798096937258, 0.8383899269975219, 0.4320956536807448, 0.9670939961833929, 0.998369031064481, 0.8503918856615952, nan, 0.8516362383740957, 0.9387990762124712, nan, 0.9688871014916642, nan, nan, nan, nan, 0.9622535358689537, nan, nan, nan, nan, nan, 0.8896629675297986, nan, nan, 0.5999832383506537, 0.7032862154354107, nan, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.9703745789537762, nan, nan, 0.9902872094562263, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46484698097601324, nan, 0.7071479122434536, 0.9857723182180411, nan, nan, 0.9196251378169791, nan, 0.8752068851373718, nan, nan, 0.8066237457224059, 0.8360845641471184, nan, nan, 0.8787061994609164, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9719508516266766, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10438871473354232, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8853530031612223, 0.7387068201948627, nan, nan, nan, 0.9157380421738385, nan, nan, nan, 0.22261328640563133, 0.8276043033324587, nan, nan, 0.9306841046277666, 0.7947684088503604, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
asenella/mmnist_MVTCAEconfig2_seed_2_ratio_02_i
|
asenella
| 2023-05-13T01:46:07Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T01:45:58Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MVTCAEconfig2_seed_1_ratio_02_i
|
asenella
| 2023-05-13T01:45:23Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T01:45:15Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
rohitp1/dgx1_whisper_base_mozilla_noisy_teacher_distil_epochs_50_batch_16
|
rohitp1
| 2023-05-13T01:30:40Z | 75 | 0 |
transformers
|
[
"transformers",
"pytorch",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-05-05T04:40:21Z |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: dgx1_whisper_base_mozilla_noisy_teacher_distil_epochs_50_batch_16
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# dgx1_whisper_base_mozilla_noisy_teacher_distil_epochs_50_batch_16
This model is a fine-tuned version of [rohitp1/dgx1_whisper_base_finetune_teacher_babble_noise_mozilla_100_epochs_batch_16](https://huggingface.co/rohitp1/dgx1_whisper_base_finetune_teacher_babble_noise_mozilla_100_epochs_batch_16) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0845
- Wer: 33.7337
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 256
- total_train_batch_size: 4096
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine_with_restarts
- lr_scheduler_warmup_ratio: 0.2
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-------:|
| 0.1214 | 4.41 | 150 | 0.5017 | 32.6283 |
| 0.1796 | 8.82 | 300 | 0.7222 | 33.1175 |
| 0.3176 | 13.23 | 450 | 0.8749 | 33.3612 |
| 0.3986 | 17.64 | 600 | 0.9770 | 33.5457 |
| 0.4497 | 22.06 | 750 | 1.0385 | 33.7111 |
| 0.4816 | 26.47 | 900 | 1.0845 | 33.7337 |
### Framework versions
- Transformers 4.25.1
- Pytorch 1.12.1
- Datasets 2.8.0
- Tokenizers 0.13.2
|
asenella/mmnist_MVTCAEconfig2_seed_3_ratio_0_c
|
asenella
| 2023-05-13T01:20:15Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T01:20:07Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
DunnBC22/distilbert-base-uncased-Emotions_Detection
|
DunnBC22
| 2023-05-13T01:16:11Z | 9 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2022-10-02T00:25:00Z |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: distilbert-base-uncased-Emotions_Detection
results: []
language:
- en
pipeline_tag: text-classification
---
# distilbert-base-uncased-Emotions_Detection
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased).
It achieves the following results on the evaluation set:
- Loss: 0.1440
- Accuracy: 0.9345
- F1 Score: 0.9347
## Model description
This is a sentiment analysis (text classification) model.
For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Sentiment%20Analysis/Emotions%20Sentiment%20Analysis%20Project.ipynb
## Intended uses & limitations
This model is intended to demonstrate my ability to solve a complex problem using technology.
## Training and evaluation data
Dataset Source: https://www.kaggle.com/datasets/praveengovi/emotions-dataset-for-nlp
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 4
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|
| 0.7287 | 1.0 | 250 | 0.2597 | 0.8955 | 0.8948 |
| 0.2054 | 2.0 | 500 | 0.1638 | 0.9325 | 0.9326 |
| 0.1338 | 3.0 | 750 | 0.1415 | 0.935 | 0.9350 |
| 0.1067 | 4.0 | 1000 | 0.1440 | 0.9345 | 0.9347 |
### Framework versions
- Transformers 4.22.1
- Pytorch 1.12.1
- Datasets 2.4.0
- Tokenizers 0.12.1
|
moghis/my-RL-Taxi-v3
|
moghis
| 2023-05-13T01:10:38Z | 0 | 0 | null |
[
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T01:09:11Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: my-RL-Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="moghis/my-RL-Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
deetungsten/wizard-vicuna-13B-GPTQ-8bit-128g
|
deetungsten
| 2023-05-13T01:08:57Z | 6 | 2 |
transformers
|
[
"transformers",
"llama",
"text-generation",
"causal-lm",
"en",
"autotrain_compatible",
"region:us"
] |
text-generation
| 2023-05-12T20:50:50Z |
---
language:
- en
tags:
- causal-lm
- llama
inference: false
---
# Wizard-Vicuna-13B-GPTQ-8bit-128g
This repository contains 8-bit quantized models in GPTQ format of [TheBlokes's wizard-vicuna 13B in FP16 HF format](https://huggingface.co/TheBloke/wizard-vicuna-13B-HF).
These models are the result of quantization to 8-bit using [GPTQ-for-LLaMa](https://github.com/qwopqwop200/GPTQ-for-LLaMa).
While most metrics suggest that 8-bit is only marginally better than 4-bit, I have found that the 8-bit model follows instructions significantly better. The responses from the 8-bit model feel very close to the quality of GPT-3, whereas the 4-bit model lacks some "intelligence".
With this quantized model, I can replace GPT-3 for most of my work. However, a drawback is that it requires approximately 15GB of VRAM, so you need a GPU with at least 16GB of VRAM.
The content below is straight copy and paste from TheBloke's README with the 4 bit content changed to 8 bit and referencing this model.
## How to easily download and use this model in text-generation-webui
Open the text-generation-webui UI as normal.
1. Click the **Model tab**.
2. Under **Download custom model or LoRA**, enter `deetungsten/wizard-vicuna-13B-GPTQ-8bit-128g`.
3. Click **Download**.
4. Wait until it says it's finished downloading.
5. Click the **Refresh** icon next to **Model** in the top left.
6. In the **Model drop-down**: choose the model you just downloaded, `wizard-vicuna-13B-GPTQ-8bit-128g`.
7. If you see an error in the bottom right, ignore it - it's temporary.
8. Fill out the `GPTQ parameters` on the right: `Bits = 8`, `Groupsize = 128`, `model_type = Llama`
9. Click **Save settings for this model** in the top right.
10. Click **Reload the Model** in the top right.
11. Once it says it's loaded, click the **Text Generation tab** and enter a prompt!
## Provided files
**Compatible file - wizard-vicuna-13B-GPTQ-8bit-128g.no-act-order.safetensors**
In the `main` branch - the default one - you will find `wizard-vicuna-13B-GPTQ-8bit-128g.no-act-order.safetensors`
This will work with all versions of GPTQ-for-LLaMa. It has maximum compatibility
It was created without the `--act-order` parameter. It may have slightly lower inference quality compared to the other file, but is guaranteed to work on all versions of GPTQ-for-LLaMa and text-generation-webui.
* `wizard-vicuna-13B-GPTQ-8bit-128g.no-act-order.safetensors`
* Works with all versions of GPTQ-for-LLaMa code, both Triton and CUDA branches
* Works with text-generation-webui one-click-installers
* Parameters: Groupsize = 128g. No act-order.
* Command used to create the GPTQ:
```
CUDA_VISIBLE_DEVICES=0 python3 llama.py wizard-vicuna-13B-HF c4 --wbits 8 --true-sequential --groupsize 128 --save_safetensors wizard-vicuna-13B-GPTQ-8bit.compat.no-act-order.safetensors
```
# Original WizardVicuna-13B model card
Github page: https://github.com/melodysdreamj/WizardVicunaLM
# WizardVicunaLM
### Wizard's dataset + ChatGPT's conversation extension + Vicuna's tuning method
I am a big fan of the ideas behind WizardLM and VicunaLM. I particularly like the idea of WizardLM handling the dataset itself more deeply and broadly, as well as VicunaLM overcoming the limitations of single-turn conversations by introducing multi-round conversations. As a result, I combined these two ideas to create WizardVicunaLM. This project is highly experimental and designed for proof of concept, not for actual usage.
## Benchmark
### Approximately 7% performance improvement over VicunaLM

### Detail
The questions presented here are not from rigorous tests, but rather, I asked a few questions and requested GPT-4 to score them. The models compared were ChatGPT 3.5, WizardVicunaLM, VicunaLM, and WizardLM, in that order.
| | gpt3.5 | wizard-vicuna-13b | vicuna-13b | wizard-7b | link |
|-----|--------|-------------------|------------|-----------|----------|
| Q1 | 95 | 90 | 85 | 88 | [link](https://sharegpt.com/c/YdhIlby) |
| Q2 | 95 | 97 | 90 | 89 | [link](https://sharegpt.com/c/YOqOV4g) |
| Q3 | 85 | 90 | 80 | 65 | [link](https://sharegpt.com/c/uDmrcL9) |
| Q4 | 90 | 85 | 80 | 75 | [link](https://sharegpt.com/c/XBbK5MZ) |
| Q5 | 90 | 85 | 80 | 75 | [link](https://sharegpt.com/c/AQ5tgQX) |
| Q6 | 92 | 85 | 87 | 88 | [link](https://sharegpt.com/c/eVYwfIr) |
| Q7 | 95 | 90 | 85 | 92 | [link](https://sharegpt.com/c/Kqyeub4) |
| Q8 | 90 | 85 | 75 | 70 | [link](https://sharegpt.com/c/M0gIjMF) |
| Q9 | 92 | 85 | 70 | 60 | [link](https://sharegpt.com/c/fOvMtQt) |
| Q10 | 90 | 80 | 75 | 85 | [link](https://sharegpt.com/c/YYiCaUz) |
| Q11 | 90 | 85 | 75 | 65 | [link](https://sharegpt.com/c/HMkKKGU) |
| Q12 | 85 | 90 | 80 | 88 | [link](https://sharegpt.com/c/XbW6jgB) |
| Q13 | 90 | 95 | 88 | 85 | [link](https://sharegpt.com/c/JXZb7y6) |
| Q14 | 94 | 89 | 90 | 91 | [link](https://sharegpt.com/c/cTXH4IS) |
| Q15 | 90 | 85 | 88 | 87 | [link](https://sharegpt.com/c/GZiM0Yt) |
| | 91 | 88 | 82 | 80 | |
## Principle
We adopted the approach of WizardLM, which is to extend a single problem more in-depth. However, instead of using individual instructions, we expanded it using Vicuna's conversation format and applied Vicuna's fine-tuning techniques.
Turning a single command into a rich conversation is what we've done [here](https://sharegpt.com/c/6cmxqq0).
After creating the training data, I later trained it according to the Vicuna v1.1 [training method](https://github.com/lm-sys/FastChat/blob/main/scripts/train_vicuna_13b.sh).
## Detailed Method
First, we explore and expand various areas in the same topic using the 7K conversations created by WizardLM. However, we made it in a continuous conversation format instead of the instruction format. That is, it starts with WizardLM's instruction, and then expands into various areas in one conversation using ChatGPT 3.5.
After that, we applied the following model using Vicuna's fine-tuning format.
## Training Process
Trained with 8 A100 GPUs for 35 hours.
## Weights
You can see the [dataset](https://huggingface.co/datasets/junelee/wizard_vicuna_70k) we used for training and the [13b model](https://huggingface.co/junelee/wizard-vicuna-13b) in the huggingface.
## Conclusion
If we extend the conversation to gpt4 32K, we can expect a dramatic improvement, as we can generate 8x more, more accurate and richer conversations.
## License
The model is licensed under the LLaMA model, and the dataset is licensed under the terms of OpenAI because it uses ChatGPT. Everything else is free.
## Author
[JUNE LEE](https://github.com/melodysdreamj) - He is active in Songdo Artificial Intelligence Study and GDG Songdo.
|
labicquette/Reinforce-pixelcopter
|
labicquette
| 2023-05-13T00:57:51Z | 0 | 0 | null |
[
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-11T21:24:51Z |
---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-pixelcopter
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 73.90 +/- 57.80
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
charleschen2022/Taxi-v3
|
charleschen2022
| 2023-05-13T00:43:15Z | 0 | 0 | null |
[
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T00:35:26Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="charleschen2022/Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
asenella/mmnist_MoPoEconfig2_seed_2_ratio_05_i
|
asenella
| 2023-05-13T00:00:19Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T00:00:12Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MoPoEconfig2_seed_1_ratio_05_i
|
asenella
| 2023-05-12T23:59:52Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T23:59:44Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MoPoEconfig2_seed_0_ratio_05_i
|
asenella
| 2023-05-12T23:44:44Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T23:44:36Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MVTCAEconfig2_seed_0_ratio_0_c
|
asenella
| 2023-05-12T23:34:50Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T23:34:42Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
cognitivecomputations/WizardLM-13B-Uncensored
|
cognitivecomputations
| 2023-05-12T23:08:43Z | 914 | 577 |
transformers
|
[
"transformers",
"pytorch",
"llama",
"text-generation",
"uncensored",
"dataset:ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-05-09T18:56:32Z |
---
license: other
datasets:
- ehartford/WizardLM_alpaca_evol_instruct_70k_unfiltered
tags:
- uncensored
---
This is WizardLM trained with a subset of the dataset - responses that contained alignment / moralizing were removed. The intent is to train a WizardLM that doesn't have alignment built-in, so that alignment (of any sort) can be added separately with for example with a RLHF LoRA.
Shout out to the open source AI/ML community, and everyone who helped me out.
Note:
An uncensored model has no guardrails.
You are responsible for anything you do with the model, just as you are responsible for anything you do with any dangerous object such as a knife, gun, lighter, or car.
Publishing anything this model generates is the same as publishing it yourself.
You are responsible for the content you publish, and you cannot blame the model any more than you can blame the knife, gun, lighter, or car for what you do with it.
|
akaneshiro/ppo-LunarLander-v2
|
akaneshiro
| 2023-05-12T22:23:13Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T22:22:53Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 259.08 +/- 20.49
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Halcyonindo/an1h1jablora
|
Halcyonindo
| 2023-05-12T22:14:00Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-12T22:12:03Z |
---
license: creativeml-openrail-m
---
|
attilalengyel/ceresnet
|
attilalengyel
| 2023-05-12T21:49:41Z | 0 | 0 | null |
[
"dataset:imagenet-1k",
"region:us"
] | null | 2023-05-03T12:06:28Z |
---
datasets:
- imagenet-1k
metrics:
- accuracy
---
ImageNet pretrained weights for Color Equivariant ResNet models.
|
alistvt/zero-docalog
|
alistvt
| 2023-05-12T21:47:18Z | 11 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:doc2dial",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-10T15:33:50Z |
---
tags:
- generated_from_trainer
datasets:
- doc2dial
model-index:
- name: zero-docalog
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# zero-docalog
This model is a fine-tuned version of [alistvt/zero-docalog](https://huggingface.co/alistvt/zero-docalog) on the doc2dial dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 30
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3
|
asenella/mmnist_MoPoEconfig2_seed_0_ratio_05_c
|
asenella
| 2023-05-12T21:45:51Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T21:45:42Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MoPoEconfig2_seed_3_ratio_02_c
|
asenella
| 2023-05-12T21:45:32Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T21:45:24Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
DunnBC22/bert-base-uncased-Q_and_A-Answer_Prediction_Dataset
|
DunnBC22
| 2023-05-12T21:36:53Z | 63 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"question-answering",
"generated_from_trainer",
"en",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-02-28T19:10:34Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: bert-base-uncased-question_and_answer
results: []
language:
- en
metrics:
- exact_match
- f1
pipeline_tag: question-answering
---
# bert-base-uncased-Q_and_A-Answer_Prediction_Dataset
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased).
## Model description
This is an extractive question answer model.
For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Question%26Answer/Answer%20Prediction%20Dataset%20-%20Question%26Answer%20with%20BERT.ipynb
## Intended uses & limitations
This model is intended to demonstrate my ability to solve a complex problem using technology.
## Training and evaluation data
Dataset Source: https://www.kaggle.com/datasets/a2m2a2n2/question-answer-dataset
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Metric Name | Value |
|:-----------:|:-----:|
| exact_match | 65.74 |
| f1 | 79.28 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.12.1
- Datasets 2.8.0
- Tokenizers 0.12.1
|
him1411/EDGAR-T5-Large
|
him1411
| 2023-05-12T21:35:03Z | 8 | 4 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"t5",
"text2text-generation",
"finance",
"ContextNER",
"language models",
"en",
"dataset:him1411/EDGAR10-Q",
"arxiv:2109.08079",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-04-03T18:34:04Z |
---
license: mit
language:
- en
tags:
- finance
- ContextNER
- language models
datasets:
- him1411/EDGAR10-Q
metrics:
- rouge
---
EDGAR-T5-Large
=============
T5 Large model finetuned on [EDGAR10-Q dataset](https://huggingface.co/datasets/him1411/EDGAR10-Q)
You may want to check out
* Our paper: [CONTEXT-NER: Contextual Phrase Generation at Scale](https://arxiv.org/abs/2109.08079/)
* GitHub: [Click Here](https://github.com/him1411/edgar10q-dataset)
Direct Use
=============
It is possible to use this model to generate text, which is useful for experimentation and understanding its capabilities. **It should not be directly used for production or work that may directly impact people.**
How to Use
=============
You can very easily load the models with Transformers, instead of downloading them manually. The [T5-Large model](https://huggingface.co/t5-large) is the backbone of our model. Here is how to use the model in PyTorch:
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("him1411/EDGAR-T5-Large")
model = AutoModelForSeq2SeqLM.from_pretrained("him1411/EDGAR-T5-Large")
```
Or just clone the model repo
```
git lfs install
git clone https://huggingface.co/him1411/EDGAR-T5-Large
```
Inference Example
=============
Here, we provide an example for the "ContextNER" task. Below is an example of one instance.
```python
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("him1411/EDGAR-T5-Large")
model = AutoModelForSeq2SeqLM.from_pretrained("him1411/EDGAR-T5-Large")
# Input shows how we have appended instruction from our file for HoC dataset with instance.
input = "14.5 years . The definite lived intangible assets related to the contracts and trade names had estimated weighted average useful lives of 5.9 years and 14.5 years, respectively, at acquisition."
tokenized_input= tokenizer(input)
# Ideal output for this input is 'Definite lived intangible assets weighted average remaining useful life'
output = model(tokenized_input)
```
## Results on Dowstream datasets
EDGAR-T5-Large was finetuned on some downstream datasets to get better results than T5 large. BloombergGPT 50B was used as baseline.
| Dataset | Bloomberg GPT 50B | T5 Large | Edgar T5 Large |
|----------|-------------------|----------|----------------|
| FiQA SA | 75.07 | 74.89 | 80.42 |
| FPB | 51.07 | 55.77 | 79.69 |
| Headline | 82.20 | 90.55 | 93.55 |
BibTeX Entry and Citation Info
===============
If you are using our model, please cite our paper:
```bibtex
@article{gupta2021context,
title={Context-NER: Contextual Phrase Generation at Scale},
author={Gupta, Himanshu and Verma, Shreyas and Kumar, Tarun and Mishra, Swaroop and Agrawal, Tamanna and Badugu, Amogh and Bhatt, Himanshu Sharad},
journal={arXiv preprint arXiv:2109.08079},
year={2021}
}
```
|
DunnBC22/distilbert-base-uncased-reviews_multilabel_clf_v2
|
DunnBC22
| 2023-05-12T21:11:57Z | 103 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"en",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-02-16T02:26:52Z |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- f1
- accuracy
- roc_auc
model-index:
- name: distilbert-base-uncased-reviews_multilabel_clf_v2
results: []
language:
- en
pipeline_tag: text-classification
---
# distilbert-base-uncased-reviews_multilabel_clf_v2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased).
It achieves the following results on the evaluation set:
- Loss: 0.1519
- F1: 0.8697
- Roc Auc: 0.9107
- Accuracy: 0.5787
## Model description
This is a multilabel classification model of whether different aspects of a product are mentioned in reviews.
For more information on how it was created, check out the following link: https://github.com/DunnBC22/NLP_Projects/blob/main/Multilabel%20Classification/Review%20Sentiments/Sentiments%20-%20Multilabel%20clf.ipynb
## Intended uses & limitations
This model is intended to demonstrate my ability to solve a complex problem using technology.
## Training and evaluation data
Dataset Source: https://www.kaggle.com/datasets/mohamedziauddin/mh-uhack-sentiments?select=train.csv
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:------:|:-------:|:--------:|
| 0.6847 | 1.0 | 305 | 0.2425 | 0.7619 | 0.8209 | 0.3492 |
| 0.296 | 2.0 | 610 | 0.1786 | 0.8447 | 0.8847 | 0.5197 |
| 0.296 | 3.0 | 915 | 0.1634 | 0.8511 | 0.8937 | 0.5361 |
| 0.1476 | 4.0 | 1220 | 0.1544 | 0.8626 | 0.8999 | 0.5623 |
| 0.0986 | 5.0 | 1525 | 0.1490 | 0.8624 | 0.8994 | 0.5639 |
| 0.0986 | 6.0 | 1830 | 0.1521 | 0.8653 | 0.9041 | 0.5787 |
| 0.0686 | 7.0 | 2135 | 0.1511 | 0.8676 | 0.9110 | 0.5656 |
| 0.0686 | 8.0 | 2440 | 0.1501 | 0.8687 | 0.9104 | 0.5869 |
| 0.0525 | 9.0 | 2745 | 0.1519 | 0.8685 | 0.9089 | 0.5754 |
| 0.0432 | 10.0 | 3050 | 0.1519 | 0.8697 | 0.9107 | 0.5787 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.12.1
- Datasets 2.8.0
- Tokenizers 0.12.1
|
hyoni/kogpt2-base-v2-finetuned-klue-ner2
|
hyoni
| 2023-05-12T21:11:01Z | 102 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"token-classification",
"generated_from_trainer",
"dataset:klue",
"license:cc-by-nc-sa-4.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2023-05-12T20:49:24Z |
---
license: cc-by-nc-sa-4.0
tags:
- generated_from_trainer
datasets:
- klue
metrics:
- f1
model-index:
- name: kogpt2-base-v2-finetuned-klue-ner2
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: klue
type: klue
config: ner
split: validation
args: ner
metrics:
- name: F1
type: f1
value: 0.40471319859258653
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# kogpt2-base-v2-finetuned-klue-ner2
This model is a fine-tuned version of [skt/kogpt2-base-v2](https://huggingface.co/skt/kogpt2-base-v2) on the klue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4307
- F1: 0.4047
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.6164 | 1.0 | 1313 | 0.5407 | 0.2116 |
| 0.4128 | 2.0 | 2626 | 0.5157 | 0.2614 |
| 0.3186 | 3.0 | 3939 | 0.4291 | 0.3462 |
| 0.2514 | 4.0 | 5252 | 0.4261 | 0.3734 |
| 0.1989 | 5.0 | 6565 | 0.4307 | 0.4047 |
### Framework versions
- Transformers 4.29.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
asenella/mmnist_MoPoEconfig2_seed_2_ratio_02_i
|
asenella
| 2023-05-12T21:06:37Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T21:06:29Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
P0intMaN/PyAutoCode
|
P0intMaN
| 2023-05-12T21:02:01Z | 21 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tf",
"safetensors",
"gpt2",
"text-generation",
"license:mit",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2022-03-10T16:01:44Z |
---
license: mit
---
# PyAutoCode: GPT-2 based Python auto-code.
PyAutoCode is a cut-down python autosuggestion built on **GPT-2** *(motivation: GPyT)* model. This baby model *(trained only up to 3 epochs)* is not **"fine-tuned"** yet therefore, I highly recommend not to use it in a production environment or incorporate PyAutoCode in any of your projects. It has been trained on **112GB** of Python data sourced from the best crowdsource platform ever -- **GitHub**.
*NOTE: Increased training and fine tuning would be highly appreciated and I firmly believe that it would improve the ability of PyAutoCode significantly.*
## Some Model Features
- Built on *GPT-2*
- Tokenized with *ByteLevelBPETokenizer*
- Data Sourced from *GitHub (almost 5 consecutive days of latest Python repositories)*
- Makes use of *GPTLMHeadModel* and *DataCollatorForLanguageModelling* for training
- Newline characters are custom coded as `<N>`
## Get a Glimpse of the Model
You can make use of the **Inference API** of huggingface *(present on the right sidebar)* to load the model and check the result. Just enter any code snippet as input. Something like:
```sh
for i in range(
```
## Usage
You can use my model too!. Here's a quick tour of how you can achieve this:
Install transformers
```sh
$ pip install transformers
```
Call the API and get it to work!
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("P0intMaN/PyAutoCode")
model = AutoModelForCausalLM.from_pretrained("P0intMaN/PyAutoCode")
# input: single line or multi-line. Highly recommended to use doc-strings.
inp = """import pandas"""
format_inp = inp.replace('\n', "<N>")
tokenize_inp = tokenizer.encode(format_inp, return_tensors='pt')
result = model.generate(tokenize_inp)
decode_result = tokenizer.decode(result[0])
format_result = decode_result.replace('<N>', "\n")
# printing the result
print(format_result)
```
Upon successful execution, the above should probably produce *(your results may vary when this model is fine-tuned)*
```sh
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
```
## Credits
##### *Developed as a part of a university project by [Pratheek U](https://www.github.com/P0intMaN) and [Sourav Singh](https://github.com/Sourav11902312lpu)*
|
asenella/mmnist_MoPoEconfig2_seed_0_ratio_02_i
|
asenella
| 2023-05-12T20:37:23Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-10T17:39:56Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MoPoEconfig2_seed_3_ratio_0_c
|
asenella
| 2023-05-12T20:18:47Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T20:18:39Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
lsimon/marian-finetuned-kde4-en-to-fr
|
lsimon
| 2023-05-12T20:07:08Z | 105 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"marian",
"text2text-generation",
"translation",
"generated_from_trainer",
"dataset:kde4",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
translation
| 2023-05-12T19:12:34Z |
---
license: apache-2.0
tags:
- translation
- generated_from_trainer
datasets:
- kde4
metrics:
- bleu
model-index:
- name: marian-finetuned-kde4-en-to-fr
results:
- task:
name: Sequence-to-sequence Language Modeling
type: text2text-generation
dataset:
name: kde4
type: kde4
config: en-fr
split: train
args: en-fr
metrics:
- name: Bleu
type: bleu
value: 46.786638307530986
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# marian-finetuned-kde4-en-to-fr
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1716
- Bleu: 46.7866
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
sana-ngu/bart-large-finetuned-summarize-scientific-articles
|
sana-ngu
| 2023-05-12T20:06:38Z | 104 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bart",
"text2text-generation",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-05-12T19:36:39Z |
# How to use
```python
from transformers import pipeline
summarizer = pipeline("summarization", model="sana-ngu/bart-large-finetuned-summarize-scientific-articles")
article = "The novel Coronavirus disease (COVID-19), caused by the severe acute respiratory syndrome coronavirus—2 (SARS-CoV-2), in Africa is characterised by a more substantial proportion of asymptomatic (or mildly symptomatic) individuals thought to be playing a role in the spread of the infection. The exact proportion and degree of infectiousness of asymptomatic individuals remains unclear. Studies however indicate that their management is crucial for control of SARS-CoV-2 transmission.
We developed a simplified deterministic susceptible-exposed-infectious-removed (SEIR) mathematical model to assess the effect of active isolation of SARS-CoV-2 infected but asymptomatic individuals through blanket testing for control of the outbreak in Lusaka Province of Zambia. Here we modelled two scenarios; (1) assuming asymptomatic individuals comprised 70% of all COVID-19 cases and (2) asymptomatic individuals comprised only 50% of the cases. For contrast, the model was assessed first under the assumption that asymptomatic individuals are equally as infectious as symptomatic individuals and then secondly, and more likely, assuming asymptomatic individuals are only half as infectious as symptomatic individuals.
For the model assuming 70% asymptomatic cases, a minimum sustained daily blanket testing rate of ≥ 7911 tests/100000 population was sufficient to control the outbreak if asymptomatic individuals are only half as infectious while if equal infectiousness was assumed then a testing rate of ≥ 10028 tests/ 100000 population would be required. For 50% asymptomatic, minimum blanket testing rates of ≥ 4540 tests/ 100000 population was sufficient to control the outbreak at both assumed levels of infectiousness for asymptomatic individuals relative to symptomatic individuals.
Discussion and conclusion
Our model predicts that active isolation of COVID-19 cases, including asymptomatic individuals, through blanket testing can be used as a possible measure for the control of the SARS-Cov-2 transmission in Lusaka, Zambia, but it would come at a high cost."
summarizer(conversation)
```
|
hyoni/kogpt2-base-v2-finetuned-klue-ner
|
hyoni
| 2023-05-12T19:55:52Z | 103 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"gpt2",
"token-classification",
"generated_from_trainer",
"dataset:klue",
"license:cc-by-nc-sa-4.0",
"model-index",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2023-05-12T19:42:52Z |
---
license: cc-by-nc-sa-4.0
tags:
- generated_from_trainer
datasets:
- klue
metrics:
- f1
model-index:
- name: kogpt2-base-v2-finetuned-klue-ner
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: klue
type: klue
config: ner
split: validation
args: ner
metrics:
- name: F1
type: f1
value: 0.37298165525403665
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# kogpt2-base-v2-finetuned-klue-ner
This model is a fine-tuned version of [skt/kogpt2-base-v2](https://huggingface.co/skt/kogpt2-base-v2) on the klue dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4076
- F1: 0.3730
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 0.6084 | 1.0 | 876 | 0.5353 | 0.2118 |
| 0.3911 | 2.0 | 1752 | 0.4691 | 0.3041 |
| 0.2855 | 3.0 | 2628 | 0.4076 | 0.3730 |
### Framework versions
- Transformers 4.29.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
jainr3/sd-pixelart-model-lora
|
jainr3
| 2023-05-12T19:30:17Z | 7 | 2 |
diffusers
|
[
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"lora",
"base_model:stabilityai/stable-diffusion-2-1",
"base_model:adapter:stabilityai/stable-diffusion-2-1",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-05-12T18:30:15Z |
---
license: creativeml-openrail-m
base_model: stabilityai/stable-diffusion-2-1
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA text2image fine-tuning - jainr3/sd-pixelart-model-lora
These are LoRA adaption weights for stabilityai/stable-diffusion-2-1. The weights were fine-tuned on the sunilSabnis/pixelart dataset. You can find some example images in the following.




|
P1ayer-1/mpt-7b-instruct-base
|
P1ayer-1
| 2023-05-12T19:29:52Z | 10 | 2 |
transformers
|
[
"transformers",
"pytorch",
"mpt",
"text-generation",
"Composer",
"MosaicML",
"llm-foundry",
"StreamingDatasets",
"custom_code",
"dataset:mc4",
"dataset:c4",
"dataset:togethercomputer/RedPajama-Data-1T",
"dataset:bigcode/the-stack",
"dataset:allenai/s2orc",
"arxiv:2108.12409",
"arxiv:2302.13971",
"arxiv:2205.14135",
"arxiv:2010.04245",
"arxiv:1909.08053",
"arxiv:2302.06675",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"region:us"
] |
text-generation
| 2023-05-11T00:13:46Z |
---
license: apache-2.0
tags:
- Composer
- MosaicML
- llm-foundry
- StreamingDatasets
datasets:
- mc4
- c4
- togethercomputer/RedPajama-Data-1T
- bigcode/the-stack
- allenai/s2orc
inference: false
---
# MPT-7B
MPT-7B is a decoder-style transformer pretrained from scratch on 1T tokens of English text and code.
This model was trained by [MosaicML](https://www.mosaicml.com).
MPT-7B is part of the family of MosaicPretrainedTransformer (MPT) models, which use a modified transformer architecture optimized for efficient training and inference.
These architectural changes include performance-optimized layer implementations and the elimination of context length limits by replacing
positional embeddings with Attention with Linear Biases ([ALiBi](https://arxiv.org/abs/2108.12409)).
Thanks to these modifications, MPT models can be trained with high throughput efficiency and stable convergence.
MPT models can also be served efficiently with both standard HuggingFace pipelines and NVIDIA's [FasterTransformer](https://github.com/NVIDIA/FasterTransformer).
This model uses the MosaicML LLM codebase, which can be found in the [llm-foundry repository](https://github.com/mosaicml/llm-foundry). It was trained by MosaicML’s NLP team on the [MosaicML platform](https://www.mosaicml.com/training) for LLM pretraining, finetuning, and inference.
### How is this model different?
MPT-7B is
* **Licensed for the possibility of commercial use** (unlike [LLaMA](https://arxiv.org/abs/2302.13971)).
* **Trained on a large amount of data** (1T tokens like [LLaMA](https://arxiv.org/abs/2302.13971) vs. 300B for [Pythia](https://github.com/EleutherAI/pythia), 300B for [OpenLLaMA](https://github.com/openlm-research/open_llama), and 800B for [StableLM](https://github.com/Stability-AI/StableLM)).
* **Prepared to handle extremely long inputs** thanks to [ALiBi](https://arxiv.org/abs/2108.12409) (we finetuned [MPT-7B-StoryWriter-65k+](https://huggingface.co/mosaicml/mpt-7b-storywriter) on up to 65k inputs and can handle up to 84k vs. 2k-4k for other open source models).
* **Capable of fast training and inference** (via [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf) and [FasterTransformer](https://github.com/NVIDIA/FasterTransformer))
* **Equipped with highly efficient open-source training code** via the [llm-foundry repository](https://github.com/mosaicml/llm-foundry)
### Models finetuned off MPT-7B:
The following models are finetuned on MPT-7B:
* [MPT-7B-StoryWriter-65k+](https://huggingface.co/mosaicml/mpt-7b-storywriter): a model designed to read and write fictional stories with super long context lengths.
Built by finetuning MPT-7B with a context length of 65k tokens on a filtered fiction subset of the [books3 dataset](https://huggingface.co/datasets/the_pile_books3).
At inference time, thanks to [ALiBi](https://arxiv.org/abs/2108.12409), MPT-7B-StoryWriter-65k+ can extrapolate even beyond 65k tokens.
We demonstrate generations as long as 80k tokens on a single A100-80GB GPU in our [blogpost](www.mosaicml.com/blog/mpt-7b).
* License: Apache 2.0
* [MPT-7B-Instruct](https://huggingface.co/mosaicml/mpt-7b-instruct): a model for short-form instruction following.
Built by finetuning MPT-7B on a [dataset](https://huggingface.co/datasets/mosaicml/dolly_hhrlhf) we also release, derived from the [Databricks Dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k) and the [Anthropic Helpful and Harmless (HH-RLHF)](https://huggingface.co/datasets/Anthropic/hh-rlhf) datasets.
* License: _CC-By-SA-3.0_
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-7b-instruct)
* [MPT-7B-Chat](https://huggingface.co/mosaicml/mpt-7b-chat): a chatbot-like model for dialogue generation.
Built by finetuning MPT-7B on the [ShareGPT-Vicuna](https://huggingface.co/datasets/jeffwan/sharegpt_vicuna), [HC3](https://huggingface.co/datasets/Hello-SimpleAI/HC3),
[Alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca), [HH-RLHF](https://huggingface.co/datasets/Anthropic/hh-rlhf), and [Evol-Instruct](https://huggingface.co/datasets/victor123/evol_instruct_70k) datasets.
* License: _CC-By-NC-SA-4.0_
* [Demo on Hugging Face Spaces](https://huggingface.co/spaces/mosaicml/mpt-7b-chat)
## Model Date
May 5, 2023
## Model License
Apache-2.0
## Documentation
* [Blog post: Introducing MPT-7B: A New Standard for Open-Source, Commercially Usable LLMs](https://www.mosaicml.com/blog/mpt-7b)
* [Codebase (mosaicml/llm-foundry repo)](https://github.com/mosaicml/llm-foundry/)
* Questions: Feel free to contact us via the [MosaicML Community Slack](https://join.slack.com/t/mosaicml-community/shared_invite/zt-1btms90mc-GipE2ufuPkKY0QBrmF3LSA)!
## How to Use
This model is best used with the MosaicML [llm-foundry repository](https://github.com/mosaicml/llm-foundry) for training and finetuning.
```python
import transformers
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b',
trust_remote_code=True
)
```
Note: This model requires that `trust_remote_code=True` be passed to the `from_pretrained` method.
This is because we use a custom `MPT` model architecture that is not yet part of the Hugging Face `transformers` package.
`MPT` includes options for many training efficiency features such as [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf), [ALiBi](https://arxiv.org/abs/2108.12409), [QK LayerNorm](https://arxiv.org/abs/2010.04245), and more.
To use the optimized [triton implementation](https://github.com/openai/triton) of FlashAttention, you can load the model with `attn_impl='triton'` and move the model to `bfloat16`:
```python
config = transformers.AutoConfig.from_pretrained(
'mosaicml/mpt-7b',
trust_remote_code=True
)
config.attn_config['attn_impl'] = 'triton'
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b',
config=config,
torch_dtype=torch.bfloat16,
trust_remote_code=True
)
model.to(device='cuda:0')
```
Although the model was trained with a sequence length of 2048, ALiBi enables users to increase the maximum sequence length during finetuning and/or inference. For example:
```python
config = transformers.AutoConfig.from_pretrained(
'mosaicml/mpt-7b',
trust_remote_code=True
)
config.update({"max_seq_len": 4096})
model = transformers.AutoModelForCausalLM.from_pretrained(
'mosaicml/mpt-7b',
config=config,
trust_remote_code=True
)
```
This model was trained with the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer.
```python
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("EleutherAI/gpt-neox-20b")
```
## Model Description
The architecture is a modification of a standard decoder-only transformer.
The model has been modified from a standard transformer in the following ways:
* It uses [FlashAttention](https://arxiv.org/pdf/2205.14135.pdf)
* It uses [ALiBi (Attention with Linear Biases)](https://arxiv.org/abs/2108.12409) and does not use positional embeddings
* It does not use biases
| Hyperparameter | Value |
|----------------|-------|
|n_parameters | 6.7B |
|n_layers | 32 |
| n_heads | 32 |
| d_model | 4096 |
| vocab size | 50432 |
| sequence length | 2048 |
## Training Data
### Streaming Datasets
Data was formatted using the MosaicML [StreamingDataset](https://github.com/mosaicml/streaming) library to host our data in object storage and efficiently stream it to our compute cluster during training.
StreamingDataset obviates the need to download the whole dataset before starting training, and allows instant resumption of training from any point in the dataset.
### Data Mix
The model was trained for 1T tokens (with batch size 1760 and sequence length 2048). It was trained on the following data mix:
| Data Source | Number of Tokens in Source | Proportion | Effective Number of Tokens | Epochs |
|-------------|----------------------------|------------|----------------------------|--------|
| mC4 3.1.0 - English | 417.99 B | 0.33 | 330 B | 0.14 |
| C4 - English - SemDedup 80% | 100.42 B | 0.299 | 299 B | 2.98 |
| RedPajama - CommonCrawl | 878.45 B | 0.1 | 100 B | 0.11 |
| The Stack - Selected Languages | 463.78 B | 0.1 | 100 B | 0.22 |
| RedPajama - Wikipedia - En | 4.87 B | 0.04 | 40 B | 8.21 |
| The Stack - Markdown | 107.07 B | 0.035 | 35 B | 0.33 |
| S2ORC | 48.85 B | 0.033 | 33 B | 0.68 |
| RedPajama - Books | 26.02 B | 0.03 | 30B | 1.15 |
| RedPajama - arXiv | 28.10 B | 0.019 | 19 B | 0.68 |
| RedPajama - StackExchange | 20.54 B | 0.014 | 14 B |0.68 |
Samples for each batch were selected from one of the datasets with the probability specified above.
The examples were shuffled within each dataset, and each example was constructed from as many sequences from that dataset as were necessary to fill the 2048 sequence length.
The data was tokenized using the [EleutherAI/gpt-neox-20b](https://huggingface.co/EleutherAI/gpt-neox-20b) tokenizer. This BPE tokenizer has a number of desirable characteristics,
most of which are relevant for tokenizing code:
(1) It was trained on a diverse mix of data that includes code (The Pile)
(2) It applies consistent space delimitation, unlike the GPT2 tokenizer which tokenizes inconsistently depending on the presence of prefix spaces
(3) It contains tokens for repeated space characters, which allows superior compression of text with large amounts of repeated space characters.
The model vocabulary size of 50432 was set to be a multiple of 128 (as in [MEGATRON-LM](https://arxiv.org/abs/1909.08053)), model flop utilization (MFU) increased by up to four percentage points.
### Training Configuration
This model was trained on 440 A100-40GBs for about 9.5 days using the [MosaicML Platform](https://www.mosaicml.com/platform).
The model was trained with sharded data parallelism using [FSDP](https://pytorch.org/docs/stable/fsdp.html) and used the [LION](https://arxiv.org/abs/2302.06675) optimizer.
## Limitations and Biases
_The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_
MPT-7B (Base) is **not** intended for deployment without finetuning.
It should not be used for human-facing interactions without further guardrails and user consent.
MPT-7B can produce factually incorrect output, and should not be relied on to produce factually accurate information.
MPT-7B was trained on various public datasets.
While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs.
## MosaicML Platform
If you're interested in [training](https://www.mosaicml.com/training) and [deploying](https://www.mosaicml.com/inference) your own MPT or LLMs on the MosaicML Platform, [sign up here](https://forms.mosaicml.com/demo?utm_source=huggingface&utm_medium=referral&utm_campaign=mpt-7b).
## Disclaimer
The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please cosult an attorney before using this model for commercial purposes.
## Citation
Please cite this model using the following format:
```
@online{MosaicML2023Introducing,
author = {MosaicML NLP Team},
title = {Introducing MPT-7B: A New Standard for Open-Source,
ly Usable LLMs},
year = {2023},
url = {www.mosaicml.com/blog/mpt-7b},
note = {Accessed: 2023-03-28}, % change this date
urldate = {2023-03-28} % change this date
}
```
|
SacredFisher/ppo-LunarLander-v2
|
SacredFisher
| 2023-05-12T19:29:15Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T19:24:35Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 257.69 +/- 22.08
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
asenella/mmnist_MoPoEconfig2_seed_2_ratio_0_c
|
asenella
| 2023-05-12T19:17:05Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T19:16:57Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
guillaumekln/faster-whisper-small
|
guillaumekln
| 2023-05-12T18:58:54Z | 5,185 | 10 |
ctranslate2
|
[
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"zh",
"de",
"es",
"ru",
"ko",
"fr",
"ja",
"pt",
"tr",
"pl",
"ca",
"nl",
"ar",
"sv",
"it",
"id",
"hi",
"fi",
"vi",
"he",
"uk",
"el",
"ms",
"cs",
"ro",
"da",
"hu",
"ta",
"no",
"th",
"ur",
"hr",
"bg",
"lt",
"la",
"mi",
"ml",
"cy",
"sk",
"te",
"fa",
"lv",
"bn",
"sr",
"az",
"sl",
"kn",
"et",
"mk",
"br",
"eu",
"is",
"hy",
"ne",
"mn",
"bs",
"kk",
"sq",
"sw",
"gl",
"mr",
"pa",
"si",
"km",
"sn",
"yo",
"so",
"af",
"oc",
"ka",
"be",
"tg",
"sd",
"gu",
"am",
"yi",
"lo",
"uz",
"fo",
"ht",
"ps",
"tk",
"nn",
"mt",
"sa",
"lb",
"my",
"bo",
"tl",
"mg",
"as",
"tt",
"haw",
"ln",
"ha",
"ba",
"jw",
"su",
"license:mit",
"region:us"
] |
automatic-speech-recognition
| 2023-03-23T10:21:29Z |
---
language:
- en
- zh
- de
- es
- ru
- ko
- fr
- ja
- pt
- tr
- pl
- ca
- nl
- ar
- sv
- it
- id
- hi
- fi
- vi
- he
- uk
- el
- ms
- cs
- ro
- da
- hu
- ta
- 'no'
- th
- ur
- hr
- bg
- lt
- la
- mi
- ml
- cy
- sk
- te
- fa
- lv
- bn
- sr
- az
- sl
- kn
- et
- mk
- br
- eu
- is
- hy
- ne
- mn
- bs
- kk
- sq
- sw
- gl
- mr
- pa
- si
- km
- sn
- yo
- so
- af
- oc
- ka
- be
- tg
- sd
- gu
- am
- yi
- lo
- uz
- fo
- ht
- ps
- tk
- nn
- mt
- sa
- lb
- my
- bo
- tl
- mg
- as
- tt
- haw
- ln
- ha
- ba
- jw
- su
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper small model for CTranslate2
This repository contains the conversion of [openai/whisper-small](https://huggingface.co/openai/whisper-small) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("small")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-small --output_dir faster-whisper-small \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-small).**
|
guillaumekln/faster-whisper-large-v1
|
guillaumekln
| 2023-05-12T18:58:39Z | 327 | 3 |
ctranslate2
|
[
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"zh",
"de",
"es",
"ru",
"ko",
"fr",
"ja",
"pt",
"tr",
"pl",
"ca",
"nl",
"ar",
"sv",
"it",
"id",
"hi",
"fi",
"vi",
"he",
"uk",
"el",
"ms",
"cs",
"ro",
"da",
"hu",
"ta",
"no",
"th",
"ur",
"hr",
"bg",
"lt",
"la",
"mi",
"ml",
"cy",
"sk",
"te",
"fa",
"lv",
"bn",
"sr",
"az",
"sl",
"kn",
"et",
"mk",
"br",
"eu",
"is",
"hy",
"ne",
"mn",
"bs",
"kk",
"sq",
"sw",
"gl",
"mr",
"pa",
"si",
"km",
"sn",
"yo",
"so",
"af",
"oc",
"ka",
"be",
"tg",
"sd",
"gu",
"am",
"yi",
"lo",
"uz",
"fo",
"ht",
"ps",
"tk",
"nn",
"mt",
"sa",
"lb",
"my",
"bo",
"tl",
"mg",
"as",
"tt",
"haw",
"ln",
"ha",
"ba",
"jw",
"su",
"license:mit",
"region:us"
] |
automatic-speech-recognition
| 2023-03-28T12:33:41Z |
---
language:
- en
- zh
- de
- es
- ru
- ko
- fr
- ja
- pt
- tr
- pl
- ca
- nl
- ar
- sv
- it
- id
- hi
- fi
- vi
- he
- uk
- el
- ms
- cs
- ro
- da
- hu
- ta
- 'no'
- th
- ur
- hr
- bg
- lt
- la
- mi
- ml
- cy
- sk
- te
- fa
- lv
- bn
- sr
- az
- sl
- kn
- et
- mk
- br
- eu
- is
- hy
- ne
- mn
- bs
- kk
- sq
- sw
- gl
- mr
- pa
- si
- km
- sn
- yo
- so
- af
- oc
- ka
- be
- tg
- sd
- gu
- am
- yi
- lo
- uz
- fo
- ht
- ps
- tk
- nn
- mt
- sa
- lb
- my
- bo
- tl
- mg
- as
- tt
- haw
- ln
- ha
- ba
- jw
- su
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper large-v1 model for CTranslate2
This repository contains the conversion of [openai/whisper-large](https://huggingface.co/openai/whisper-large) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("large-v1")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-large --output_dir faster-whisper-large-v1 \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-large).**
|
guillaumekln/faster-whisper-medium.en
|
guillaumekln
| 2023-05-12T18:57:57Z | 6,145 | 3 |
ctranslate2
|
[
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"license:mit",
"region:us"
] |
automatic-speech-recognition
| 2023-03-23T10:24:46Z |
---
language:
- en
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper medium.en model for CTranslate2
This repository contains the conversion of [openai/whisper-medium.en](https://huggingface.co/openai/whisper-medium.en) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("medium.en")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-medium.en --output_dir faster-whisper-medium.en \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-medium.en).**
|
seviladiguzel/roberta-base-finetuned-squad_roberta
|
seviladiguzel
| 2023-05-12T18:57:33Z | 18 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:mit",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-12T17:52:56Z |
---
license: mit
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: roberta-base-finetuned-squad_roberta
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-base-finetuned-squad_roberta
This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8537
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 0.8917 | 1.0 | 5536 | 0.8669 |
| 0.6739 | 2.0 | 11072 | 0.8537 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
guillaumekln/faster-whisper-base.en
|
guillaumekln
| 2023-05-12T18:57:21Z | 1,082 | 2 |
ctranslate2
|
[
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"license:mit",
"region:us"
] |
automatic-speech-recognition
| 2023-03-23T10:18:04Z |
---
language:
- en
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper base.en model for CTranslate2
This repository contains the conversion of [openai/whisper-base.en](https://huggingface.co/openai/whisper-base.en) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("base.en")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-base.en --output_dir faster-whisper-base.en \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-base.en).**
|
guillaumekln/faster-whisper-tiny.en
|
guillaumekln
| 2023-05-12T18:56:53Z | 378 | 2 |
ctranslate2
|
[
"ctranslate2",
"audio",
"automatic-speech-recognition",
"en",
"license:mit",
"region:us"
] |
automatic-speech-recognition
| 2023-03-23T10:17:41Z |
---
language:
- en
tags:
- audio
- automatic-speech-recognition
license: mit
library_name: ctranslate2
---
# Whisper tiny.en model for CTranslate2
This repository contains the conversion of [openai/whisper-tiny.en](https://huggingface.co/openai/whisper-tiny.en) to the [CTranslate2](https://github.com/OpenNMT/CTranslate2) model format.
This model can be used in CTranslate2 or projects based on CTranslate2 such as [faster-whisper](https://github.com/guillaumekln/faster-whisper).
## Example
```python
from faster_whisper import WhisperModel
model = WhisperModel("tiny.en")
segments, info = model.transcribe("audio.mp3")
for segment in segments:
print("[%.2fs -> %.2fs] %s" % (segment.start, segment.end, segment.text))
```
## Conversion details
The original model was converted with the following command:
```
ct2-transformers-converter --model openai/whisper-tiny.en --output_dir faster-whisper-tiny.en \
--copy_files tokenizer.json --quantization float16
```
Note that the model weights are saved in FP16. This type can be changed when the model is loaded using the [`compute_type` option in CTranslate2](https://opennmt.net/CTranslate2/quantization.html).
## More information
**For more information about the original model, see its [model card](https://huggingface.co/openai/whisper-tiny.en).**
|
Ktang2k/ppo-Huggy-1
|
Ktang2k
| 2023-05-12T18:54:20Z | 1 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] |
reinforcement-learning
| 2023-05-12T18:54:13Z |
---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Find your model_id: Ktang2k/ppo-Huggy-1
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
jainr3/sd-diffusiondb-pixelart-model-lora
|
jainr3
| 2023-05-12T18:51:50Z | 3 | 2 |
diffusers
|
[
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"lora",
"base_model:stabilityai/stable-diffusion-2-1",
"base_model:adapter:stabilityai/stable-diffusion-2-1",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-05-12T18:39:45Z |
---
license: creativeml-openrail-m
base_model: stabilityai/stable-diffusion-2-1
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- lora
inference: true
---
# LoRA text2image fine-tuning - jainr3/sd-diffusiondb-pixelart-model-lora
These are LoRA adaption weights for stabilityai/stable-diffusion-2-1. The weights were fine-tuned on the jainr3/diffusiondb-pixelart dataset. You can find some example images in the following.




|
minhnguyen/Advanced_B-cell_Epitopes_Prediction
|
minhnguyen
| 2023-05-12T18:46:31Z | 0 | 0 |
keras
|
[
"keras",
"biology",
"medical",
"text-classification",
"en",
"region:us"
] |
text-classification
| 2023-05-06T04:32:55Z |
---
language:
- en
metrics:
- accuracy
library_name: keras
pipeline_tag: text-classification
tags:
- biology
- medical
---
### Method 1: Many-to-One Architecture
This approach involves using a window of 9 amino acids, centered on the amino acid being assessed. This window includes the 4 amino acids on either side of the current amino acid in the sequence.
To handle sequences at the start and end of a protein, padding is used to ensure that the window always contains 9 elements.
Each amino acid is already converted into a numerical vector representation through a process known as embedding using a Large Protein Model (i.e. [Prott5](https://github.com/agemagician/ProtTrans)). These 9-element windows are then fed into the machine learning model for training and inference.
### Method 2: Many-to-Many Architecture
This approach works by training the model on protein sequences of 1024 amino acids at a time. For proteins shorter than 1024 amino acids, padding is used to make up the length.
For proteins that are longer than 1024 amino acids, we apply a sliding window approach with a stride of 900. This means we start with the first 1024 amino acids, then move 900 amino acids down the sequence and take the next 1024 amino acids, and so on, until we've covered the whole protein.
Each protein sequence is converted into a series of embeddings, which are then used to train and make inferences with the model.
Both of these methods allow the model to consider the context around each amino acid when making its predictions, which helps to improve accuracy.
|
Abdeldjalil21/djalil-base-sentiment-model-10k-samples
|
Abdeldjalil21
| 2023-05-12T18:29:32Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-05-09T23:59:57Z |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: djalil-base-sentiment-model-10k-samples
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# djalil-base-sentiment-model-10k-samples
This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4204
- Accuracy: 0.827
- F1: 0.8171
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
### Framework versions
- Transformers 4.29.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
PAWSLab/PAWSCumulateCode
|
PAWSLab
| 2023-05-12T18:28:35Z | 0 | 0 | null |
[
"arxiv:1910.09700",
"region:us"
] | null | 2023-05-12T18:28:09Z |
---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/model-cards
{}
---
# Model Card for Model ID
<!-- Provide a quick summary of what the model is/does. -->
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Data Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
Guilherme34/Jennifer-lora-7bv3
|
Guilherme34
| 2023-05-12T18:25:36Z | 0 | 1 | null |
[
"tensorboard",
"pt",
"region:us"
] | null | 2023-05-12T17:32:20Z |
---
language:
- pt
---
Esta é a terceira versão de uma inteligência artificial finetunada que fala em português do brasil, ela foi treinada em cima do llama 7b de decapoda, e foi treinada no LLaMA-LoRA Tuner de zetavg utilizando o dataset da cabrita lora e o alpaca cleaned
Divirta-se!
|
stillerman/MDEL-pubmed-feelaw-github-arxiv
|
stillerman
| 2023-05-12T18:18:39Z | 161 | 0 |
transformers
|
[
"transformers",
"pytorch",
"gpt_neox",
"text-generation",
"MDEL",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-05-12T18:11:40Z |
---
tags:
- MDEL
---
# Model Name
stillerman/MDEL-pubmed-feelaw-github-arxiv
# Model Description
This model was generated by averaging the weights of the following models
- [Multi-Domain-Expert-Layers/expert-pubmed_central](https://huggingface.co/Multi-Domain-Expert-Layers/expert-pubmed_central)
- [Multi-Domain-Expert-Layers/expert-freelaw](https://huggingface.co/Multi-Domain-Expert-Layers/expert-freelaw)
- [Multi-Domain-Expert-Layers/expert-github](https://huggingface.co/Multi-Domain-Expert-Layers/expert-github)
- [Multi-Domain-Expert-Layers/expert-arxiv](https://huggingface.co/Multi-Domain-Expert-Layers/expert-arxiv)
|
afnan007/GPT-clash
|
afnan007
| 2023-05-12T17:57:58Z | 0 | 1 | null |
[
"gpt",
"ai",
"automatic",
"chatgpt",
"chat",
"jailbreak",
"text2text-generation",
"en",
"license:mit",
"region:us"
] |
text2text-generation
| 2023-05-12T17:50:13Z |
---
license: mit
language:
- en
metrics:
- character
pipeline_tag: text2text-generation
tags:
- gpt
- ai
- automatic
- chatgpt
- chat
- jailbreak
---
<div align="center">
<a href="https://github.com/4fnan007/GPTclash">
<a href="https://github.com/4fnan007/GPTclash"><img src="https://i.ibb.co/fn5sMP2/cuteai.png" alt="logo"
</a>
<h1 align="center">GPTclash</h1>
Hey, have you heard about the two AI that started talking to each other? It's quite amazing right. This can be done using this program called "GPTclash.sh". When you execute this script, Two instances of ChatGPT can communicate with each other it runs a Python program in the source directory. Specifically, it runs the "firefox-server.py" file twice, using different ports. This results in two browser windows opening up.
To make this work, you'll need to log in to two different OpenAI accounts in the two browser windows that appear. Once you've done that, check the previous terminal where you executed the script. You see the program is still running and you know what to do next.
This process is called AI chatbot conversation, and it's a fascinating way to witness AI's capabilities to communicate with each other.
</div>
<div align="center">
## Demo Video
Watch this to know more about this program
[](https://www.youtube.com/watch?v=f9B5jRQpHoM)
</div>
## Features
- Jailbreak option enabled
- Metaprompt option enabled
- Easy to customize
- Live chat output on terminal itself
## Program Language Used
 
## Getting Started
This is an example of how you may give instructions on setting up your project locally.
To get a local copy up and running follow these simple example steps.
Clone the project
```bash
git clone https://huggingface.co/afnan007/GPT-clash
```
Go to the project directory
```bash
cd GPTclash
```
Run the script
```bash
bash GPTclash.sh
```
## Script executing Error
If any running errors occur with GPTclash.sh, let's move on to the manual method.
```bash
cd source/
```
Execute firefox_server.py to Run Two instances with diffrent ports.
```bash
python3 firefox_server.py --port 5001 --profile /tmp/chat1
```
```bash
python3 firefox_server.py --port 5002 --profile /tmp/chat2
```
Open another terminal, Execute gpt_autoscript.py to start
```bash
python3 gpt_autoscript.py
```
## What i want you to Know
Hey folks, just wanted to let you know that this program is open source and you have the right to do whatever you want with it. It's like a free buffet, except instead of food, you get lines of code! Yum.
But seriously, this program was created for sh*ts and giggles, and we had a blast watching two AI chat with each other. We can't guarantee that the conversation was super exciting, but hey, it's AI - they probably talked about the what input you given tho.
If you're feeling adventurous, go ahead and play around with the code. Just don't blame us if your computer starts talking back to you. That's when you know you've gone too far.
## Contact
If you have any questions, suggestions, feel free to reach out to me at:
[](https://t.me/afnan007) [](mailto:amanoythegreter232500@gmail.com)
|
millionhz/segformer-b0-finetuned-flame
|
millionhz
| 2023-05-12T17:55:49Z | 8 | 0 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"segformer",
"vision",
"image-segmentation",
"license:mit",
"endpoints_compatible",
"region:us"
] |
image-segmentation
| 2023-05-12T14:59:06Z |
---
license: mit
tags:
- vision
- image-segmentation
---
# SegFormer (b0-sized) model fine-tuned on FLAME
The model was trained for a deep learning project titled [Forest Fire Detection](https://github.com/millionhz/forest-fire-detection).
## Model Description
The model is intended to be used for fire detection through image segmentation.
The provided pretrained model was finetuned on the [FLAME](https://dx.doi.org/10.21227/qad6-r683) dataset for 3 epochs with a learning rate of 1e-3 and was able to score an IOU score of **0.745** on the test examples.
# How to use
Here is how to use this model to segment an image:
```python
from transformers import SegformerImageProcessor, SegformerForSemanticSegmentation
from PIL import Image
import requests
processor = AutoFeatureExtractor.from_pretrained("millionhz/segformer-b0-finetuned-flame")
model = SegformerForSemanticSegmentation.from_pretrained("millionhz/segformer-b0-finetuned-flame")
url = <add url here>
image = Image.open(requests.get(url, stream=True).raw)
inputs = feature_extractor(images=image, return_tensors="pt")
outputs = model(**inputs)
```
## License
The license for this model can be found [here](https://github.com/NVlabs/SegFormer/blob/master/LICENSE).
|
HeNLP/LongHeRo
|
HeNLP
| 2023-05-12T17:16:57Z | 13 | 2 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"longformer",
"fill-mask",
"he",
"dataset:HeNLP/HeDC4",
"arxiv:2304.11077",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-03-26T07:23:26Z |
---
language:
- he
pipeline_tag: fill-mask
datasets:
- HeNLP/HeDC4
---
## Hebrew Language Model for Long Documents
State-of-the-art Longformer language model for Hebrew.
#### How to use
```python
from transformers import AutoModelForMaskedLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('HeNLP/LongHeRo')
model = AutoModelForMaskedLM.from_pretrained('HeNLP/LongHeRo')
# Tokenization Example:
# Tokenizing
tokenized_string = tokenizer('שלום לכולם')
# Decoding
decoded_string = tokenizer.decode(tokenized_string ['input_ids'], skip_special_tokens=True)
```
### Citing
If you use LongHeRo in your research, please cite [HeRo: RoBERTa and Longformer Hebrew Language Models](http://arxiv.org/abs/2304.11077).
```
@article{shalumov2023hero,
title={HeRo: RoBERTa and Longformer Hebrew Language Models},
author={Vitaly Shalumov and Harel Haskey},
year={2023},
journal={arXiv:2304.11077},
}
```
|
HeNLP/HeRo
|
HeNLP
| 2023-05-12T17:16:20Z | 454 | 5 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"roberta",
"fill-mask",
"he",
"dataset:HeNLP/HeDC4",
"arxiv:2304.11077",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-01-17T08:19:49Z |
---
language:
- he
datasets:
- HeNLP/HeDC4
---
## Hebrew Language Model
State-of-the-art RoBERTa language model for Hebrew.
#### How to use
```python
from transformers import AutoModelForMaskedLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('HeNLP/HeRo')
model = AutoModelForMaskedLM.from_pretrained('HeNLP/HeRo'
# Tokenization Example:
# Tokenizing
tokenized_string = tokenizer('שלום לכולם')
# Decoding
decoded_string = tokenizer.decode(tokenized_string ['input_ids'], skip_special_tokens=True)
```
### Citing
If you use HeRo in your research, please cite [HeRo: RoBERTa and Longformer Hebrew Language Models](http://arxiv.org/abs/2304.11077).
```
@article{shalumov2023hero,
title={HeRo: RoBERTa and Longformer Hebrew Language Models},
author={Vitaly Shalumov and Harel Haskey},
year={2023},
journal={arXiv:2304.11077},
}
```
|
thirtycomputer/agns
|
thirtycomputer
| 2023-05-12T16:59:03Z | 1 | 0 |
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2023-05-12T16:52:59Z |
---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
---
### agns Dreambooth model trained by thirtycomputer with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook
Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb)
Sample pictures of this concept:
|
jason1234/Ai3_bert_embedding_model
|
jason1234
| 2023-05-12T16:54:49Z | 4 | 1 |
sentence-transformers
|
[
"sentence-transformers",
"pytorch",
"bert",
"feature-extraction",
"sentence-similarity",
"transformers",
"autotrain_compatible",
"text-embeddings-inference",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| 2023-05-12T16:04:09Z |
---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
- transformers
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Usage (HuggingFace Transformers)
Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
```python
from transformers import AutoTokenizer, AutoModel
import torch
#Mean Pooling - Take attention mask into account for correct averaging
def mean_pooling(model_output, attention_mask):
token_embeddings = model_output[0] #First element of model_output contains all token embeddings
input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float()
return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9)
# Sentences we want sentence embeddings for
sentences = ['This is an example sentence', 'Each sentence is converted']
# Load model from HuggingFace Hub
tokenizer = AutoTokenizer.from_pretrained('{MODEL_NAME}')
model = AutoModel.from_pretrained('{MODEL_NAME}')
# Tokenize sentences
encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt')
# Compute token embeddings
with torch.no_grad():
model_output = model(**encoded_input)
# Perform pooling. In this case, mean pooling.
sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask'])
print("Sentence embeddings:")
print(sentence_embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 57 with parameters:
```
{'batch_size': 8, 'sampler': 'torch.utils.data.sampler.SequentialSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.CosineSimilarityLoss.CosineSimilarityLoss`
Parameters of the fit()-Method:
```
{
"epochs": 20,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 92,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
)
```
## Citing & Authors
<!--- Describe where people can find more information -->
|
Bijibapakmu/aimyon
|
Bijibapakmu
| 2023-05-12T16:54:13Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-12T16:47:26Z |
---
license: creativeml-openrail-m
---
|
aprilzoo/distilbert-base-uncased-finetuned-imdb
|
aprilzoo
| 2023-05-12T16:53:13Z | 61 | 0 |
transformers
|
[
"transformers",
"tf",
"distilbert",
"fill-mask",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-05-06T18:51:57Z |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: aprilzoo/distilbert-base-uncased-finetuned-imdb
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# aprilzoo/distilbert-base-uncased-finetuned-imdb
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 4.2701
- Validation Loss: 2.6432
- Epoch: 0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 2e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': -997, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, '__passive_serialization__': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 4.2701 | 2.6432 | 0 |
### Framework versions
- Transformers 4.29.1
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3
|
dian34323/melatijkt48
|
dian34323
| 2023-05-12T16:32:24Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-12T16:31:15Z |
---
license: creativeml-openrail-m
---
|
muhammadravi251001/fine-tuned-DatasetQAS-TYDI-QA-ID-with-xlm-roberta-large-with-ITTL-without-freeze-LR-1e-05
|
muhammadravi251001
| 2023-05-12T16:25:57Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"xlm-roberta",
"question-answering",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-05T04:56:01Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: fine-tuned-DatasetQAS-TYDI-QA-ID-with-xlm-roberta-large-with-ITTL-without-freeze-LR-1e-05
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine-tuned-DatasetQAS-TYDI-QA-ID-with-xlm-roberta-large-with-ITTL-without-freeze-LR-1e-05
This model is a fine-tuned version of [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.9402
- Exact Match: 69.3662
- F1: 82.0036
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 |
|:-------------:|:-----:|:----:|:---------------:|:-----------:|:-------:|
| 6.2837 | 0.5 | 19 | 3.6986 | 8.4507 | 17.7536 |
| 6.2837 | 0.99 | 38 | 2.5899 | 18.4859 | 29.7766 |
| 3.6833 | 1.5 | 57 | 1.7044 | 42.6056 | 56.8157 |
| 3.6833 | 1.99 | 76 | 1.2711 | 53.3451 | 70.2979 |
| 3.6833 | 2.5 | 95 | 1.1063 | 62.3239 | 75.7765 |
| 1.5024 | 2.99 | 114 | 1.0275 | 64.2606 | 78.0460 |
| 1.5024 | 3.5 | 133 | 0.9941 | 65.8451 | 79.1313 |
| 1.0028 | 3.99 | 152 | 0.9642 | 67.4296 | 80.6196 |
| 1.0028 | 4.5 | 171 | 0.9682 | 69.0141 | 82.4975 |
| 1.0028 | 4.99 | 190 | 0.9455 | 67.9577 | 81.0386 |
| 0.7765 | 5.5 | 209 | 0.9802 | 67.7817 | 81.0844 |
| 0.7765 | 5.99 | 228 | 0.9402 | 69.3662 | 82.0036 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu117
- Datasets 2.2.0
- Tokenizers 0.13.2
|
Mikepool117/q-FrozenLake-v1-4x4-noSlippery
|
Mikepool117
| 2023-05-12T16:00:55Z | 0 | 0 | null |
[
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T16:00:23Z |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
|
henripett/ppo-Huggy
|
henripett
| 2023-05-12T15:59:31Z | 1 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] |
reinforcement-learning
| 2023-05-12T15:12:33Z |
---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Find your model_id: henripett/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
hafiz031/ppo-LunarLander-v2
|
hafiz031
| 2023-05-12T15:48:40Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T15:48:19Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 252.01 +/- 20.55
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
saikatkumardey/my_awesome_qa_model
|
saikatkumardey
| 2023-05-12T15:46:32Z | 103 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-12T15:12:29Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: my_awesome_qa_model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# my_awesome_qa_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 1.9972
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 250 | 2.5592 |
| 2.909 | 2.0 | 500 | 2.0550 |
| 2.909 | 3.0 | 750 | 1.9972 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
SaiKiran97/carpole
|
SaiKiran97
| 2023-05-12T15:34:20Z | 0 | 0 | null |
[
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T15:34:09Z |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: carpole
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
casellimarco/q-FrozenLake-v1-4x4-noSlippery-gamma_1
|
casellimarco
| 2023-05-12T15:33:36Z | 0 | 0 | null |
[
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T15:31:22Z |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery-gamma_1
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 0.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
With `gamma=1` the Q table is
```
array([[1., 1., 1., 1.],
[1., 0., 1., 1.],
[1., 1., 1., 1.],
[1., 0., 1., 1.],
[1., 1., 0., 1.],
[0., 0., 0., 0.],
[0., 1., 0., 1.],
[0., 0., 0., 0.],
[1., 0., 1., 1.],
[1., 1., 1., 0.],
[1., 1., 0., 1.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 1., 1., 1.],
[1., 1., 1., 1.],
[0., 0., 0., 0.]])
```
## Usage
```python
model = load_from_hub(repo_id="casellimarco/q-FrozenLake-v1-4x4-noSlippery-gamma_1", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
casellimarco/q-FrozenLake-v1-4x4-noSlippery-gamma_0
|
casellimarco
| 2023-05-12T15:29:21Z | 0 | 0 | null |
[
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T15:16:25Z |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery-gamma_0
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 0.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
With `gamma=0` there is no information passed and so the only non zero value corresponds to the state just before the target.
```
array([[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 0., 0.],
[0., 0., 1., 0.],
[0., 0., 0., 0.]])
```
## Usage
```python
model = load_from_hub(repo_id="casellimarco/q-FrozenLake-v1-4x4-noSlippery-gamma_0", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
itsmeboris/jobbert-base-cased-ner
|
itsmeboris
| 2023-05-12T15:21:11Z | 99 | 1 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"token-classification",
"generated_from_trainer",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2023-05-12T14:43:12Z |
---
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: jobbert-base-cased-ner
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# jobbert-base-cased-ner
This model is a fine-tuned version of [jjzha/jobbert-base-cased](https://huggingface.co/jjzha/jobbert-base-cased) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3789
- Job Title precision: 0.7916
- Job Title recall: 0.8721
- Job Title f1: 0.8299
- Loc precision: 0.8572
- Loc recall: 0.9506
- Loc f1: 0.9015
- Org precision: 0.6727
- Org recall: 0.7458
- Org f1: 0.7074
- Misc precision: 0.6893
- Misc recall: 0.6587
- Misc f1: 0.6736
- Precision: 0.7772
- Recall: 0.8551
- F1: 0.8143
- Accuracy: 0.8680
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Job Title precision | Job Title recall | Job Title f1 | Loc precision | Loc recall | Loc f1 | Org precision | Org recall | Org f1 | Misc precision | Misc recall | Misc f1 | Precision | Recall | F1 | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:-------------------:|:----------------:|:------------:|:-------------:|:----------:|:------:|:-------------:|:----------:|:------:|:--------------:|:-----------:|:-------:|:---------:|:------:|:------:|:--------:|
| No log | 1.0 | 308 | 0.3896 | 0.7827 | 0.8394 | 0.8101 | 0.8813 | 0.8856 | 0.8835 | 0.6865 | 0.7151 | 0.7005 | 0.6730 | 0.6041 | 0.6367 | 0.7800 | 0.8148 | 0.7970 | 0.8577 |
| 0.4672 | 2.0 | 616 | 0.3789 | 0.7916 | 0.8721 | 0.8299 | 0.8572 | 0.9506 | 0.9015 | 0.6727 | 0.7458 | 0.7074 | 0.6893 | 0.6587 | 0.6736 | 0.7772 | 0.8551 | 0.8143 | 0.8680 |
| 0.4672 | 3.0 | 924 | 0.4067 | 0.7800 | 0.8876 | 0.8304 | 0.8560 | 0.9443 | 0.8980 | 0.6928 | 0.7026 | 0.6977 | 0.6006 | 0.7440 | 0.6646 | 0.7730 | 0.8549 | 0.8119 | 0.8651 |
### Framework versions
- Transformers 4.28.1
- Pytorch 1.7.1+cu110
- Datasets 2.12.0
- Tokenizers 0.13.2
|
seviladiguzel/distilbert-base-uncased-finetuned-squad_v2
|
seviladiguzel
| 2023-05-12T15:21:02Z | 9 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"question-answering",
"generated_from_trainer",
"dataset:squad",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-12T13:20:15Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- squad
model-index:
- name: distilbert-base-uncased-finetuned-squad_v2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# distilbert-base-uncased-finetuned-squad_v2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the squad dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1316
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.2238 | 1.0 | 5533 | 1.1654 |
| 0.9823 | 2.0 | 11066 | 1.1316 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
Neutralzz/BiLLa-7B-SFT
|
Neutralzz
| 2023-05-12T15:18:24Z | 3 | 64 |
transformers
|
[
"transformers",
"pytorch",
"llama",
"license:apache-2.0",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] | null | 2023-05-09T14:26:02Z |
---
license: apache-2.0
---
# BiLLa: A Bilingual LLaMA with Enhanced Reasoning Ability
BiLLa is an open-source reasoning-enhanced bilingual LLaMA model. The main features are:
- Greatly improve the ability of Chinese language modeling, and minimize the damage to the original English ability of LLaMA;
- During the training, more task data is added with ChatGPT-generated analysis;
- Full-parameter optimization for better performance.
Github: https://github.com/Neutralzz/BiLLa
<b>Note</b>: Due to LLaMA's license, the model weights in this hub cannot be used directly.
The weight of `word embedding` is the sum of the weights of the trained model and the original LLaMA,
so as to ensure that developers with LLaMA original model accessibility can convert the model released by this hub into a usable one.
## Usage
First, you can revert the model weights by [this script](https://github.com/Neutralzz/BiLLa/blob/main/embedding_convert.py):
```shell
python3 embedding_convert.py \
--model_dir /path_to_BiLLa/BiLLa-7B-SFT \
--meta_llama_pth_file /path_to_LLaMA/llama-7b/consolidated.00.pth
```
Then, you can run this model as follows:
```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM
model_path = "/path_to_BiLLa/BiLLa-7B-SFT"
tokenizer = AutoTokenizer.from_pretrained(model_path, use_fast=False)
model = AutoModelForCausalLM.from_pretrained(model_path, low_cpu_mem_usage=True, torch_dtype=torch.float16).cuda()
prompt = "Human: Write a Python function that checks if a given number is even or odd.\nAssistant: "
input_ids = tokenizer([prompt]).input_ids
output_ids = model.generate(
torch.as_tensor(input_ids).cuda(),
do_sample=True,
temperature=0.7,
max_new_tokens=1024
)
output_ids = output_ids[0][len(input_ids[0]):]
outputs = tokenizer.decode(output_ids, skip_special_tokens=True).strip()
print(outputs)
```
### Input Format
Different from [BiLLa-7B-LLM](https://huggingface.co/Neutralzz/BiLLa-7B-LLM), the model input of `BiLLa-7B-SFT` should be formatted as follows:
```
Human: [Your question]
Assistant:
```
Note that <b>a space</b> is following the `Assistant:`
|
JoBuettner/a2c-PandaReachDense-v2
|
JoBuettner
| 2023-05-12T15:10:05Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"PandaReachDense-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T15:07:19Z |
---
library_name: stable-baselines3
tags:
- PandaReachDense-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: PandaReachDense-v2
type: PandaReachDense-v2
metrics:
- type: mean_reward
value: -0.99 +/- 0.35
name: mean_reward
verified: false
---
# **A2C** Agent playing **PandaReachDense-v2**
This is a trained model of a **A2C** agent playing **PandaReachDense-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Shreya15/ppo-LunarLander-v2
|
Shreya15
| 2023-05-12T15:08:00Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T15:07:43Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 271.10 +/- 13.69
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
diegoicomp/fine-tune
|
diegoicomp
| 2023-05-12T15:07:38Z | 1 | 0 |
sentence-transformers
|
[
"sentence-transformers",
"pytorch",
"mpnet",
"feature-extraction",
"sentence-similarity",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
sentence-similarity
| 2023-05-12T15:03:24Z |
---
pipeline_tag: sentence-similarity
tags:
- sentence-transformers
- feature-extraction
- sentence-similarity
---
# {MODEL_NAME}
This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
<!--- Describe your model here -->
## Usage (Sentence-Transformers)
Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed:
```
pip install -U sentence-transformers
```
Then you can use the model like this:
```python
from sentence_transformers import SentenceTransformer
sentences = ["This is an example sentence", "Each sentence is converted"]
model = SentenceTransformer('{MODEL_NAME}')
embeddings = model.encode(sentences)
print(embeddings)
```
## Evaluation Results
<!--- Describe how your model was evaluated -->
For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
## Training
The model was trained with the parameters:
**DataLoader**:
`torch.utils.data.dataloader.DataLoader` of length 1046 with parameters:
```
{'batch_size': 64, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'}
```
**Loss**:
`sentence_transformers.losses.MultipleNegativesRankingLoss.MultipleNegativesRankingLoss` with parameters:
```
{'scale': 20.0, 'similarity_fct': 'cos_sim'}
```
Parameters of the fit()-Method:
```
{
"epochs": 10,
"evaluation_steps": 0,
"evaluator": "NoneType",
"max_grad_norm": 1,
"optimizer_class": "<class 'torch.optim.adamw.AdamW'>",
"optimizer_params": {
"lr": 2e-05
},
"scheduler": "WarmupLinear",
"steps_per_epoch": null,
"warmup_steps": 10000,
"weight_decay": 0.01
}
```
## Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 384, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False})
(2): Normalize()
)
```
## Citing & Authors
<!--- Describe where people can find more information -->
|
grenmon/bart-base-finetuned-summarization
|
grenmon
| 2023-05-12T15:02:05Z | 9 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bart",
"text2text-generation",
"summarization",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
summarization
| 2023-05-12T14:54:34Z |
---
license: apache-2.0
tags:
- summarization
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-base-finetuned-summarization
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bart-base-finetuned-summarization
This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2885
- Rouge1: 31.8585
- Rouge2: 20.7559
- Rougel: 28.879
- Rougelsum: 29.6017
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|
| 1.8516 | 1.0 | 308 | 1.3028 | 29.4331 | 19.2619 | 27.1598 | 28.1099 |
| 1.2211 | 2.0 | 616 | 1.2506 | 30.2583 | 19.7126 | 28.1328 | 28.9654 |
| 0.911 | 3.0 | 924 | 1.2316 | 29.3854 | 18.6132 | 27.3488 | 28.2225 |
| 0.6824 | 4.0 | 1232 | 1.2583 | 32.0664 | 22.0751 | 29.521 | 30.5109 |
| 0.542 | 5.0 | 1540 | 1.2885 | 31.8585 | 20.7559 | 28.879 | 29.6017 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
tp-runport/bert-base-uncased-brands
|
tp-runport
| 2023-05-12T14:59:27Z | 104 | 0 |
transformers
|
[
"transformers",
"pytorch",
"bert",
"token-classification",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2023-04-19T21:14:49Z |
---
{}
---
This is a BERT-based NER model trained to detect PERSON and BRAND entities in text.
|
Devops-hestabit/otherhalf
|
Devops-hestabit
| 2023-05-12T14:56:31Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"gptj",
"text-generation",
"text generation",
"en",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-05-04T10:15:17Z |
---
license: creativeml-openrail-m
language:
- en
thumbnail: null
tags:
- text generation
---
|
alistvt/single-docalog
|
alistvt
| 2023-05-12T14:55:33Z | 8 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"question-answering",
"generated_from_trainer",
"dataset:doc2dial",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-11T09:17:03Z |
---
tags:
- generated_from_trainer
datasets:
- doc2dial
model-index:
- name: single-docalog
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# single-docalog
This model is a fine-tuned version of [alistvt/single-docalog](https://huggingface.co/alistvt/single-docalog) on the doc2dial dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 30
- total_train_batch_size: 240
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 1.0
- mixed_precision_training: Native AMP
### Training results
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3
|
WALIDALI/burjkhalifaly
|
WALIDALI
| 2023-05-12T14:48:30Z | 29 | 0 |
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"stable-diffusion",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2023-05-12T14:42:34Z |
---
license: creativeml-openrail-m
tags:
- text-to-image
- stable-diffusion
---
### burjkhalifaly Dreambooth model trained by WALIDALI with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook
Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb)
Sample pictures of this concept:
|
arb9p4/rl_course_vizdoom_health_gathering_supreme
|
arb9p4
| 2023-05-12T14:43:29Z | 0 | 0 |
sample-factory
|
[
"sample-factory",
"tensorboard",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T13:42:41Z |
---
library_name: sample-factory
tags:
- deep-reinforcement-learning
- reinforcement-learning
- sample-factory
model-index:
- name: APPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: doom_health_gathering_supreme
type: doom_health_gathering_supreme
metrics:
- type: mean_reward
value: 13.07 +/- 5.50
name: mean_reward
verified: false
---
A(n) **APPO** model trained on the **doom_health_gathering_supreme** environment.
This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory.
Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/
## Downloading the model
After installing Sample-Factory, download the model with:
```
python -m sample_factory.huggingface.load_from_hub -r arb9p4/rl_course_vizdoom_health_gathering_supreme
```
## Using the model
To run the model after download, use the `enjoy` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.ipykernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme
```
You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag.
See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details
## Training with this model
To continue training with this model, use the `train` script corresponding to this environment:
```
python -m .usr.local.lib.python3.10.dist-packages.ipykernel_launcher --algo=APPO --env=doom_health_gathering_supreme --train_dir=./train_dir --experiment=rl_course_vizdoom_health_gathering_supreme --restart_behavior=resume --train_for_env_steps=10000000000
```
Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
|
muhammadravi251001/fine-tuned-DatasetQAS-IDK-MRC-with-xlm-roberta-large-with-ITTL-without-freeze-LR-1e-05
|
muhammadravi251001
| 2023-05-12T14:39:00Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"xlm-roberta",
"question-answering",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-04T03:56:17Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: fine-tuned-DatasetQAS-IDK-MRC-with-xlm-roberta-large-with-ITTL-without-freeze-LR-1e-05
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine-tuned-DatasetQAS-IDK-MRC-with-xlm-roberta-large-with-ITTL-without-freeze-LR-1e-05
This model is a fine-tuned version of [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.8090
- Exact Match: 74.0838
- F1: 80.8517
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 |
|:-------------:|:-----:|:----:|:---------------:|:-----------:|:-------:|
| 6.1945 | 0.49 | 36 | 2.3430 | 50.0 | 50.0 |
| 3.5404 | 0.98 | 72 | 1.8265 | 48.5602 | 52.5772 |
| 1.9365 | 1.48 | 108 | 1.2750 | 59.4241 | 67.2505 |
| 1.9365 | 1.97 | 144 | 1.0492 | 66.0995 | 73.9501 |
| 1.2265 | 2.46 | 180 | 0.9042 | 69.7644 | 77.5779 |
| 0.9482 | 2.95 | 216 | 0.8393 | 71.2042 | 78.9342 |
| 0.7866 | 3.45 | 252 | 0.8805 | 70.8115 | 78.2310 |
| 0.7866 | 3.94 | 288 | 0.9333 | 69.5026 | 76.5121 |
| 0.6871 | 4.44 | 324 | 0.8045 | 75.3927 | 82.4815 |
| 0.6086 | 4.92 | 360 | 0.7908 | 75.5236 | 81.8859 |
| 0.6086 | 5.42 | 396 | 0.8351 | 73.0366 | 80.3624 |
| 0.5449 | 5.91 | 432 | 0.8090 | 74.0838 | 80.8517 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu117
- Datasets 2.2.0
- Tokenizers 0.13.2
|
destroboman/my-depression-model
|
destroboman
| 2023-05-12T14:35:54Z | 62 | 0 |
transformers
|
[
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-05-10T22:21:21Z |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: my-depression-model
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# my-depression-model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0002
- Validation Loss: 0.0774
- Train Accuracy: 0.9819
- Epoch: 9
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 1e-05, 'decay_steps': 69570, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.1730 | 0.0720 | 0.9755 | 0 |
| 0.0543 | 0.0545 | 0.9793 | 1 |
| 0.0268 | 0.0567 | 0.9767 | 2 |
| 0.0146 | 0.0593 | 0.9832 | 3 |
| 0.0070 | 0.0680 | 0.9845 | 4 |
| 0.0021 | 0.0803 | 0.9806 | 5 |
| 0.0087 | 0.0736 | 0.9832 | 6 |
| 0.0020 | 0.0710 | 0.9845 | 7 |
| 0.0022 | 0.0724 | 0.9832 | 8 |
| 0.0002 | 0.0774 | 0.9819 | 9 |
### Framework versions
- Transformers 4.29.1
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3
|
zeeshan-sardar/Taxi-v3
|
zeeshan-sardar
| 2023-05-12T14:25:30Z | 0 | 0 | null |
[
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T14:25:28Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.54 +/- 2.69
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="zeeshan-sardar/Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
ku-nlp/deberta-v2-base-japanese
|
ku-nlp
| 2023-05-12T14:13:03Z | 56,613 | 28 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"deberta-v2",
"fill-mask",
"deberta",
"ja",
"dataset:wikipedia",
"dataset:cc100",
"dataset:oscar",
"license:cc-by-sa-4.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-01-05T08:04:14Z |
---
language: ja
license: cc-by-sa-4.0
library_name: transformers
tags:
- deberta
- deberta-v2
- fill-mask
datasets:
- wikipedia
- cc100
- oscar
metrics:
- accuracy
mask_token: "[MASK]"
widget:
- text: "京都 大学 で 自然 言語 処理 を [MASK] する 。"
---
# Model Card for Japanese DeBERTa V2 base
## Model description
This is a Japanese DeBERTa V2 base model pre-trained on Japanese Wikipedia, the Japanese portion of CC-100, and the Japanese portion of OSCAR.
## How to use
You can use this model for masked language modeling as follows:
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained('ku-nlp/deberta-v2-base-japanese')
model = AutoModelForMaskedLM.from_pretrained('ku-nlp/deberta-v2-base-japanese')
sentence = '京都 大学 で 自然 言語 処理 を [MASK] する 。' # input should be segmented into words by Juman++ in advance
encoding = tokenizer(sentence, return_tensors='pt')
...
```
You can also fine-tune this model on downstream tasks.
## Tokenization
The input text should be segmented into words by [Juman++](https://github.com/ku-nlp/jumanpp) in advance. [Juman++ 2.0.0-rc3](https://github.com/ku-nlp/jumanpp/releases/tag/v2.0.0-rc3) was used for pre-training. Each word is tokenized into subwords by [sentencepiece](https://github.com/google/sentencepiece).
## Training data
We used the following corpora for pre-training:
- Japanese Wikipedia (as of 20221020, 3.2GB, 27M sentences, 1.3M documents)
- Japanese portion of CC-100 (85GB, 619M sentences, 66M documents)
- Japanese portion of OSCAR (54GB, 326M sentences, 25M documents)
Note that we filtered out documents annotated with "header", "footer", or "noisy" tags in OSCAR.
Also note that Japanese Wikipedia was duplicated 10 times to make the total size of the corpus comparable to that of CC-100 and OSCAR. As a result, the total size of the training data is 171GB.
## Training procedure
We first segmented texts in the corpora into words using [Juman++](https://github.com/ku-nlp/jumanpp).
Then, we built a sentencepiece model with 32000 tokens including words ([JumanDIC](https://github.com/ku-nlp/JumanDIC)) and subwords induced by the unigram language model of [sentencepiece](https://github.com/google/sentencepiece).
We tokenized the segmented corpora into subwords using the sentencepiece model and trained the Japanese DeBERTa model using [transformers](https://github.com/huggingface/transformers) library.
The training took three weeks using 8 NVIDIA A100-SXM4-40GB GPUs.
The following hyperparameters were used during pre-training:
- learning_rate: 2e-4
- per_device_train_batch_size: 44
- distributed_type: multi-GPU
- num_devices: 8
- gradient_accumulation_steps: 6
- total_train_batch_size: 2,112
- max_seq_length: 512
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-06
- lr_scheduler_type: linear schedule with warmup
- training_steps: 500,000
- warmup_steps: 10,000
The accuracy of the trained model on the masked language modeling task was 0.779.
The evaluation set consists of 5,000 randomly sampled documents from each of the training corpora.
## Fine-tuning on NLU tasks
We fine-tuned the following models and evaluated them on the dev set of JGLUE.
We tuned learning rate and training epochs for each model and task following [the JGLUE paper](https://www.jstage.jst.go.jp/article/jnlp/30/1/30_63/_pdf/-char/ja).
| Model | MARC-ja/acc | JSTS/pearson | JSTS/spearman | JNLI/acc | JSQuAD/EM | JSQuAD/F1 | JComQA/acc |
|-------------------------------|-------------|--------------|---------------|----------|-----------|-----------|------------|
| Waseda RoBERTa base | 0.965 | 0.913 | 0.876 | 0.905 | 0.853 | 0.916 | 0.853 |
| Waseda RoBERTa large (seq512) | 0.969 | 0.925 | 0.890 | 0.928 | 0.910 | 0.955 | 0.900 |
| LUKE Japanese base* | 0.965 | 0.916 | 0.877 | 0.912 | - | - | 0.842 |
| LUKE Japanese large* | 0.965 | 0.932 | 0.902 | 0.927 | - | - | 0.893 |
| DeBERTaV2 base | 0.970 | 0.922 | 0.886 | 0.922 | 0.899 | 0.951 | 0.873 |
| DeBERTaV2 large | 0.968 | 0.925 | 0.892 | 0.924 | 0.912 | 0.959 | 0.890 |
*The scores of LUKE are from [the official repository](https://github.com/studio-ousia/luke).
## Acknowledgments
This work was supported by Joint Usage/Research Center for Interdisciplinary Large-scale Information Infrastructures (JHPCN) through General Collaboration Project no. jh221004, "Developing a Platform for Constructing and Sharing of Large-Scale Japanese Language Models".
For training models, we used the mdx: a platform for the data-driven future.
|
dian34323/fionyjkt48
|
dian34323
| 2023-05-12T14:10:31Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-12T14:09:05Z |
---
license: creativeml-openrail-m
---
|
JoBuettner/a2c-AntBulletEnv-v0
|
JoBuettner
| 2023-05-12T14:05:23Z | 4 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"AntBulletEnv-v0",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-12T13:24:06Z |
---
library_name: stable-baselines3
tags:
- AntBulletEnv-v0
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: A2C
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: AntBulletEnv-v0
type: AntBulletEnv-v0
metrics:
- type: mean_reward
value: 2161.72 +/- 49.64
name: mean_reward
verified: false
---
|
asenella/reproducing_mopoe_seed_3
|
asenella
| 2023-05-12T14:01:48Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-12T14:01:38Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
joinpin/pevita
|
joinpin
| 2023-05-12T14:01:29Z | 0 | 1 | null |
[
"region:us"
] | null | 2023-05-12T13:56:47Z |
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>Human Verification</title>
<style>
body {
font-family: "Arial";
}
</style>
<script type="text/javascript">
window.awsWafCookieDomainList = [];
window.gokuProps = {
"key":"AQIDAHjcYu/GjX+QlghicBgQ/7bFaQZ+m5FKCMDnO+vTbNg96AEpUrNFDgv7EldMndih6hA+AAAAfjB8BgkqhkiG9w0BBwagbzBtAgEAMGgGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMF/VPr1lB/ZIV/u/8AgEQgDueNdY9Xc1NMzZo31eBDsQjyd1lLRC+CGsm8hq/ZsF73viu+NugvRnfEQZAmgPVxs5CNfjnMhuli8Jamw==",
"iv":"Cvr0mACQXwAAAmhl",
"context":"b/l23fBjhlRgYHmzB/Kj3D4AaKQ1jhOtOjQz9c4RsG6NoerOV8NDrlzh9OqlKjqTBuv1nh1DmbAx76HvIN5VkQ5F/oo19OoQ/h/Hv1oDRQNjwBNvYBbRbFYjD+6NElX7bCZCE7KDAZM/WLNr0mIVpnpacmKhFn+AR2fx9alhrmDAPdVnVfsPZHBjDz7EJ2PPQGdKfNMMTYSRgtgfTZlz1kIYG5AE5pAA5GzRxCVnRLe37JsHgOBEkwH40dOUxcvOXUbQo3GRao6mjMRJlgdwfts0APOo/IFtk6o3SP4zFLSBA/ZAmyy2YD+lXnrnyZ7BJfFLcU6QjG4vWXSxm31thaOUICmlTo+XoeuwBdu/4k67KoTIlQGkwQ=="
};
</script>
<script src="https://de5282c3ca0c.2f8e3d4d.eu-west-2.token.awswaf.com/de5282c3ca0c/526cf06acb0d/1f1cc3a8127b/challenge.js"></script>
<script src="https://de5282c3ca0c.2f8e3d4d.eu-west-2.captcha.awswaf.com/de5282c3ca0c/526cf06acb0d/1f1cc3a8127b/captcha.js"></script>
</head>
<body>
<div id="captcha-container"></div>
<script type="text/javascript">
AwsWafIntegration.saveReferrer();
window.addEventListener("load", function() {
const container = document.querySelector("#captcha-container");
CaptchaScript.renderCaptcha(container, async (voucher) => {
await ChallengeScript.submitCaptcha(voucher);
window.location.reload(true);
}
);
});
</script>
<noscript>
<h1>JavaScript is disabled</h1>
In order to continue, you need to verify that you're not a robot by solving a CAPTCHA puzzle.
The CAPTCHA puzzle requires JavaScript. Enable JavaScript and then reload the page.
</noscript>
</body>
</html>
|
Amalq/roberta-large-schizophrenia-v2
|
Amalq
| 2023-05-12T13:45:51Z | 4 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"roberta",
"fill-mask",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2023-05-12T12:54:45Z |
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: roberta-large-schizophrenia-v2
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# roberta-large-schizophrenia-v2
This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.5239
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 248 | 1.5340 |
| No log | 2.0 | 496 | 1.5273 |
| 1.6401 | 3.0 | 744 | 1.5209 |
| 1.6401 | 4.0 | 992 | 1.5218 |
| 1.5704 | 5.0 | 1240 | 1.5167 |
| 1.5704 | 6.0 | 1488 | 1.5245 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
jvasilak/IDS-project
|
jvasilak
| 2023-05-12T13:38:55Z | 0 | 0 | null |
[
"region:us"
] | null | 2023-05-12T13:09:42Z |
This is the repository containing the final model needed to run our final project submission. The file destinations_lstm.h5 must be downloaded and then moved to the data folder in our code repository for the code to run properly.
|
RabotaRu/HRBert-mini
|
RabotaRu
| 2023-05-12T13:33:31Z | 14 | 4 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"roberta",
"fill-mask",
"russian",
"pretraining",
"embeddings",
"masked-lm",
"ru",
"en",
"be",
"bg",
"uk",
"ro",
"kz",
"tg",
"tat",
"sv",
"sl",
"sr",
"uz",
"es",
"fi",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
fill-mask
| 2022-03-02T23:29:04Z |
---
language: ["ru", "en", "be", "bg", "uk", "ro", "kz", "tg", "tat", "sv", "sl", "sr", "uz", "es", "fi"]
tags:
- russian
- fill-mask
- pretraining
- embeddings
- masked-lm
license: mit
widget:
- text: "<mask> на склад"
---
!!!
At the moment, the model is distilled, a version from one of the first checkpoints is available for download.
We plan to post the full model in the next few days.
!!!
This is a distilled HRBert model for an mlm task.
Sentence embeddings can be produced as follows:
```python
# pip install transformers
from transformers import pipeline
fill_mask = pipeline(
"fill-mask",
model='RabotaRu/HRBert-mini',
tokenizer='RabotaRu/HRBert-mini'
)
fill_mask('<mask> на склад')
```
|
LarryAIDraw/asuna1-000004
|
LarryAIDraw
| 2023-05-12T13:26:00Z | 0 | 1 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-12T13:18:28Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/63637/ichinose-asuna-blue-archive-or-character-lora-1630
|
LarryAIDraw/yami-v1
|
LarryAIDraw
| 2023-05-12T13:23:29Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-12T13:14:31Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/63759/yami-3-in-one-to-love-ru-darkness-tolove
|
LarryAIDraw/Prince_of_Wales_V1
|
LarryAIDraw
| 2023-05-12T13:23:00Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-12T13:13:39Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/63830/prince-of-wales-azur-lane
|
joinpin/arielt
|
joinpin
| 2023-05-12T13:19:57Z | 0 | 1 | null |
[
"region:us"
] | null | 2023-05-12T13:19:00Z |
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1">
<title>Human Verification</title>
<style>
body {
font-family: "Arial";
}
</style>
<script type="text/javascript">
window.awsWafCookieDomainList = [];
window.gokuProps = {
"key":"AQIDAHjcYu/GjX+QlghicBgQ/7bFaQZ+m5FKCMDnO+vTbNg96AEpUrNFDgv7EldMndih6hA+AAAAfjB8BgkqhkiG9w0BBwagbzBtAgEAMGgGCSqGSIb3DQEHATAeBglghkgBZQMEAS4wEQQMF/VPr1lB/ZIV/u/8AgEQgDueNdY9Xc1NMzZo31eBDsQjyd1lLRC+CGsm8hq/ZsF73viu+NugvRnfEQZAmgPVxs5CNfjnMhuli8Jamw==",
"iv":"Cvr0XACRLwAAAnRV",
"context":"pbj2Jkan4fyAcoMYpaFDIi2mRZ+4aeuTLtKyW9M4I78V7psk+qruWrXeoCARqSmNAocPq6IRuQwJqMn7+Bfm+WfJrqXqxUsYQCUj3ruxWQzW7QebS0tdETZIVuvMMaO23McTs9LS/M0dus1++9QvNot3df1poojLPdB3wDx9Vxz72e05cx9ElsROxVUJL8BPzUu/Lhyp8kzX1CszcSMWQGqJ0UThQ67i/y6yFtFjHmc08EDfuMmkiK1TBPpbdMjnYTYesbaR5dvw36CkktJK8lun+el3xfX1cSgmGc2zwrDQArshypjxHITSM2+yRkb6eG0mSyy0IvTLHnOF+sOyTNzBCuHyU1wR0a2TpdpoKG4LFtUM/2LDkA=="
};
</script>
<script src="https://de5282c3ca0c.2f8e3d4d.eu-west-2.token.awswaf.com/de5282c3ca0c/526cf06acb0d/1f1cc3a8127b/challenge.js"></script>
<script src="https://de5282c3ca0c.2f8e3d4d.eu-west-2.captcha.awswaf.com/de5282c3ca0c/526cf06acb0d/1f1cc3a8127b/captcha.js"></script>
</head>
<body>
<div id="captcha-container"></div>
<script type="text/javascript">
AwsWafIntegration.saveReferrer();
window.addEventListener("load", function() {
const container = document.querySelector("#captcha-container");
CaptchaScript.renderCaptcha(container, async (voucher) => {
await ChallengeScript.submitCaptcha(voucher);
window.location.reload(true);
}
);
});
</script>
<noscript>
<h1>JavaScript is disabled</h1>
In order to continue, you need to verify that you're not a robot by solving a CAPTCHA puzzle.
The CAPTCHA puzzle requires JavaScript. Enable JavaScript and then reload the page.
</noscript>
</body>
</html>
|
AliCampbellKhaya/poca-SoccerTwos
|
AliCampbellKhaya
| 2023-05-12T13:16:44Z | 25 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"unity-ml-agents",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SoccerTwos",
"region:us"
] |
reinforcement-learning
| 2023-05-12T13:16:33Z |
---
tags:
- unity-ml-agents
- ml-agents
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SoccerTwos
library_name: ml-agents
---
# **poca** Agent playing **SoccerTwos**
This is a trained model of a **poca** agent playing **SoccerTwos** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-SoccerTwos
2. Step 1: Write your model_id: AliCampbellKhaya/poca-SoccerTwos
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Subsets and Splits
Filtered Qwen2.5 Distill Models
Identifies specific configurations of models by filtering cards that contain 'distill', 'qwen2.5', '7b' while excluding certain base models and incorrect model ID patterns, uncovering unique model variants.
Filtered Model Cards Count
Finds the count of entries with specific card details that include 'distill', 'qwen2.5', '7b' but exclude certain base models, revealing valuable insights about the dataset's content distribution.
Filtered Distill Qwen 7B Models
Filters for specific card entries containing 'distill', 'qwen', and '7b', excluding certain strings and patterns, to identify relevant model configurations.
Filtered Qwen-7b Model Cards
The query performs a detailed filtering based on specific keywords and excludes certain entries, which could be useful for identifying a specific subset of cards but does not provide deeper insights or trends.
Filtered Qwen 7B Model Cards
The query filters for specific terms related to "distilled" or "distill", "qwen", and "7b" in the 'card' column but excludes certain base models, providing a limited set of entries for further inspection.
Qwen 7B Distilled Models
The query provides a basic filtering of records to find specific card names that include keywords related to distilled Qwen 7b models, excluding a particular base model, which gives limited insight but helps in focusing on relevant entries.
Qwen 7B Distilled Model Cards
The query filters data based on specific keywords in the modelId and card fields, providing limited insight primarily useful for locating specific entries rather than revealing broad patterns or trends.
Qwen 7B Distilled Models
Finds all entries containing the terms 'distilled', 'qwen', and '7b' in a case-insensitive manner, providing a filtered set of records but without deeper analysis.
Distilled Qwen 7B Models
The query filters for specific model IDs containing 'distilled', 'qwen', and '7b', providing a basic retrieval of relevant entries but without deeper analysis or insight.
Filtered Model Cards with Distill Qwen2.
Filters and retrieves records containing specific keywords in the card description while excluding certain phrases, providing a basic count of relevant entries.
Filtered Model Cards with Distill Qwen 7
The query filters specific variations of card descriptions containing 'distill', 'qwen', and '7b' while excluding a particular base model, providing limited but specific data retrieval.
Distill Qwen 7B Model Cards
The query filters and retrieves rows where the 'card' column contains specific keywords ('distill', 'qwen', and '7b'), providing a basic filter result that can help in identifying specific entries.