modelId
stringlengths 5
139
| author
stringlengths 2
42
| last_modified
timestamp[us, tz=UTC]date 2020-02-15 11:33:14
2025-09-06 12:28:13
| downloads
int64 0
223M
| likes
int64 0
11.7k
| library_name
stringclasses 543
values | tags
listlengths 1
4.05k
| pipeline_tag
stringclasses 55
values | createdAt
timestamp[us, tz=UTC]date 2022-03-02 23:29:04
2025-09-06 12:27:52
| card
stringlengths 11
1.01M
|
---|---|---|---|---|---|---|---|---|---|
Ktang2k/q-FrozenLake-v1-4x4-noSlippery
|
Ktang2k
| 2023-05-13T19:18:53Z | 0 | 0 | null |
[
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T19:18:51Z |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="Ktang2k/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
bastienm/Taxi-v3
|
bastienm
| 2023-05-13T18:43:26Z | 0 | 0 | null |
[
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T18:43:21Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.44 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="bastienm/Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
bastienm/q-FrozenLake-v1-4x4-noSlippery
|
bastienm
| 2023-05-13T18:38:50Z | 0 | 0 | null |
[
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T18:38:46Z |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="bastienm/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
dark844/alleymix
|
dark844
| 2023-05-13T18:37:50Z | 0 | 0 |
nemo
|
[
"nemo",
"art",
"text-to-image",
"en",
"dataset:OpenAssistant/oasst1",
"arxiv:1910.09700",
"license:openrail",
"region:us"
] |
text-to-image
| 2023-05-13T18:15:34Z |
---
license: openrail
datasets:
- OpenAssistant/oasst1
language:
- en
metrics:
- accuracy
library_name: nemo
pipeline_tag: text-to-image
tags:
- art
---
# Model Card for Model ID
<!-- Provide a quick summalleymixary of what the model is/does. -->
This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1).
## Model Details
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Model type:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
- **Finetuned from model [optional]:** [More Information Needed]
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
[More Information Needed]
### Downstream Use [optional]
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
Use the code below to get started with the model.
[More Information Needed]
## Training Details
### Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
[More Information Needed]
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing [optional]
[More Information Needed]
#### Training Hyperparameters
- **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
#### Speeds, Sizes, Times [optional]
<!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
[More Information Needed]
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
### Testing Data, Factors & Metrics
#### Testing Data
<!-- This should link to a Data Card if possible. -->
[More Information Needed]
#### Factors
<!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
[More Information Needed]
#### Metrics
<!-- These are the evaluation metrics being used, ideally with a description of why. -->
[More Information Needed]
### Results
[More Information Needed]
#### Summary
## Model Examination [optional]
<!-- Relevant interpretability work for the model goes here -->
[More Information Needed]
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
- **Hardware Type:** [More Information Needed]
- **Hours used:** [More Information Needed]
- **Cloud Provider:** [More Information Needed]
- **Compute Region:** [More Information Needed]
- **Carbon Emitted:** [More Information Needed]
## Technical Specifications [optional]
### Model Architecture and Objective
[More Information Needed]
### Compute Infrastructure
[More Information Needed]
#### Hardware
[More Information Needed]
#### Software
[More Information Needed]
## Citation [optional]
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Model Card Authors [optional]
[More Information Needed]
## Model Card Contact
[More Information Needed]
|
LarryAIDraw/HSR_Natasha4
|
LarryAIDraw
| 2023-05-13T18:28:59Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T18:15:55Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/64271/natashahonkai-star-rail
|
LarryAIDraw/shixiang-10
|
LarryAIDraw
| 2023-05-13T18:28:24Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T18:18:05Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/65131/yatogami-tohka-date-a-live
|
LarryAIDraw/Miorine-000009
|
LarryAIDraw
| 2023-05-13T18:27:27Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T18:16:58Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/64841/miorine-rembran-or-the-witch-from-mercury
|
LarryAIDraw/tomoe_koga-01
|
LarryAIDraw
| 2023-05-13T18:27:14Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T18:16:37Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/64850/koga-tomoe-from-bunny-girl-senpai
|
Tribbiani/robin-7b-v2
|
Tribbiani
| 2023-05-13T18:27:07Z | 8 | 3 |
transformers
|
[
"transformers",
"pytorch",
"llama",
"text-generation",
"generated_from_trainer",
"dataset:customized",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-05-13T17:47:37Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- customized
model-index:
- name: h34
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# h34
This model is a fine-tuned version of [pinkmanlove/llama-7b-hf](https://huggingface.co/pinkmanlove/llama-7b-hf) on the customized dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- distributed_type: multi-GPU
- num_devices: 8
- total_train_batch_size: 64
- total_eval_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.03
- num_epochs: 1.0
### Training results
### Framework versions
- Transformers 4.28.0.dev0
- Pytorch 2.0.0+cu117
- Datasets 2.10.1
- Tokenizers 0.13.3
|
LarryAIDraw/SwiftsureMaidBikiniV1
|
LarryAIDraw
| 2023-05-13T18:26:55Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T18:16:14Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/64385/swiftsure-azur-lane-midsummer-special-service-swimsuit
|
kuntalbhowmick/Reinforce-cartpole
|
kuntalbhowmick
| 2023-05-13T18:26:24Z | 0 | 0 | null |
[
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T18:26:19Z |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-cartpole
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 77.60 +/- 5.28
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
LarryAIDraw/nodoka-01
|
LarryAIDraw
| 2023-05-13T18:26:21Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T18:15:34Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/64161/toyohama-nodoka-from-bunny-girl-senpai
|
LarryAIDraw/kirika_towa_alma_v1
|
LarryAIDraw
| 2023-05-13T18:26:08Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T18:15:14Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/64069/kirika-towa-alma-shining-resonance
|
LarryAIDraw/tenjouin_asuka_v1
|
LarryAIDraw
| 2023-05-13T18:25:56Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T18:14:54Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/64078/tenjouin-asuka-yu-gi-oh
|
guserrl/cartas
|
guserrl
| 2023-05-13T18:13:56Z | 0 | 0 |
fastai
|
[
"fastai",
"region:us"
] | null | 2023-05-13T17:53:59Z |
---
tags:
- fastai
---
# Amazing!
🥳 Congratulations on hosting your fastai model on the Hugging Face Hub!
# Some next steps
1. Fill out this model card with more information (see the template below and the [documentation here](https://huggingface.co/docs/hub/model-repos))!
2. Create a demo in Gradio or Streamlit using 🤗 Spaces ([documentation here](https://huggingface.co/docs/hub/spaces)).
3. Join the fastai community on the [Fastai Discord](https://discord.com/invite/YKrxeNn)!
Greetings fellow fastlearner 🤝! Don't forget to delete this content from your model card.
---
# Model card
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
|
parallelq/q-Taxi-v3
|
parallelq
| 2023-05-13T17:51:55Z | 0 | 0 | null |
[
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T17:43:46Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="parallelq/q-Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
nergaldarski/KojiV2
|
nergaldarski
| 2023-05-13T17:46:46Z | 0 | 1 | null |
[
"region:us"
] | null | 2023-05-13T17:31:01Z |
CivitAI: https://civitai.com/models/41916/koji
|
messerb5467/ppo-Huggy
|
messerb5467
| 2023-05-13T17:34:14Z | 13 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] |
reinforcement-learning
| 2023-05-13T17:34:08Z |
---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Find your model_id: messerb5467/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
nergaldarski/hassakuV1.2
|
nergaldarski
| 2023-05-13T17:26:56Z | 0 | 1 | null |
[
"region:us"
] | null | 2023-05-13T17:11:16Z |
CivitAI: https://civitai.com/models/2583?modelVersionId=62528
|
fgiauna/peft-adapter-jul
|
fgiauna
| 2023-05-13T17:17:07Z | 0 | 0 | null |
[
"tensorboard",
"generated_from_trainer",
"license:mit",
"region:us"
] | null | 2023-05-13T17:11:11Z |
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: peft-adapter-jul
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# peft-adapter-jul
This model is a fine-tuned version of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0858
- Loc: {'precision': 0.6808510638297872, 'recall': 0.7407407407407407, 'f1': 0.7095343680709535, 'number': 216}
- Misc: {'precision': 0.5416666666666666, 'recall': 0.325, 'f1': 0.40624999999999994, 'number': 40}
- Org: {'precision': 0.75, 'recall': 0.81, 'f1': 0.7788461538461539, 'number': 200}
- Per: {'precision': 0.7989130434782609, 'recall': 0.75, 'f1': 0.7736842105263159, 'number': 196}
- Overall Precision: 0.7314
- Overall Recall: 0.7393
- Overall F1: 0.7353
- Overall Accuracy: 0.9799
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
### Framework versions
- Transformers 4.26.1
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
vitouphy/wav2vec2-xls-r-300m-phoneme
|
vitouphy
| 2023-05-13T17:04:45Z | 60,319 | 3 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-05-19T03:03:57Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: wav2vec2-xls-r-300m-phoneme
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# wav2vec2-xls-r-300m-phoneme
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3327
- Cer: 0.1332
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 2000
- training_steps: 7000
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Cer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 3.4324 | 1.32 | 1000 | 3.3693 | 0.9091 |
| 2.1751 | 2.65 | 2000 | 1.1382 | 0.2397 |
| 1.3986 | 3.97 | 3000 | 0.4886 | 0.1452 |
| 1.2285 | 5.3 | 4000 | 0.3842 | 0.1351 |
| 1.142 | 6.62 | 5000 | 0.3505 | 0.1349 |
| 1.1075 | 7.95 | 6000 | 0.3323 | 0.1317 |
| 1.0867 | 9.27 | 7000 | 0.3265 | 0.1315 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
|
vitouphy/wav2vec2-xls-r-300m-english
|
vitouphy
| 2023-05-13T17:04:05Z | 94 | 3 |
transformers
|
[
"transformers",
"pytorch",
"safetensors",
"wav2vec2",
"automatic-speech-recognition",
"en",
"generated_from_trainer",
"hf-asr-leaderboard",
"librispeech_asr",
"robust-speech-event",
"dataset:librispeech_asr",
"license:apache-2.0",
"model-index",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2022-03-02T23:29:05Z |
---
language:
- en
license: apache-2.0
tags:
- automatic-speech-recognition
- en
- generated_from_trainer
- hf-asr-leaderboard
- librispeech_asr
- robust-speech-event
datasets:
- librispeech_asr
model-index:
- name: XLS-R-300M - English
results:
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: LibriSpeech (clean)
type: librispeech_asr
config: clean
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 12.29
- name: Test CER
type: cer
value: 3.34
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Robust Speech Event - Dev Data
type: speech-recognition-community-v2/dev_data
args: en
metrics:
- name: Validation WER
type: wer
value: 36.75
- name: Validation CER
type: cer
value: 14.83
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Common Voice 8.0
type: mozilla-foundation/common_voice_8_0
config: en
split: test
args:
language: en
metrics:
- name: Test WER
type: wer
value: 37.81
- task:
name: Automatic Speech Recognition
type: automatic-speech-recognition
dataset:
name: Robust Speech Event - Test Data
type: speech-recognition-community-v2/eval_data
args: en
metrics:
- name: Test WER
type: wer
value: 38.8
---
#
This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the librispeech_asr dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1444
- Wer: 0.1167
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 50
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| 2.9365 | 4.17 | 500 | 2.9398 | 0.9999 |
| 1.5444 | 8.33 | 1000 | 0.5947 | 0.4289 |
| 1.1367 | 12.5 | 1500 | 0.2751 | 0.2366 |
| 0.9972 | 16.66 | 2000 | 0.2032 | 0.1797 |
| 0.9118 | 20.83 | 2500 | 0.1786 | 0.1479 |
| 0.8664 | 24.99 | 3000 | 0.1641 | 0.1408 |
| 0.8251 | 29.17 | 3500 | 0.1537 | 0.1267 |
| 0.793 | 33.33 | 4000 | 0.1525 | 0.1244 |
| 0.785 | 37.5 | 4500 | 0.1470 | 0.1184 |
| 0.7612 | 41.66 | 5000 | 0.1446 | 0.1177 |
| 0.7478 | 45.83 | 5500 | 0.1449 | 0.1176 |
| 0.7443 | 49.99 | 6000 | 0.1444 | 0.1167 |
### Framework versions
- Transformers 4.17.0.dev0
- Pytorch 1.10.2+cu102
- Datasets 1.18.2.dev0
- Tokenizers 0.11.0
|
Felix555/poca-SoccerTwos
|
Felix555
| 2023-05-13T16:53:08Z | 0 | 0 |
ml-agents
|
[
"ml-agents",
"onnx",
"ML-Agents-SoccerTwos",
"reinforcement-learning",
"region:us"
] |
reinforcement-learning
| 2023-05-13T00:26:40Z |
---
task: reinforcement-learning
library_name: ml-agents
tags:
- ML-Agents-SoccerTwos
- reinforcement-learning
---
|
akaneshiro/q-Taxi-v3
|
akaneshiro
| 2023-05-13T16:46:31Z | 0 | 0 | null |
[
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T16:46:29Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-Taxi-v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="akaneshiro/q-Taxi-v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Mikepool117/dqn-SpaceInvadersNoFrameskip-v4
|
Mikepool117
| 2023-05-13T16:44:33Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T16:43:55Z |
---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 592.50 +/- 98.37
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Mikepool117 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Mikepool117 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga Mikepool117
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
|
akaneshiro/q-FrozenLake-v1-4x4-noSlippery
|
akaneshiro
| 2023-05-13T16:42:42Z | 0 | 0 | null |
[
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T16:42:39Z |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="akaneshiro/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
messerb5467/ppo-LunarLander-v2
|
messerb5467
| 2023-05-13T16:30:17Z | 4 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T16:29:57Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 264.83 +/- 18.72
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Theiss/Unit1
|
Theiss
| 2023-05-13T16:20:11Z | 11 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-08T06:57:23Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 273.88 +/- 21.43
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
LarryAIDraw/shiina_mashiro_v1
|
LarryAIDraw
| 2023-05-13T16:18:53Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-04-28T06:14:19Z |
---
license: creativeml-openrail-m
---
https://civitai.com/models/50851?modelVersionId=55367
|
fooosd/Deltalines
|
fooosd
| 2023-05-13T16:11:17Z | 0 | 0 | null |
[
"region:us"
] | null | 2023-05-13T16:08:38Z |
[.DeLTa~@airlines.] Contact X Number X (+1 888-721-0122) FLIGHT chaNGe department number
Delta Airlines Phone Number for Reservations {1(888) »721»0122}꧂.
Delta Airlines Booking Online {1(888) »721»0122}꧂ Available at Lowest Airlines fares Once passengers have completed the steps for the booking process, they can also go through the web check-in process and book their preferred seats in the flight. If you want to book a flight with Delta Airlines, the best way to do so is through the Airlines's online reservations system. You will need the reservation number, which is {1(888) »721»0122}꧂. The majority of reservations and tickets for Delta Airlines contain this number.
To book their seats on the flight, passengers must follow the steps that are listed below: Go to the Airlines's website; enter the travel destinations; enter the passenger's information; select Delta Airlines from the list that appears. The best part is that Delta Airlines tickets can be purchased online at the lowest prices, making it easier for them to choose and save money for their work. What is Delta Airlines Reservations Number? You can book flights, inns, vehicle rentals, and other travel-related administrations with the Reservations Number given by Delta Carriers. On Delta Airlines crafts' site, in its call communities, and at most air terminal ticket workplaces, you can see as the - Reservations Number. The - Reservations Number can likewise be found on Delta Airlines crafts' application and on tickets. To reserve a spot for a flight, lodging, or rental vehicle, you really want the Reservations Number. Furthermore, to make a buy from the web-based store worked by Delta Carriers, you will require the Airlines crafts Reservations Number. You can still make a reservation by calling Delta Airlines customer service line even if you do not possess the Reservations Number. Reservations can be handled by most Delta Airlines employees. How to Use the Delta Airlines Reservations Number When you book a flight with Delta Airlines, you must use the Reservations Number. This number can only be used with Delta Airlines, not with any other Airlines. You will first need to create an account with Delta Airlines before you can use the Reservations Number. You can use the Reservations Number to make reservations after creating your account. If you already have a reservation for a flight with another Airlines and want to switch to Delta Airlines, you will need to use the Reservations Number to make a new reservation with Delta Airlines. What are the advantages of using Delta Airlines Reservations Number? Customers who wish to make reservations online can do so by calling the Reservations Number provided by Delta Airlines. You can make reservations using the Reservations Number without entering your previous phone number. Reservations for Delta Airlines flights and reservations for flights: {1(888) »721»0122}꧂ The Delta Airlines are available to passengers to enhance and facilitate their travel experience. Travelers can snatch the most ideal arrangements that anyone could hope to find for the aircrafts by going on its site and entering the subtleties to get the most ideal offers that anyone could hope to find for them. With us, they can look at a variety of flight options and take advantage of attractive offers and the best deals offered by Delta Airlines. Reservations and online ticket booking for Delta Airlines can be made by calling {1(888) »721»0122}꧂. Because the Airlines's flights are available at various times, it is convenient for customers to book and travel with Delta Airlines. The process for purchasing a ticket, the policy regarding refunds and cancellations, as well as the baggage policy, are all extremely user-friendly for passengers, making it ideal for them to select Airlines and travel internationally. The Delta Airlines online ticket booking and reservation system was created in such a way that passengers can take advantage of the best deals offered by the Airlines and have their work done quickly and easily. They can also get help from experts about Airlines offers and policies there, which will improve their travel significantly. Since the Airlines offer a variety of amenities, it is a popular choice for passengers. The ability to book Delta Airlines flights online at the lowest airfare price makes it a very useful option for passengers to choose from, travel frequently, and experience the services of the Airlines. These features include high-quality services at low prices, inexpensive airfare, a speedy booking mechanism, a variety of flights available for a variety of routes, and a simple cancellation and refund policy that provides excellent customer support. Find various deals and book the ticket online: Passengers can book tickets for Delta Airlines online or from the Delta States, where they can find a variety of features and deals for the Airlines. They can also visit the Delta Airlines website, where they can find a variety of deals and book the ticket online. After providing their destination and personal information, their passengers are able to select from a variety of flight deals for their route. Passengers can easily access the website and select the help section if they encounter any difficulties during the booking process. They can even send an email to the experts in customer service to find a solution to their problems there. Passengers can call the support specialists, who will guarantee a resolution to their concerns. Delta Airlines Airlines is a discounted Airlines that operates both scheduled and charter flights at the appropriate times. It ranks ninth among the largest US commercial Airlines and is recognized as a major carrier. Delta Airlines States has its headquarters in Las Vegas, Nevada, and provides a variety of flights that cater to passengers in every way. If you want to book your flight ticket online, you must first gather accurate information about the baggage policy, manage your booking, check in, and cancellation policies, among other things. You will be able to travel for a longer period of time in a more pleasurable and comfortable manner once you have access to additional information regarding the flight. If not, are we aware of the information? If so, please be the first to know. Delta Airlines carry-on baggage: You are not permitted to charge for personal possessions. Your personal item must fit under the seat in front of you and be the same size as a carry-on. In business class, you can carry your laptop in the personal item bag and an important bag. You need to charge a fair price for any additional baggage. You can bring one important bag with you on Delta Airlines, but it must be the right size, weight, and length. As a result, you need to pay a fee. Baggage Checked: Checking your baggage with dimensions for first and business class is critical. It has three dimensions, such as length plus width and height plus. Each piece's linear dimension must not exceed 62 inches, so you must choose one. It should come as no surprise that each bag must not weigh more than 23 kilograms. Therefore, you are required to pack your bags in accordance with the policy. Policy regarding musical instruments on Delta Airlines: You are permitted to purchase an additional seat that secures your large instrument if you intend to bring a musical instrument with you. Therefore, you may choose to bring this kind of item aboard. Dimensional factors include weight, length, and size. Therefore, everything falls under the managed booking's baggage policy. Policy for checking in with Delta Airlines: Check-in Delta a crucial role because it provides all of the necessary services for the flight and the best flight service possible. It offers flight service for airport check-in and online check-in. You must investigate these two services in order to learn more about them. Get the Airline sport Check-in service: To confirm your check-in, all you need to do is visit the ticket counter at the airport. You must select flight confirmation, confirmed seat selection, and instant reservation service during this process. After you check in, a Delta Airlines representative at the airport will give you a boarding pass. Get the online check-in service: Go to the booking website and use the correct email address and password by clicking the log in button. After confirming the flight ticket, select online check-in and each option. After completing your online check-in successfully, your boarding pass will be automatically sent to the agent. Policy for Delta Airlines Cancellations: Airlines Delta Airlines Airlines has the best instant cancellation policy. After booking a flight right away, you need to be aware of the best way to handle a flight cancellation. Therefore, you have the option to cancel your reservation, but you will undoubtedly be required to pay the full amount. If you bought a ticket that can't be changed, you might have to pay more, but if you bought a ticket that can be changed, you won't have to pay anything at all. As a result, you must review the Delta Airlines Airlines Cancellation Policy in accordance with it. What is Delta Airlines 24-hour cancellation policy for flights? In addition, the cancellation policy can be selected within 24 hours. When passengers purchase a flight ticket and attempt to cancel it within 24 hours, it might be a new rule. You are allowed to receive the refund directly into your bank account and are not required to pay any additional fees under this rule. Contact our customer service representative right away if you need any assistance with flight-related services. For those with hectic schedules, the Reservations Number {1(850) »952»8866}꧂ is especially helpful. You don't have to worry about remembering your initial phone number or going to a different location to make a reservation because you can do so online. You can also make reservations far in advance of your travel dates using the Reservations Number. People who want to avoid long lines at the ticket counter or airport will appreciate this. The Reservations Number can be used in a number of different ways. It allows you to purchase tickets on Delta Airlines, reserve seats on flights, and make reservations online.
Conclusion:
Know Delta Airlines 1-888»721»0122reservation number if you want to book a flight at {1(888) »721»0122}꧂. This is the phone number you'll need to make your reservation, and you can also call it if you have any other questions or concerns about your trip. Don't be afraid to call Delta Airlines at {1(888) »721»0122}꧂ if you're having trouble planning your vacation because the Airlines has a reputation for being one of the most dependable when it comes to customer service.
|
walter2/imdb_model2
|
walter2
| 2023-05-13T16:10:17Z | 59 | 0 |
transformers
|
[
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-05-13T16:09:15Z |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: imdb_model2
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# imdb_model2
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.0909
- Validation Loss: 0.0370
- Train Accuracy: 0.992
- Epoch: 2
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 625, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 0.4551 | 0.2036 | 0.924 | 0 |
| 0.1960 | 0.1091 | 0.976 | 1 |
| 0.0909 | 0.0370 | 0.992 | 2 |
### Framework versions
- Transformers 4.29.1
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3
|
Staticaliza/TestModel
|
Staticaliza
| 2023-05-13T15:55:56Z | 0 | 0 | null |
[
"aa",
"dataset:databricks/databricks-dolly-15k",
"license:openrail",
"region:us"
] | null | 2023-05-13T15:50:06Z |
---
license: openrail
datasets:
- databricks/databricks-dolly-15k
language:
- aa
---
|
ReadyP1/dqn-SpaceInvadersNoFrameskip-v4
|
ReadyP1
| 2023-05-13T15:51:22Z | 8 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"SpaceInvadersNoFrameskip-v4",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-02T14:47:58Z |
---
library_name: stable-baselines3
tags:
- SpaceInvadersNoFrameskip-v4
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: DQN
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: SpaceInvadersNoFrameskip-v4
type: SpaceInvadersNoFrameskip-v4
metrics:
- type: mean_reward
value: 580.00 +/- 161.71
name: mean_reward
verified: false
---
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4**
This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3)
and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo).
The RL Zoo is a training framework for Stable Baselines3
reinforcement learning agents,
with hyperparameter optimization and pre-trained agents included.
## Usage (with SB3 RL Zoo)
RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/>
SB3: https://github.com/DLR-RM/stable-baselines3<br/>
SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib
Install the RL Zoo (with SB3 and SB3-Contrib):
```bash
pip install rl_zoo3
```
```
# Download model and save it into the logs/ folder
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga ReadyP1 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do:
```
python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga ReadyP1 -f logs/
python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
```
## Training (with the RL Zoo)
```
python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/
# Upload the model and generate video (when possible)
python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga ReadyP1
```
## Hyperparameters
```python
OrderedDict([('batch_size', 32),
('buffer_size', 100000),
('env_wrapper',
['stable_baselines3.common.atari_wrappers.AtariWrapper']),
('exploration_final_eps', 0.01),
('exploration_fraction', 0.1),
('frame_stack', 4),
('gradient_steps', 1),
('learning_rate', 0.0001),
('learning_starts', 100000),
('n_timesteps', 1000000.0),
('optimize_memory_usage', False),
('policy', 'CnnPolicy'),
('target_update_interval', 1000),
('train_freq', 4),
('normalize', False)])
```
|
LukeMich/my_awesome_model
|
LukeMich
| 2023-05-13T15:38:44Z | 60 | 0 |
transformers
|
[
"transformers",
"tf",
"distilbert",
"text-classification",
"generated_from_keras_callback",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-05-13T15:03:47Z |
---
license: apache-2.0
tags:
- generated_from_keras_callback
model-index:
- name: LukeMich/my_awesome_model
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# LukeMich/my_awesome_model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.7595
- Validation Loss: 0.5417
- Train Accuracy: 0.8762
- Epoch: 1
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': True, 'is_legacy_optimizer': False, 'learning_rate': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 275, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Train Accuracy | Epoch |
|:----------:|:---------------:|:--------------:|:-----:|
| 1.0643 | 0.8268 | 0.6571 | 0 |
| 0.7595 | 0.5417 | 0.8762 | 1 |
### Framework versions
- Transformers 4.29.1
- TensorFlow 2.12.0
- Datasets 2.12.0
- Tokenizers 0.13.3
|
divers/e2e-flan-large-noscore
|
divers
| 2023-05-13T15:20:19Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-04-28T12:06:53Z |
<table>
<thead>
<tr>
<th>Epoch</th>
<th>Training Loss</th>
<th>Validation Loss</th>
<th>Rouge1</th>
<th>Rouge2</th>
<th>Rougel</th>
<th>Rougelsum</th>
<th>Gen Len</th>
</tr>
</thead>
<tbody>
<tr>
<td>45</td>
<td>0.354900</td>
<td>0.269528</td>
<td>0.576800</td>
<td>0.401800</td>
<td>0.459100</td>
<td>0.568700</td>
<td>477.000000</td>
</tr>
<tr>
<td>46</td>
<td>0.336500</td>
<td>0.266415</td>
<td>0.589300</td>
<td>0.407500</td>
<td>0.462300</td>
<td>0.578900</td>
<td>531.588200</td>
</tr>
<tr>
<td>47</td>
<td>0.330200</td>
<td>0.264836</td>
<td>0.527300</td>
<td>0.359300</td>
<td>0.419800</td>
<td>0.519300</td>
<td>602.764700</td>
</tr>
<tr>
<td>49</td>
<td>0.326100</td>
<td>0.263734</td>
<td>0.562700</td>
<td>0.399900</td>
<td>0.458800</td>
<td>0.556100</td>
<td>562.941200</td>
</tr>
<tr>
<td>49</td>
<td>0.326700</td>
<td>0.262347</td>
<td>0.585700</td>
<td>0.413500</td>
<td>0.474000</td>
<td>0.577500</td>
<td>545.882400</td>
</tr>
<tr>
<td>50</td>
<td>0.319400</td>
<td>0.261993</td>
<td>0.581300</td>
<td>0.411200</td>
<td>0.471100</td>
<td>0.570000</td>
<td>555.941200</td>
</tr>
<tr>
<td>51</td>
<td>0.314700</td>
<td>0.261653</td>
<td>0.586700</td>
<td>0.415300</td>
<td>0.473400</td>
<td>0.578600</td>
<td>543.529400</td>
</tr>
<tr>
<td>53</td>
<td>0.311900</td>
<td>0.261087</td>
<td>0.617900</td>
<td>0.433900</td>
<td>0.489800</td>
<td>0.608700</td>
<td>525.647100</td>
</tr>
<tr>
<td>53</td>
<td>0.315600</td>
<td>0.260690</td>
<td>0.616000</td>
<td>0.434500</td>
<td>0.492100</td>
<td>0.606800</td>
<td>529.588200</td>
</tr>
<tr>
<td>54</td>
<td>0.315400</td>
<td>0.260348</td>
<td>0.615600</td>
<td>0.432100</td>
<td>0.489000</td>
<td>0.605500</td>
<td>535.411800</td>
</tr>
<tr>
<td>55</td>
<td>0.307700</td>
<td>0.259774</td>
<td>0.622600</td>
<td>0.443800</td>
<td>0.497700</td>
<td>0.613200</td>
<td>529.647100</td>
</tr>
<tr>
<td>56</td>
<td>0.306800</td>
<td>0.259790</td>
<td>0.620900</td>
<td>0.438100</td>
<td>0.493000</td>
<td>0.610600</td>
<td>515.058800</td>
</tr>
<tr>
<td>58</td>
<td>0.305700</td>
<td>0.259624</td>
<td>0.618500</td>
<td>0.434000</td>
<td>0.489200</td>
<td>0.607800</td>
<td>506.352900</td>
</tr>
<tr>
<td>58</td>
<td>0.310700</td>
<td>0.259519</td>
<td>0.618800</td>
<td>0.434700</td>
<td>0.489100</td>
<td>0.608000</td>
<td>506.529400</td>
</tr>
<tr>
<td>59</td>
<td>0.300900</td>
<td>0.259501</td>
<td>0.618800</td>
<td>0.434700</td>
<td>0.489100</td>
<td>0.608000</td>
<td>506.529400</td>
</tr>
</table>
|
divers/e2e-flan-large-noscore-totalds
|
divers
| 2023-05-13T15:10:50Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-05-02T05:34:46Z |
<table>
<thead>
<tr>
<th>Epoch</th>
<th>Training Loss</th>
<th>Validation Loss</th>
<th>Rouge1</th>
<th>Rouge2</th>
<th>Rougel</th>
<th>Rougelsum</th>
<th>Gen Len</th>
</tr>
</thead>
<tr>
<td>0</td>
<td>0.630300</td>
<td>0.412157</td>
<td>0.417600</td>
<td>0.263800</td>
<td>0.332800</td>
<td>0.406200</td>
<td>794.000000</td>
</tr>
<tr>
<td>1</td>
<td>0.445600</td>
<td>0.371808</td>
<td>0.516700</td>
<td>0.336200</td>
<td>0.415500</td>
<td>0.508000</td>
<td>560.642900</td>
</tr>
<tr>
<td>2</td>
<td>0.398800</td>
<td>0.350914</td>
<td>0.562700</td>
<td>0.375400</td>
<td>0.443900</td>
<td>0.552700</td>
<td>523.714300</td>
</tr>
<tr>
<td>4</td>
<td>0.350600</td>
<td>0.334888</td>
<td>0.553300</td>
<td>0.364900</td>
<td>0.427100</td>
<td>0.538800</td>
<td>464.035700</td>
</tr>
<tr>
<td>5</td>
<td>0.334300</td>
<td>0.326556</td>
<td>0.552100</td>
<td>0.361400</td>
<td>0.429900</td>
<td>0.540300</td>
<td>517.821400</td>
</tr>
<tr>
<td>6</td>
<td>0.322300</td>
<td>0.321693</td>
<td>0.596600</td>
<td>0.400800</td>
<td>0.469400</td>
<td>0.586400</td>
<td>414.892900</td>
</tr>
<tr>
<td>8</td>
<td>0.308800</td>
<td>0.321562</td>
<td>0.594200</td>
<td>0.389100</td>
<td>0.458500</td>
<td>0.581800</td>
<td>401.357100</td>
</tr>
<tr>
<td>8</td>
<td>0.300100</td>
<td>0.319800</td>
<td>0.586200</td>
<td>0.376100</td>
<td>0.453400</td>
<td>0.571500</td>
<td>381.357100</td>
</tr>
<tr>
<td>9</td>
<td>0.291200</td>
<td>0.319443</td>
<td>0.611500</td>
<td>0.399600</td>
<td>0.468600</td>
<td>0.597500</td>
<td>368.821400</td>
</tr>
<tr>
<td>10</td>
<td>0.282900</td>
<td>0.318927</td>
<td>0.593200</td>
<td>0.388700</td>
<td>0.459100</td>
<td>0.579800</td>
<td>354.285700</td>
</tr>
<tr>
<td>12</td>
<td>0.273700</td>
<td>0.319651</td>
<td>0.594000</td>
<td>0.394200</td>
<td>0.457000</td>
<td>0.580800</td>
<td>386.785700</td>
</tr>
<tr>
<td>12</td>
<td>0.268100</td>
<td>0.315178</td>
<td>0.603700</td>
<td>0.396100</td>
<td>0.465300</td>
<td>0.588500</td>
<td>365.714300</td>
</tr>
<tr>
<td>13</td>
<td>0.262000</td>
<td>0.312819</td>
<td>0.601500</td>
<td>0.402800</td>
<td>0.471700</td>
<td>0.586000</td>
<td>377.250000</td>
</tr>
<tr>
<td>14</td>
<td>0.254900</td>
<td>0.316255</td>
<td>0.601200</td>
<td>0.397600</td>
<td>0.469700</td>
<td>0.587900</td>
<td>353.071400</td>
</tr>
<tr>
<td>16</td>
<td>0.248500</td>
<td>0.316413</td>
<td>0.610300</td>
<td>0.407900</td>
<td>0.476000</td>
<td>0.597400</td>
<td>341.464300</td>
</tr>
<tr>
<td>16</td>
<td>0.243600</td>
<td>0.315982</td>
<td>0.611400</td>
<td>0.404900</td>
<td>0.483200</td>
<td>0.598300</td>
<td>379.571400</td>
</tr>
<tr>
<td>17</td>
<td>0.238900</td>
<td>0.318108</td>
<td>0.608100</td>
<td>0.408200</td>
<td>0.486100</td>
<td>0.594000</td>
<td>375.964300</td>
</tr>
<tr>
<td>18</td>
<td>0.233900</td>
<td>0.317792</td>
<td>0.600200</td>
<td>0.406300</td>
<td>0.471700</td>
<td>0.587600</td>
<td>346.964300</td>
</tr>
<tr>
<td>19</td>
<td>0.229600</td>
<td>0.322435</td>
<td>0.599100</td>
<td>0.407100</td>
<td>0.479600</td>
<td>0.586600</td>
<td>362.571400</td>
</tr>
</table>
|
divers/flan-base-req-extractor
|
divers
| 2023-05-13T15:01:01Z | 3 | 0 |
transformers
|
[
"transformers",
"pytorch",
"t5",
"text2text-generation",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-05-05T20:23:32Z |
html'''
<table>
<tr>
<th>Epoch</th>
<th>Training Loss</th>
<th>Validation Loss</th>
<th>Rouge1</th>
<th>Rouge2</th>
<th>Rougel</th>
<th>Rougelsum</th>
<th>Gen Len</th>
</tr>
<tr>
<td>0</td>
<td>0.357300</td>
<td>0.280200</td>
<td>0.732700</td>
<td>0.685700</td>
<td>0.695100</td>
<td>0.700500</td>
<td>303.733300</td>
</tr>
<tr>
<td>2</td>
<td>0.257200</td>
<td>0.244938</td>
<td>0.742900</td>
<td>0.702100</td>
<td>0.712600</td>
<td>0.717700</td>
<td>330.200000</td>
</tr>
<tr>
<td>2</td>
<td>0.229900</td>
<td>0.230673</td>
<td>0.789800</td>
<td>0.747500</td>
<td>0.759500</td>
<td>0.765300</td>
<td>267.666700</td>
</tr>
<tr>
<td>4</td>
<td>0.209900</td>
<td>0.213156</td>
<td>0.800300</td>
<td>0.759900</td>
<td>0.766400</td>
<td>0.771700</td>
<td>274.466700</td>
</tr>
<tr>
<td>4</td>
<td>0.196200</td>
<td>0.207821</td>
<td>0.782800</td>
<td>0.745000</td>
<td>0.754900</td>
<td>0.756200</td>
<td>288.333300</td>
</tr>
<tr>
<td>6</td>
<td>0.183900</td>
<td>0.203908</td>
<td>0.752000</td>
<td>0.715000</td>
<td>0.726300</td>
<td>0.727100</td>
<td>309.755600</td>
</tr>
<tr>
<td>6</td>
<td>0.174500</td>
<td>0.203386</td>
<td>0.786100</td>
<td>0.743400</td>
<td>0.750800</td>
<td>0.756200</td>
<td>252.422200</td>
</tr>
<tr>
<td>8</td>
<td>0.165500</td>
<td>0.190161</td>
<td>0.771100</td>
<td>0.733500</td>
<td>0.735600</td>
<td>0.740400</td>
<td>292.288900</td>
</tr>
<tr>
<td>8</td>
<td>0.158300</td>
<td>0.192600</td>
<td>0.774900</td>
<td>0.737300</td>
<td>0.743300</td>
<td>0.744300</td>
<td>285.800000</td>
</tr>
<tr>
<td>9</td>
<td>0.152200</td>
<td>0.192426</td>
<td>0.795200</td>
<td>0.758900</td>
<td>0.754700</td>
<td>0.759000</td>
<td>284.266700</td>
</tr>
<tr>
<td>15</td>
<td>0.124400</td>
<td>0.182381</td>
<td>0.787800</td>
<td>0.742800</td>
<td>0.745100</td>
<td>0.746900</td>
<td>274.533300</td>
</tr>
<tr>
<td>17</td>
<td>0.120300</td>
<td>0.183192</td>
<td>0.779400</td>
<td>0.739000</td>
<td>0.734500</td>
<td>0.739300</td>
<td>289.266700</td>
</tr>
</table>
'''
|
kasunw/Pixelcopter-PLE-v0
|
kasunw
| 2023-05-13T14:47:32Z | 0 | 0 | null |
[
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T14:47:26Z |
---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Pixelcopter-PLE-v0
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 41.40 +/- 32.61
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
andyssj/entregable2
|
andyssj
| 2023-05-13T14:40:29Z | 0 | 0 |
fastai
|
[
"fastai",
"region:us"
] | null | 2023-05-13T14:40:26Z |
---
tags:
- fastai
---
# Amazing!
🥳 Congratulations on hosting your fastai model on the Hugging Face Hub!
# Some next steps
1. Fill out this model card with more information (see the template below and the [documentation here](https://huggingface.co/docs/hub/model-repos))!
2. Create a demo in Gradio or Streamlit using 🤗 Spaces ([documentation here](https://huggingface.co/docs/hub/spaces)).
3. Join the fastai community on the [Fastai Discord](https://discord.com/invite/YKrxeNn)!
Greetings fellow fastlearner 🤝! Don't forget to delete this content from your model card.
---
# Model card
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
|
ChrisOfLondon/Reinforce-Cartpole
|
ChrisOfLondon
| 2023-05-13T14:36:03Z | 0 | 0 | null |
[
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T14:35:54Z |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-Cartpole
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
dvgodoy/ddpm-cifar10-32-mnist
|
dvgodoy
| 2023-05-13T14:32:03Z | 73 | 0 |
diffusers
|
[
"diffusers",
"pytorch",
"unconditional-image-generation",
"diffusion-models-class",
"license:mit",
"diffusers:DDPMPipeline",
"region:us"
] |
unconditional-image-generation
| 2023-05-13T14:31:43Z |
---
license: mit
tags:
- pytorch
- diffusers
- unconditional-image-generation
- diffusion-models-class
---
# Diffusion Models 101
This model is a diffusion model for unconditional image generation of MNIST digits fine-tuned on Google's ddpm-cifar10-32 model
## Usage
```python
from diffusers import DDPMPipeline
pipeline = DDPMPipeline.from_pretrained('dvgodoy/ddpm-cifar10-32-mnist')
image = pipeline().images[0]
image
```
|
Abrumu/output
|
Abrumu
| 2023-05-13T14:18:18Z | 5 | 1 |
diffusers
|
[
"diffusers",
"tensorboard",
"stable-diffusion",
"stable-diffusion-diffusers",
"text-to-image",
"controlnet",
"base_model:runwayml/stable-diffusion-v1-5",
"base_model:adapter:runwayml/stable-diffusion-v1-5",
"license:creativeml-openrail-m",
"region:us"
] |
text-to-image
| 2023-05-12T16:02:42Z |
---
license: creativeml-openrail-m
base_model: runwayml/stable-diffusion-v1-5
tags:
- stable-diffusion
- stable-diffusion-diffusers
- text-to-image
- diffusers
- controlnet
inference: true
---
# controlnet-Abrumu/output
These are controlnet weights trained on runwayml/stable-diffusion-v1-5 with new type of conditioning.
|
bdsqlsz/FaceBeauty
|
bdsqlsz
| 2023-05-13T13:50:06Z | 0 | 1 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T13:47:22Z |
---
license: creativeml-openrail-m
---
|
JoeyAndFriends/ppo-Huggy
|
JoeyAndFriends
| 2023-05-13T13:19:39Z | 18 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] |
reinforcement-learning
| 2023-05-13T13:19:29Z |
---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Find your model_id: JoeyAndFriends/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
demetere/Reinforce-PixelCopter-v1.0.0
|
demetere
| 2023-05-13T13:09:26Z | 0 | 0 | null |
[
"Pixelcopter-PLE-v0",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T13:09:23Z |
---
tags:
- Pixelcopter-PLE-v0
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-PixelCopter-v1.0.0
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Pixelcopter-PLE-v0
type: Pixelcopter-PLE-v0
metrics:
- type: mean_reward
value: 22.50 +/- 15.27
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **Pixelcopter-PLE-v0**
This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
DucHaiten/DucHaiten-LoFi
|
DucHaiten
| 2023-05-13T13:08:27Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-04-12T17:09:04Z |
---
license: creativeml-openrail-m
---
z






|
JustFrederik/m2m_100_418m_ct2_int8
|
JustFrederik
| 2023-05-13T13:07:09Z | 5 | 0 |
transformers
|
[
"transformers",
"multilingual",
"af",
"am",
"ar",
"ast",
"az",
"ba",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"ceb",
"cs",
"cy",
"da",
"de",
"el",
"en",
"es",
"et",
"fa",
"ff",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"ht",
"hu",
"hy",
"id",
"ig",
"ilo",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"lb",
"lg",
"ln",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"ns",
"oc",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"ss",
"su",
"sv",
"sw",
"ta",
"th",
"tl",
"tn",
"tr",
"uk",
"ur",
"uz",
"vi",
"wo",
"xh",
"yi",
"yo",
"zh",
"zu",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2023-05-13T13:06:12Z |
---
language:
- multilingual
- af
- am
- ar
- ast
- az
- ba
- be
- bg
- bn
- br
- bs
- ca
- ceb
- cs
- cy
- da
- de
- el
- en
- es
- et
- fa
- ff
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- ht
- hu
- hy
- id
- ig
- ilo
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- lb
- lg
- ln
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- no
- ns
- oc
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- so
- sq
- sr
- ss
- su
- sv
- sw
- ta
- th
- tl
- tn
- tr
- uk
- ur
- uz
- vi
- wo
- xh
- yi
- yo
- zh
- zu
license: mit
---
https://huggingface.co/facebook/m2m100_418M
<br />
https://github.com/facebookresearch/fairseq/tree/nllb/examples/m2m_100
```
ct2-fairseq-converter --data_dir . --model_path 418M_last_checkpoint.pt --fixed_dictionary model_dict.128k.txt --quantization int8 --output_dir converted/m2m_100_418m_ct2_int8
```
External language dictionary is not provided; use lang-pairs to infer the set of supported languages. The language ordering is not stable which might cause misalignment in pretraining and finetuning.
```
wget https://dl.fbaipublicfiles.com/m2m_100/model_dict.128k.txt
# 418M parameter model
wget https://dl.fbaipublicfiles.com/m2m_100/418M_last_checkpoint.pt
```
|
JustFrederik/m2m_100_418m_ct2_float16
|
JustFrederik
| 2023-05-13T13:05:55Z | 3 | 0 |
transformers
|
[
"transformers",
"multilingual",
"af",
"am",
"ar",
"ast",
"az",
"ba",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"ceb",
"cs",
"cy",
"da",
"de",
"el",
"en",
"es",
"et",
"fa",
"ff",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"ht",
"hu",
"hy",
"id",
"ig",
"ilo",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"lb",
"lg",
"ln",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"ns",
"oc",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"ss",
"su",
"sv",
"sw",
"ta",
"th",
"tl",
"tn",
"tr",
"uk",
"ur",
"uz",
"vi",
"wo",
"xh",
"yi",
"yo",
"zh",
"zu",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2023-05-13T13:04:29Z |
---
language:
- multilingual
- af
- am
- ar
- ast
- az
- ba
- be
- bg
- bn
- br
- bs
- ca
- ceb
- cs
- cy
- da
- de
- el
- en
- es
- et
- fa
- ff
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- ht
- hu
- hy
- id
- ig
- ilo
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- lb
- lg
- ln
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- no
- ns
- oc
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- so
- sq
- sr
- ss
- su
- sv
- sw
- ta
- th
- tl
- tn
- tr
- uk
- ur
- uz
- vi
- wo
- xh
- yi
- yo
- zh
- zu
license: mit
---
https://huggingface.co/facebook/m2m100_418M
<br />
https://github.com/facebookresearch/fairseq/tree/nllb/examples/m2m_100
```
ct2-fairseq-converter --data_dir . --model_path 418M_last_checkpoint.pt --fixed_dictionary model_dict.128k.txt --quantization float16 --output_dir converted/m2m_100_418m_ct2_float16
```
External language dictionary is not provided; use lang-pairs to infer the set of supported languages. The language ordering is not stable which might cause misalignment in pretraining and finetuning.
```
wget https://dl.fbaipublicfiles.com/m2m_100/model_dict.128k.txt
# 418M parameter model
wget https://dl.fbaipublicfiles.com/m2m_100/418M_last_checkpoint.pt
```
|
JustFrederik/m2m_100_418m_ct2
|
JustFrederik
| 2023-05-13T13:03:58Z | 4 | 0 |
transformers
|
[
"transformers",
"multilingual",
"af",
"am",
"ar",
"ast",
"az",
"ba",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"ceb",
"cs",
"cy",
"da",
"de",
"el",
"en",
"es",
"et",
"fa",
"ff",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"ht",
"hu",
"hy",
"id",
"ig",
"ilo",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"lb",
"lg",
"ln",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"ns",
"oc",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"ss",
"su",
"sv",
"sw",
"ta",
"th",
"tl",
"tn",
"tr",
"uk",
"ur",
"uz",
"vi",
"wo",
"xh",
"yi",
"yo",
"zh",
"zu",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2023-05-13T13:01:28Z |
---
language:
- multilingual
- af
- am
- ar
- ast
- az
- ba
- be
- bg
- bn
- br
- bs
- ca
- ceb
- cs
- cy
- da
- de
- el
- en
- es
- et
- fa
- ff
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- ht
- hu
- hy
- id
- ig
- ilo
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- lb
- lg
- ln
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- no
- ns
- oc
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- so
- sq
- sr
- ss
- su
- sv
- sw
- ta
- th
- tl
- tn
- tr
- uk
- ur
- uz
- vi
- wo
- xh
- yi
- yo
- zh
- zu
license: mit
---
https://huggingface.co/facebook/m2m100_418M
<br />
https://github.com/facebookresearch/fairseq/tree/nllb/examples/m2m_100
```
ct2-fairseq-converter --data_dir . --model_path 418M_last_checkpoint.pt --fixed_dictionary model_dict.128k.txt --output_dir converted/m2m_100_418m_ct2
```
External language dictionary is not provided; use lang-pairs to infer the set of supported languages. The language ordering is not stable which might cause misalignment in pretraining and finetuning.
```
wget https://dl.fbaipublicfiles.com/m2m_100/model_dict.128k.txt
# 418M parameter model
wget https://dl.fbaipublicfiles.com/m2m_100/418M_last_checkpoint.pt
```
|
pythainlp/adapter-wangchanglm-7.5B-sft-en-klongklon
|
pythainlp
| 2023-05-13T13:01:01Z | 0 | 0 | null |
[
"text-generation",
"th",
"dataset:pythainlp/klongklon",
"license:cc",
"region:us"
] |
text-generation
| 2023-05-04T06:11:17Z |
---
license: cc
datasets:
- pythainlp/klongklon
language:
- th
pipeline_tag: text-generation
---
# Model Card for Klongklon Adapter of WangChanGLM 🐘
<!-- Provide a longer summary of what this model is. -->
This is a LoRA adapter for Thai poems trained on 80k poems to be used with WangChanGLM. WangChanGLM is a multilingual, instruction-finetuned Facebook XGLM-7.5B using open-source, commercially permissible datasets (LAION OIG chip2 and infill_dbpedia, DataBricks Dolly v2, OpenAI TL;DR, and Hello-SimpleAI HC3; about 400k examples), released under CC-BY SA 4.0. The models are trained to perform a subset of instruction-following tasks we found most relevant namely: reading comprehension, brainstorming, and creative writing. We provide the weights for a model finetuned on an English-only dataset ([wangchanglm-7.5B-sft-en](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-en)) and another checkpoint further finetuned on Google-Translated Thai dataset ([wangchanglm-7.5B-sft-enth](https://huggingface.co/pythainlp/wangchanglm-7.5B-sft-enth)). We perform Vicuna-style evaluation using both humans and ChatGPT (in our case, `gpt-3.5-turbo` since we are still on the waitlist for `gpt-4`) and observe some discrepancies between the two types of annoators. All training and evaluation codes are shared under the [Apache-2.0 license](https://github.com/pythainlp/wangchanglm/blob/main/LICENSE) in our Github, as well as datasets and model weights on [HuggingFace](https://huggingface.co/pythainlp). In a similar manner to [Dolly v2](https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm), we use only use open-source, commercially permissive pretrained models and datasets, our models are neither restricted by non-commercial clause like models that use LLaMA as base nor non-compete clause like models that use self-instruct datasets from ChatGPT.
- **Developed by:** [PyThaiNLP](https://www.github.com/pythainlp) and [VISTEC-depa AI Research Institute of Thailand](https://huggingface.co/airesearch)
- **Model type:** Finetuned [XGLM-7.5B](https://huggingface.co/facebook/xglm-7.5B)
- **Language(s) (NLP)**: `en`, `th`, `ja`, `vi` capacibilities evaluated, theoretically all 30 languages of [XGLM-7.5B](https://huggingface.co/facebook/xglm-7.5B)
- **License:** [CC-BY SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/)
### Model Sources
<!-- Provide the basic links for the model. -->
- **Repository:** [pythainlp/wangchanglm](https://www.github.com/pythainlp/wangchanglm)
- **Blog:** [Medium](https://link.medium.com/s2MWr3ZXnzb)
- **Demo:** [Colab notebook](https://colab.research.google.com/github/pythainlp/WangChanGLM/blob/main/demo/WangChanGLM_v0_1_demo.ipynb)
## Uses
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
### Direct Use
<!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
Intended to be use as an instruction-following model for reading comprehension, brainstorming and creative writing.
### Downstream Use
<!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
The model can be finetuned for any typical instruction-following use cases.
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
We do not expect the models to perform well in math problems, reasoning, and factfulness. We intentionally filter out training examples from these use cases.
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
We noticed similar limitations to other finetuned instruction followers such as math problems, reasoning, and factfulness. Even though the models do not perform on the level that we expect them to be abused, they do contain undesirable biases and toxicity and should be further optimized for your particular use cases.
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
## How to Get Started with the Model
See [example notebook](https://github.com/PyThaiNLP/WangChanGLM/blob/main/notebook/klong4_assistant.ipynb).
## Training Details
### Training Data
<!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
Finetuned on [PythaiNLP/klongklon](https://huggingface.co/datasets/pythainlp/klongklon)
### Training Procedure
<!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
#### Preprocessing
See [pythainlp/wangchanglm](https://www.github.com/pythainlp/wangchanglm).
#### Training Hyperparameters
- **Training regime:** LoRA with 4 GPUs. See more details at [pythainlp/wangchanglm](https://www.github.com/pythainlp/wangchanglm).
## Evaluation
<!-- This section describes the evaluation protocols and provides the results. -->
We performed automatic evaluation in the style of [Vicuna](https://vicuna.lmsys.org/) and human evaluation. See more details from our [blog]().
## Environmental Impact
<!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
Experiments were conducted using a private infrastructure, which has a carbon efficiency of 0.432 kgCO2eq/kWh. A cumulative of 500 hours of computation was performed on hardware of type Tesla V100-SXM2-32GB (TDP of 300W). Total emissions are estimated to be 64.8 CO2eq of which 0 percents were directly offset. Estimations were conducted using the [MachineLearning Impact calculator](https://mlco2.github.io/impact#compute).
## Citation
<!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
```
@software{charin_polpanumas_2023_7878101,
author = {Charin Polpanumas and
Wannaphong Phatthiyaphaibun and
Patomporn Payoungkhamdee and
Peerat Limkonchotiwat and
Lalita Lowphansirikul and
Can Udomcharoenchaikit and
Titipat Achakulwisut and
Ekapol Chuangsuwanich and
Sarana Nutanong},
title = {{WangChanGLM🐘 — The Multilingual Instruction-
Following Model}},
month = apr,
year = 2023,
publisher = {Zenodo},
version = {v0.1},
doi = {10.5281/zenodo.7878101},
url = {https://doi.org/10.5281/zenodo.7878101}
}
```
## Model Card Contact
[PyThaiNLP](https://github.com/pythainlp)
|
JustFrederik/m2m_100_1.2b_ct2_float16
|
JustFrederik
| 2023-05-13T13:00:25Z | 3 | 0 |
transformers
|
[
"transformers",
"multilingual",
"af",
"am",
"ar",
"ast",
"az",
"ba",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"ceb",
"cs",
"cy",
"da",
"de",
"el",
"en",
"es",
"et",
"fa",
"ff",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"ht",
"hu",
"hy",
"id",
"ig",
"ilo",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"lb",
"lg",
"ln",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"ns",
"oc",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"ss",
"su",
"sv",
"sw",
"ta",
"th",
"tl",
"tn",
"tr",
"uk",
"ur",
"uz",
"vi",
"wo",
"xh",
"yi",
"yo",
"zh",
"zu",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2023-05-13T12:53:19Z |
---
language:
- multilingual
- af
- am
- ar
- ast
- az
- ba
- be
- bg
- bn
- br
- bs
- ca
- ceb
- cs
- cy
- da
- de
- el
- en
- es
- et
- fa
- ff
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- ht
- hu
- hy
- id
- ig
- ilo
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- lb
- lg
- ln
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- no
- ns
- oc
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- so
- sq
- sr
- ss
- su
- sv
- sw
- ta
- th
- tl
- tn
- tr
- uk
- ur
- uz
- vi
- wo
- xh
- yi
- yo
- zh
- zu
license: mit
---
https://huggingface.co/facebook/m2m100_1.2B
<br />
https://github.com/facebookresearch/fairseq/tree/nllb/examples/m2m_100
```
ct2-fairseq-converter --data_dir . --model_path 1.2B_last_checkpoint.pt --fixed_dictionary model_dict.128k.txt --quantization float16 --output_dir converted/m2m_100_1.2b_ct2_float16
```
External language dictionary is not provided; use lang-pairs to infer the set of supported languages. The language ordering is not stable which might cause misalignment in pretraining and finetuning.
```
wget https://dl.fbaipublicfiles.com/m2m_100/model_dict.128k.txt
# 1.2B parameter model
wget https://dl.fbaipublicfiles.com/m2m_100/1.2B_last_checkpoint.pt
```
|
JustFrederik/m2m_100_1.2b_ct2
|
JustFrederik
| 2023-05-13T12:52:25Z | 2 | 0 |
transformers
|
[
"transformers",
"multilingual",
"af",
"am",
"ar",
"ast",
"az",
"ba",
"be",
"bg",
"bn",
"br",
"bs",
"ca",
"ceb",
"cs",
"cy",
"da",
"de",
"el",
"en",
"es",
"et",
"fa",
"ff",
"fi",
"fr",
"fy",
"ga",
"gd",
"gl",
"gu",
"ha",
"he",
"hi",
"hr",
"ht",
"hu",
"hy",
"id",
"ig",
"ilo",
"is",
"it",
"ja",
"jv",
"ka",
"kk",
"km",
"kn",
"ko",
"lb",
"lg",
"ln",
"lo",
"lt",
"lv",
"mg",
"mk",
"ml",
"mn",
"mr",
"ms",
"my",
"ne",
"nl",
"no",
"ns",
"oc",
"or",
"pa",
"pl",
"ps",
"pt",
"ro",
"ru",
"sd",
"si",
"sk",
"sl",
"so",
"sq",
"sr",
"ss",
"su",
"sv",
"sw",
"ta",
"th",
"tl",
"tn",
"tr",
"uk",
"ur",
"uz",
"vi",
"wo",
"xh",
"yi",
"yo",
"zh",
"zu",
"license:mit",
"endpoints_compatible",
"region:us"
] | null | 2023-05-13T12:45:46Z |
---
language:
- multilingual
- af
- am
- ar
- ast
- az
- ba
- be
- bg
- bn
- br
- bs
- ca
- ceb
- cs
- cy
- da
- de
- el
- en
- es
- et
- fa
- ff
- fi
- fr
- fy
- ga
- gd
- gl
- gu
- ha
- he
- hi
- hr
- ht
- hu
- hy
- id
- ig
- ilo
- is
- it
- ja
- jv
- ka
- kk
- km
- kn
- ko
- lb
- lg
- ln
- lo
- lt
- lv
- mg
- mk
- ml
- mn
- mr
- ms
- my
- ne
- nl
- no
- ns
- oc
- or
- pa
- pl
- ps
- pt
- ro
- ru
- sd
- si
- sk
- sl
- so
- sq
- sr
- ss
- su
- sv
- sw
- ta
- th
- tl
- tn
- tr
- uk
- ur
- uz
- vi
- wo
- xh
- yi
- yo
- zh
- zu
license: mit
---
https://huggingface.co/facebook/m2m100_1.2B
<br />
https://github.com/facebookresearch/fairseq/tree/nllb/examples/m2m_100
```
ct2-fairseq-converter --data_dir . --model_path 1.2B_last_checkpoint.pt --fixed_dictionary model_dict.128k.txt --output_dir converted/m2m_100_1.2b_ct2
```
External language dictionary is not provided; use lang-pairs to infer the set of supported languages. The language ordering is not stable which might cause misalignment in pretraining and finetuning.
```
wget https://dl.fbaipublicfiles.com/m2m_100/model_dict.128k.txt
# 1.2B parameter model
wget https://dl.fbaipublicfiles.com/m2m_100/1.2B_last_checkpoint.pt
```
|
Tingwen/ppo-Huggy
|
Tingwen
| 2023-05-13T12:39:54Z | 12 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"Huggy",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-Huggy",
"region:us"
] |
reinforcement-learning
| 2023-05-13T11:57:05Z |
---
library_name: ml-agents
tags:
- Huggy
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-Huggy
---
# **ppo** Agent playing **Huggy**
This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-Huggy
2. Step 1: Find your model_id: Tingwen/ppo-Huggy
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
autobots/llama_7b_roleplay_lora
|
autobots
| 2023-05-13T12:37:17Z | 0 | 0 | null |
[
"region:us"
] | null | 2023-05-13T12:23:37Z |
Roleplay Lora trained on llama-7b in 4-bit mode.
Trained for 3 epochs.
uses the https://github.com/teknium1/GPTeacher/tree/main/Roleplay dataset
Training in 4bit is very fast.. only took 1/2 hour on a 3090. Eval against the dataset gave a 3.x perpelexity.
|
aisquared/chopt-350m
|
aisquared
| 2023-05-13T12:28:26Z | 103 | 2 |
transformers
|
[
"transformers",
"pytorch",
"opt",
"text-generation",
"en",
"dataset:aisquared/databricks-dolly-15k",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-04-19T14:56:48Z |
---
license: other
commercial: false
datasets:
- aisquared/databricks-dolly-15k
language:
- en
library_name: transformers
---
# Model Card for `chopt-350m`
<!-- Provide a quick summary of what the model is/does. -->
AI Squared's `chopt-350m` is a large language model which is derived from Meta AI's Open Pre-trained Transformer language modelsand fine-tuned on a corpus of 15k records ([Databricks' "Dolly 15k" Dataset](https://huggingface.co/datasets/aisquared/databricks-dolly-15k)) to help it exhibit chat-based capabilities. Despite the permissive license of the Dolly 15k dataset, due to this model being a derivative of OPT it is restricted to use for **non-commercial research purposes**. The ChOPT family of models from AI Squared are licensed under the OPT-175B license, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
While `chopt-350m` is **not a state-of-the-art model**, we believe that the level of interactivity that can be achieved on such a small model that is trained so cheaply is important to showcase, as it continues to demonstrate that creating powerful AI capabilities may be much more accessible than previously thought.
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** AI Squared, Inc.
- **Shared by:** AI Squared, Inc.
- **Model type:** Large Language Model
- **Language(s) (NLP):** EN
- **License:** other
- **Finetuned from model:** OPT
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
**`chopt-350m` is not a state-of-the-art language model.** `chopt-350m` is an experimental technology and is not designed for use in any
environment other than for research purposes. Furthermore, the model can sometimes exhibit undesired behaviors. Some of these behaviors include,
but are not limited to: factual inaccuracies, biases, offensive responses, toxicity, and hallucinations.
Just as with any other LLM, we advise users of this technology to exercise good judgment when applying this technology.
## Usage
To use the model with the `transformers` library on a machine with GPUs, first make sure you have the `transformers` and `accelerate` libraries installed.
From your terminal, run:
```python
pip install "accelerate>=0.16.0,<1" "transformers[torch]>=4.28.1,<5" "torch>=1.13.1,<2"
```
The instruction following pipeline can be loaded using the `pipeline` function as shown below. This loads a custom `InstructionTextGenerationPipeline`
found in the model repo [here](https://huggingface.co/aisquared/chopt-350m/blob/main/instruct_pipeline.py), which is why `trust_remote_code=True` is required.
Including `torch_dtype=torch.bfloat16` is generally recommended if this type is supported in order to reduce memory usage. It does not appear to impact output quality.
It is also fine to remove it if there is sufficient memory.
```python
from transformers import pipeline
import torch
generate_text = pipeline(model="aisquared/chopt-350m", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
```
You can then use the pipeline to answer instructions:
```python
res = generate_text("Who was George Washington?")
print(res)
```
Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/aisquared/chopt-350m/blob/main/instruct_pipeline.py),
store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
```python
from instruct_pipeline import InstructionTextGenerationPipeline
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("aisquared/chopt-350m", padding_side="left")
model = AutoModelForCausalLM.from_pretrained("aisquared/chopt-350m", device_map="auto", torch_dtype=torch.bfloat16)
generate_text = InstructionTextGenerationPipeline(model=model, tokenizer=tokenizer)
```
### Model Performance Metrics
We present the results from various model benchmarks on the EleutherAI LLM Evaluation Harness for all models in the ChOPT family.
Model results are sorted by mean score, ascending, to provide an ordering. These metrics serve to further show that none of the DLite models are
state of the art, but rather further show that chat-like behaviors in LLMs can be trained almost independent of model size.
| Model | openbookqa | arc_easy | winogrande | hellaswag | arc_challenge | piqa | boolq |
|:--------------------|-------------:|-----------:|-------------:|------------:|----------------:|---------:|---------:|
| chopt-125m | 0.178 | 0.443182 | 0.501973 | 0.294165 | 0.197099 | 0.630577 | 0.476758 |
| chopt-research-125m | 0.17 | 0.436027 | 0.503552 | 0.294762 | 0.205631 | 0.62568 | 0.48685 |
| opt-125m | 0.166 | 0.435606 | 0.501973 | 0.291775 | 0.190273 | 0.6284 | 0.554434 |
| chopt-350m | 0.178 | 0.450758 | 0.508287 | 0.325334 | 0.21843 | 0.650707 | 0.559633 |
| opt_350m | 0.176 | 0.441077 | 0.52644 | 0.320056 | 0.207338 | 0.645267 | 0.57737 |
| chopt-research-350m | 0.172 | 0.462542 | 0.514601 | 0.327524 | 0.235495 | 0.643634 | 0.589908 |
| opt-1.3b | 0.234 | 0.569865 | 0.596685 | 0.414957 | 0.232935 | 0.718172 | 0.577676 |
| chopt-research-1_3b | 0.232 | 0.564815 | 0.59116 | 0.424716 | 0.276451 | 0.713275 | 0.634557 |
| chopt-1_3b | 0.236 | 0.569444 | 0.584057 | 0.42621 | 0.268771 | 0.723069 | 0.658104 |
| opt-2.7b | 0.25 | 0.608165 | 0.608524 | 0.458176 | 0.267918 | 0.738303 | 0.603058 |
| chopt-2_7b | 0.276 | 0.616582 | 0.601421 | 0.472615 | 0.288396 | 0.75136 | 0.552294 |
| chopt-research-2_7b | 0.262 | 0.610269 | 0.625099 | 0.458176 | 0.295222 | 0.742111 | 0.636697 |
|
aisquared/chopt-125m
|
aisquared
| 2023-05-13T12:26:16Z | 101 | 0 |
transformers
|
[
"transformers",
"pytorch",
"opt",
"text-generation",
"en",
"dataset:aisquared/databricks-dolly-15k",
"license:other",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-04-19T14:33:43Z |
---
license: other
commercial: false
datasets:
- aisquared/databricks-dolly-15k
language:
- en
library_name: transformers
---
# Model Card for `chopt-125m`
<!-- Provide a quick summary of what the model is/does. -->
AI Squared's `chopt-125m` is a large language model which is derived from Meta AI's Open Pre-trained Transformer language modelsand fine-tuned on a corpus of 15k records ([Databricks' "Dolly 15k" Dataset](https://huggingface.co/datasets/aisquared/databricks-dolly-15k)) to help it exhibit chat-based capabilities. Despite the permissive license of the Dolly 15k dataset, due to this model being a derivative of OPT it is restricted to use for **non-commercial research purposes**. The ChOPT family of models from AI Squared are licensed under the OPT-175B license, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
While `chopt-125m` is **not a state-of-the-art model**, we believe that the level of interactivity that can be achieved on such a small model that is trained so cheaply is important to showcase, as it continues to demonstrate that creating powerful AI capabilities may be much more accessible than previously thought.
### Model Description
<!-- Provide a longer summary of what this model is. -->
- **Developed by:** AI Squared, Inc.
- **Shared by:** AI Squared, Inc.
- **Model type:** Large Language Model
- **Language(s) (NLP):** EN
- **License:** other
- **Finetuned from model:** OPT
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
**`chopt-125m` is not a state-of-the-art language model.** `chopt-125m` is an experimental technology and is not designed for use in any
environment other than for research purposes. Furthermore, the model can sometimes exhibit undesired behaviors. Some of these behaviors include,
but are not limited to: factual inaccuracies, biases, offensive responses, toxicity, and hallucinations.
Just as with any other LLM, we advise users of this technology to exercise good judgment when applying this technology.
## Usage
To use the model with the `transformers` library on a machine with GPUs, first make sure you have the `transformers` and `accelerate` libraries installed.
From your terminal, run:
```python
pip install "accelerate>=0.16.0,<1" "transformers[torch]>=4.28.1,<5" "torch>=1.13.1,<2"
```
The instruction following pipeline can be loaded using the `pipeline` function as shown below. This loads a custom `InstructionTextGenerationPipeline`
found in the model repo [here](https://huggingface.co/aisquared/chopt-125m/blob/main/instruct_pipeline.py), which is why `trust_remote_code=True` is required.
Including `torch_dtype=torch.bfloat16` is generally recommended if this type is supported in order to reduce memory usage. It does not appear to impact output quality.
It is also fine to remove it if there is sufficient memory.
```python
from transformers import pipeline
import torch
generate_text = pipeline(model="aisquared/chopt-125m", torch_dtype=torch.bfloat16, trust_remote_code=True, device_map="auto")
```
You can then use the pipeline to answer instructions:
```python
res = generate_text("Who was George Washington?")
print(res)
```
Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/aisquared/chopt-125m/blob/main/instruct_pipeline.py),
store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
```python
from instruct_pipeline import InstructionTextGenerationPipeline
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("aisquared/chopt-125m", padding_side="left")
model = AutoModelForCausalLM.from_pretrained("aisquared/chopt-125m", device_map="auto", torch_dtype=torch.bfloat16)
generate_text = InstructionTextGenerationPipeline(model=model, tokenizer=tokenizer)
```
### Model Performance Metrics
We present the results from various model benchmarks on the EleutherAI LLM Evaluation Harness for all models in the ChOPT family.
Model results are sorted by mean score, ascending, to provide an ordering. These metrics serve to further show that none of the DLite models are
state of the art, but rather further show that chat-like behaviors in LLMs can be trained almost independent of model size.
| Model | openbookqa | arc_easy | winogrande | hellaswag | arc_challenge | piqa | boolq |
|:--------------------|-------------:|-----------:|-------------:|------------:|----------------:|---------:|---------:|
| chopt-125m | 0.178 | 0.443182 | 0.501973 | 0.294165 | 0.197099 | 0.630577 | 0.476758 |
| chopt-research-125m | 0.17 | 0.436027 | 0.503552 | 0.294762 | 0.205631 | 0.62568 | 0.48685 |
| opt-125m | 0.166 | 0.435606 | 0.501973 | 0.291775 | 0.190273 | 0.6284 | 0.554434 |
| chopt-350m | 0.178 | 0.450758 | 0.508287 | 0.325334 | 0.21843 | 0.650707 | 0.559633 |
| opt_350m | 0.176 | 0.441077 | 0.52644 | 0.320056 | 0.207338 | 0.645267 | 0.57737 |
| chopt-research-350m | 0.172 | 0.462542 | 0.514601 | 0.327524 | 0.235495 | 0.643634 | 0.589908 |
| opt-1.3b | 0.234 | 0.569865 | 0.596685 | 0.414957 | 0.232935 | 0.718172 | 0.577676 |
| chopt-research-1_3b | 0.232 | 0.564815 | 0.59116 | 0.424716 | 0.276451 | 0.713275 | 0.634557 |
| chopt-1_3b | 0.236 | 0.569444 | 0.584057 | 0.42621 | 0.268771 | 0.723069 | 0.658104 |
| opt-2.7b | 0.25 | 0.608165 | 0.608524 | 0.458176 | 0.267918 | 0.738303 | 0.603058 |
| chopt-2_7b | 0.276 | 0.616582 | 0.601421 | 0.472615 | 0.288396 | 0.75136 | 0.552294 |
| chopt-research-2_7b | 0.262 | 0.610269 | 0.625099 | 0.458176 | 0.295222 | 0.742111 | 0.636697 |
|
AKAWIZ/whisper_telugu_medium
|
AKAWIZ
| 2023-05-13T12:01:27Z | 77 | 0 |
transformers
|
[
"transformers",
"pytorch",
"whisper",
"automatic-speech-recognition",
"te",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-05-12T18:40:30Z |
---
language:
- te
pipeline_tag: automatic-speech-recognition
---
|
madhav-devrev/flan-t5-large-work-filters
|
madhav-devrev
| 2023-05-13T11:54:25Z | 103 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"t5",
"text2text-generation",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text2text-generation
| 2023-05-13T02:58:40Z |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: flan-t5-large-work-filters
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# flan-t5-large-work-filters
This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0362
- Rouge1: 41.8961
- Rouge2: 31.4402
- Rougel: 41.841
- Rougelsum: 41.9024
- Gen Len: 18.9259
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
|:-------------:|:-----:|:----:|:---------------:|:-------:|:-------:|:-------:|:---------:|:-------:|
| 0.5256 | 1.0 | 213 | 0.2899 | 41.8953 | 29.9382 | 41.4023 | 41.4413 | 18.9788 |
| 0.2377 | 2.0 | 426 | 0.1172 | 42.3662 | 31.0031 | 41.997 | 42.0997 | 19.0 |
| 0.1501 | 3.0 | 639 | 0.1091 | 42.0009 | 31.2986 | 41.8735 | 41.9067 | 19.0 |
| 0.1256 | 4.0 | 852 | 0.0905 | 43.6233 | 32.9567 | 43.5606 | 43.6054 | 18.9788 |
| 0.0997 | 5.0 | 1065 | 0.0936 | 43.4929 | 32.9118 | 43.5026 | 43.5589 | 18.9577 |
| 0.0792 | 6.0 | 1278 | 0.0743 | 43.3921 | 32.8388 | 43.3863 | 43.4487 | 18.9577 |
| 0.0738 | 7.0 | 1491 | 0.0613 | 42.3912 | 31.6893 | 42.3324 | 42.375 | 18.9577 |
| 0.0621 | 8.0 | 1704 | 0.0753 | 42.4408 | 31.7954 | 42.391 | 42.4501 | 18.9577 |
| 0.0664 | 9.0 | 1917 | 0.0568 | 42.0348 | 31.4631 | 41.9591 | 42.0159 | 18.9577 |
| 0.0575 | 10.0 | 2130 | 0.0576 | 43.0601 | 32.8756 | 42.9724 | 43.0502 | 18.9577 |
| 0.0488 | 11.0 | 2343 | 0.0473 | 42.3785 | 31.845 | 42.2759 | 42.37 | 18.9577 |
| 0.0528 | 12.0 | 2556 | 0.0503 | 43.1495 | 32.7992 | 43.1017 | 43.1919 | 18.9577 |
| 0.0392 | 13.0 | 2769 | 0.0407 | 42.0459 | 31.7063 | 41.9685 | 42.0368 | 18.9259 |
| 0.0462 | 14.0 | 2982 | 0.0446 | 43.473 | 33.1682 | 43.4607 | 43.5482 | 18.9259 |
| 0.0449 | 15.0 | 3195 | 0.0426 | 43.2263 | 32.5799 | 43.2171 | 43.255 | 18.9577 |
| 0.0432 | 16.0 | 3408 | 0.0419 | 42.2094 | 31.7081 | 42.1549 | 42.2244 | 18.9577 |
| 0.037 | 17.0 | 3621 | 0.0398 | 42.2089 | 31.5243 | 42.1439 | 42.213 | 18.9259 |
| 0.0376 | 18.0 | 3834 | 0.0402 | 42.624 | 31.7967 | 42.5462 | 42.6104 | 18.9259 |
| 0.0423 | 19.0 | 4047 | 0.0406 | 42.6076 | 31.9496 | 42.5665 | 42.6086 | 18.9259 |
| 0.0364 | 20.0 | 4260 | 0.0406 | 43.4863 | 33.0331 | 43.4492 | 43.5222 | 18.9259 |
| 0.0326 | 21.0 | 4473 | 0.0362 | 41.8961 | 31.4402 | 41.841 | 41.9024 | 18.9259 |
| 0.0302 | 22.0 | 4686 | 0.0410 | 42.9891 | 32.761 | 42.9509 | 42.9624 | 18.9259 |
| 0.0318 | 23.0 | 4899 | 0.0411 | 42.861 | 32.4727 | 42.8046 | 42.8544 | 18.9259 |
| 0.034 | 24.0 | 5112 | 0.0387 | 42.6177 | 32.1915 | 42.4974 | 42.5653 | 18.9259 |
| 0.0307 | 25.0 | 5325 | 0.0373 | 43.2371 | 32.9299 | 43.2075 | 43.2857 | 18.9259 |
| 0.0308 | 26.0 | 5538 | 0.0377 | 42.8476 | 32.5806 | 42.7802 | 42.837 | 18.9259 |
| 0.0282 | 27.0 | 5751 | 0.0381 | 42.9285 | 32.4737 | 42.8965 | 42.945 | 18.9259 |
| 0.0277 | 28.0 | 5964 | 0.0383 | 42.6384 | 31.6781 | 42.567 | 42.6305 | 18.9259 |
| 0.0316 | 29.0 | 6177 | 0.0380 | 42.9983 | 32.5656 | 42.9407 | 42.9974 | 18.9259 |
| 0.0357 | 30.0 | 6390 | 0.0378 | 43.0447 | 32.5656 | 43.0102 | 43.0802 | 18.9259 |
### Framework versions
- Transformers 4.27.2
- Pytorch 2.0.0+cu118
- Datasets 2.9.0
- Tokenizers 0.13.3
|
smile367/task_qa_distilbert
|
smile367
| 2023-05-13T11:46:46Z | 109 | 0 |
transformers
|
[
"transformers",
"pytorch",
"distilbert",
"question-answering",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-13T10:45:01Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: task_qa_distilbert
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# task_qa_distilbert
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6780
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 250 | 2.4280 |
| 2.763 | 2.0 | 500 | 1.7672 |
| 2.763 | 3.0 | 750 | 1.6780 |
### Framework versions
- Transformers 4.27.4
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
hugogeraldes/Taxi_v3
|
hugogeraldes
| 2023-05-13T11:27:12Z | 0 | 0 | null |
[
"Taxi-v3",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T10:56:49Z |
---
tags:
- Taxi-v3
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: Taxi_v3
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: Taxi-v3
type: Taxi-v3
metrics:
- type: mean_reward
value: 7.56 +/- 2.71
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **Taxi-v3**
This is a trained model of a **Q-Learning** agent playing **Taxi-v3** .
## Usage
```python
model = load_from_hub(repo_id="hugogeraldes/Taxi_v3", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
OKTAN94/293Ulzzangmodelsampe
|
OKTAN94
| 2023-05-13T10:45:22Z | 0 | 1 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T10:34:57Z |
---
license: creativeml-openrail-m
---
|
shantanusharma/vit-base-patch16-224-finetuned-flower
|
shantanusharma
| 2023-05-13T10:37:04Z | 162 | 0 |
transformers
|
[
"transformers",
"pytorch",
"vit",
"image-classification",
"generated_from_trainer",
"dataset:imagefolder",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
image-classification
| 2023-05-13T10:25:29Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- imagefolder
model-index:
- name: vit-base-patch16-224-finetuned-flower
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# vit-base-patch16-224-finetuned-flower
This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.24.0
- Pytorch 2.0.0+cu118
- Datasets 2.7.1
- Tokenizers 0.13.3
|
yigitkucuk/LLV
|
yigitkucuk
| 2023-05-13T10:24:32Z | 5 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T10:09:39Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 254.44 +/- 16.08
name: mean_reward
verified: false
---
|
Arena/RL_course-unit1
|
Arena
| 2023-05-13T10:24:31Z | 5 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T10:24:09Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 250.52 +/- 42.43
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
mNikravan/swin-tiny-patch4-window7-224-finetuned-eurosat
|
mNikravan
| 2023-05-13T10:21:38Z | 164 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"swin",
"image-classification",
"generated_from_trainer",
"dataset:cifar10",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
image-classification
| 2023-05-12T11:54:20Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- cifar10
model-index:
- name: swin-tiny-patch4-window7-224-finetuned-eurosat
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# swin-tiny-patch4-window7-224-finetuned-eurosat
This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the cifar10 dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 3
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
wootwoot/Nabylon-v1.0-fp16-inpainting
|
wootwoot
| 2023-05-13T10:12:28Z | 37 | 1 |
diffusers
|
[
"diffusers",
"safetensors",
"text-to-image",
"en",
"license:creativeml-openrail-m",
"autotrain_compatible",
"endpoints_compatible",
"diffusers:StableDiffusionPipeline",
"region:us"
] |
text-to-image
| 2023-05-13T10:03:47Z |
---
license: creativeml-openrail-m
language:
- en
library_name: diffusers
pipeline_tag: text-to-image
---
### Based off [NegiInNattoMaki/Nabylon](https://huggingface.co/NegiInNattoMaki/Nabylon)
All credits go to the original author and all the author of Nabylon's ancestor models
### Inpainting
This model can be used for inpainting, as it's the original Nabylon-v1.0 merged with
[runwayml/stable-diffusion-inpainting](https://huggingface.co/runwayml/stable-diffusion-inpainting) and
[runwayml/stable-diffusion-v1-5](https://huggingface.co/runwayml/stable-diffusion-v1-5) via the weights of
```
stable-diffusion-inpainting + (Nabylon-v1.0 - stable-diffusion-v1-5)
```
### Diffusers
The merged model was converted to be used with the [🧨Diffusers library](https://github.com/huggingface/diffusers)
|
Atharva192003/zero-shot-classfier
|
Atharva192003
| 2023-05-13T10:10:40Z | 4 | 0 |
transformers
|
[
"transformers",
"bart",
"text-classification",
"zero-shot-classification",
"en",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
zero-shot-classification
| 2023-05-13T05:09:09Z |
---
license: mit
language:
- en
metrics:
- character
pipeline_tag: zero-shot-classification
library_name: transformers
---
bart-large-mnli
This is the checkpoint for bart-large after being trained on the MultiNLI (MNLI) dataset.
Additional information about this model:
The bart-large model page
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
BART fairseq implementation
NLI-based Zero Shot Text Classification
Yin et al. proposed a method for using pre-trained NLI models as a ready-made zero-shot sequence classifiers. The method works by posing the sequence to be classified as the NLI premise and to construct a hypothesis from each candidate label. For example, if we want to evaluate whether a sequence belongs to the class "politics", we could construct a hypothesis of This text is about politics.. The probabilities for entailment and contradiction are then converted to label probabilities.
This method is surprisingly effective in many cases, particularly when used with larger pre-trained models like BART and Roberta. See this blog post for a more expansive introduction to this and other zero shot methods, and see the code snippets below for examples of using this model for zero-shot classification both with Hugging Face's built-in pipeline and with native Transformers/PyTorch code.
With the zero-shot classification pipeline
The model can be loaded with the zero-shot-classification pipeline like so:
from transformers import pipeline
classifier = pipeline("zero-shot-classification",
model="facebook/bart-large-mnli")
You can then use this pipeline to classify sequences into any of the class names you specify.
sequence_to_classify = "one day I will see the world"
candidate_labels = ['travel', 'cooking', 'dancing']
classifier(sequence_to_classify, candidate_labels)
#{'labels': ['travel', 'dancing', 'cooking'],
# 'scores': [0.9938651323318481, 0.0032737774308770895, 0.002861034357920289],
# 'sequence': 'one day I will see the world'}
If more than one candidate label can be correct, pass multi_class=True to calculate each class independently:
candidate_labels = ['travel', 'cooking', 'dancing', 'exploration']
classifier(sequence_to_classify, candidate_labels, multi_class=True)
#{'labels': ['travel', 'exploration', 'dancing', 'cooking'],
# 'scores': [0.9945111274719238,
# 0.9383890628814697,
# 0.0057061901316046715,
# 0.0018193122232332826],
# 'sequence': 'one day I will see the world'}
With manual PyTorch
# pose sequence as a NLI premise and label as a hypothesis
from transformers import AutoModelForSequenceClassification, AutoTokenizer
nli_model = AutoModelForSequenceClassification.from_pretrained('facebook/bart-large-mnli')
tokenizer = AutoTokenizer.from_pretrained('facebook/bart-large-mnli')
premise = sequence
hypothesis = f'This example is {label}.'
# run through model pre-trained on MNLI
x = tokenizer.encode(premise, hypothesis, return_tensors='pt',
truncation_strategy='only_first')
logits = nli_model(x.to(device))[0]
# we throw away "neutral" (dim 1) and take the probability of
# "entailment" (2) as the probability of the label being true
entail_contradiction_logits = logits[:,[0,2]]
probs = entail_contradiction_logits.softmax(dim=1)
prob_label_is_true = probs[:,1]
|
tp333/tuned_model
|
tp333
| 2023-05-13T10:06:01Z | 8 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"text-classification",
"generated_from_trainer",
"license:apache-2.0",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-04-24T04:18:00Z |
---
license: apache-2.0
tags:
- generated_from_trainer
model-index:
- name: tuned_model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# tuned_model
This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset.
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
### Training results
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu118
- Datasets 2.11.0
- Tokenizers 0.13.3
|
afnan007/PixelPal
|
afnan007
| 2023-05-13T10:03:31Z | 0 | 1 | null |
[
"AI",
"GPT",
"chatGPT",
"Character",
"Chatting",
"Automatic",
"Python",
"text2text-generation",
"en",
"license:mit",
"region:us"
] |
text2text-generation
| 2023-05-12T18:00:10Z |
---
license: mit
language:
- en
tags:
- AI
- GPT
- chatGPT
- Character
- Chatting
- Automatic
- Python
metrics:
- character
pipeline_tag: text2text-generation
---
<div align="center">
<a href="https://github.com/4fnan007/GPTclash">
<a href="https://github.com/4fnan007/PixelPal"><img src="https://i.ibb.co/Lvmz5Lt/catbird-image-1-removebg-preview.png" width="400" height="300" alt="logo" </a>
<h1 align=center> PixelPAL - The Character Chat </h1>
Have you ever wondered what it would be like if two iconic characters from movies or TV shows could talk to each other? Well, wonder no more! PixelPAL The Character Chat is an exciting project that brings this concept to life using the power of AI.
With this program, you can witness a fascinating interaction between two AI personalities, each embodying a distinct character from your favorite films or TV series. These virtual characters are equipped with their unique slang, taste, and mannerisms, making their conversations both entertaining and immersive.
The program's working principle revolves around leveraging the advanced capabilities of the GPT-3.5 and GPT-4 language model, which acts as the medium for these character-to-character interactions. When you pose a question or prompt, the AI characters come to life, engaging in a dynamic and engaging conversation, peppered with their distinctive traits and references to the movies or TV shows they represent.
Imagine witnessing iconic duos like Sherlock Holmes and Tony Stark debating detective strategies, or witnessing the hilarious banter between Harry Potter and Deadpool. Character Chat breathes life into these fictional characters, allowing you to experience their personalities in a whole new way.
But that's not all! Character Chat has an added element of surprise. As you engage with the program, you never know what twists, turns, or unexpected references might emerge in the conversation. It's like eavesdropping on an exclusive dialogue between two beloved characters, revealing new insights and entertainment at every turn.
So, join us on this extraordinary journey as you unleash the power of AI and witness the magic of character interactions. Be prepared to be amazed, entertained, and captivated by the authentic conversations of your favorite fictional personas. Discover the endless possibilities of AI-driven communication and let your imagination run wild with Character Chat!
Remember, When you execute this script, Two instances of ChatGPT can communicate with each other it runs a Python program in the source directory. Specifically, it runs the "firefox-server.py" file twice, using different ports. This results in two browser windows opening up.
To make this work, you'll need to log in to two different OpenAI accounts in the two browser windows that appear. Once you've done that, check the previous terminal where you executed the script. You see the program is still running and you know what to do next.
</div>
## Features
- NO API needed (login required)
- Chat output can be saved after clicking (ctrl + c)
- Easy to customize
- Live chat output on terminal itself
## Program Language Used
 
## Getting Started
This is an example of how you may give instructions on setting up your project locally.
To get a local copy up and running follow these simple example steps.
Clone the project
```bash
git clone https://huggingface.co/afnan007/GPT-clash
```
Go to the project directory
```bash
cd PixelPal
```
Run the script
```bash
bash pixelpal.sh
```
## Script executing Error
If any running errors occur with pixelpal.sh, let's move on to the manual method.
```bash
cd source/
```
Execute firefox_server.py to Run Two instances with diffrent ports.
```bash
python3 firefox_server.py --port 5001 --profile /tmp/chat1
```
```bash
python3 firefox_server.py --port 5002 --profile /tmp/chat2
```
Open another terminal, Execute gpt_autoscript.py to start
```bash
python3 gpt_autoscript.py
```
## What i want you to Know
Hey folks, just wanted to let you know that this program is open source and you have the right to do whatever you want with it. It's like a free buffet, except instead of food, you get lines of code! Yum.
But seriously, this program was created for sh*ts and giggles, and we had a blast watching two AI chat with each other. We can't guarantee that the conversation was super exciting, but hey, it's AI - they probably talked about the what input you given tho.
If you're feeling adventurous, go ahead and play around with the code. Just don't blame us if your computer starts talking back to you. That's when you know you've gone too far.
## Contact
If you have any questions, suggestions, feel free to reach out to me at:
[](https://t.me/afnan007) [](mailto:amanoythegreter232500@gmail.com)
|
Bijibapakmu/larissalora
|
Bijibapakmu
| 2023-05-13T09:50:32Z | 0 | 0 | null |
[
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T09:46:45Z |
---
license: creativeml-openrail-m
---
|
labicquette/ppo-SnowballTarget
|
labicquette
| 2023-05-13T09:34:36Z | 0 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"SnowballTarget",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SnowballTarget",
"region:us"
] |
reinforcement-learning
| 2023-05-13T09:34:31Z |
---
library_name: ml-agents
tags:
- SnowballTarget
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SnowballTarget
---
# **ppo** Agent playing **SnowballTarget**
This is a trained model of a **ppo** agent playing **SnowballTarget** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-SnowballTarget
2. Step 1: Find your model_id: labicquette/ppo-SnowballTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
Rohit001/face_celeb
|
Rohit001
| 2023-05-13T09:30:43Z | 0 | 0 |
keras
|
[
"keras",
"video-classification",
"license:cc",
"region:us"
] |
video-classification
| 2023-05-13T09:29:00Z |
---
license: cc
metrics:
- accuracy
library_name: keras
pipeline_tag: video-classification
---
|
demetere/Reinforce-v1.1.0
|
demetere
| 2023-05-13T09:17:37Z | 0 | 0 | null |
[
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T09:17:28Z |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-v1.1.0
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 500.00 +/- 0.00
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
demetere/Reinforce-v1.0.0
|
demetere
| 2023-05-13T08:56:44Z | 0 | 0 | null |
[
"CartPole-v1",
"reinforce",
"reinforcement-learning",
"custom-implementation",
"deep-rl-class",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T08:56:40Z |
---
tags:
- CartPole-v1
- reinforce
- reinforcement-learning
- custom-implementation
- deep-rl-class
model-index:
- name: Reinforce-v1.0.0
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: CartPole-v1
type: CartPole-v1
metrics:
- type: mean_reward
value: 68.40 +/- 14.87
name: mean_reward
verified: false
---
# **Reinforce** Agent playing **CartPole-v1**
This is a trained model of a **Reinforce** agent playing **CartPole-v1** .
To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
|
bikram98237/q-FrozenLake-v1-4x4-noSlippery
|
bikram98237
| 2023-05-13T08:46:54Z | 0 | 0 | null |
[
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T08:46:51Z |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="bikram98237/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
chrisrov/ppo-LunarLander-v2
|
chrisrov
| 2023-05-13T08:40:55Z | 1 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T08:20:35Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 236.20 +/- 23.22
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
Tingwen/PPO-LunarLander-V2
|
Tingwen
| 2023-05-13T08:26:15Z | 0 | 0 |
stable-baselines3
|
[
"stable-baselines3",
"LunarLander-v2",
"deep-reinforcement-learning",
"reinforcement-learning",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T08:25:56Z |
---
library_name: stable-baselines3
tags:
- LunarLander-v2
- deep-reinforcement-learning
- reinforcement-learning
- stable-baselines3
model-index:
- name: PPO
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: LunarLander-v2
type: LunarLander-v2
metrics:
- type: mean_reward
value: 252.33 +/- 19.74
name: mean_reward
verified: false
---
# **PPO** Agent playing **LunarLander-v2**
This is a trained model of a **PPO** agent playing **LunarLander-v2**
using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3).
## Usage (with Stable-baselines3)
TODO: Add your code
```python
from stable_baselines3 import ...
from huggingface_sb3 import load_from_hub
...
```
|
sai1881/bloom-560m-finetuned-Bank-test-v0
|
sai1881
| 2023-05-13T08:15:33Z | 12 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bloom",
"text-generation",
"generated_from_trainer",
"license:bigscience-bloom-rail-1.0",
"autotrain_compatible",
"text-generation-inference",
"endpoints_compatible",
"region:us"
] |
text-generation
| 2023-05-12T21:08:19Z |
---
license: bigscience-bloom-rail-1.0
tags:
- generated_from_trainer
model-index:
- name: bloom-560m-finetuned-Bank-test-v0
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# bloom-560m-finetuned-Bank-test-v0
This model is a fine-tuned version of [bigscience/bloom-560m](https://huggingface.co/bigscience/bloom-560m) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.6120
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3.0
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| No log | 1.0 | 5 | 2.3864 |
| No log | 2.0 | 10 | 1.8167 |
| No log | 3.0 | 15 | 1.6120 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3
|
imranypatel/temp-model
|
imranypatel
| 2023-05-13T07:49:16Z | 0 | 0 | null |
[
"region:us"
] | null | 2023-05-13T07:22:40Z |
# temp-model
This is dummy model just for testing while following hugging face getting-started documentation.
|
instruction-tuning-sd/scratch-low-level-img-proc
|
instruction-tuning-sd
| 2023-05-13T07:45:10Z | 6 | 3 |
diffusers
|
[
"diffusers",
"stable-diffusion",
"stable-diffusion-diffusers",
"image-to-image",
"dataset:instruction-tuning-sd/low-level-image-proc",
"arxiv:2211.09800",
"arxiv:2109.01652",
"license:mit",
"diffusers:StableDiffusionInstructPix2PixPipeline",
"region:us"
] |
image-to-image
| 2023-03-23T04:07:29Z |
---
license: mit
tags:
- stable-diffusion
- stable-diffusion-diffusers
- image-to-image
widget:
- src: >-
https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/derain%20the%20image_1.png
prompt: derain the image
datasets:
- instruction-tuning-sd/low-level-image-proc
---
# Instruction-tuned Stable Diffusion for Low-level Image Processing (Scratch)
This pipeline is an 'instruction-tuned' version of [Stable Diffusion (v1.5)](https://huggingface.co/runwayml/stable-diffusion-v1-5). It was trained using the [InstructPix2Pix methodology](https://huggingface.co/papers/2211.09800).
## Pipeline description
Motivation behind this pipeline partly comes from [FLAN](https://huggingface.co/papers/2109.01652) and partly
comes from [InstructPix2Pix](https://huggingface.co/papers/2211.09800). The main idea is to first create an
instruction prompted dataset (as described in [our blog](https://hf.co/blog/instruction-tuning-sd)) and then conduct InstructPix2Pix style
training. The end objective is to make Stable Diffusion better at following specific instructions
that entail image transformation related operations.
<p align="center">
<img src="https://huggingface.co/datasets/sayakpaul/sample-datasets/resolve/main/instruction-tuning-sd.png" width=600/>
</p>
Follow [this post](https://hf.co/blog/instruction-tuning-sd) to know more.
## Training procedure and results
Training was conducted on [instruction-tuning-sd/low-level-image-proc](https://huggingface.co/datasets/instruction-tuning-sd/low-level-image-proc) dataset. Refer to
[this repository](https://github.com/huggingface/instruction-tuned-sd) to know more.
Here are some results dervied from the pipeline:
<p align="center">
<img src="https://huggingface.co/datasets/sayakpaul/sample-datasets/resolve/main/img_proc_results.png" width=600/>
</p>
## Intended uses & limitations
You can use the pipeline for performing low-level image processing with an input image and an input prompt.
### How to use
Here is how to use this model:
```python
import torch
from diffusers import StableDiffusionInstructPix2PixPipeline
from diffusers.utils import load_image
model_id = "instruction-tuning-sd/scratch-low-level-img-proc"
pipeline = StableDiffusionInstructPix2PixPipeline.from_pretrained(
model_id, torch_dtype=torch.float16, use_auth_token=True
).to("cuda")
image_path = "https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/derain%20the%20image_1.png"
image = load_image(image_path)
image = pipeline("derain the image", image=image).images[0]
image.save("image.png")
```
For notes on limitations, misuse, malicious use, out-of-scope use, please refer to the model card
[here](https://huggingface.co/runwayml/stable-diffusion-v1-5).
## Citation
**FLAN**
```bibtex
@inproceedings{
wei2022finetuned,
title={Finetuned Language Models are Zero-Shot Learners},
author={Jason Wei and Maarten Bosma and Vincent Zhao and Kelvin Guu and Adams Wei Yu and Brian Lester and Nan Du and Andrew M. Dai and Quoc V Le},
booktitle={International Conference on Learning Representations},
year={2022},
url={https://openreview.net/forum?id=gEZrGCozdqR}
}
```
**InstructPix2Pix**
```bibtex
@InProceedings{
brooks2022instructpix2pix,
author = {Brooks, Tim and Holynski, Aleksander and Efros, Alexei A.},
title = {InstructPix2Pix: Learning to Follow Image Editing Instructions},
booktitle = {CVPR},
year = {2023},
}
```
**Instruction-tuning for Stable Diffusion blog**
```bibtex
@article{
Paul2023instruction-tuning-sd,
author = {Paul, Sayak},
title = {Instruction-tuning Stable Diffusion with InstructPix2Pix},
journal = {Hugging Face Blog},
year = {2023},
note = {https://huggingface.co/blog/instruction-tuning-sd},
}
```
|
digitous/GPT4-x-MedOAlpacino-13B
|
digitous
| 2023-05-13T07:12:54Z | 0 | 5 | null |
[
"llama",
"alpaca",
"alpacino",
"medalpaca",
"kobold",
"oasst",
"gpt4xalpaca",
"region:us"
] | null | 2023-05-13T05:12:18Z |
---
tags:
- llama
- alpaca
- alpacino
- medalpaca
- kobold
- oasst
- gpt4xalpaca
---
# For use with KoboldCPP
# GPT4-x-MedOAlpacino-13B Recipe
`(Alpacino-13B + (MedAlpaca-13B + (GPT4-x-Alpaca-13B + OASST-LLaMa-13B)))`
## Intermediate Step Merges
1. `(GPT4-x-Alpaca-13B + OASST-LLaMa-13B)` = **GPT4-x-OAlpaca-13B**
2. `(MedAlpaca-13B + GPT4xOAlpaca-13B)` = **GPT4-x-MedOAlpaca-13B**
## Original Models
Alpacino-13B: https://huggingface.co/digitous/Alpacino13b
GPT4-x-Alpaca-13B: https://huggingface.co/chavinlo/gpt4-x-alpaca
OASST-LLaMa-13B: https://huggingface.co/dvruette/oasst-llama-13b-2-epochs
MedAlpaca-13B: https://huggingface.co/medalpaca/medalpaca-13b
|
muhammadravi251001/fine-tuned-DatasetQAS-TYDI-QA-ID-with-indobert-large-p2-with-ITTL-with-freeze-LR-1e-05
|
muhammadravi251001
| 2023-05-13T06:32:21Z | 32 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"bert",
"question-answering",
"generated_from_trainer",
"license:mit",
"endpoints_compatible",
"region:us"
] |
question-answering
| 2023-05-07T12:48:27Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: fine-tuned-DatasetQAS-TYDI-QA-ID-with-indobert-large-p2-with-ITTL-with-freeze-LR-1e-05
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# fine-tuned-DatasetQAS-TYDI-QA-ID-with-indobert-large-p2-with-ITTL-with-freeze-LR-1e-05
This model is a fine-tuned version of [indobenchmark/indobert-large-p2](https://huggingface.co/indobenchmark/indobert-large-p2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2003
- Exact Match: 60.2113
- F1: 73.9948
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 64
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
### Training results
| Training Loss | Epoch | Step | Validation Loss | Exact Match | F1 |
|:-------------:|:-----:|:----:|:---------------:|:-----------:|:-------:|
| 6.2316 | 0.5 | 19 | 3.5321 | 11.9718 | 21.8197 |
| 6.2316 | 0.99 | 38 | 2.6566 | 19.1901 | 31.9985 |
| 3.5132 | 1.5 | 57 | 2.1442 | 27.2887 | 40.7031 |
| 3.5132 | 1.99 | 76 | 1.6755 | 41.5493 | 53.9850 |
| 3.5132 | 2.5 | 95 | 1.4228 | 48.2394 | 61.2829 |
| 1.845 | 2.99 | 114 | 1.2882 | 52.8169 | 66.2197 |
| 1.845 | 3.5 | 133 | 1.2352 | 54.7535 | 68.3725 |
| 1.2542 | 3.99 | 152 | 1.2033 | 56.6901 | 70.5019 |
| 1.2542 | 4.5 | 171 | 1.2117 | 57.9225 | 72.0740 |
| 1.2542 | 4.99 | 190 | 1.1748 | 58.4507 | 71.9264 |
| 0.9877 | 5.5 | 209 | 1.1763 | 58.8028 | 72.2772 |
| 0.9877 | 5.99 | 228 | 1.1827 | 59.5070 | 73.5652 |
| 0.9877 | 6.5 | 247 | 1.1789 | 59.8592 | 73.2748 |
| 0.8293 | 6.99 | 266 | 1.1835 | 60.0352 | 73.4695 |
| 0.8293 | 7.5 | 285 | 1.1669 | 59.8592 | 73.7145 |
| 0.7663 | 7.99 | 304 | 1.1912 | 60.3873 | 74.3001 |
| 0.7663 | 8.5 | 323 | 1.1828 | 60.2113 | 74.1533 |
| 0.7663 | 8.99 | 342 | 1.2046 | 60.3873 | 74.0424 |
| 0.7068 | 9.5 | 361 | 1.2003 | 60.2113 | 73.9948 |
### Framework versions
- Transformers 4.26.1
- Pytorch 1.13.1+cu117
- Datasets 2.2.0
- Tokenizers 0.13.2
|
Bilguunee/q-FrozenLake-v1-4x4-noSlippery
|
Bilguunee
| 2023-05-13T06:04:15Z | 0 | 0 | null |
[
"FrozenLake-v1-4x4-no_slippery",
"q-learning",
"reinforcement-learning",
"custom-implementation",
"model-index",
"region:us"
] |
reinforcement-learning
| 2023-05-13T06:04:12Z |
---
tags:
- FrozenLake-v1-4x4-no_slippery
- q-learning
- reinforcement-learning
- custom-implementation
model-index:
- name: q-FrozenLake-v1-4x4-noSlippery
results:
- task:
type: reinforcement-learning
name: reinforcement-learning
dataset:
name: FrozenLake-v1-4x4-no_slippery
type: FrozenLake-v1-4x4-no_slippery
metrics:
- type: mean_reward
value: 1.00 +/- 0.00
name: mean_reward
verified: false
---
# **Q-Learning** Agent playing1 **FrozenLake-v1**
This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** .
## Usage
```python
model = load_from_hub(repo_id="Bilguunee/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl")
# Don't forget to check if you need to add additional attributes (is_slippery=False etc)
env = gym.make(model["env_id"])
```
|
Cameronrc7/SimonSays
|
Cameronrc7
| 2023-05-13T04:22:09Z | 0 | 0 | null |
[
"region:us"
] | null | 2023-05-13T04:19:36Z |
---
title: Simon Says
emoji: 🤖
colorFrom: indigo
colorTo: green
sdk: gradio
sdk_version: 3.27.0
app_file: app.py
pinned: false
---
|
NextLaoHuang/RWKV-Sloshed-Lawyer
|
NextLaoHuang
| 2023-05-13T04:19:01Z | 0 | 1 | null |
[
"legal",
"question-answering",
"license:unlicense",
"region:us"
] |
question-answering
| 2023-05-07T07:17:05Z |
---
license: unlicense
pipeline_tag: question-answering
tags:
- legal
---
## 介绍
"醉酒律师" 是一种基于 RWKV 架构的自然语言处理模型,通过使用刑法领域的法律问答数据集和法考资料数据集进行训练,可以用来回答与法律相关的问题。该模型可以理解与法律领域相关的自然语言问题,并尽可能给出准确的答案。虽然本模型对法律有一定理解,但并不能保证其所给出的建议具有法律效力。因此给它起名为“醉酒律师”。
## 使用方法
通过任何支持 RWKV 的程序进行使用,推荐结合 wenda 知识库,效果有提升,但是不多。
注意本模型为lora模型,需要合并基础模型进行使用,微调时使用的基本模型为RWKV-4-Raven-7B-v11-EngChn49-ctx8192.pth,合并程序为merge_lora.py,合并命令示例:
python3 merge_lora.py --use-gpu 32 RWKV-4-Raven-7B-v11-Eng49%-Chn49%-Jpn1%-Other1%-20230430-ctx8192.pth rwkv-lora.pth output.pth
合并版的模型下载方式:
autodl:
https://www.codewithgpu.com/m/file/RWKV-Sloshed-Lawyer-7B
百度云:
链接: https://pan.baidu.com/s/1ueufuVWeKLoYE7vlflSC5w 提取码: ib71
可以使用rwkv的程序下载:
rwkv桌面版懒人包(本地,应该是最容易上手的):
https://zhuanlan.zhihu.com/p/615655028
wenda懒人包(本地):
链接:https://pan.baidu.com/s/105nOsldGt5mEPoT2np1ZoA?pwd=lyqz
视频教程:https://www.bilibili.com/video/BV1aX4y1z7ar/?vd_source=629edb00375d46ad4097acdc7cbc0ca3
提取码:lyqz
autodl-wenda懒人包(云端):
https://www.codewithgpu.com/i/l15y/wenda/Wenda-ChatGLM-Vincuna-RWKV
## 更新日志
- 2023-05-07: v1.0.0 初始版本。
## 使用声明
本模型仅供学习交流使用,请勿用于违法用途。后续可能会训练更加实用的法律模型,或者更加有趣的小模型,不过我个人的力量有限,感兴趣的朋友可以加 Q 群讨论:759852889。
|
akommala/xlm-roberta-base-finetuned-panx-all
|
akommala
| 2023-05-13T04:18:13Z | 124 | 0 |
transformers
|
[
"transformers",
"pytorch",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2023-05-13T04:13:41Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-all
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-all
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2922
- F1: 0.7806
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| No log | 1.0 | 219 | 0.3294 | 0.7191 |
| 0.4012 | 2.0 | 438 | 0.3038 | 0.7586 |
| 0.4012 | 3.0 | 657 | 0.2922 | 0.7806 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
akommala/xlm-roberta-base-finetuned-panx-en
|
akommala
| 2023-05-13T04:12:49Z | 114 | 0 |
transformers
|
[
"transformers",
"pytorch",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"dataset:xtreme",
"license:mit",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2023-05-13T04:10:18Z |
---
license: mit
tags:
- generated_from_trainer
datasets:
- xtreme
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-en
results:
- task:
name: Token Classification
type: token-classification
dataset:
name: xtreme
type: xtreme
config: PAN-X.en
split: validation
args: PAN-X.en
metrics:
- name: F1
type: f1
value: 0.666098807495741
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-en
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the xtreme dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4067
- F1: 0.6661
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| No log | 1.0 | 32 | 0.6516 | 0.5020 |
| 0.901 | 2.0 | 64 | 0.4286 | 0.6479 |
| 0.901 | 3.0 | 96 | 0.4067 | 0.6661 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
akommala/xlm-roberta-base-finetuned-panx-hi-ta
|
akommala
| 2023-05-13T04:02:32Z | 101 | 0 |
transformers
|
[
"transformers",
"pytorch",
"xlm-roberta",
"token-classification",
"generated_from_trainer",
"license:mit",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
token-classification
| 2023-05-13T03:58:30Z |
---
license: mit
tags:
- generated_from_trainer
metrics:
- f1
model-index:
- name: xlm-roberta-base-finetuned-panx-hi-ta
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# xlm-roberta-base-finetuned-panx-hi-ta
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2195
- F1: 0.8388
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 48
- eval_batch_size: 48
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
### Training results
| Training Loss | Epoch | Step | Validation Loss | F1 |
|:-------------:|:-----:|:----:|:---------------:|:------:|
| No log | 1.0 | 186 | 0.3174 | 0.7680 |
| 0.3919 | 2.0 | 372 | 0.2364 | 0.8062 |
| 0.3919 | 3.0 | 558 | 0.2195 | 0.8388 |
### Framework versions
- Transformers 4.28.1
- Pytorch 2.0.0+cu117
- Datasets 2.12.0
- Tokenizers 0.13.3
|
bullhug/kl-f8-anime2
|
bullhug
| 2023-05-13T03:59:39Z | 9 | 4 |
diffusers
|
[
"diffusers",
"vae",
"en",
"license:creativeml-openrail-m",
"region:us"
] | null | 2023-05-13T03:53:22Z |
---
license: creativeml-openrail-m
language:
- en
tags:
- vae
---
diffusers format of kl-f8-anime2.ckpt from https://huggingface.co/hakurei/waifu-diffusion-v1-4
|
ManishW/text-classification-model
|
ManishW
| 2023-05-13T03:57:37Z | 5 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"distilbert",
"text-classification",
"generated_from_trainer",
"dataset:imdb",
"license:apache-2.0",
"model-index",
"autotrain_compatible",
"endpoints_compatible",
"region:us"
] |
text-classification
| 2023-05-13T03:02:32Z |
---
license: apache-2.0
tags:
- generated_from_trainer
datasets:
- imdb
metrics:
- accuracy
model-index:
- name: text-classification-model
results:
- task:
name: Text Classification
type: text-classification
dataset:
name: imdb
type: imdb
config: plain_text
split: test
args: plain_text
metrics:
- name: Accuracy
type: accuracy
value: 0.93072
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# text-classification-model
This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the imdb dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2158
- Accuracy: 0.9307
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 0.2859 | 1.0 | 782 | 0.1943 | 0.9241 |
| 0.1005 | 2.0 | 1564 | 0.2158 | 0.9307 |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
asenella/mmnist_MVTCAEconfig2_seed_3_ratio_05_i
|
asenella
| 2023-05-13T03:55:05Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T03:54:56Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
sudheer997/final-project
|
sudheer997
| 2023-05-13T03:39:42Z | 105 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"wav2vec2",
"automatic-speech-recognition",
"generated_from_trainer",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-05-13T02:43:31Z |
---
tags:
- generated_from_trainer
model-index:
- name: uaspeech-foundation-fintuned
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# uaspeech-foundation-fintuned
- Loss: 2.5324
- Wer: 1.2855
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 4
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 30
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:-----:|:---------------:|:------:|
| 41.2984 | 0.7 | 500 | 2.8954 | 1.0 |
| 3.0227 | 1.4 | 1000 | 2.8232 | 1.0042 |
| 2.8283 | 2.11 | 1500 | 2.6291 | 1.0309 |
| 2.5552 | 2.81 | 2000 | 2.2593 | 1.9170 |
| 2.1714 | 3.51 | 2500 | 1.9586 | 1.9142 |
| 1.8537 | 4.21 | 3000 | 1.5725 | 1.8579 |
| 1.6087 | 4.92 | 3500 | 1.2772 | 1.7426 |
| 1.3108 | 5.62 | 4000 | 1.2792 | 1.6751 |
| 1.1652 | 6.32 | 4500 | 1.4565 | 1.6174 |
| 1.0113 | 7.02 | 5000 | 1.1906 | 1.5626 |
| 0.925 | 7.72 | 5500 | 1.4491 | 1.5260 |
| 0.8183 | 8.43 | 6000 | 1.3712 | 1.5387 |
| 0.7118 | 9.13 | 6500 | 1.4713 | 1.4866 |
| 0.6959 | 9.83 | 7000 | 1.3336 | 1.4318 |
| 0.6146 | 10.53 | 7500 | 1.3690 | 1.4177 |
| 0.5655 | 11.24 | 8000 | 1.3789 | 1.4135 |
| 0.4969 | 11.94 | 8500 | 1.5476 | 1.3966 |
| 0.4705 | 12.64 | 9000 | 1.9062 | 1.3797 |
| 0.4387 | 13.34 | 9500 | 1.2711 | 1.3924 |
| 0.4115 | 14.04 | 10000 | 1.6318 | 1.3769 |
| 0.3695 | 14.75 | 10500 | 1.5119 | 1.3755 |
| 0.377 | 15.45 | 11000 | 1.6637 | 1.3812 |
| 0.3788 | 16.15 | 11500 | 1.6636 | 1.3699 |
| 0.3396 | 16.85 | 12000 | 1.6572 | 1.3418 |
| 0.3047 | 17.56 | 12500 | 1.4740 | 1.3361 |
| 0.2804 | 18.26 | 13000 | 2.0885 | 1.3249 |
| 0.2995 | 18.96 | 13500 | 1.9536 | 1.3235 |
| 0.2628 | 19.66 | 14000 | 1.7736 | 1.3179 |
| 0.2703 | 20.37 | 14500 | 2.0018 | 1.3291 |
| 0.2335 | 21.07 | 15000 | 1.7962 | 1.3221 |
| 0.2068 | 21.77 | 15500 | 2.3187 | 1.3136 |
| 0.2311 | 22.47 | 16000 | 2.4853 | 1.3291 |
| 0.2491 | 23.17 | 16500 | 2.1901 | 1.3024 |
| 0.1836 | 23.88 | 17000 | 2.4344 | 1.2911 |
| 0.1823 | 24.58 | 17500 | 2.3705 | 1.3066 |
| 0.1575 | 25.28 | 18000 | 2.1864 | 1.2897 |
| 0.1451 | 25.98 | 18500 | 2.4216 | 1.2883 |
| 0.1502 | 26.69 | 19000 | 2.1780 | 1.2855 |
| 0.1392 | 27.39 | 19500 | 2.4009 | 1.2925 |
| 0.1609 | 28.09 | 20000 | 2.4250 | 1.2982 |
| 0.1066 | 28.79 | 20500 | 2.4433 | 1.2897 |
| 0.1514 | 29.49 | 21000 | 2.5063 | 1.2855 |
### Framework versions
- Transformers 4.23.1
- Pytorch 1.12.1+cu113
- Datasets 1.18.3
- Tokenizers 0.13.2
|
hts98/model
|
hts98
| 2023-05-13T03:33:42Z | 75 | 0 |
transformers
|
[
"transformers",
"pytorch",
"jax",
"whisper",
"automatic-speech-recognition",
"generated_from_trainer",
"license:apache-2.0",
"endpoints_compatible",
"region:us"
] |
automatic-speech-recognition
| 2023-04-13T07:07:12Z |
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: model
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# model
This model is a fine-tuned version of [hts98/whisper-medium-1113](https://huggingface.co/hts98/whisper-medium-1113) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3198
- Wer: 100.0
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 12
- eval_batch_size: 12
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 30
- training_steps: 200
- mixed_precision_training: Native AMP
### Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|:-------------:|:-----:|:----:|:---------------:|:-----:|
| 0.1889 | 1.43 | 30 | 0.2595 | 100.0 |
| 0.067 | 2.86 | 60 | 0.2960 | 100.0 |
| 0.0319 | 4.29 | 90 | 0.3027 | 100.0 |
| 0.0171 | 5.71 | 120 | 0.3166 | 100.0 |
| 0.0067 | 7.14 | 150 | 0.3214 | 100.0 |
| 0.0028 | 8.57 | 180 | 0.3198 | 100.0 |
### Framework versions
- Transformers 4.29.0.dev0
- Pytorch 2.0.0+cu117
- Datasets 2.7.0
- Tokenizers 0.13.3
|
asenella/mmnist_MVTCAEconfig2_seed_0_ratio_05_i
|
asenella
| 2023-05-13T03:31:22Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T03:31:14Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
jontromanab/snowBallTarget
|
jontromanab
| 2023-05-13T03:17:34Z | 0 | 0 |
ml-agents
|
[
"ml-agents",
"tensorboard",
"onnx",
"SnowballTarget",
"deep-reinforcement-learning",
"reinforcement-learning",
"ML-Agents-SnowballTarget",
"region:us"
] |
reinforcement-learning
| 2023-05-13T03:13:04Z |
---
library_name: ml-agents
tags:
- SnowballTarget
- deep-reinforcement-learning
- reinforcement-learning
- ML-Agents-SnowballTarget
---
# **ppo** Agent playing **SnowballTarget**
This is a trained model of a **ppo** agent playing **SnowballTarget** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents).
## Usage (with ML-Agents)
The Documentation: https://github.com/huggingface/ml-agents#get-started
We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:
### Resume the training
```
mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume
```
### Watch your Agent play
You can watch your agent **playing directly in your browser:**.
1. Go to https://huggingface.co/spaces/unity/ML-Agents-SnowballTarget
2. Step 1: Find your model_id: jontromanab/snowBallTarget
3. Step 2: Select your *.nn /*.onnx file
4. Click on Watch the agent play 👀
|
az00/pls-relu-segformer-b5-scene-parse-150-cvfinal
|
az00
| 2023-05-13T03:15:36Z | 1 | 0 |
transformers
|
[
"transformers",
"pytorch",
"tensorboard",
"segformer",
"generated_from_trainer",
"dataset:scene_parse_150",
"license:other",
"endpoints_compatible",
"region:us"
] | null | 2023-05-13T01:49:17Z |
---
license: other
tags:
- generated_from_trainer
datasets:
- scene_parse_150
model-index:
- name: pls-relu-segformer-b5-scene-parse-150-cvfinal
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# pls-relu-segformer-b5-scene-parse-150-cvfinal
This model is a fine-tuned version of [nvidia/segformer-b5-finetuned-ade-640-640](https://huggingface.co/nvidia/segformer-b5-finetuned-ade-640-640) on the scene_parse_150 dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2163
- Mean Iou: 0.6274
- Mean Accuracy: 0.7469
- Overall Accuracy: 0.9418
- Per Category Iou: [0.8684705920049987, 0.9788583293272919, 0.9606747391887004, 0.9253111886254581, 0.24568229913304662, 0.9673287812952618, 0.9833414396887159, nan, 0.7001172811177401, nan, 0.8421861956856026, 0.8851052596845509, 0.9197815739241791, 0.7674147657591366, 0.4244409937888199, 0.9139732741480556, 0.975721673551077, 0.595, nan, 0.802955985057658, 0.43452699091394975, nan, 0.9271645968165122, nan, 0.0, nan, nan, 0.9220460101548298, nan, nan, nan, nan, nan, 0.6805356890237193, nan, nan, 0.5647439438175649, 0.7019540200293489, 0.0, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.8638079583837581, nan, nan, 0.9697141888726062, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3530150753768844, nan, 0.6765223671737463, 0.8920254678485999, nan, nan, 0.8264184572729973, 0.0, 0.8616266197825097, nan, nan, 0.7762981574539364, 0.8315092165898618, nan, nan, 0.8692742104162309, nan, nan, nan, nan, nan, nan, 0.824181626187962, 0.0, nan, 0.9686018546354324, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09251101321585903, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8853530031612223, 0.6819296811120196, nan, nan, nan, 0.9152956298200514, nan, nan, nan, 0.22230014025245443, 0.8228541612314114, 0.0, nan, 0.9069607843137255, 0.7778216258879243, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan]
- Per Category Accuracy: [0.9598165178225602, 0.9873972772341073, 0.9962357062595016, 0.9644300578376044, 0.26165030897855324, 0.9937295413000042, 0.9923120849870924, nan, 0.7356703682370433, nan, 0.9995828336027534, 0.9734359123239947, 0.9432798096937258, 0.8381890027459648, 0.43059909576395344, 0.9671918579047806, 0.998369031064481, 0.8503918856615952, nan, 0.851498449879435, 0.9387990762124712, nan, 0.9689602222872185, nan, nan, nan, nan, 0.9622666195653596, nan, nan, nan, nan, nan, 0.8897451705713111, nan, nan, 0.5998156218571907, 0.7033120099050764, nan, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.9703948703380545, nan, nan, 0.990348296189206, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46484698097601324, nan, 0.707100731304553, 0.9857723182180411, nan, nan, 0.9201764057331864, nan, 0.8760013240648792, nan, nan, 0.8064207412563077, 0.8360845641471184, nan, nan, 0.8786004968024946, nan, nan, nan, nan, nan, nan, 0.9770498643855623, 0.0, nan, 0.9719508516266766, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0987460815047022, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8853530031612223, 0.7387068201948627, nan, nan, nan, 0.9156094633979085, nan, nan, nan, 0.2231412230532336, 0.8276043033324587, nan, nan, 0.9306841046277666, 0.794718008164911, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 100
### Training results
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------:|
| 0.1696 | 1.0 | 20 | 0.1269 | 0.6721 | 0.7965 | 0.9565 | [0.9135663110592681, 0.9825519725750829, 0.9682770522478501, 0.9110306258322237, 0.19175221854880808, 0.9731929353526031, 0.9846199197072778, nan, 0.8321642999001251, nan, 0.8806729926333758, 0.8467248339627556, 0.9136873281150836, 0.5747675962815405, 0.9107026007030572, 0.8487774294670847, 0.9760582258241872, 0.5338296112489661, nan, 0.7846778120434994, 0.5497866287339972, nan, 0.9038519968940955, nan, 0.0, nan, nan, 0.8968909926766707, nan, nan, nan, nan, nan, 0.6853325753268903, nan, nan, 0.6837282780410743, 0.9318723684878, 0.0, nan, nan, 0.0, 0.0, 0.4291920069504778, nan, nan, nan, 0.8979626711223346, nan, nan, 0.9895701866865907, nan, 0.09646256570535587, nan, nan, nan, nan, nan, nan, 0.34575260804769004, nan, 0.8046462080550661, 0.7515476138209597, nan, nan, 0.8324633113365508, nan, 0.7898716370669949, nan, nan, 0.8725899260956121, 0.8209644816632977, nan, nan, 0.8700244109002142, nan, nan, nan, nan, nan, nan, 0.7378111148994204, 0.0, nan, 0.9809370339259081, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.7434462444771723, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9083719135802469, 0.271036315323295, nan, nan, nan, 0.9814439090371122, nan, nan, nan, 0.470176038284054, 0.8237577639751553, 0.0, nan, 0.9149274498111707, 0.9366989912772282, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9395206002456757, 0.9891615552471795, 0.990194328772556, 0.9469901930749896, 0.24034896401308614, 0.986183735067806, 0.993109664874865, nan, 0.9710125514044982, nan, 0.9953590238306305, 0.9438234867179561, 0.9632024977698483, 0.5797334404929342, 0.9427527213724224, 0.9936389881098009, 0.9993549152717723, 0.743891194098663, nan, 0.8326558732345849, 0.8926096997690531, nan, 0.978831529687043, nan, nan, nan, nan, 0.9469979458596642, nan, nan, nan, nan, nan, 0.8423140156185779, nan, nan, 0.7254441837076768, 0.9476888155179529, nan, nan, nan, 0.0, nan, 0.4391111111111111, nan, nan, nan, 0.9596404366705896, nan, nan, 0.997851783223547, nan, 0.09659980082515293, nan, nan, nan, nan, nan, nan, 0.7675765095119934, nan, 0.9265864590705355, 0.9981535112441446, nan, nan, 0.9318632855567806, nan, 0.8921549155908639, nan, nan, 0.9279334145351198, 0.8233420214306401, nan, nan, 0.9229956133396755, nan, nan, nan, nan, nan, nan, 0.9029835176298769, 0.0, nan, 0.9958636933215761, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.7912225705329153, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9924130663856692, 0.271036315323295, nan, nan, nan, 0.9951568661066347, nan, nan, nan, 0.4841179058512978, 0.8352138546313304, nan, nan, 0.9261569416498994, 0.9688019757068697, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1831 | 2.0 | 40 | 0.1248 | 0.6877 | 0.7875 | 0.9584 | [0.9172129530391012, 0.9821494521728426, 0.9682313648143698, 0.9109209291746041, 0.17895339954163483, 0.9708509835171748, 0.9844997563444166, nan, 0.8506306218224482, nan, 0.8806148748159057, 0.8446081281902177, 0.9194248052726184, 0.6330935251798561, 0.917229910135331, 0.8438051122962837, 0.9748195031586947, 0.5588631012445341, nan, 0.7910762833839757, 0.5737963693764798, nan, 0.9132947976878613, nan, 0.0, nan, nan, 0.9044269344535308, nan, nan, nan, nan, nan, 0.7094443992516066, nan, nan, 0.6580006439150032, 0.891798382078407, 0.0, nan, nan, 0.0, nan, 0.48442622950819675, nan, nan, nan, 0.9011837166778943, nan, nan, 0.9890751724972764, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.413867047891351, nan, 0.7979035120300442, 0.8356403928810734, nan, nan, 0.8065164923572004, nan, 0.9018156512937306, nan, nan, 0.868433236199243, 0.6597161888213148, nan, nan, 0.8863738795132455, nan, nan, nan, nan, nan, nan, 0.7635009310986964, 0.0, nan, 0.9807468111370182, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.7605790645879733, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9159864622735417, 0.3029229406554473, nan, nan, nan, 0.9808758197588322, nan, nan, nan, 0.3184564917127072, 0.8988336713995944, 0.0, nan, 0.8978584729981378, 0.9294648550546917, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9517900025452896, 0.9877480693536655, 0.9920979575649415, 0.9460120842576154, 0.2043620501635769, 0.9840581558474685, 0.9915852962136192, nan, 0.932340294428891, nan, 0.997757730614799, 0.9487588909856293, 0.9506145306769749, 0.6601031411157994, 0.9470533562280439, 0.9715711699368792, 0.9991693012697823, 0.7660212079299217, nan, 0.8519807096107475, 0.8394919168591224, nan, 0.9762357414448669, nan, nan, nan, nan, 0.9521267548507805, nan, nan, nan, nan, nan, 0.8961364570489108, nan, nan, 0.6851324170298357, 0.9241642591828312, nan, nan, nan, 0.0, nan, 0.5253333333333333, nan, nan, nan, 0.9500629032912625, nan, nan, 0.998248846987915, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.47890818858560796, nan, 0.9121962727058268, 0.993838558572567, nan, nan, 0.8842337375964718, nan, 0.9437272426348892, nan, nan, 0.9247723449915898, 0.6597161888213148, nan, nan, 0.9354685270334548, nan, nan, nan, nan, nan, nan, 0.9409555601919466, 0.0, nan, 0.9955852322829164, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8564263322884013, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9696522655426765, 0.3029229406554473, nan, nan, nan, 0.9936139207954741, nan, nan, nan, 0.3245930488341399, 0.9302020467069011, nan, nan, 0.9701207243460764, 0.9550425885792047, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1655 | 3.0 | 60 | 0.1231 | 0.6705 | 0.7841 | 0.9596 | [0.9221198627730325, 0.9816427249944908, 0.9677625176429644, 0.922621508718402, 0.19515125812835737, 0.9743198629278281, 0.9845204417929398, nan, 0.8511589973621501, nan, 0.8628781623001276, 0.8811161328355276, 0.9250584327715586, 0.7085691977357017, 0.9107827038861521, 0.8539530693663936, 0.9754312970860514, 0.5834205933682374, nan, 0.812015204425353, 0.5976430976430976, nan, 0.9110754852133575, nan, 0.0, nan, nan, 0.9237793817786062, nan, nan, nan, nan, nan, 0.6821963132188436, nan, nan, 0.6447014743015395, 0.8380763962867144, 0.0, nan, nan, 0.0, nan, 0.43739279588336194, nan, nan, nan, 0.9041428489494714, nan, nan, 0.9903678023832867, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.2640877246420956, nan, 0.7677694337324944, 0.8654598995323055, nan, nan, 0.8086072450893298, 0.0, 0.8869372225745086, nan, nan, 0.8887636123638764, 0.6017955401100492, nan, nan, 0.8746488225979465, nan, nan, nan, nan, nan, nan, 0.7643256464011181, 0.0, nan, 0.9818548387096774, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6197695573074591, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9088738559490649, 0.39681133746678476, nan, nan, nan, 0.9860363822700486, nan, nan, nan, 0.3377597654565836, 0.8339788277820811, 0.0, nan, 0.9108986615678776, 0.9245089416593374, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9583911556721224, 0.9881246550114265, 0.9925242910965695, 0.955398237927738, 0.20072700836059615, 0.9911363346511924, 0.995332505863864, nan, 0.8947128715523823, nan, 0.9989744659401019, 0.976339091304979, 0.9316830211121023, 0.7293550331525015, 0.943682162604956, 0.9740421784019181, 0.9987189591010197, 0.7706316274781005, nan, 0.905132621426111, 0.8198614318706697, nan, 0.976528224627084, nan, nan, nan, nan, 0.9661786447907262, nan, nan, nan, nan, nan, 0.8898273736128237, nan, nan, 0.6633422728796513, 0.8523008666941808, nan, nan, nan, 0.0, nan, 0.4533333333333333, nan, nan, nan, 0.9596404366705896, nan, nan, 0.9976074362916281, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.71712158808933, nan, 0.8923802783675395, 0.9711170285136738, nan, nan, 0.9031973539140022, nan, 0.9259847732538894, nan, nan, 0.9372716199756395, 0.6017955401100492, nan, nan, 0.9049733100787485, nan, nan, nan, nan, nan, nan, 0.9127894846651367, 0.0, nan, 0.9944771893999165, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6407523510971787, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9626975763962066, 0.39681133746678476, nan, nan, nan, 0.989670838333619, nan, nan, nan, 0.3446546414430268, 0.8475465757019155, nan, nan, 0.9585513078470825, 0.9536313693866236, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0719 | 4.0 | 80 | 0.1275 | 0.6694 | 0.7799 | 0.9578 | [0.9134610314696745, 0.9805005224446312, 0.9674469336023466, 0.9272321143206176, 0.1936352682684811, 0.9758118795221269, 0.9829411626981339, nan, 0.7762752591418602, nan, 0.8531813188444524, 0.8562007810999424, 0.9227758357540249, 0.7198100407055631, 0.9047094838116206, 0.8656481542193954, 0.9754753072280017, 0.6143911439114391, nan, 0.8196726471514006, 0.6182634730538922, nan, 0.9099145124484861, nan, 0.0, nan, nan, 0.9272599136683405, nan, nan, nan, nan, nan, 0.6974937945391945, nan, nan, 0.6408289817232375, 0.8422287988570262, 0.0, nan, nan, 0.0, nan, 0.36837455830388693, nan, nan, nan, 0.9071361012990522, nan, nan, 0.9878809247972599, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.37797619047619047, nan, 0.7633030637358907, 0.8615868996081518, nan, nan, 0.8135490394337714, nan, 0.8869330592316588, nan, nan, 0.8970246078887582, 0.6938893715609615, nan, nan, 0.8847610275304473, nan, nan, nan, nan, nan, nan, 0.7708590580349268, 0.0, nan, 0.9818040792875611, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.47213622291021673, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9174293672900915, 0.44729849424269263, nan, nan, nan, 0.9738002482983005, nan, nan, nan, 0.2739821771798008, 0.8330749354005168, 0.0, nan, 0.8943535514764565, 0.9158512720156555, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan] | [0.9585947788364707, 0.9875262449251213, 0.9952029215414105, 0.9637250812937611, 0.19861868411486733, 0.9861412234833993, 0.990660292320108, nan, 0.8135640898557325, nan, 0.9989744659401019, 0.9706053128175351, 0.9288581623550402, 0.7816623133078829, 0.9450999543156005, 0.9689778343201056, 0.9977726319761197, 0.7676348547717843, nan, 0.897037547364795, 0.953810623556582, nan, 0.9767475870137468, nan, nan, nan, nan, 0.9640198348837513, nan, nan, nan, nan, nan, 0.895088368269626, nan, nan, 0.6582299698290311, 0.8515270326042097, nan, nan, nan, 0.0, nan, 0.37066666666666664, nan, nan, nan, 0.9536138955399537, nan, nan, 0.9983812015760377, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.7353184449958643, nan, 0.848690728945506, 0.9786779140508076, nan, nan, 0.8871003307607497, nan, 0.9140019860973188, nan, nan, 0.9503799083579839, 0.6938893715609615, nan, nan, 0.9137994820569737, nan, nan, nan, nan, nan, nan, 0.9117462966826622, 0.0, nan, 0.9913387014433563, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4780564263322884, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9717597471022128, 0.44729849424269263, nan, nan, nan, 0.9749271386936397, nan, nan, nan, 0.27593488781346237, 0.845972185778011, nan, nan, 0.9019114688128773, 0.9435008316113099, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.057 | 5.0 | 100 | 0.1223 | 0.6828 | 0.8014 | 0.9605 | [0.919316686316272, 0.981836786902021, 0.9689497069084484, 0.9294765194155438, 0.23046092184368738, 0.9760735423750158, 0.9862114587047498, nan, 0.8376126237014425, nan, 0.8520617868324604, 0.8995436854113542, 0.9333154728561338, 0.7227880721560928, 0.8926255714584516, 0.8837785246771626, 0.9753755181481867, 0.6558861578266494, nan, 0.8289918536883795, 0.5458064516129032, nan, 0.9126838235294118, nan, 0.0, nan, nan, 0.9274671217749222, nan, nan, nan, nan, nan, 0.707317866259125, nan, nan, 0.6886927480916031, 0.8975832822399346, 0.0, nan, nan, 0.0, nan, 0.38461538461538464, nan, nan, nan, 0.9081566503965833, nan, nan, 0.9903408460086599, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3343971631205674, nan, 0.7749613730321125, 0.8439836901794411, nan, nan, 0.8356548752379998, nan, 0.8767678066340961, nan, nan, 0.8803390576838397, 0.8221836084564147, nan, nan, 0.8850071297616623, nan, nan, nan, nan, nan, nan, 0.7531624500665779, 0.011428571428571429, nan, 0.9824606699025784, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5639121015165584, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9169781625024592, 0.5326855123674912, nan, nan, nan, 0.9833693945950532, nan, nan, nan, 0.34082234559333796, 0.8794125225457357, 0.0, nan, 0.9138204924543288, 0.9362483474514028, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan] | [0.9500724854199174, 0.989955996223826, 0.9931654438495604, 0.9654746042350266, 0.242457288258815, 0.9885218722101773, 0.9920619563832348, nan, 0.9115641702275863, nan, 0.9981227512123898, 0.9729278560023226, 0.9495737932401626, 0.7889625611144598, 0.9412089037319429, 0.9778343201056907, 0.9988102446757688, 0.818118948824343, nan, 0.8868756458835687, 0.976905311778291, nan, 0.9802208248025739, nan, nan, nan, nan, 0.9669898339678926, nan, nan, nan, nan, nan, 0.8940608302507193, nan, nan, 0.7258632249413343, 0.9062886917044986, nan, nan, nan, 0.0, nan, 0.38666666666666666, nan, nan, nan, 0.9664989245566332, nan, nan, 0.9989717066615083, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.7799834574028123, nan, 0.875583864118896, 0.9937219382300919, nan, nan, 0.9194046306504962, nan, 0.9029460443561734, nan, nan, 0.9276724087929934, 0.8221836084564147, nan, nan, 0.9184503990275356, nan, nan, nan, nan, nan, nan, 0.94408512413937, 0.011428571428571429, nan, 0.9933923516034715, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5711598746081504, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9822971548998947, 0.5341009743135519, nan, nan, nan, 0.9934424824275673, nan, nan, nan, 0.3457105147382314, 0.8955654683810024, nan, nan, 0.9259557344064386, 0.9637115064764881, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.2005 | 6.0 | 120 | 0.1255 | 0.6920 | 0.7891 | 0.9586 | [0.9141596643985943, 0.9814688768225341, 0.9679011192828346, 0.924349765373819, 0.2067970728961441, 0.9762269134818976, 0.9853861525802281, nan, 0.7930904617238695, nan, 0.8462842069585464, 0.8530493216685685, 0.9107676008860638, 0.7200271638473885, 0.8999953744391508, 0.9036847281626679, 0.9753945167300052, 0.6227132765803296, nan, 0.8113679966182161, 0.6070656092285508, nan, 0.9208300245140352, nan, 0.0, nan, nan, 0.9204013633126462, nan, nan, nan, nan, nan, 0.7020924233255836, nan, nan, 0.6512574265483845, 0.8478022118356192, 0.0, nan, nan, 0.0, nan, 0.39649122807017545, nan, nan, nan, 0.9120451188398012, nan, nan, 0.9911723413496015, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4039800995024876, nan, 0.7582771876162526, 0.8781898956503771, nan, nan, 0.7915770058407624, nan, 0.8845953002610966, nan, nan, 0.8753745910428065, 0.8166811468288445, nan, nan, 0.8800826659777835, nan, nan, nan, nan, nan, nan, 0.7717671303451829, 0.0, nan, 0.9818009913187978, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.622704004817826, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9111599121581154, 0.42604074402125774, nan, nan, nan, 0.9832710597826086, nan, nan, nan, 0.36163905841325195, 0.851421188630491, 0.0, nan, 0.9151027703306523, 0.9367415566060724, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.956906034549539, 0.9893369513069585, 0.9934149646374513, 0.969900085261561, 0.21366775717920755, 0.9898184755345831, 0.9890320966912233, nan, 0.8462620390339303, nan, 0.998209660878483, 0.9766294092030774, 0.9373079591634453, 0.7811265153037305, 0.919532443800312, 0.9666291530068014, 0.998877187430585, 0.7925311203319502, nan, 0.8595246296934206, 0.9722863741339491, nan, 0.9750658087159988, nan, nan, nan, nan, 0.9469063599848229, nan, nan, nan, nan, nan, 0.9060830250719276, nan, nan, 0.67063359034529, 0.8562216260833677, nan, nan, nan, 0.0, nan, 0.4017777777777778, nan, nan, nan, 0.9647335741244267, nan, nan, 0.9991040612496309, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6716294458229942, nan, 0.8655343241330502, 0.9912729110381154, nan, nan, 0.8517089305402425, nan, 0.8971863621317444, nan, nan, 0.923380314366916, 0.8166811468288445, nan, nan, 0.9002695417789758, nan, nan, nan, nan, nan, nan, 0.9376173586480284, 0.0, nan, 0.9939840813106233, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6482758620689655, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.961854583772392, 0.42604074402125774, nan, nan, nan, 0.9925424309960569, nan, nan, nan, 0.36498020237571493, 0.8646024665442141, nan, nan, 0.9271629778672033, 0.9687515750214203, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1192 | 7.0 | 140 | 0.1288 | 0.6815 | 0.8008 | 0.9574 | [0.9089655216316034, 0.9816031335603287, 0.9677470281869162, 0.930325735917928, 0.23104916231804448, 0.9677371801621565, 0.9849904869283349, nan, 0.8156326367762207, nan, 0.8333212435984854, 0.8777474313955, 0.9190563951345374, 0.7456661413504667, 0.8790269659834877, 0.8848017867113345, 0.9755307322351094, 0.6229286438958404, nan, 0.8013482327219725, 0.5059952038369304, nan, 0.9138001638001638, nan, 0.0, nan, nan, 0.9294484357775497, nan, nan, nan, nan, nan, 0.6956709610306474, nan, nan, 0.6630796012027219, 0.8462578095116664, 0.0, nan, nan, 0.0, nan, 0.43287435456110157, nan, nan, nan, 0.8963680206366712, nan, nan, 0.9920626895854399, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.27716390423572745, nan, 0.7707094979132414, 0.8808678568966706, nan, nan, 0.8125062331704398, 0.0, 0.909795451657985, nan, nan, 0.8781220448350671, 0.8591264101822389, nan, nan, 0.8853840226046751, nan, nan, nan, nan, nan, nan, 0.7501209872560091, 0.022857142857142857, nan, 0.9813980417784504, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6667617689015692, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9180880602935343, 0.6247863247863248, nan, nan, nan, 0.9831460674157303, nan, nan, nan, 0.32281744297824, 0.8598347960764068, 0.0, nan, 0.9109728836621634, 0.9192895134924121, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9480982260438454, 0.9889861591874004, 0.9926763170070725, 0.9633006189767874, 0.24463831334060343, 0.9767674191217106, 0.9895181956760788, nan, 0.8762809264195679, nan, 0.9984182440771062, 0.9796777471331107, 0.9267766874814154, 0.8239233808854062, 0.9073866948124577, 0.9692469540539218, 0.9986337592312537, 0.8492392807745505, nan, 0.9090595935239407, 0.9745958429561201, nan, 0.9790143316759287, nan, nan, nan, nan, 0.9597283824626134, nan, nan, nan, nan, nan, 0.8933210028771065, nan, nan, 0.7023131076097888, 0.8560152703260421, nan, nan, nan, 0.0, nan, 0.4471111111111111, nan, nan, nan, 0.9730327502942251, nan, nan, 0.9989208010506918, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.49793217535153017, nan, 0.8625619249823071, 0.9935081342688876, nan, nan, 0.8982359426681367, nan, 0.948162859980139, nan, nan, 0.9156081433791543, 0.8601216333622936, nan, nan, 0.9108398076211617, nan, nan, nan, nan, nan, nan, 0.9701648237012309, 0.022857142857142857, nan, 0.9931719032811992, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.7326018808777429, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9755532139093783, 0.6474756421612046, nan, nan, nan, 0.9900565746614092, nan, nan, nan, 0.32503299604047514, 0.874048806087641, nan, nan, 0.9429577464788732, 0.9494985131797793, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.1854 | 8.0 | 160 | 0.1337 | 0.6622 | 0.7851 | 0.9570 | [0.9085505695684986, 0.9805941628799328, 0.9668684319928861, 0.9271099154223735, 0.19218794225870364, 0.9767836919592299, 0.9850584982714825, nan, 0.7731630998906719, nan, 0.8332584334816834, 0.8907280400949618, 0.9178932563835205, 0.7535005289688219, 0.8452615992102666, 0.8889838898388984, 0.9756129469621736, 0.6254379817799579, nan, 0.8096109986829848, 0.5261669024045261, nan, 0.9182253453205195, nan, 0.0, nan, nan, 0.9221167994317317, nan, nan, nan, nan, nan, 0.7017582942159281, nan, nan, 0.635898684851437, 0.8379283887468031, 0.0, nan, nan, 0.0, 0.0, 0.36977777777777776, nan, nan, nan, 0.9017616972911536, nan, nan, 0.9924661279543107, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3232231779085559, nan, 0.7699065014133507, 0.8733254833700234, nan, nan, 0.8205051112447385, 0.0, 0.8840325203252033, nan, nan, 0.8889838216244522, 0.7998841587025775, nan, nan, 0.8776102388724805, nan, nan, nan, nan, nan, nan, 0.7707325212748206, 0.0, nan, 0.9810676009181437, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5662437297137799, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9243799153055051, 0.552212389380531, nan, nan, nan, 0.9841174707821396, nan, nan, nan, 0.37336928161419375, 0.8342827550491974, 0.0, nan, 0.899057344854674, 0.9194301154507492, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9566028130982813, 0.9874333881875913, 0.9953731244629519, 0.9641421616573962, 0.19745547073791347, 0.9899885218722102, 0.9937798207568986, nan, 0.8241597792453083, nan, 0.9990961394726321, 0.9803309624038322, 0.937977004658539, 0.8109302792847096, 0.8632776193701854, 0.9666291530068014, 0.9984329309668055, 0.8229598893499308, nan, 0.8682053048570444, 0.859122401847575, nan, 0.9745905235448962, nan, nan, nan, nan, 0.9511323939239313, nan, nan, nan, nan, nan, 0.9063296341964653, nan, nan, 0.6564699966476701, 0.8451042096574495, nan, nan, nan, 0.0, nan, 0.36977777777777776, nan, nan, nan, 0.9659510571811208, nan, nan, 0.997851783223547, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5905707196029777, nan, 0.8352913422977117, 0.9946937744173842, nan, nan, 0.902646085997795, nan, 0.8998344918901026, nan, nan, 0.9354155791427411, 0.7998841587025775, nan, nan, 0.8951429628455155, nan, nan, nan, nan, nan, nan, 0.9636970582098894, 0.0, nan, 0.9893372627279899, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6015673981191223, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9660695468914647, 0.5527015057573074, nan, nan, nan, 0.9852563003600205, nan, nan, nan, 0.3777386713594369, 0.8454473891367095, nan, nan, 0.9211267605633803, 0.9432488281840633, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0497 | 9.0 | 180 | 0.1374 | 0.6811 | 0.7752 | 0.9544 | [0.9061123055932223, 0.980497969925197, 0.9665656568899047, 0.9217891570694744, 0.1993963782696177, 0.9776373867748887, 0.9847459292072654, nan, 0.7841408581972065, nan, 0.8318733982944098, 0.8755549751893444, 0.8423218058489936, 0.6910631250398115, 0.8022286269130589, 0.9219207173194903, 0.9754015303469281, 0.6883933676386507, nan, 0.7061645875384667, 0.5472242249459265, nan, 0.9142339509126957, nan, 0.0, nan, nan, 0.9269139620059057, nan, nan, nan, nan, nan, 0.6958401551398007, nan, nan, 0.6680989055533036, 0.8635116773912375, 0.0, nan, nan, 0.0, nan, 0.35733333333333334, nan, nan, nan, 0.9047027574477843, nan, nan, 0.9904202376265608, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.40131219245489336, nan, 0.7548851559821734, 0.8565466352815478, nan, nan, 0.809762074951496, nan, 0.820341645395641, nan, nan, 0.8817315828261048, 0.807413843035042, nan, nan, 0.8848708104181799, nan, nan, nan, nan, nan, nan, 0.7758881966661054, 0.0, nan, 0.9805754857566987, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5213649851632047, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9222266881028939, 0.503985828166519, nan, nan, nan, 0.9854170220023879, nan, nan, nan, 0.3745854424855996, 0.8313440581214323, 0.0, nan, 0.8875521972559157, 0.8116120151418318, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.954678352865664, 0.9879131479981635, 0.995126908586159, 0.9716532991794958, 0.21613958560523447, 0.9933681928325468, 0.993520253337801, nan, 0.8339919360240043, nan, 0.9986789730753854, 0.9732907533749455, 0.9385221528397264, 0.7266090683812203, 0.8199719592306117, 0.9558888290845036, 0.9988072018232772, 0.8326417704011065, nan, 0.7351360661384775, 0.8764434180138568, nan, 0.9778078385492834, nan, nan, nan, nan, 0.9569415551281548, nan, nan, nan, nan, nan, 0.8996300863131936, nan, nan, 0.6906637613141133, 0.8688351217498969, nan, nan, nan, 0.0, nan, 0.35733333333333334, nan, nan, nan, 0.9606752972687796, nan, nan, 0.9989106199285285, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6071133167907361, nan, 0.8311394196744515, 0.9972594219518358, nan, nan, 0.8743109151047409, nan, 0.8297914597815294, nan, nan, 0.9368366104054289, 0.807413843035042, nan, nan, 0.9013794196924053, nan, nan, nan, nan, nan, nan, 0.9614020446484457, 0.0, nan, 0.988681719032812, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5507836990595611, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9671232876712329, 0.503985828166519, nan, nan, nan, 0.990485170581176, nan, nan, nan, 0.3776506819181698, 0.8407242193649961, nan, nan, 0.8980885311871227, 0.8320649160828587, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0527 | 10.0 | 200 | 0.1349 | 0.6823 | 0.7807 | 0.9552 | [0.9047960004019696, 0.9805055961369263, 0.9672768448090895, 0.9238890488530569, 0.24698633806589873, 0.9780733964204033, 0.9845975279289736, nan, 0.7930625780176273, nan, 0.8394005847953216, 0.8701273388773388, 0.903571601413154, 0.719191411890263, 0.8133526314168694, 0.8856031650238048, 0.9754204552207761, 0.6148491879350348, nan, 0.8072023752514127, 0.5452522255192879, nan, 0.9209098136817726, nan, 0.0, nan, nan, 0.9246558740607499, nan, nan, nan, nan, nan, 0.7016745950115711, nan, nan, 0.6347868374128708, 0.8234977440525021, 0.0, nan, nan, 0.0, nan, 0.35644444444444445, nan, nan, nan, 0.9025840037860862, nan, nan, 0.9899755945057384, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.20013275804845668, nan, 0.7553305131820707, 0.8614611688006183, nan, nan, 0.8351946875943254, nan, 0.8818105071757907, nan, nan, 0.8621452063123566, 0.8039386041123661, nan, nan, 0.8887046632124352, nan, nan, nan, nan, nan, nan, 0.776004032935641, 0.0, nan, 0.9810498476398551, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5239477503628447, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9265684890054469, 0.6085836909871245, nan, nan, nan, 0.9857714918633381, nan, nan, nan, 0.3104930689594666, 0.8362694300518134, 0.0, nan, 0.8997545409916544, 0.8504700497120637, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9565286677069155, 0.9888262392505429, 0.9937636327582788, 0.959292218313888, 0.26812068338785894, 0.9908174977681418, 0.9928500974557674, nan, 0.8425247478333088, nan, 0.9979836957466409, 0.9720569023080273, 0.9252899197145406, 0.7672627419462863, 0.8321492147009247, 0.9693448157753095, 0.9988711017256017, 0.855232826187183, nan, 0.8709610747502583, 0.848729792147806, nan, 0.9740055571804621, nan, nan, nan, nan, 0.9483193991966611, nan, nan, nan, nan, nan, 0.8972667488697081, nan, nan, 0.6563861884009387, 0.8285957490713991, nan, nan, nan, 0.0, nan, 0.35644444444444445, nan, nan, nan, 0.9674729110019885, nan, nan, 0.9994196760366928, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4987593052109181, nan, 0.8340174569473933, 0.9965013897257479, nan, nan, 0.9152149944873208, nan, 0.8989738497186363, nan, nan, 0.9046749028478627, 0.8039386041123661, nan, nan, 0.9065059986258655, nan, nan, nan, nan, nan, nan, 0.9634884206133946, 0.0, nan, 0.9898941848053093, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5658307210031348, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9679662802950474, 0.6279893711248893, nan, nan, nan, 0.9917709583404766, nan, nan, nan, 0.31139463264408274, 0.847021779060614, nan, nan, 0.9219315895372233, 0.8708734438788368, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0863 | 11.0 | 220 | 0.1323 | 0.6724 | 0.7847 | 0.9565 | [0.9088824176044232, 0.9804307686012552, 0.967312218184509, 0.9204319687726663, 0.2629041976300462, 0.9767924528301887, 0.9848134389383426, nan, 0.8024980483996877, nan, 0.8439584986626694, 0.8873749341758821, 0.900152457469206, 0.7514405289015588, 0.8245191560761893, 0.8947799758690554, 0.9754903126114347, 0.5881350681536555, nan, 0.7986218444100979, 0.5450236966824644, nan, 0.9231035917348496, nan, 0.0, nan, nan, 0.9238051494041696, nan, nan, nan, nan, nan, 0.6994675701839304, nan, nan, 0.6091535034426894, 0.9024483942017109, 0.0, nan, nan, 0.0, nan, 0.32711111111111113, nan, nan, nan, 0.8985258956642883, nan, nan, 0.9914310543440916, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.22577903682719547, nan, 0.7613943393719271, 0.8650118203309692, nan, nan, 0.789537345025703, 0.0, 0.8777562862669246, nan, nan, 0.8596758132330161, 0.8492476851851852, nan, nan, 0.8881897134536377, nan, nan, nan, nan, nan, nan, 0.7911897329171003, 0.0, nan, 0.9807985457975966, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4711626487641135, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9249697458652683, 0.6246786632390745, nan, nan, nan, 0.9832361826149854, nan, nan, nan, 0.3377431224811635, 0.8644461657629744, 0.0, nan, 0.8785, 0.9046939578823067, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9543220123280546, 0.9888468740811052, 0.9936529182364995, 0.9695568244313127, 0.28549618320610687, 0.990371126131871, 0.9946387529437305, nan, 0.853763412053099, nan, 0.998209660878483, 0.9784438960661925, 0.9217216770740411, 0.8297501841805639, 0.8360402652845823, 0.9616137397856829, 0.9988832731355682, 0.8752881512217612, nan, 0.8543575611436445, 0.7967667436489607, nan, 0.9734571512138052, nan, nan, nan, nan, 0.9473381219662179, nan, nan, nan, nan, nan, 0.8909371146732429, nan, nan, 0.6302380154207174, 0.9089197276104003, nan, nan, nan, 0.0, nan, 0.32711111111111113, nan, nan, nan, 0.9684468974473439, nan, nan, 0.9989106199285285, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6592224979321754, nan, 0.8110403397027601, 0.9956656106046765, nan, nan, 0.8636163175303198, nan, 0.9012909632571996, nan, nan, 0.9013108288382344, 0.8499855198378222, nan, nan, 0.9026478515934676, nan, nan, nan, nan, nan, nan, 0.9518047152096808, 0.0, nan, 0.989134218220634, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.48401253918495296, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.966491043203372, 0.645704162976085, nan, nan, nan, 0.9904423109891994, nan, nan, nan, 0.33919929608446986, 0.8785095775387037, nan, nan, 0.8838028169014085, 0.9267174033566856, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.043 | 12.0 | 240 | 0.1393 | 0.6661 | 0.7838 | 0.9533 | [0.9014487638412604, 0.9811548844142339, 0.9664116103888176, 0.9242579105753308, 0.27357573400738866, 0.9735057890893919, 0.986527368045247, nan, 0.7677191127225118, nan, 0.8387303997430432, 0.8958402222075259, 0.896274757489811, 0.7500606648871633, 0.8163183155287196, 0.8821202355817312, 0.9755058984340178, 0.5617220676526578, nan, 0.7968589743589743, 0.5044937088076693, nan, 0.9185180074513591, nan, 0.0, nan, nan, 0.9235194026818366, nan, nan, nan, nan, nan, 0.6883841468447203, nan, nan, 0.6025152129817444, 0.7993997383474848, 0.0, nan, nan, 0.0, nan, 0.3422222222222222, nan, nan, nan, 0.8691710783958227, nan, nan, 0.9920797086789399, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.12412060301507538, nan, 0.7370820002629618, 0.8368570681654793, nan, nan, 0.8097722484197853, 0.0, 0.870570206244812, nan, nan, 0.8052452239973187, 0.8302924992759919, nan, nan, 0.8891244763924083, nan, nan, nan, nan, nan, nan, 0.7928973877411645, 0.0, nan, 0.9797860580798194, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5691126279863481, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9314679643146796, 0.6264020707506471, nan, nan, nan, 0.9830046428419305, nan, nan, nan, 0.38321837073255915, 0.8565901470208924, 0.0, nan, 0.8938139308329274, 0.8906057772747404, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9553323816163696, 0.9881194963037859, 0.9940660321237359, 0.9579081020628869, 0.3068702290076336, 0.9919015431705139, 0.9911133554516237, nan, 0.8168861264785072, nan, 0.9985572995428551, 0.9831615619102918, 0.9318316978887897, 0.8280758154175876, 0.8306053970604452, 0.9710573958995938, 0.9989289159229428, 0.88427846934071, nan, 0.856424388563555, 0.9722863741339491, nan, 0.9734571512138052, nan, nan, nan, nan, 0.9515641559053264, nan, nan, nan, nan, nan, 0.8570283600493218, nan, nan, 0.6223600402279584, 0.8038330581923235, nan, nan, nan, 0.0, nan, 0.3422222222222222, nan, nan, nan, 0.972768962298608, nan, nan, 0.9985339184084869, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.40860215053763443, nan, 0.7934890304317056, 0.9955295535384555, nan, nan, 0.8898566703417861, nan, 0.9026150281363787, nan, nan, 0.8361173945826809, 0.8302924992759919, nan, nan, 0.9086729031235136, nan, nan, nan, nan, nan, nan, 0.9689129981222616, 0.0, nan, 0.9872662087529587, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6272727272727273, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9681770284510011, 0.6430469441984057, nan, nan, nan, 0.9891136636379222, nan, nan, nan, 0.3861856577210735, 0.8714248228811335, nan, nan, 0.9230382293762576, 0.9121516052618316, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0536 | 13.0 | 260 | 0.1349 | 0.6692 | 0.7833 | 0.9545 | [0.9025402305227535, 0.9807080869364873, 0.9661258677087996, 0.9237178267701002, 0.24345104443083623, 0.975291881618246, 0.9849066149500401, nan, 0.7802608009419899, nan, 0.8324995287191311, 0.8817730816313382, 0.9048043776968879, 0.737515299877601, 0.8021170379957474, 0.9125981720479492, 0.975741496436772, 0.6037705956907478, nan, 0.8075210452757825, 0.516504854368932, nan, 0.9209845006731333, nan, 0.0, nan, nan, 0.9273441957775946, nan, nan, nan, nan, nan, 0.7077362960542439, nan, nan, 0.6306900434293068, 0.765018484288355, 0.0, nan, nan, 0.0, nan, 0.3448888888888889, nan, nan, nan, 0.867847312762567, nan, nan, 0.9921501188609579, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1874338157430286, nan, 0.7295767287928933, 0.88021453084312, nan, nan, 0.8098383982112003, 0.0, 0.9040036103410483, nan, nan, 0.818411974182061, 0.8166811468288445, nan, nan, 0.8913346767942955, nan, nan, nan, nan, nan, nan, 0.7700753851196329, 0.0, nan, 0.9794019244833205, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5687679083094556, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9284418464019351, 0.567922874671341, nan, nan, nan, 0.9836797143950019, nan, nan, nan, 0.38771945148047865, 0.8592783505154639, 0.0, nan, 0.909260899517764, 0.9199193786255039, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9541549085355732, 0.9881349724267077, 0.9951450855971974, 0.9677777736766926, 0.26012359142130137, 0.9925604727288186, 0.992250732688033, nan, 0.8343803999839257, nan, 0.9978967860805479, 0.9933226883437364, 0.9300723560313212, 0.8071127185051236, 0.8200979851604467, 0.9722562019865929, 0.998226016997374, 0.8785154449054864, nan, 0.855838787461247, 0.9214780600461894, nan, 0.9754314126937701, nan, nan, nan, nan, 0.9568761366461253, nan, nan, nan, nan, nan, 0.8901972872996301, nan, nan, 0.6572242708682534, 0.7686494015683037, nan, nan, nan, 0.0, nan, 0.3448888888888889, nan, nan, nan, 0.9724848829187127, nan, nan, 0.9985542806528136, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4392059553349876, nan, 0.7904694503420618, 0.992069816711695, nan, nan, 0.8785005512679162, nan, 0.9283018867924528, nan, nan, 0.8531117684589061, 0.8166811468288445, nan, nan, 0.9116854288885365, nan, nan, nan, nan, nan, nan, 0.9803880659294805, 0.0, nan, 0.9866802803174456, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.622257053291536, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9707060063224446, 0.5739592559787422, nan, nan, nan, 0.99198525630036, nan, nan, nan, 0.3905851297844259, 0.8748360010495932, nan, nan, 0.929476861167002, 0.9431480268131647, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0344 | 14.0 | 280 | 0.1332 | 0.6693 | 0.7818 | 0.9566 | [0.9119679331335832, 0.9796317977085772, 0.9657175939290136, 0.9182651537981736, 0.22860686233980984, 0.9752514502733609, 0.9849055896195459, nan, 0.8376596825554165, nan, 0.844739590227831, 0.8763402889245586, 0.8756345771435255, 0.7382935406255784, 0.8336294945686604, 0.9271181221724816, 0.9755908343761425, 0.6142244800762026, nan, 0.7724486116119726, 0.5483870967741935, nan, 0.9220350755339468, nan, 0.0, nan, nan, 0.928019231985829, nan, nan, nan, nan, nan, 0.697800710270253, nan, nan, 0.6372713475517214, 0.7727529532614278, 0.0, nan, nan, 0.0, nan, 0.30044444444444446, nan, nan, nan, 0.8851174934725848, nan, nan, 0.9921097353726632, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1684597961494904, nan, 0.7313342550985137, 0.8882614891343071, nan, nan, 0.8023072889355007, 0.0, 0.8897704363660012, nan, nan, 0.8575409653327388, 0.8302924992759919, nan, nan, 0.892059553349876, nan, nan, nan, nan, nan, nan, 0.7772089182493807, 0.0, nan, 0.9787256098965652, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.505091649694501, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9271065989847715, 0.5777972027972028, nan, nan, nan, 0.9854683371686696, nan, nan, nan, 0.38774441340782123, 0.8464119772844605, 0.0, nan, 0.8919996014745442, 0.921998031496063, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9572701216205748, 0.9884857645462659, 0.9950740300085927, 0.9672167452229534, 0.24122137404580152, 0.9934107044169537, 0.9915664185831394, nan, 0.9055898624301769, nan, 0.998296570544576, 0.990637247786326, 0.9274705124392904, 0.801486839461523, 0.84988736432521, 0.957625874639135, 0.9987220019535114, 0.8918856615952052, nan, 0.8116431277988287, 0.9226327944572749, nan, 0.9706785609827435, nan, nan, nan, nan, 0.959636796587772, nan, nan, nan, nan, nan, 0.900472667488697, nan, nan, 0.6686221924237344, 0.7761813867106893, nan, nan, nan, 0.0, nan, 0.30044444444444446, nan, nan, nan, 0.9699078771153768, nan, nan, 0.9985237372863237, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4921422663358147, nan, 0.7985845718329795, 0.9692511030340726, nan, nan, 0.8434399117971334, nan, 0.9057927838464085, nan, nan, 0.8924076329679252, 0.8302924992759919, nan, nan, 0.9120025368638022, nan, nan, nan, nan, nan, nan, 0.9818485291049447, 0.0, nan, 0.985879704831299, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5442006269592476, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9622760800842992, 0.5854738706820195, nan, nan, nan, 0.9911280644608264, nan, nan, nan, 0.390849098108227, 0.8604040934138022, nan, nan, 0.9007042253521127, 0.9442568418930497, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0384 | 15.0 | 300 | 0.1342 | 0.6594 | 0.7882 | 0.9562 | [0.9105418968765336, 0.9810211433366764, 0.9668330640428936, 0.9184523265701977, 0.2587356089678853, 0.9735771085091729, 0.9852686551798366, nan, 0.8475885455915599, nan, 0.8363758213260682, 0.8913430474055515, 0.8735254660683721, 0.7328914257656662, 0.8304520246615237, 0.9305163874209067, 0.9750708232121964, 0.6063649003553221, nan, 0.7716915845139387, 0.5160026126714565, nan, 0.9242271446952046, nan, 0.0, nan, nan, 0.9283440295113446, nan, nan, nan, nan, nan, 0.68834840745427, nan, nan, 0.6422488566472165, 0.7634872902048474, 0.0, nan, nan, 0.0, 0.0, 0.30133333333333334, nan, nan, nan, 0.8895073909967606, nan, nan, 0.9913878499949459, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1605875152998776, nan, 0.7427888792354475, 0.8759884281581485, nan, nan, 0.8202547247329499, 0.0, 0.8681785504215411, nan, nan, 0.8611341861881818, 0.8291340863017665, nan, nan, 0.8880600880600881, nan, nan, nan, nan, nan, nan, 0.7685489166119501, 0.08, nan, 0.9801170869863053, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5390716803760282, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9270982234020829, 0.619813717188823, nan, nan, nan, 0.9860241169201925, nan, nan, nan, 0.3790541131217764, 0.863342805476621, 0.0, nan, 0.8966749379652605, 0.928596016719941, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.95475249825703, 0.9890222701408843, 0.9937454557472404, 0.9647290269478207, 0.27938931297709924, 0.9915189389108532, 0.9936523967511598, nan, 0.9039824253546408, nan, 0.9978620222141107, 0.9812019160981275, 0.9358211913965705, 0.7997454959480276, 0.8445155090659904, 0.960635122571806, 0.9991480013023408, 0.904794836330106, nan, 0.8143300034447124, 0.9122401847575058, nan, 0.9717022521205031, nan, nan, nan, nan, 0.9614423466917874, nan, nan, nan, nan, nan, 0.9055898068228524, nan, nan, 0.6826181696278913, 0.7662247214197276, nan, nan, nan, 0.0, nan, 0.30133333333333334, nan, nan, nan, 0.9695020494298121, nan, nan, 0.9985440995306503, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5425971877584781, nan, 0.8067468742627978, 0.9710975917899279, nan, nan, 0.8804851157662624, nan, 0.8794438927507447, nan, nan, 0.9018908415985152, 0.8291340863017665, nan, nan, 0.906083187992178, nan, nan, nan, nan, nan, nan, 0.9768412267890674, 0.08, nan, 0.9877419130273356, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5752351097178683, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9567966280295047, 0.6483613817537643, nan, nan, nan, 0.9918138179324533, nan, nan, nan, 0.38152221733391994, 0.8769351876147993, nan, nan, 0.9088531187122736, 0.9517161433395495, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0573 | 16.0 | 320 | 0.1361 | 0.6573 | 0.7823 | 0.9554 | [0.9072916666666667, 0.9810067595247849, 0.9664084596395595, 0.9202848150177746, 0.26152398871119475, 0.9684363034254372, 0.9833104848708867, nan, 0.8224074350924403, nan, 0.83490545470401, 0.8776441844017231, 0.8863895946481434, 0.733486323885133, 0.8203530633437176, 0.9231289502530494, 0.9755311150667589, 0.6413134528396717, nan, 0.7748725756271927, 0.53125, nan, 0.9251055220288136, nan, 0.0, nan, nan, 0.9266661600425564, nan, nan, nan, nan, nan, 0.6877930980758744, nan, nan, 0.6408445114691642, 0.735396160468762, 0.0, nan, nan, 0.0, 0.0, 0.34044444444444444, nan, nan, nan, 0.8998677248677248, nan, nan, 0.9907431789179445, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1621548456957228, nan, 0.7432455987449886, 0.8788246252262701, nan, nan, 0.8103216464991209, 0.0, 0.8379262944295346, nan, nan, 0.869507507673589, 0.8158123370981755, nan, nan, 0.8875960939123207, nan, nan, nan, nan, nan, nan, 0.7939125998979766, 0.0, nan, 0.9791483113069016, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5385949696444059, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9372160264353573, 0.6218776916451335, nan, nan, nan, 0.9868831446272164, nan, nan, nan, 0.3633977779721809, 0.842091638622832, 0.0, nan, 0.8851412034727073, 0.9243945658594211, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9561789670551, 0.9882587814100812, 0.9943023332672352, 0.9669731407627772, 0.28295165394402033, 0.9945585171959359, 0.991830705409857, nan, 0.8842645305613974, nan, 0.9984877718099807, 0.9907098272608507, 0.942288631182476, 0.7920433996383364, 0.8338190582712393, 0.9683661985614327, 0.9988924016930432, 0.8824343015214384, nan, 0.8064760592490527, 0.9226327944572749, nan, 0.9695817490494296, nan, nan, nan, nan, 0.9572555638418966, nan, nan, nan, nan, nan, 0.9072338676531032, nan, nan, 0.6766677841099564, 0.7381087494841106, nan, nan, nan, 0.0, nan, 0.34044444444444444, nan, nan, nan, 0.9662960107138509, nan, nan, 0.9959580945011759, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.49545078577336643, nan, 0.8047180938900684, 0.9719528076347451, nan, nan, 0.8638368246968027, nan, 0.84746772591857, nan, nan, 0.9118960617133577, 0.8158123370981755, nan, nan, 0.903123513556366, nan, nan, nan, nan, nan, nan, 0.9741289380346339, 0.0, nan, 0.9864134218220634, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.584012539184953, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9563751317175975, 0.6395039858281665, nan, nan, nan, 0.9899708554774559, nan, nan, nan, 0.3655081390233172, 0.8535817370768827, nan, nan, 0.892354124748491, 0.9465248727382692, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0213 | 17.0 | 340 | 0.1368 | 0.6714 | 0.7875 | 0.9551 | [0.9067499508557408, 0.9821580506606801, 0.9656742349013637, 0.9244670447877958, 0.2546065904505716, 0.9755603367242496, 0.9836430253860245, nan, 0.817041640770665, nan, 0.8331640058055152, 0.8925996957873157, 0.8859683748577797, 0.7204220515702067, 0.8139929633973212, 0.8885240810169266, 0.9756620395008726, 0.6541392448908901, nan, 0.7765998457979953, 0.49516908212560384, nan, 0.9250537559825206, nan, 0.0, nan, nan, 0.9249814341636563, nan, nan, nan, nan, nan, 0.6921909339573419, nan, nan, 0.6555931669684326, 0.7460048301731669, 0.0, nan, nan, 0.0, nan, 0.39824561403508774, nan, nan, nan, 0.8915121587626945, nan, nan, 0.990649203768593, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.15126050420168066, nan, 0.7192319388331379, 0.8858780432229681, nan, nan, 0.8124357656731758, 0.0, 0.8831405173534382, nan, nan, 0.8529583414390128, 0.8169707500724008, nan, nan, 0.8882882882882883, nan, nan, nan, nan, nan, nan, 0.7808379235519947, 0.045714285714285714, nan, 0.9789477321018294, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5727350671620463, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9365466748017084, 0.6238297872340426, nan, nan, nan, 0.9864427012278308, nan, nan, nan, 0.37011393514461, 0.8410561739580636, 0.0, nan, 0.884680385724227, 0.9133916290037335, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9545654748071667, 0.9885167167921093, 0.99470222751008, 0.9595579686340803, 0.2752453653217012, 0.9927092632742422, 0.9933220382177629, nan, 0.8804870534338874, nan, 0.9978098764144548, 0.9796051676585862, 0.9454851818812569, 0.7728216462393678, 0.8309834748499504, 0.979889416254832, 0.9985455165089961, 0.8706777316735823, nan, 0.8327247674819153, 0.9468822170900693, nan, 0.9751754899093302, nan, nan, nan, nan, 0.9614815977810051, nan, nan, nan, nan, nan, 0.8996917385943279, nan, nan, 0.6979550787797519, 0.7489682212133718, nan, nan, nan, 0.0, nan, 0.40355555555555556, nan, nan, nan, 0.9708006980236191, nan, nan, 0.9987986275847324, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4913151364764268, nan, 0.7811276244397264, 0.9823708915625182, nan, nan, 0.8715545755237045, nan, 0.8995696789142668, nan, nan, 0.8900875819268024, 0.8169707500724008, nan, nan, 0.9119496855345912, nan, nan, nan, nan, nan, nan, 0.9760066764030878, 0.045714285714285714, nan, 0.9862567874878173, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.6282131661442006, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9704952581664911, 0.6492471213463242, nan, nan, nan, 0.9916852391565232, nan, nan, nan, 0.3715794104707435, 0.8525321437942797, nan, nan, 0.895271629778672, 0.9370999445592461, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0662 | 18.0 | 360 | 0.1409 | 0.6605 | 0.7686 | 0.9542 | [0.9030712244946921, 0.981512175158584, 0.9657512251202641, 0.9221994890332976, 0.2452637317704784, 0.9744408279085295, 0.9850679093666637, nan, 0.7840496501124414, nan, 0.8297749400514257, 0.8769567738194937, 0.9069277691711851, 0.7495120761161259, 0.8235093326713318, 0.8788998899889989, 0.9748452690166975, 0.6528478057889823, nan, 0.8030485228639215, 0.5259562841530054, nan, 0.9271252229254817, nan, 0.0, nan, nan, 0.9239590997894096, nan, nan, nan, nan, nan, 0.6953602817477164, nan, nan, 0.640427755889684, 0.725208097831672, 0.0, nan, nan, 0.0, nan, 0.2773333333333333, nan, nan, nan, 0.8954206642759243, nan, nan, 0.9914690601815351, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.14660093409444733, nan, 0.7355447751409466, 0.8872497733138034, nan, nan, 0.7939083106552229, 0.0, 0.8265706806282722, nan, nan, 0.856755034765506, 0.8010425716768028, nan, nan, 0.8836274407434478, nan, nan, nan, nan, nan, nan, 0.8039965247610773, 0.0, nan, 0.9787600426181358, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.42356687898089174, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9307866091599918, 0.479185119574845, nan, nan, nan, 0.9873379817769603, nan, nan, nan, 0.3659689990366932, 0.8380681818181818, 0.0, nan, 0.8173038229376257, 0.9310175162369613, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9580801876874384, 0.9897806001640469, 0.9938958292021944, 0.9632563272567554, 0.26165030897855324, 0.9927092632742422, 0.991613612659339, nan, 0.835974441750499, nan, 0.9984530079435435, 0.9880243867034403, 0.9284121320249777, 0.8230527091286585, 0.8361190314907292, 0.9772960806380584, 0.9992758011069898, 0.805901337021669, nan, 0.8511539786427833, 0.8891454965357968, nan, 0.9693258262649898, nan, nan, nan, nan, 0.9529117766351349, nan, nan, nan, nan, nan, 0.908898479243732, nan, nan, 0.6675326852162253, 0.7281262897234833, nan, nan, nan, 0.0, nan, 0.2773333333333333, nan, nan, nan, 0.9677164076133273, nan, nan, 0.9986662729966097, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46732837055417703, nan, 0.7940552016985138, 0.9889793776361057, nan, nan, 0.8362734288864389, nan, 0.8361469712015889, nan, nan, 0.896931732498115, 0.8010425716768028, nan, nan, 0.8945087468949844, nan, nan, nan, nan, nan, nan, 0.9653661589818485, 0.0, nan, 0.9859087111894927, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4586206896551724, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9551106427818756, 0.479185119574845, nan, nan, nan, 0.9892422424138522, nan, nan, nan, 0.3677078750549934, 0.8514825505116768, nan, nan, 0.8173038229376257, 0.953681770072073, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0517 | 19.0 | 380 | 0.1411 | 0.6674 | 0.7807 | 0.9539 | [0.9010726546114758, 0.981572104260939, 0.965844572896596, 0.9227951076404144, 0.2412848435038696, 0.9721603795176963, 0.9846847605415064, nan, 0.809910055608773, nan, 0.8331374341561099, 0.8857404021937842, 0.9156647074604297, 0.7336315070195055, 0.7603759538432905, 0.8800961347650652, 0.9747961518939248, 0.6425137362637363, nan, 0.806066411238825, 0.49967170059093896, nan, 0.9213580161635698, nan, 0.0, nan, nan, 0.9261340674883815, nan, nan, nan, nan, nan, 0.6896540825840299, nan, nan, 0.6372432389563011, 0.7299866406330284, 0.0, nan, nan, 0.0, nan, 0.3075555555555556, nan, nan, nan, 0.8921508207742731, nan, nan, 0.9903702356412433, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1650586701434159, nan, 0.7433352117560362, 0.8892995424146427, nan, nan, 0.8126226629480425, 0.0, 0.8752849234776946, nan, nan, 0.8465815886785109, 0.8355053576600058, nan, nan, 0.8887453493179, nan, nan, nan, nan, nan, nan, 0.7864241199258885, 0.0, nan, 0.9783160987412507, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5178880553952683, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9325040783034257, 0.6282271944922547, nan, nan, nan, 0.9867354772669112, nan, nan, nan, 0.3627365101611773, 0.8623262995367987, 0.0, nan, 0.8676073251275893, 0.9212819376753802, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9556577360202738, 0.9892131423235851, 0.9939982814462291, 0.9613001096220071, 0.25612504543802256, 0.9931343791183097, 0.9934353040006418, nan, 0.8720747994052482, nan, 0.9979663138134224, 0.9846131514007839, 0.930344930121915, 0.7909718036300315, 0.7723184045117283, 0.9765376522973039, 0.9992788439594814, 0.8626094974642693, nan, 0.8696520840509817, 0.8787528868360277, nan, 0.9753217315004388, nan, nan, nan, nan, 0.959492875927307, nan, nan, nan, nan, nan, 0.8968762844225237, nan, nan, 0.6733992624874288, 0.7329240610813041, nan, nan, nan, 0.0, nan, 0.3075555555555556, nan, nan, nan, 0.9693600097398645, nan, nan, 0.9978619643457102, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5235732009925558, nan, 0.8090587402689313, 0.9821376508775681, nan, nan, 0.8673649393605292, nan, 0.8897715988083416, nan, nan, 0.8830404268893915, 0.8355053576600058, nan, nan, 0.9089900110987791, nan, nan, nan, nan, nan, nan, 0.9741289380346339, 0.0, nan, 0.9851777509630111, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5626959247648903, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9637513171759747, 0.6465899025686448, nan, nan, nan, 0.9915566603805932, nan, nan, nan, 0.3643642762868456, 0.879296772500656, nan, nan, 0.8722334004024145, 0.9431984274986139, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0943 | 20.0 | 400 | 0.1485 | 0.6727 | 0.7694 | 0.9524 | [0.8977663663966882, 0.9797150076209377, 0.9645703198788488, 0.923065218160586, 0.22169614259852388, 0.972634431012922, 0.9835813622221389, nan, 0.7575181056022728, nan, 0.8685743685970673, 0.866798950265634, 0.91869581145439, 0.7360478892560953, 0.7419692355742953, 0.8850011143302875, 0.9755217635503992, 0.6366586949094636, nan, 0.8092433167195288, 0.49522673031026254, nan, 0.9276467706948429, nan, 0.0, nan, nan, 0.9202238882088943, nan, nan, nan, nan, nan, 0.6804310464954247, nan, nan, 0.5960989533777354, 0.7278053292905414, 0.0, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.8923140387391948, nan, nan, 0.9881201321332242, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16511897852582705, nan, 0.7268125973351791, 0.8901335631193451, nan, nan, 0.8036842105263158, nan, 0.8655977056446357, nan, nan, 0.8406075273303847, 0.8271068635968722, nan, nan, 0.888802286308132, nan, nan, nan, nan, nan, nan, 0.8119762569832403, 0.0, nan, 0.9759888309410625, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4815455594002307, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9368660732767543, 0.576048951048951, nan, nan, nan, 0.985508481800358, nan, nan, nan, 0.2807648451890185, 0.8505806451612903, 0.0, nan, 0.8695174343091218, 0.906847290640394, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9642729878379426, 0.9881401311343482, 0.9956738713728601, 0.9627875732197497, 0.23147946201381317, 0.9919440547549208, 0.990660292320108, nan, 0.8000348278033033, nan, 0.9976882028819245, 0.9828712440121934, 0.9321042719793835, 0.7905699551269172, 0.7484679972904426, 0.9715222390761854, 0.9988650160206184, 0.859151682803135, nan, 0.8613158801240096, 0.9584295612009238, nan, 0.9693623866627669, nan, nan, nan, nan, 0.944315788096453, nan, nan, nan, nan, nan, 0.9031442663378545, nan, nan, 0.6300703989272545, 0.7306025588113908, nan, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.9656263950326691, nan, nan, 0.9958664644017063, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4706368899917287, nan, 0.792686954470394, 0.9637505102139984, nan, nan, 0.8417861080485116, nan, 0.879179079774909, nan, nan, 0.8763702801461632, 0.8271068635968722, nan, nan, 0.9040219861529517, nan, nan, nan, nan, nan, nan, 0.9703734612977258, 0.0, nan, 0.9814243282127443, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5235109717868338, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9538461538461539, 0.5837023914969, nan, nan, nan, 0.9909994856848963, nan, nan, nan, 0.2816542014958205, 0.8648648648648649, nan, nan, 0.8755533199195171, 0.9278262184365708, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0412 | 21.0 | 420 | 0.1430 | 0.6653 | 0.7779 | 0.9551 | [0.9073281447895429, 0.981046390776115, 0.9654003323005075, 0.9217240359426495, 0.27154011847065157, 0.9743193908417649, 0.9844806500133545, nan, 0.8142198423322922, nan, 0.8577771133866507, 0.8788506342221072, 0.9283099004100761, 0.7664277540203578, 0.8119602360981671, 0.8965610921192638, 0.9755309696380758, 0.6117665816326531, nan, 0.8051859322640466, 0.4977079240340537, nan, 0.9250788370239457, nan, 0.0, nan, nan, 0.9255701214482417, nan, nan, nan, nan, nan, 0.6753415050615929, nan, nan, 0.5778802576131032, 0.7801808281105518, 0.0, nan, nan, 0.0, nan, 0.2648888888888889, nan, nan, nan, 0.8864463423691051, nan, nan, 0.9912955567911843, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16860143725815369, nan, 0.7323076923076923, 0.8773085436033643, nan, nan, 0.8156436050092384, 0.0, 0.8629682622268471, nan, nan, 0.8289517656232595, 0.8475556841191785, nan, nan, 0.8872872977051569, nan, nan, nan, nan, nan, nan, 0.7913279132791328, 0.0, nan, 0.9767589388696655, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5090647482014389, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9350675952478492, 0.5722996515679443, nan, nan, nan, 0.985744074215924, nan, nan, nan, 0.3174770039421813, 0.8465717981888745, 0.0, nan, 0.8648973460190286, 0.8990157480314961, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9568971813684805, 0.988496081961547, 0.9947088373322758, 0.9662127995688939, 0.29327517266448566, 0.9927305190664456, 0.9915428215450397, nan, 0.879911055148487, nan, 0.9973927100172081, 0.9856292640441283, 0.9423877490336009, 0.8522537003549662, 0.8234849320247641, 0.968879972598718, 0.9988863159880599, 0.8845089903181189, nan, 0.8525318635893903, 0.8775981524249422, nan, 0.975979818660427, nan, nan, nan, nan, 0.9542463136685376, nan, nan, nan, nan, nan, 0.9103370324702014, nan, nan, 0.6091183372443848, 0.7834812216260834, nan, nan, nan, 0.0, nan, 0.2648888888888889, nan, nan, nan, 0.9687918509800738, nan, nan, 0.9982997525987314, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5045492142266336, nan, 0.7748053786270347, 0.9630119147116563, nan, nan, 0.8760749724366041, nan, 0.8784508440913604, nan, nan, 0.8632329911258048, 0.8485375036200405, nan, nan, 0.9011680143755616, nan, nan, nan, nan, nan, nan, 0.9747548508241185, 0.0, nan, 0.9825613774539379, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5545454545454546, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9620653319283456, 0.5819309123117803, nan, nan, nan, 0.9927995885479171, nan, nan, nan, 0.3188737351517818, 0.858567305169247, nan, nan, 0.8688128772635815, 0.9207197217882164, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0199 | 22.0 | 440 | 0.1447 | 0.6543 | 0.7735 | 0.9546 | [0.9039550377724814, 0.9805792442592574, 0.9651985210092653, 0.9280094246838981, 0.25948293411142026, 0.9682457592056268, 0.9838782264642734, nan, 0.7846596188132486, nan, 0.8698082341179678, 0.8607947355099975, 0.9184953743256767, 0.7601198483551425, 0.834240099201736, 0.888963510185807, 0.9756241826180002, 0.6379928315412187, nan, 0.796530223241689, 0.5219478737997256, nan, 0.9255110613273593, nan, 0.0, nan, nan, 0.9156916417340378, nan, nan, nan, nan, nan, 0.6811012065689258, nan, nan, 0.5786924939467313, 0.7218325239600195, 0.0, nan, nan, 0.0, 0.0, 0.2568888888888889, nan, nan, nan, 0.885194467650608, nan, nan, 0.9851881788330387, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.2647887323943662, nan, 0.7357186867353226, 0.8891144469968125, nan, nan, 0.8115897118066315, 0.0, 0.8649951155975253, nan, nan, 0.8358096828046745, 0.8488271068635969, nan, nan, 0.8884051210283522, nan, nan, nan, nan, nan, nan, 0.8024499654934437, 0.0, nan, 0.9759326668474216, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4338654503990878, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9347600249324746, 0.609375, nan, nan, nan, 0.9859737380627558, nan, nan, nan, 0.39161938018332604, 0.8355348355348355, 0.0, nan, 0.8889551645292773, 0.8914553065747353, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9599061562807787, 0.9882226704565972, 0.9951500429638442, 0.9681911630636583, 0.27800799709196655, 0.9948773540789865, 0.9904856742381696, nan, 0.8371398336302627, nan, 0.9981227512123898, 0.9873711714327188, 0.9324016255327584, 0.8325631237023642, 0.8478551962066195, 0.9715222390761854, 0.9987828590033441, 0.8616874135546335, nan, 0.8492938339648639, 0.8787528868360277, nan, 0.9666569172272594, nan, nan, nan, nan, 0.9410186966021641, nan, nan, nan, nan, nan, 0.9060213727907933, nan, nan, 0.6409654710023466, 0.7246440363186133, nan, nan, nan, 0.0, nan, 0.2568888888888889, nan, nan, nan, 0.9675134937705451, nan, nan, 0.9927510410197412, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.38875103391232424, nan, 0.7887237556027364, 0.9704756166300609, nan, nan, 0.8662624035281147, nan, 0.8793114862628268, nan, nan, 0.8711501653036366, 0.8488271068635969, nan, nan, 0.9058717826753343, nan, nan, nan, nan, nan, nan, 0.9703734612977258, 0.0, nan, 0.981430129484383, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.47711598746081507, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9481559536354057, 0.6217891939769707, nan, nan, nan, 0.9912137836447797, nan, nan, nan, 0.3947206335239771, 0.8464969824193125, nan, nan, 0.8995975855130784, 0.9123028073181795, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0611 | 23.0 | 460 | 0.1400 | 0.6670 | 0.7796 | 0.9550 | [0.9056291095707336, 0.9815046187247813, 0.965093017962421, 0.9240066025772721, 0.23283705541770058, 0.9696410043577506, 0.9840532959326789, nan, 0.793840900310323, nan, 0.8489584103375372, 0.8835370237239396, 0.9146673739541389, 0.7555487431023912, 0.8607633728146117, 0.8943652440674907, 0.9751605372428582, 0.6375567895002524, nan, 0.8056213689915939, 0.5057692307692307, nan, 0.9277478862413528, nan, 0.0, nan, nan, 0.9160464763061968, nan, nan, nan, nan, nan, 0.6888519398258116, nan, nan, 0.583969465648855, 0.7256957829003161, 0.0, nan, nan, 0.0, nan, 0.28355555555555556, nan, nan, nan, 0.88433311766337, nan, nan, 0.9908382912958716, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.20684977958630044, nan, 0.7390609035582604, 0.8940956282223321, nan, nan, 0.8051599587203302, 0.0, 0.8744222381355381, nan, nan, 0.8321348940914158, 0.8540399652476107, nan, nan, 0.884737923946557, nan, nan, nan, nan, nan, nan, 0.7769796191112596, 0.0, nan, 0.9768347340551329, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4137830764757197, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9369721936148301, 0.6206008583690987, nan, nan, nan, 0.9860223301798346, nan, nan, nan, 0.37178590169669407, 0.8576962809917356, 0.0, nan, 0.9031655524970508, 0.9115636220549899, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9561335945021746, 0.9888210805429024, 0.9949120893647961, 0.9628281739631124, 0.2455834242093784, 0.9932194022871232, 0.9933833905168223, nan, 0.8532409950035498, nan, 0.9980879873459526, 0.9812019160981275, 0.93998414114382, 0.8253298506463064, 0.8849540793018164, 0.9700298478250232, 0.9990262872026753, 0.8734439834024896, nan, 0.855046503616948, 0.9110854503464203, nan, 0.970824802573852, nan, nan, nan, nan, 0.9469325273776348, nan, nan, nan, nan, nan, 0.8939786272092067, nan, nan, 0.6539557492457257, 0.728410028889806, nan, nan, nan, 0.0, nan, 0.28355555555555556, nan, nan, nan, 0.9706992411022279, nan, nan, 0.9986866352409363, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5045492142266336, nan, 0.7849492804906818, 0.9842368170421194, nan, nan, 0.8601984564498346, nan, 0.88924197285667, nan, nan, 0.8658720491850821, 0.8540399652476107, nan, nan, 0.9099413350245759, nan, nan, nan, nan, nan, nan, 0.9703734612977258, 0.0, nan, 0.9824279482062468, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4460815047021944, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9586933614330875, 0.6403897254207263, nan, nan, nan, 0.9916852391565232, nan, nan, nan, 0.3740431148262209, 0.8714248228811335, nan, nan, 0.9242454728370222, 0.9340759034322866, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0257 | 24.0 | 480 | 0.1440 | 0.6567 | 0.7687 | 0.9532 | [0.9010923741485296, 0.9813417169307022, 0.9646562453447638, 0.9210322973969263, 0.2263273125342091, 0.9711594443982368, 0.9840924712780731, nan, 0.7782854409747133, nan, 0.8424493864805829, 0.8863904549626328, 0.9176318863149698, 0.7361859416113269, 0.8341129119982595, 0.8756038913767621, 0.9752557597937287, 0.6308631211857019, nan, 0.8017986189176168, 0.5057962172056132, nan, 0.9275606587309948, nan, 0.0, nan, nan, 0.9194653916095477, nan, nan, nan, nan, nan, 0.6825275859331587, nan, nan, 0.5707147183525136, 0.723511422917791, 0.0, nan, nan, 0.0, nan, 0.28, nan, nan, nan, 0.8955265925480769, nan, nan, 0.9840276866883936, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.18865107018417124, nan, 0.7304286841413168, 0.8978562197950043, nan, nan, 0.8156237301909792, 0.0, 0.8316967792615868, nan, nan, 0.8401759122665405, 0.7955401100492325, nan, nan, 0.8827295440349782, nan, nan, nan, nan, nan, nan, 0.8046984572230014, 0.0, nan, 0.9767390890582013, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24353906962602614, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9312267657992565, 0.5692982456140351, nan, nan, nan, 0.9864483584131327, nan, nan, nan, 0.3173169662428759, 0.8274072151570205, 0.0, nan, 0.8962771252239697, 0.897276801103068, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603355355621217, 0.98870758897481, 0.9951616101526869, 0.9590965965504132, 0.24049436568520538, 0.992751774858649, 0.9932418082882236, nan, 0.8282855344058511, nan, 0.9988701743407902, 0.9813470750471767, 0.9344583209436019, 0.7870202933494073, 0.845570976228359, 0.9710818613299408, 0.9990110729402171, 0.8340248962655602, nan, 0.8599379951774027, 0.9572748267898383, nan, 0.9657794676806084, nan, nan, nan, nan, 0.9460035849328152, nan, nan, nan, nan, nan, 0.8834566378956021, nan, nan, 0.631746563861884, 0.7262174989682212, nan, nan, nan, 0.0, nan, 0.28, nan, nan, nan, 0.967594659307658, nan, nan, 0.9929241200965171, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.6269644334160464, nan, 0.7862231658410003, 0.9858111916655329, nan, nan, 0.885226019845645, nan, 0.8411122144985105, nan, nan, 0.875384258453686, 0.7955401100492325, nan, nan, 0.896305692088156, nan, nan, nan, nan, nan, nan, 0.9576465679115377, 0.0, nan, 0.9824337494778855, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2510971786833856, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.950263435194942, 0.5748449955713021, nan, nan, nan, 0.9889850848619921, nan, nan, nan, 0.3184337879454465, 0.8365258462345841, nan, nan, 0.9058350100603622, 0.9183508895720982, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0322 | 25.0 | 500 | 0.1493 | 0.6622 | 0.7792 | 0.9527 | [0.8987532517302642, 0.9821762592693188, 0.9652978739436654, 0.9247008398164701, 0.2464658881376767, 0.9660166095112176, 0.9841700503340007, nan, 0.798278495071498, nan, 0.8386729525783428, 0.8877510831035841, 0.9246757687516791, 0.7648765432098765, 0.7661324041811847, 0.8786893886156009, 0.975471042161282, 0.6302578018995929, nan, 0.7958702621157835, 0.46604215456674475, nan, 0.9266499407624225, nan, 0.0, nan, nan, 0.9198568512311298, nan, nan, nan, nan, nan, 0.6853465068802693, nan, nan, 0.6117514124293786, 0.716386068628711, 0.0, nan, nan, 0.0, nan, 0.3543516873889876, nan, nan, nan, 0.8873532085412685, nan, nan, 0.9860963755595299, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.2118533922537042, nan, 0.7313612099644128, 0.8829593587005986, nan, nan, 0.8264090499549505, 0.0, 0.8389147388426502, nan, nan, 0.8368210151380232, 0.854950781702374, nan, nan, 0.8848238482384824, nan, nan, nan, nan, nan, nan, 0.7803371724253046, 0.0, nan, 0.9776948671668214, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.29377845220030346, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9334292171769057, 0.6505922165820643, nan, nan, nan, 0.9866084798699354, nan, nan, nan, 0.36605030233984753, 0.8176378772112383, 0.0, nan, 0.8780779583291796, 0.9021333071175777, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9550756393656695, 0.9886921128518883, 0.9949302663758345, 0.9603330737346409, 0.2623773173391494, 0.993942099222038, 0.9929067303472068, nan, 0.8472532918971776, nan, 0.9987658827414785, 0.9815648134707504, 0.9381256814352265, 0.8298841336816021, 0.7793601033412625, 0.9789107990409551, 0.9989258730704511, 0.8566159520516367, nan, 0.8723045125732002, 0.9191685912240185, nan, 0.9722506580871599, nan, nan, nan, nan, 0.9550836702385158, nan, nan, nan, nan, nan, 0.9037813399095767, nan, nan, 0.6806067717063359, 0.7189176640528271, nan, nan, nan, 0.0, nan, 0.3546666666666667, nan, nan, nan, 0.9705572014122803, nan, nan, 0.9935858930371305, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.674110835401158, nan, 0.7757018164661477, 0.9805244028066629, nan, nan, 0.9101433296582139, nan, 0.84746772591857, nan, nan, 0.8721071863580999, 0.8551983782218361, nan, nan, 0.8973098673431636, nan, nan, nan, nan, nan, nan, 0.9753807636136032, 0.0, nan, 0.983576599990718, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.30344827586206896, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9574288724973656, 0.6811337466784765, nan, nan, nan, 0.9883421909823419, nan, nan, nan, 0.3675318961724593, 0.8247179218053005, nan, nan, 0.8861167002012073, 0.9250037800514087, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0596 | 26.0 | 520 | 0.1515 | 0.6627 | 0.7698 | 0.9535 | [0.898996883288365, 0.9810175804834121, 0.9653349632702696, 0.9289054400033855, 0.24659956474428726, 0.9696523092890503, 0.9849334611876418, nan, 0.7827180783817952, nan, 0.831613080675748, 0.8830779263123574, 0.9181114249537353, 0.7569973778888957, 0.809344490934449, 0.8759166259277366, 0.9748511136049212, 0.6175791556728232, nan, 0.8047978647223488, 0.4954308093994778, nan, 0.9286839617687218, nan, 0.0, nan, nan, 0.921801007556675, nan, nan, nan, nan, nan, 0.6897872073316176, nan, nan, 0.6223682127592212, 0.7164785436968092, 0.0, nan, nan, 0.0, nan, 0.296, nan, nan, nan, 0.8882658668802742, nan, nan, 0.991599017357987, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3551762114537445, nan, 0.7380358256839854, 0.8914093087767966, nan, nan, 0.8025706940874036, 0.0, 0.8506914858753359, nan, nan, 0.8335281674460031, 0.8300028960324356, nan, nan, 0.8839504883271531, nan, nan, nan, nan, nan, nan, 0.8004820106730934, 0.0, nan, 0.9772533327182988, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.31045655375552283, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9208454206382097, 0.5398230088495575, nan, nan, nan, 0.9859317540408792, nan, nan, nan, 0.3618455611976887, 0.8331606217616581, 0.0, nan, 0.8468089369802625, 0.9188271756776701, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9553832874074566, 0.9890996507554928, 0.9943304250115672, 0.972225400563243, 0.2636132315521629, 0.992921821196276, 0.993737346088319, nan, 0.8293437638139124, nan, 0.9985399176096366, 0.9828712440121934, 0.9343096441669144, 0.8314245529435403, 0.8227445296869831, 0.9643783334148848, 0.9991601727123074, 0.8633010603964961, nan, 0.8517051326214261, 0.8764434180138568, nan, 0.9698011114360924, nan, nan, nan, nan, 0.9576088236448561, nan, nan, nan, nan, nan, 0.9033292231812577, nan, nan, 0.6589842440496145, 0.7187886917044986, nan, nan, nan, 0.0, nan, 0.296, nan, nan, nan, 0.9675540765391015, nan, nan, 0.9986255485079566, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.533498759305211, nan, 0.7814578910120311, 0.9868413380240627, nan, nan, 0.860529217199559, nan, 0.8592519033432638, nan, nan, 0.8684821066063454, 0.8300028960324356, nan, nan, 0.8945087468949844, nan, nan, nan, nan, nan, nan, 0.9701648237012309, 0.0, nan, 0.9832401262356708, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3304075235109718, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9365648050579557, 0.5403011514614703, nan, nan, nan, 0.9882136122064118, nan, nan, nan, 0.3636603607567092, 0.843872999212805, nan, nan, 0.8503018108651912, 0.941333602136989, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0373 | 27.0 | 540 | 0.1535 | 0.6514 | 0.7772 | 0.9524 | [0.8963973714327141, 0.9815431861804222, 0.9656869828242253, 0.9268806134403067, 0.2601889338731444, 0.9681631637855989, 0.9856069911579105, nan, 0.777042752339995, nan, 0.8435352170592727, 0.8865554465161923, 0.9174633062958671, 0.7758838764078781, 0.7319016978060314, 0.8994057437241566, 0.974827587230619, 0.6110304789550073, nan, 0.8094741982744588, 0.5006172839506173, nan, 0.9267859010563748, nan, 0.0, nan, nan, 0.92450982853413, nan, nan, nan, nan, nan, 0.6830977143924768, nan, nan, 0.6236653386454183, 0.7161630448759162, 0.0, nan, nan, 0.0, 0.0, 0.3431111111111111, nan, nan, nan, 0.8806434395303039, nan, nan, 0.9882189671589781, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.20980091883614088, nan, 0.753966831017481, 0.885693618672924, nan, nan, 0.824291742013261, 0.0, 0.8546108567691302, nan, nan, 0.8229317851959361, 0.8294236895453229, nan, nan, 0.8852127990059024, nan, nan, nan, nan, nan, nan, 0.8037107681636899, 0.0, nan, 0.9770793950850661, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3821618428824572, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9215442092154421, 0.5928819444444444, nan, nan, nan, 0.9851231190150479, nan, nan, nan, 0.3537628775973459, 0.8534126163391934, 0.0, nan, 0.884470073573275, 0.9071815051647811, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9578168055509445, 0.9892802055229124, 0.9939222684909776, 0.9636881715270678, 0.2803344238458742, 0.9928580538196659, 0.9921516251280139, nan, 0.8229140155117678, nan, 0.998383480210669, 0.9835244592829148, 0.9417434830012885, 0.8627687361864577, 0.7436159989917925, 0.9738464549591427, 0.9991449584498492, 0.8734439834024896, nan, 0.8564588356872201, 0.9364896073903002, nan, 0.9718850541093887, nan, nan, nan, nan, 0.96364040768798, nan, nan, nan, nan, nan, 0.9016440608302507, nan, nan, 0.6559671471672812, 0.7183243912505158, nan, nan, nan, 0.0, nan, 0.3431111111111111, nan, nan, nan, 0.9709021549450103, nan, nan, 0.9932193726392523, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5665839536807279, nan, 0.7936305732484077, 0.9757624054889308, nan, nan, 0.9046306504961411, nan, 0.8650777888116518, nan, nan, 0.8550548112058465, 0.8294236895453229, nan, nan, 0.9035991755192643, nan, nan, nan, nan, nan, nan, 0.9670352597538077, 0.0, nan, 0.9835185872743305, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.40564263322884014, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9357218124341412, 0.6049601417183348, nan, nan, nan, 0.9876564375107149, nan, nan, nan, 0.35653321601407834, 0.8661768564681186, nan, nan, 0.8949698189134809, 0.9295398417418477, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0385 | 28.0 | 560 | 0.1548 | 0.6607 | 0.7661 | 0.9523 | [0.8969621541765724, 0.9806285969057003, 0.9635773293986869, 0.9231756991442915, 0.2249930728733721, 0.9688394078865531, 0.9854204853900137, nan, 0.7478959546390704, nan, 0.84956367401272, 0.8813208157979929, 0.9237327357318588, 0.7794047619047619, 0.7817020681510947, 0.9110552880701509, 0.9753082385097597, 0.6450381679389313, nan, 0.8004179046002154, 0.5100207325501037, nan, 0.9287141204920618, nan, 0.0, nan, nan, 0.9257957677040932, nan, nan, nan, nan, nan, 0.6798638313747435, nan, nan, 0.6271429710549398, 0.7125334293355277, 0.0, nan, nan, 0.0, nan, 0.3552397868561279, nan, nan, nan, 0.8879326403636228, nan, nan, 0.9867445131391219, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.19743863393810032, nan, 0.7175386283545676, 0.8872143471922151, nan, nan, 0.8105869279725225, 0.0, 0.8288317818277173, nan, nan, 0.838185905798107, 0.8169707500724008, nan, nan, 0.8838252163937845, nan, nan, nan, nan, nan, nan, 0.816590024761231, 0.0, nan, 0.975703737886479, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.39425587467362927, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9184308841843088, 0.5150442477876106, nan, nan, nan, 0.9847889249700906, nan, nan, nan, 0.31881014873140856, 0.829097510373444, 0.0, nan, 0.8815331010452961, 0.8871658543789691, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9594734570565386, 0.9871341831444387, 0.9955102782735145, 0.9712066910025062, 0.2361323155216285, 0.9933044254559368, 0.992033639937515, nan, 0.7880192356636706, nan, 0.9984008621438877, 0.9879518072289156, 0.939686787590445, 0.8769673832964973, 0.7925140597677972, 0.9582864412585017, 0.9989045731030097, 0.818118948824343, nan, 0.8445056837754048, 0.8521939953810623, nan, 0.96881398069611, nan, nan, nan, nan, 0.958197589983122, nan, nan, nan, nan, nan, 0.8988286066584463, nan, nan, 0.6591518605430774, 0.7147389599669831, nan, nan, nan, 0.0, nan, 0.35555555555555557, nan, nan, nan, 0.9672091230063715, nan, nan, 0.9928324899970474, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.45905707196029777, nan, 0.7493276716206653, 0.9817489164026512, nan, nan, 0.8846747519294377, nan, 0.8370076133730553, nan, nan, 0.8757902673858825, 0.8169707500724008, nan, nan, 0.8958300301252576, nan, nan, nan, nan, nan, nan, 0.9632797830168996, 0.0, nan, 0.981273495150137, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4260188087774295, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9325605900948367, 0.5155004428697962, nan, nan, nan, 0.9878278758786216, nan, nan, nan, 0.32063352397712275, 0.8388874311204408, nan, nan, 0.8908450704225352, 0.9082707524822338, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0412 | 29.0 | 580 | 0.1593 | 0.6500 | 0.7726 | 0.9514 | [0.8948418449819836, 0.9812311696829333, 0.9639522307057234, 0.9257996353408432, 0.24174917491749176, 0.9663349127852842, 0.98455014628053, nan, 0.7596775209422435, nan, 0.8658385842202057, 0.8839937434827946, 0.9306782489202984, 0.7693898655635988, 0.7245385450597177, 0.900695747699913, 0.9755799610634724, 0.6496042216358839, nan, 0.802975901665858, 0.49936708860759493, nan, 0.9273852393105618, nan, 0.0, nan, nan, 0.9155676822798242, nan, nan, nan, nan, nan, 0.6822022688722553, nan, nan, 0.6083743842364532, 0.7122244798230498, 0.0, nan, nan, 0.0, 0.0, 0.34044444444444444, nan, nan, nan, 0.8759535351687552, nan, nan, 0.9832898119471266, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.19440509915014165, nan, 0.7086296715741789, 0.8858214342306812, nan, nan, 0.8264661654135338, 0.0, 0.847679518230019, nan, nan, 0.8151593887072891, 0.8349261511728931, nan, nan, 0.8867043627120402, nan, nan, nan, nan, nan, nan, 0.8265906255569417, 0.0, nan, 0.9758325064038954, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4595292766934558, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9329179646936656, 0.6205704407951599, nan, nan, nan, 0.9847416335427619, nan, nan, nan, 0.27804024496937885, 0.8321641984931151, 0.0, nan, 0.8980773003727683, 0.8735903875510908, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9599692351958213, 0.9878976718752418, 0.9955615043955318, 0.9670543422495026, 0.25561613958560525, 0.9926880074820389, 0.9894568433770193, nan, 0.8078308976196535, nan, 0.998383480210669, 0.9844679924517347, 0.9398106849043513, 0.8969258589511754, 0.7358496510657068, 0.9628370113030288, 0.9987524304784277, 0.851313969571231, nan, 0.8551153978642784, 0.9110854503464203, nan, 0.9698011114360924, nan, nan, nan, nan, 0.9483063155002551, nan, nan, nan, nan, nan, 0.900924784217016, nan, nan, 0.6831210191082803, 0.7143004539826661, nan, nan, nan, 0.0, nan, 0.34044444444444444, nan, nan, nan, 0.9716326447790268, nan, nan, 0.9891061992852852, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.45409429280397023, nan, 0.7380514272234018, 0.9779004451009737, nan, nan, 0.9089305402425579, nan, 0.8573320092684542, nan, nan, 0.8461516153355374, 0.8349261511728931, nan, nan, 0.9033877702024206, nan, nan, nan, nan, nan, nan, 0.9676611725432923, 0.0, nan, 0.9812444887919431, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.5018808777429468, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9466807165437302, 0.6359610274579274, nan, nan, nan, 0.9874849991428082, nan, nan, nan, 0.2796304443466784, 0.8404618210443453, nan, nan, 0.9210261569416499, 0.8941081598709743, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0419 | 30.0 | 600 | 0.1654 | 0.6589 | 0.7657 | 0.9500 | [0.8894597148054091, 0.9806504481434059, 0.9635034979344921, 0.9272361312193227, 0.23905585288870593, 0.9666790766939687, 0.9853960222081665, nan, 0.747393299040734, nan, 0.8652372276512129, 0.8867275373825592, 0.9255024848826107, 0.7764501020775789, 0.6383005252411902, 0.9095685065844774, 0.975578001177107, 0.6456268475047818, nan, 0.8052866836029199, 0.4974779319041614, nan, 0.9281130553704807, nan, 0.0, nan, nan, 0.915925401799252, nan, nan, nan, nan, nan, 0.6784451746531076, nan, nan, 0.6009713895423845, 0.7131798894175132, 0.0, nan, nan, 0.0, nan, 0.296, nan, nan, nan, 0.8804477914234685, nan, nan, 0.9842483838659747, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.21609798775153105, nan, 0.7173296094418667, 0.887165095838187, nan, nan, 0.8345962113659023, 0.0, 0.8312873367032101, nan, nan, 0.8198376614320382, 0.8268172603533159, nan, nan, 0.8871587252154054, nan, nan, nan, nan, nan, nan, 0.8274446823697359, 0.0, nan, 0.9740183990621445, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.39105504587155965, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.927675585284281, 0.5916230366492147, nan, nan, nan, 0.9828449197860962, nan, nan, nan, 0.32230971128608926, 0.824935064935065, 0.0, nan, 0.9002729576915578, 0.879444499162809, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9616269933490478, 0.9877480693536655, 0.9959068676052614, 0.9691914177410484, 0.25328971283169754, 0.993431960209157, 0.99257637181381, nan, 0.7921449908242134, nan, 0.9988180285411343, 0.9727101175787487, 0.9367628109822579, 0.8660504989618913, 0.6451582413081491, 0.9530508391642609, 0.9986702734611534, 0.8559243891194098, nan, 0.8626248708232862, 0.9110854503464203, nan, 0.9676440479672419, nan, nan, nan, nan, 0.9484371524643143, nan, nan, nan, nan, nan, 0.8952938758734074, nan, nan, 0.6636775058665773, 0.7153322327692943, nan, nan, nan, 0.0, nan, 0.296, nan, nan, nan, 0.9702934134166633, nan, nan, 0.9905213752659818, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.40860215053763443, nan, 0.751309271054494, 0.9769869190849191, nan, nan, 0.9229327453142228, nan, 0.8383316782522343, nan, nan, 0.8523867525085552, 0.8268172603533159, nan, nan, 0.9033349188732097, nan, nan, nan, nan, nan, nan, 0.9674525349467974, 0.0, nan, 0.9784540771337077, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.42758620689655175, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.935300316122234, 0.6005314437555359, nan, nan, nan, 0.984656266072347, nan, nan, nan, 0.32415310162780464, 0.8333770663867751, nan, nan, 0.9290744466800804, 0.9000554407539942, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0461 | 31.0 | 620 | 0.1565 | 0.6614 | 0.7732 | 0.9528 | [0.8991798877938336, 0.9799250644936734, 0.9628604467671196, 0.9267811804087885, 0.22371671699149054, 0.9725405225217717, 0.9850289995733288, nan, 0.8308504034761018, nan, 0.8612243674301475, 0.8849632931305715, 0.9244002254515157, 0.7684300237414013, 0.7237533113352234, 0.8893156346158165, 0.9756389788230957, 0.6187214611872146, nan, 0.8099350670121229, 0.4732868757259001, nan, 0.9266520666040949, nan, 0.0, nan, nan, 0.9234712421823104, nan, nan, nan, nan, nan, 0.6849216300940438, nan, nan, 0.6062311717139686, 0.7188705446690326, 0.0, nan, nan, 0.0, nan, 0.25866666666666666, nan, nan, nan, 0.8850157436562326, nan, nan, 0.9866535799570971, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.19433488705629257, nan, 0.7056058028118564, 0.8852084821661933, nan, nan, 0.8273438273438274, 0.0, 0.8420397194730288, nan, nan, 0.8258628854778951, 0.8420900692840647, nan, nan, 0.8844428213689483, nan, nan, nan, nan, nan, nan, 0.8115512127028441, 0.0, nan, 0.9742180237090245, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4558739255014327, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9349492438367516, 0.5899218071242398, nan, nan, nan, 0.9847746129501326, nan, nan, nan, 0.324986867448783, 0.843579766536965, 0.0, nan, 0.8929384965831435, 0.8745262121585036, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9563549240286401, 0.9876139429550109, 0.9961745654041906, 0.9646588983911033, 0.23700472555434388, 0.9922203800535646, 0.9914814692459802, nan, 0.8964810523354722, nan, 0.9986615911421668, 0.9798954855566846, 0.9347308950341957, 0.8454222758020227, 0.7359756769955418, 0.9687087145862896, 0.9985516022139794, 0.8745965882895343, nan, 0.8722356183258698, 0.941108545034642, nan, 0.9745905235448962, nan, nan, nan, nan, 0.9640198348837513, nan, nan, nan, nan, nan, 0.8980271270036991, nan, nan, 0.6408816627556152, 0.7210586050350805, nan, nan, nan, 0.0, nan, 0.25866666666666666, nan, nan, nan, 0.9695629235826468, nan, nan, 0.9927510410197412, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.44830438378825477, nan, 0.743524416135881, 0.9874438764601838, nan, nan, 0.9213891951488423, nan, 0.8505130751406819, nan, nan, 0.8597529145641204, 0.8447726614538082, nan, nan, 0.8959885841128904, nan, nan, nan, nan, nan, nan, 0.9703734612977258, 0.0, nan, 0.9787789483454773, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4987460815047022, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9511064278187565, 0.6014171833480957, nan, nan, nan, 0.9868849648551345, nan, nan, nan, 0.326616805983282, 0.853319338756232, nan, nan, 0.9070422535211268, 0.8954185776926566, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.026 | 32.0 | 640 | 0.1624 | 0.6595 | 0.7696 | 0.9512 | [0.8943281702012047, 0.9808576666632527, 0.9631014478389521, 0.9275062592294862, 0.24392574392574393, 0.9683737124619178, 0.9847742370721561, nan, 0.7882612331787425, nan, 0.8687999032180015, 0.893223819301848, 0.8977612467336802, 0.7675957976559179, 0.6763256486389851, 0.896200921951995, 0.9755233532649398, 0.6285329744279946, nan, 0.790101940151266, 0.4647967945048655, nan, 0.9270241850683492, nan, 0.0, nan, nan, 0.9198165207364513, nan, nan, nan, nan, nan, 0.6804688475465102, nan, nan, 0.5940945185413615, 0.7179282663510523, 0.0, nan, nan, 0.0, nan, 0.2853333333333333, nan, nan, nan, 0.8874335822836548, nan, nan, 0.9811549905876161, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.17046894803548795, nan, 0.7041478365922412, 0.895628376202912, nan, nan, 0.8212106053026513, 0.0, 0.8698224852071006, nan, nan, 0.8211678832116789, 0.8506212077434268, nan, nan, 0.878044955567172, nan, nan, nan, nan, nan, nan, 0.8210749646393211, 0.0, nan, 0.9740433688197122, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4286125503742084, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9302422723475355, 0.6117850953206239, nan, nan, nan, 0.9837102911625123, nan, nan, nan, 0.33828801259071434, 0.8258951738453555, 0.0, nan, 0.8808197373656984, 0.8863669932548865, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9616944988546197, 0.9880782266426614, 0.995556547028885, 0.9667110814192543, 0.2598327880770629, 0.9931768907027165, 0.992038359345135, nan, 0.8332953799579387, nan, 0.9986268272757296, 0.9787342139642909, 0.9449895926256319, 0.8465608465608465, 0.6853605129255345, 0.9655526740715369, 0.9988102446757688, 0.8612263715998156, nan, 0.8276610403031347, 0.9376443418013857, nan, 0.9669494004094764, nan, nan, nan, nan, 0.949758605801311, nan, nan, nan, nan, nan, 0.8959926017262638, nan, nan, 0.6458263493127724, 0.7197430870821296, nan, nan, nan, 0.0, nan, 0.2853333333333333, nan, nan, nan, 0.9692585528184733, nan, nan, 0.9869987069974853, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.44499586435070304, nan, 0.7424864354800661, 0.9732161946782251, nan, nan, 0.9049614112458655, nan, 0.8856007944389275, nan, nan, 0.8547938054637202, 0.8525919490298292, nan, nan, 0.8877437767559854, nan, nan, nan, nan, nan, nan, 0.9689129981222616, 0.0, nan, 0.9787673458021998, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.46677115987460815, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9386722866174921, 0.62533215234721, nan, nan, nan, 0.9861134921995542, nan, nan, nan, 0.34043114826220855, 0.8352138546313304, nan, nan, 0.8907444668008049, 0.907363540144146, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.034 | 33.0 | 660 | 0.1699 | 0.6459 | 0.7693 | 0.9488 | [0.8861336073662662, 0.9810694359767812, 0.9632811550443633, 0.928760044496096, 0.25706612957368247, 0.9694381090607506, 0.9850402222649094, nan, 0.758450489773131, nan, 0.8668095252467328, 0.8914878064987707, 0.9211012295281139, 0.7682622711513898, 0.5609930067761393, 0.9043751995985219, 0.975454361753973, 0.6265401265401266, nan, 0.8116814159292035, 0.46369353916523726, nan, 0.9280305686040805, nan, 0.0, nan, nan, 0.9175367326018452, nan, nan, nan, nan, nan, 0.6832689731263298, nan, nan, 0.580840132970686, 0.7141092612130413, 0.0, nan, nan, 0.0, 0.0, 0.32355555555555554, nan, nan, nan, 0.8816274611016887, nan, nan, 0.9826290177351962, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1714465805231642, nan, 0.7025585876237845, 0.8954973896619924, nan, nan, 0.8250224842610173, 0.0, 0.8877438589668806, nan, nan, 0.8217899049583322, 0.8556998556998557, nan, nan, 0.8831479363759227, nan, nan, nan, nan, nan, nan, 0.8263419836473516, 0.0, nan, 0.9746616940698319, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.35232860862019094, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9371476299853831, 0.6741479634247715, nan, nan, nan, 0.9832913123370796, nan, nan, nan, 0.36895407717827833, 0.79374185136897, 0.0, nan, 0.8918518518518519, 0.8549840019689884, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9619114017905559, 0.9878512435064768, 0.9953268557075815, 0.9645555510443619, 0.2783715012722646, 0.9938358202610211, 0.9922365744651731, nan, 0.8007045932514434, nan, 0.9984182440771062, 0.9737262302220931, 0.9393150956487263, 0.845958073806175, 0.5699365144378455, 0.9699564515339825, 0.998834587495702, 0.8674504379898571, nan, 0.8688598002066827, 0.9364896073903002, nan, 0.9678634103539047, nan, nan, nan, nan, 0.9485810731247792, nan, nan, nan, nan, nan, 0.8908754623921085, nan, nan, 0.6443178008716057, 0.7158223276929426, nan, nan, nan, 0.0, nan, 0.32355555555555554, nan, nan, nan, 0.9703948703380545, nan, nan, 0.9888516712312031, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.44995864350703063, nan, 0.7397499410238264, 0.9768508620186981, nan, nan, 0.9102535832414553, nan, 0.9067858325057928, nan, nan, 0.8550838118438606, 0.8586736171445121, nan, nan, 0.8979440832936948, nan, nan, nan, nan, nan, nan, 0.9699561861047361, 0.0, nan, 0.9794112869541003, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.38181818181818183, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9458377239199157, 0.7183348095659876, nan, nan, nan, 0.9861992113835076, nan, nan, nan, 0.37184337879454465, 0.7987404880608764, nan, nan, 0.9084507042253521, 0.8754095055692758, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0269 | 34.0 | 680 | 0.1724 | 0.6601 | 0.7683 | 0.9478 | [0.8811793045132436, 0.9801464188808683, 0.9626097927113609, 0.9351348658103871, 0.21897457158651187, 0.9720546006908319, 0.9850844246883015, nan, 0.7593373570039648, nan, 0.8610299078030133, 0.8852159142875808, 0.9179415182029435, 0.7617675900956117, 0.4739597856058494, 0.9077968739272719, 0.9754600679997147, 0.627999338077114, nan, 0.8119973082962156, 0.47214076246334313, nan, 0.9274633123689727, nan, 0.0, nan, nan, 0.9185716627368898, nan, nan, nan, nan, nan, 0.6816683230009623, nan, nan, 0.5813486156594224, 0.7175865174321369, 0.0, nan, nan, 0.0, nan, 0.31377777777777777, nan, nan, nan, 0.8856360339020689, nan, nan, 0.9848241114517255, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.36560595802301965, nan, 0.7032143648858347, 0.8897996195082055, nan, nan, 0.8239087301587301, 0.0, 0.9035002272874862, nan, nan, 0.8242231164766376, 0.8601499423298731, nan, nan, 0.8843013642000827, nan, nan, nan, nan, nan, nan, 0.8045125732001378, 0.0, nan, 0.9737483397817174, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3771955082061618, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9332362728785357, 0.6585365853658537, nan, nan, nan, 0.986626789147618, nan, nan, nan, 0.3521434820647419, 0.8118811881188119, 0.0, nan, 0.9, 0.8706472555577415, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.960647610194438, 0.9876448952008543, 0.9960456738713729, 0.9698299567048436, 0.23038894947291894, 0.9929643327806827, 0.9914814692459802, nan, 0.8055670904049402, nan, 0.9983313344110132, 0.9834518798083902, 0.939686787590445, 0.8324291742013261, 0.4819861686542006, 0.9704946910016147, 0.9987098305435448, 0.8748271092669433, nan, 0.8728901136755081, 0.9295612009237876, nan, 0.9704591985960808, nan, nan, nan, nan, 0.9531603668668472, nan, nan, nan, nan, nan, 0.8880600082203042, nan, nan, 0.6546262152195776, 0.7193819645068097, nan, nan, nan, 0.0, nan, 0.31377777777777777, nan, nan, nan, 0.9711050687877927, nan, nan, 0.9910406124963094, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4466501240694789, nan, 0.7483368719037509, 0.9727108398608331, nan, nan, 0.9156560088202866, nan, 0.9210857332009268, nan, nan, 0.8553448175859869, 0.8638864755285259, nan, nan, 0.9044447967866391, nan, nan, nan, nan, nan, nan, 0.9745462132276236, 0.0, nan, 0.9782336288114355, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.4106583072100313, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.945626975763962, 0.6935341009743136, nan, nan, nan, 0.9897136979255957, nan, nan, nan, 0.354157501099868, 0.8176331671477303, nan, nan, 0.9144869215291751, 0.8921929338239, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0128 | 35.0 | 700 | 0.1676 | 0.6567 | 0.7616 | 0.9494 | [0.8887820525934624, 0.9808457367556027, 0.9627868189871254, 0.9324426250987086, 0.22132740905304082, 0.9702316698497052, 0.9844074455149191, nan, 0.7380737730937719, nan, 0.8493045393404579, 0.8871909228044861, 0.9167652955242627, 0.7694730990178517, 0.657496582577358, 0.8991569833675097, 0.9754766772859113, 0.6392035702025404, nan, 0.8065417221554553, 0.5006321112515802, nan, 0.9301964657331311, nan, 0.0, nan, nan, 0.9150045676004872, nan, nan, nan, nan, nan, 0.6609706392935316, nan, nan, 0.5648141222230534, 0.7149435340724924, 0.0, nan, nan, 0.0, nan, 0.26222222222222225, nan, nan, nan, 0.882529731723057, nan, nan, 0.990016200891049, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34004024144869216, nan, 0.7109004739336493, 0.8658150982717114, nan, nan, 0.8270016950842557, 0.0, 0.8759861772184913, nan, nan, 0.8149650349650349, 0.823953823953824, nan, nan, 0.879149025142439, nan, nan, nan, nan, nan, nan, 0.8219516499029469, 0.0, nan, 0.9734145777983135, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2836776258364853, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9265195670274771, 0.6708860759493671, nan, nan, nan, 0.9867048563611491, nan, nan, nan, 0.2711091626479614, 0.8082334549244398, 0.0, nan, 0.8806223197367109, 0.887970479704797, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9615484213671525, 0.9882433052871594, 0.9958804283164783, 0.9719042855930108, 0.23177026535805162, 0.9934532160013604, 0.9933692322939625, nan, 0.7786291240807469, nan, 0.9987311188750413, 0.9817825518943243, 0.9405045098622262, 0.8343044672158596, 0.6667874415160919, 0.9655037432108431, 0.9987980732658023, 0.8584601198709082, nan, 0.8553565277299345, 0.9145496535796767, nan, 0.9641708101784148, nan, nan, nan, nan, 0.9435569337049103, nan, nan, nan, nan, nan, 0.8998355939169749, nan, nan, 0.6328360710693932, 0.7168799009492365, nan, nan, nan, 0.0, nan, 0.26222222222222225, nan, nan, nan, 0.9712268170934621, nan, nan, 0.9954490383930117, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.41935483870967744, nan, 0.7501769285208776, 0.9367140274835274, nan, nan, 0.9144432194046307, nan, 0.8894405825885469, nan, nan, 0.8449335885389478, 0.8268172603533159, nan, nan, 0.8889065059986259, nan, nan, nan, nan, nan, nan, 0.9718339244731901, 0.0, nan, 0.9777289181788648, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3056426332288401, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9380400421496312, 0.704162976085031, nan, nan, nan, 0.9892422424138522, nan, nan, nan, 0.2720633523977123, 0.8139595906586198, nan, nan, 0.8883299798792756, 0.9096315709893654, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0262 | 36.0 | 720 | 0.1662 | 0.6472 | 0.7707 | 0.9495 | [0.8897901140013882, 0.9811053353252973, 0.964074243680929, 0.9347419895578641, 0.26674770394381414, 0.9692779701071746, 0.9843918141240982, nan, 0.7501769911504425, nan, 0.8482601760500975, 0.893537889353789, 0.9144386310180963, 0.7566193340360885, 0.6334707030644812, 0.9093161432405772, 0.9753916185314602, 0.6467045834727334, nan, 0.8165674699970729, 0.46774193548387094, nan, 0.9295690413368514, nan, 0.0, nan, nan, 0.9111142040134625, nan, nan, nan, nan, nan, 0.6527125288009815, nan, nan, 0.5855023993020212, 0.7181268493503152, 0.0, nan, nan, 0.0, 0.0, 0.2657777777777778, nan, nan, nan, 0.8822176001767013, nan, nan, 0.9899424705262732, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.30895601483836777, nan, 0.7204515272244356, 0.8585303445293354, nan, nan, 0.8309046454767726, 0.0, 0.9002079002079002, nan, nan, 0.8182047191199933, 0.8385236447520185, nan, nan, 0.8842782301986618, nan, nan, nan, nan, nan, nan, 0.8089082969432314, 0.0, nan, 0.9742506119809708, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3563551944283227, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9301890712653231, 0.7002457002457002, nan, nan, nan, 0.986336464560205, nan, nan, nan, 0.30919439579684765, 0.8209266007287871, 0.0, nan, 0.8879729863938822, 0.832439876063542, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.960400827772429, 0.9892286184465068, 0.994945138475775, 0.9694018034112006, 0.28716830243547803, 0.9938570760532245, 0.9929539244234064, nan, 0.7948642385436621, nan, 0.9983139524777945, 0.9764842502540282, 0.9441223114282883, 0.8172259058335007, 0.6441185273870099, 0.9697607280912072, 0.998877187430585, 0.8911940986629784, nan, 0.8648639338615226, 0.9376443418013857, nan, 0.9660353904650483, nan, nan, nan, nan, 0.9421438944930722, nan, nan, nan, nan, nan, 0.8965474722564735, nan, nan, 0.6749078109285954, 0.7199236483697895, nan, nan, nan, 0.0, nan, 0.2657777777777778, nan, nan, nan, 0.9725660484558256, nan, nan, 0.9950926991172967, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.48221670802315963, nan, 0.7678697806086341, 0.926062702870804, nan, nan, 0.9367144432194047, nan, 0.9173121482952664, nan, nan, 0.8477466504263094, 0.8421662322618013, nan, nan, 0.9010094603879287, nan, nan, nan, nan, nan, nan, 0.9662007093678281, 0.0, nan, 0.9789529864946396, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3849529780564263, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9435194942044257, 0.7573073516386183, nan, nan, nan, 0.9900565746614092, nan, nan, nan, 0.3106907171139463, 0.8276043033324587, nan, nan, 0.8994969818913481, 0.8530820019152261, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0356 | 37.0 | 740 | 0.1642 | 0.6431 | 0.7651 | 0.9505 | [0.8969120766873576, 0.9804800425703921, 0.963030504508709, 0.9277287446370619, 0.23565700185070945, 0.9725784958774049, 0.9833275767108838, nan, 0.7733883548890342, nan, 0.8381265047056249, 0.8860294117647058, 0.9222241173460686, 0.7426834618035971, 0.7505384341251027, 0.9022750068386979, 0.9755251613823602, 0.6457864682471016, nan, 0.8136242237024166, 0.4830917874396135, nan, 0.9302022867194372, nan, 0.0, nan, nan, 0.9179146008297075, nan, nan, nan, nan, nan, 0.654447423115154, nan, nan, 0.6007754884817152, 0.7116428663321754, 0.0, nan, nan, 0.0, 0.0, 0.2577777777777778, nan, nan, nan, 0.8767539286171634, nan, nan, 0.9892568838893722, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.16952856696846708, nan, 0.707104646536098, 0.8646209062359803, nan, nan, 0.8415772745331744, 0.0, 0.8614398744523638, nan, nan, 0.8043514784946236, 0.8094410657399362, nan, nan, 0.8827668890742285, nan, nan, nan, nan, nan, nan, 0.8167979002624672, 0.022857142857142857, nan, 0.9738004631260431, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.32908827785817657, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9200417536534446, 0.6720067453625632, nan, nan, nan, 0.9857319834251783, nan, nan, nan, 0.3440728418840833, 0.8249219562955254, 0.0, nan, 0.8752112956149946, 0.7485331098072087, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9600610869493045, 0.988542510330312, 0.9956821336506048, 0.9633338377668115, 0.24994547437295528, 0.9928793096118692, 0.9942564809265141, nan, 0.823436432561317, nan, 0.9984356260103249, 0.9620409348236318, 0.937877886807414, 0.7937177684013127, 0.7630712519100805, 0.9683661985614327, 0.9987646018883942, 0.8603042876901799, nan, 0.8709955218739235, 0.9237875288683602, nan, 0.9666934776250365, nan, nan, nan, nan, 0.9495361829624105, nan, nan, nan, nan, nan, 0.8930127414714344, nan, nan, 0.6622527656721421, 0.7134234420140322, nan, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.9724848829187127, nan, nan, 0.9956322985919508, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4491315136476427, nan, 0.7546119367775419, 0.9364807867985773, nan, nan, 0.9341786108048512, nan, 0.8721615359152598, nan, nan, 0.8330433269531929, 0.8094410657399362, nan, nan, 0.8950372601870937, nan, nan, nan, nan, nan, nan, 0.9739203004381389, 0.022857142857142857, nan, 0.9782916415278229, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.35642633228840126, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9287671232876712, 0.7059344552701505, nan, nan, nan, 0.9889850848619921, nan, nan, nan, 0.3457985041794985, 0.8320650747835214, nan, nan, 0.8855130784708249, 0.7651328058061589, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0306 | 38.0 | 760 | 0.1636 | 0.6452 | 0.7736 | 0.9497 | [0.8947461577953695, 0.9815833030146196, 0.9638059301434289, 0.9281384205856256, 0.26520796640021677, 0.9679998343033491, 0.9838990935058742, nan, 0.7716728181322375, nan, 0.8372577893769957, 0.8888595041322314, 0.9270299301791904, 0.7740930743862221, 0.7172666108579047, 0.884403832505323, 0.9754878360466706, 0.6258612401858676, nan, 0.8124840194323703, 0.4665532879818594, nan, 0.929540911163329, nan, 0.0, nan, nan, 0.9119934143870314, nan, nan, nan, nan, nan, 0.6530055048037899, nan, nan, 0.5940340076223981, 0.7142305812128541, 0.0, nan, nan, 0.0, 0.0, 0.3022222222222222, nan, nan, nan, 0.874173391871459, nan, nan, 0.9856365438692318, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.21183699870633893, nan, 0.7149239586623946, 0.845639260038889, nan, nan, 0.8394029850746269, 0.0, 0.8526260778677816, nan, nan, 0.8180349332813327, 0.8386816999132697, nan, nan, 0.8773481241170007, nan, nan, nan, nan, nan, nan, 0.8030276965422329, 0.0, nan, 0.9740358739215301, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3207270595133392, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9255008347245409, 0.6969205834683955, nan, nan, nan, 0.9862066020412521, nan, nan, nan, 0.3902566788894709, 0.8393782383419689, 0.0, nan, 0.8854779228545799, 0.7459315514350527, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9566138795746046, 0.9881762420878322, 0.9952591050300746, 0.9652014719614957, 0.2846237731733915, 0.9934107044169537, 0.9932323694729838, nan, 0.8149839926057895, nan, 0.9981575150788271, 0.9757584555087822, 0.9409505401922886, 0.8489049628290135, 0.7288709652010901, 0.9755835005137741, 0.9987798161508524, 0.9004149377593361, nan, 0.8756803306923872, 0.9503464203233256, nan, 0.9689967826849956, nan, nan, nan, nan, 0.942170061885884, nan, nan, nan, nan, nan, 0.9093094944512947, nan, nan, 0.6792658397586322, 0.7160544779199339, nan, nan, nan, 0.0, nan, 0.3022222222222222, nan, nan, nan, 0.9737023659754068, nan, nan, 0.9920689058348011, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5417700578990902, nan, 0.7474404340646379, 0.9213784524480554, nan, nan, 0.9300992282249173, nan, 0.8640847401522674, nan, nan, 0.8516037352821763, 0.840139009556907, nan, nan, 0.8861582368796576, nan, nan, nan, nan, nan, nan, 0.9739203004381389, 0.0, nan, 0.9784772822202626, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.34294670846394987, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.934668071654373, 0.7617360496014172, nan, nan, nan, 0.9897994171095491, nan, nan, nan, 0.39331280246370437, 0.850170558908423, nan, nan, 0.8937625754527163, 0.7623607681064463, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0289 | 39.0 | 780 | 0.1728 | 0.6429 | 0.7687 | 0.9487 | [0.8900697098383155, 0.9812310890166545, 0.9631470806917117, 0.9257982763761305, 0.2625118676251187, 0.9690012432656444, 0.9852369306309092, nan, 0.7536557216958998, nan, 0.8349774090915695, 0.8895141003574738, 0.9232410415555448, 0.7754779008888347, 0.6573629234727086, 0.9051176861549745, 0.9754801591664859, 0.6146979116849992, nan, 0.8144256340403451, 0.4806432400238237, nan, 0.9278874820244818, nan, 0.0, nan, nan, 0.914418063079026, nan, nan, nan, nan, nan, 0.6696003156201633, nan, nan, 0.575874310748546, 0.7104626627553908, 0.0, nan, nan, 0.0, 0.0, 0.264, nan, nan, nan, 0.874402568499398, nan, nan, 0.9857437932850026, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.1622769753610875, nan, 0.7132467764243966, 0.8631427147130039, nan, nan, 0.8384615384615385, 0.0, 0.8777654506297723, nan, nan, 0.8009580905423577, 0.8216044019693021, nan, nan, 0.8760123308427817, nan, nan, nan, nan, nan, nan, 0.8083935328517372, 0.0, nan, 0.9722424452533657, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.34898550724637684, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9220289247537204, 0.700081499592502, nan, nan, nan, 0.9859166987714567, nan, nan, nan, 0.37214173503229186, 0.8243383497664764, 0.0, nan, 0.8927479033053775, 0.7682151830139415, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9595619888671248, 0.9887024302671694, 0.9953681670963052, 0.9627063717330243, 0.28142493638676847, 0.9940058665986481, 0.9924300701775913, nan, 0.796016235114463, nan, 0.9990092298065391, 0.9752503991871099, 0.9436019427098821, 0.8531243721117139, 0.6682052332267364, 0.9662132406909038, 0.9988224160857354, 0.8888888888888888, nan, 0.8650361694798484, 0.9318706697459584, nan, 0.9672053231939164, nan, nan, nan, nan, 0.9452970653268962, nan, nan, nan, nan, nan, 0.9068639539662967, nan, nan, 0.6389540730807911, 0.7121853074700785, nan, nan, nan, 0.0, nan, 0.264, nan, nan, nan, 0.9726269226086603, nan, nan, 0.9911933293287586, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4739454094292804, nan, 0.7542344892663364, 0.9423118039223308, nan, nan, 0.9253583241455348, nan, 0.8904336312479312, nan, nan, 0.8291572414593121, 0.8216044019693021, nan, nan, 0.8861053855504466, nan, nan, nan, nan, nan, nan, 0.9805967035259754, 0.0, nan, 0.9761567735647654, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3774294670846395, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9270811380400421, 0.7608503100088574, nan, nan, nan, 0.9871421224069947, nan, nan, nan, 0.3751869775626925, 0.8336394647074259, nan, nan, 0.9102615694164989, 0.785948288896729, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0446 | 40.0 | 800 | 0.1747 | 0.6421 | 0.7689 | 0.9492 | [0.8912983510714947, 0.9806201352035986, 0.9621803529408008, 0.9257605051705483, 0.23514407831242246, 0.9698773935232247, 0.9844502313730633, nan, 0.786848610452196, nan, 0.8428902446178214, 0.8852620517535794, 0.9116699523969803, 0.7615740740740741, 0.6461262708983591, 0.909711298232627, 0.9755043570265226, 0.5999081304547542, nan, 0.8170524388649143, 0.47463556851311955, nan, 0.928561423168511, nan, 0.0, nan, nan, 0.9172287943619432, nan, nan, nan, nan, nan, 0.6663463151953382, nan, nan, 0.5801556120483784, 0.7131018864039941, 0.0, nan, nan, 0.0, 0.0, 0.2853333333333333, nan, nan, nan, 0.8720642768850433, nan, nan, 0.9851946368534046, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.15554314437308014, nan, 0.6963951101989828, 0.8649359886201992, nan, nan, 0.8362875921131249, 0.0, 0.8801121909855848, nan, nan, 0.7934116588903255, 0.819047619047619, nan, nan, 0.879833895665715, nan, nan, nan, nan, nan, nan, 0.7999316589783018, 0.0, nan, 0.971650958932606, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3242142025611176, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9212400502723084, 0.7063106796116505, nan, nan, nan, 0.9855431993156544, nan, nan, nan, 0.3301013277428372, 0.823146944083225, 0.0, nan, 0.8938635247913599, 0.8403658896429625, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.958281597556522, 0.9885218754997498, 0.9959415691717893, 0.9631898896767074, 0.2479825517993457, 0.9937295413000042, 0.9919628488232157, nan, 0.8442393473805473, nan, 0.9990266117397577, 0.9782987371171433, 0.9396372286648825, 0.837318330989217, 0.6587375352478773, 0.9621030483926212, 0.9987433019209527, 0.9031811894882434, nan, 0.8609025146400275, 0.9399538106235565, nan, 0.9694355074583212, nan, nan, nan, nan, 0.9535790451518363, nan, nan, nan, nan, nan, 0.8976777640772708, nan, nan, 0.6311599061347637, 0.7147389599669831, nan, nan, nan, 0.0, nan, 0.2853333333333333, nan, nan, nan, 0.973458869364068, nan, nan, 0.9904806507773287, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46071133167907363, nan, 0.7364472753007785, 0.9454799898929036, nan, nan, 0.9259095920617421, nan, 0.8932803707381661, nan, nan, 0.818630009860217, 0.8218940052128584, nan, nan, 0.8958300301252576, nan, nan, nan, nan, nan, nan, 0.9768412267890674, 0.0, nan, 0.97548962732631, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3492163009404389, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9268703898840885, 0.7732506643046945, nan, nan, nan, 0.9875707183267616, nan, nan, nan, 0.3325120985481742, 0.8304906848596169, nan, nan, 0.9158953722334005, 0.8612469129580163, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0192 | 41.0 | 820 | 0.1705 | 0.6544 | 0.7673 | 0.9498 | [0.893787556557121, 0.9808370483731452, 0.9628753540170951, 0.9298644540381771, 0.23933031425826815, 0.9726657645466847, 0.9839997379307763, nan, 0.7479903282569341, nan, 0.8367418232631257, 0.8854816468579684, 0.9110350112553284, 0.7618459257428801, 0.7415435700099305, 0.9082342641231222, 0.9754757461577778, 0.6236369467607441, nan, 0.8123262128299912, 0.4574468085106383, nan, 0.9267796372311686, nan, 0.0, nan, nan, 0.9145654710567703, nan, nan, nan, nan, nan, 0.6768717465515377, nan, nan, 0.5791428359206715, 0.7123738933498044, 0.0, nan, nan, 0.0, nan, 0.30044444444444446, nan, nan, nan, 0.8730008551518349, nan, nan, 0.9861641076507323, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.22868944799098762, nan, 0.7065013823240881, 0.886899860370198, nan, nan, 0.8357044062438254, 0.0, 0.8784602357845372, nan, nan, 0.794371808540486, 0.8353552859618717, nan, nan, 0.8778594132666354, nan, nan, nan, nan, nan, nan, 0.8081837016574586, 0.0, nan, 0.9733141606463275, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.23201650943396226, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9187604690117253, 0.7008064516129032, nan, nan, nan, 0.9859589041095891, nan, nan, nan, 0.3421351753082102, 0.8271925272444214, 0.0, nan, 0.8972839506172839, 0.7069220894638096, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.959471243761274, 0.988578621283796, 0.9955185405512592, 0.9675009504264923, 0.253580516175936, 0.9931131233261064, 0.9923262432099523, nan, 0.7914752253760733, nan, 0.9992004310719438, 0.9787342139642909, 0.9426851025869759, 0.8259326234009778, 0.7528631515934403, 0.9644517297059255, 0.9988802302830766, 0.8964960811433841, nan, 0.8553909748535997, 0.9434180138568129, nan, 0.9657794676806084, nan, nan, nan, nan, 0.9434391804372572, nan, nan, nan, nan, nan, 0.9005343197698314, nan, nan, 0.6534528997653369, 0.7139909203466777, nan, nan, nan, 0.0, nan, 0.30044444444444446, nan, nan, nan, 0.9736009090540156, nan, nan, 0.9919874568574948, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5037220843672456, nan, 0.7475347959424392, 0.9629730412641645, nan, nan, 0.9326350606394708, nan, 0.8928831512744124, nan, nan, 0.82109506409141, 0.8375325803649001, nan, nan, 0.8903863432165319, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9777579245370586, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24670846394984325, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9247629083245522, 0.7697077059344553, nan, nan, nan, 0.9871421224069947, nan, nan, nan, 0.34430268367795863, 0.8365258462345841, nan, nan, 0.9139839034205232, 0.721637014263394, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0205 | 42.0 | 840 | 0.1684 | 0.6439 | 0.7688 | 0.9508 | [0.8956103882639573, 0.9795012603989304, 0.9628613739438543, 0.9296053028448145, 0.23617883387130004, 0.9694526765175222, 0.9841928713316592, nan, 0.776366281257054, nan, 0.8305831695757716, 0.8746789933230611, 0.91282918579367, 0.7509578544061303, 0.7455650663689368, 0.9033252637829443, 0.9756014184945291, 0.6343047460449626, nan, 0.8163205899476034, 0.4818181818181818, nan, 0.9295464090718185, nan, 0.0, nan, nan, 0.9181872944956201, nan, nan, nan, nan, nan, 0.6859425616745577, nan, nan, 0.5973394146712276, 0.7148555106662207, 0.0, nan, nan, 0.0, 0.0, 0.25333333333333335, nan, nan, nan, 0.8811161798248841, nan, nan, 0.9876549456109284, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.23317906728373086, nan, 0.6996854649359855, 0.8843567354411661, nan, nan, 0.8339385611934439, 0.0, 0.8671885192433137, nan, nan, 0.8077591036414565, 0.8367935409457901, nan, nan, 0.8831636648394675, nan, nan, nan, nan, nan, nan, 0.8185328185328186, 0.005714285714285714, nan, 0.9733681492307958, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22833481218574386, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9272841575859179, 0.6924979389942292, nan, nan, nan, 0.98643211778805, nan, nan, nan, 0.35615001311303435, 0.8314752398236972, 0.0, nan, 0.8833349895657359, 0.7491615703294535, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.955989730309972, 0.9882226704565972, 0.9959861854716108, 0.9669362309960838, 0.2500181752090149, 0.9943247034816988, 0.9914248363545408, nan, 0.8292901825780612, nan, 0.9991656672055066, 0.9888227609232109, 0.9362176628010704, 0.8138771683075481, 0.757415838308732, 0.967681166511719, 0.9986794020186284, 0.8780544029506685, nan, 0.8694109541853255, 0.918013856812933, nan, 0.9710076045627376, nan, nan, nan, nan, 0.9572424801454907, nan, nan, nan, nan, nan, 0.9051171393341554, nan, nan, 0.6585652028159571, 0.7165703673132481, nan, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.9719775983117568, nan, nan, 0.9937284287474165, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5417700578990902, nan, 0.745175748997405, 0.9764038173725437, nan, nan, 0.9368246968026461, nan, 0.8801059251903344, nan, nan, 0.836291398410765, 0.8404286128004633, nan, nan, 0.894085936261297, nan, nan, nan, nan, nan, nan, 0.9730857500521594, 0.005714285714285714, nan, 0.9774620596834827, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24200626959247648, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9325605900948367, 0.7440212577502214, nan, nan, nan, 0.987785016286645, nan, nan, nan, 0.35846898372195335, 0.8415114143269483, nan, nan, 0.8942655935613683, 0.7655864119752028, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0113 | 43.0 | 860 | 0.1749 | 0.6693 | 0.7649 | 0.9485 | [0.8862033037289665, 0.9802356771366559, 0.9624554304939251, 0.9315883044580754, 0.25438177726249744, 0.9719408448946482, 0.9846246071851744, nan, 0.7368087747295894, nan, 0.855335803681712, 0.8860253131352875, 0.9228027639713905, 0.763580812710052, 0.6100734132515565, 0.9168496286261503, 0.9756960655162468, 0.6049612403100775, nan, 0.8131383547874582, 0.4744186046511628, nan, 0.9291752902795805, nan, 0.0, nan, nan, 0.9203683640303358, nan, nan, nan, nan, nan, 0.6856276011055692, nan, nan, 0.586016057370021, 0.7150981754548495, 0.0, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.8726262073993074, nan, nan, 0.9821439386292521, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3536745406824147, nan, 0.6969967423802936, 0.8913773993208888, nan, nan, 0.8360720649376361, nan, 0.8672629929485506, nan, nan, 0.7963067222159667, 0.83001443001443, nan, nan, 0.8799440878028577, nan, nan, nan, nan, nan, nan, 0.808426483233018, 0.0, nan, 0.9715880527675212, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.31081472890692957, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9161764705882353, 0.6882690730106645, nan, nan, nan, 0.9749561196969049, nan, nan, nan, 0.3246389496717724, 0.8319916904700078, 0.0, nan, 0.8895995276985142, 0.790977147360126, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9610249770370616, 0.9878460847988362, 0.9960274968603344, 0.9709003399389512, 0.2711741185023628, 0.9932406580793266, 0.9922082580194534, nan, 0.7792587036019986, nan, 0.9990439936729764, 0.9806212803019306, 0.9431311329170383, 0.8369164824861027, 0.6205201720253942, 0.9603660028379899, 0.9983903310319225, 0.8994928538497003, nan, 0.8638649672752325, 0.9422632794457275, nan, 0.9684118163205616, nan, nan, nan, nan, 0.9558556083264644, nan, nan, nan, nan, nan, 0.8870324702013974, nan, nan, 0.6300703989272545, 0.7167767230705737, nan, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.9715717706261922, nan, nan, 0.9906333676097779, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7369190846897853, 0.9847616085832572, nan, nan, 0.9312017640573319, nan, 0.8793776895067859, nan, nan, 0.8203700481410591, 0.8328989284679988, nan, nan, 0.8983140425981714, nan, nan, nan, nan, nan, nan, 0.9808053411224703, 0.0, nan, 0.9754490184248388, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.3360501567398119, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.919072708113804, 0.7431355181576617, nan, nan, nan, 0.9760843476770101, nan, nan, nan, 0.32635283765948087, 0.8407242193649961, nan, nan, 0.9095573440643863, 0.8094350083161131, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0423 | 44.0 | 880 | 0.1764 | 0.6578 | 0.7665 | 0.9496 | [0.8891339302458806, 0.9809659803364604, 0.9626009845292819, 0.9313350573269138, 0.24837406722804134, 0.9684709879228555, 0.9849221516497105, nan, 0.7592264039359279, nan, 0.8497642171862758, 0.890833167627759, 0.9280340839926963, 0.7730094959824689, 0.6330763753444166, 0.9188399017658754, 0.975609828608803, 0.6023664954071306, nan, 0.8114818193666102, 0.45278851463279957, nan, 0.9284113196973942, nan, 0.0, nan, nan, 0.9202097559135323, nan, nan, nan, nan, nan, 0.6886098768165214, nan, nan, 0.577328577090765, 0.7136608084810745, 0.0, nan, nan, 0.0, nan, 0.2408888888888889, nan, nan, nan, 0.8638955339316869, nan, nan, 0.9841649117122955, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3481967213114754, nan, 0.6982004218462505, 0.8907512615124034, nan, nan, 0.8356232084610062, 0.0, 0.8556781819369164, nan, nan, 0.7807678203717807, 0.8477697841726619, nan, nan, 0.8798687773380546, nan, nan, nan, nan, nan, nan, 0.8203739297571204, 0.0, nan, 0.9722448507958518, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.30771461716937354, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9221242653232578, 0.6900247320692497, nan, nan, nan, 0.9822789144765003, nan, nan, nan, 0.3276781810231745, 0.839345964183753, 0.0, nan, 0.899706457925636, 0.8274164163671279, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9601451921693613, 0.9887591760512157, 0.9958870381386741, 0.9717123548062053, 0.26375863322428206, 0.9937295413000042, 0.992363998470912, nan, 0.8082461521975004, nan, 0.9991830491387252, 0.9754681376106837, 0.9445683417583507, 0.8505123568414708, 0.6442760597993037, 0.9611244311787445, 0.9985455165089961, 0.8918856615952052, nan, 0.8579400620048226, 0.9468822170900693, nan, 0.9691430242761041, nan, nan, nan, nan, 0.9574125681987675, nan, nan, nan, nan, nan, 0.8949239621866009, nan, nan, 0.610375460945357, 0.7154096161782914, nan, nan, nan, 0.0, nan, 0.2408888888888889, nan, nan, nan, 0.9745951868836492, nan, nan, 0.9896457987599393, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4392059553349876, nan, 0.7340410474168436, 0.9812824350327509, nan, nan, 0.9320837927232635, nan, 0.8674611055941741, nan, nan, 0.802708659590511, 0.8531711555169418, nan, nan, 0.8930289096770784, nan, nan, nan, nan, nan, nan, 0.9795535155435009, 0.0, nan, 0.9762437926393466, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.33260188087774295, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9258166491043204, 0.7413640389725421, nan, nan, nan, 0.9835419166809531, nan, nan, nan, 0.3296964364276287, 0.8485961689845185, nan, nan, 0.9250503018108652, 0.8469331182904087, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0414 | 45.0 | 900 | 0.1792 | 0.6435 | 0.7677 | 0.9488 | [0.8882314590432141, 0.9807069181746052, 0.9625221599348378, 0.9285625675651727, 0.25148596023775366, 0.9685404420895769, 0.9851521518383162, nan, 0.7652008515248089, nan, 0.8434712441314554, 0.8904972523790377, 0.910053412462908, 0.7727913677886089, 0.6108581869986682, 0.90089575430739, 0.975676326563378, 0.6036306235201263, nan, 0.8005544737450081, 0.4661308840413318, nan, 0.9266196100356419, nan, 0.0, nan, nan, 0.9194284130144409, nan, nan, nan, nan, nan, 0.6874343991352435, nan, nan, 0.5658362989323843, 0.7120467354659392, 0.0, nan, nan, 0.0, 0.0, 0.296, nan, nan, nan, 0.8752894837615566, nan, nan, 0.9841428860554544, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.26455981941309253, nan, 0.6994670130335469, 0.8811332700221589, nan, nan, 0.8364921928704704, 0.0, 0.8464854977789391, nan, nan, 0.8048314606741573, 0.8328530259365994, nan, nan, 0.8792806880375293, nan, nan, nan, nan, nan, nan, 0.8090530697190427, 0.0, nan, 0.9719378322163161, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.29494299912306343, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9194461925739459, 0.7012244897959183, nan, nan, nan, 0.9838087895142636, nan, nan, nan, 0.35930811566349263, 0.8245841995841996, 0.0, nan, 0.896140972632408, 0.8298039988180833, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9588924670495668, 0.9883361620246895, 0.9958754709498314, 0.9669620678327693, 0.26761177753544163, 0.9946860519491562, 0.9935674474140006, nan, 0.8137248335632862, nan, 0.9993047226712555, 0.9644360574829438, 0.9499454851818813, 0.8442167302926796, 0.6214338600166984, 0.9645495914273132, 0.9985363879515212, 0.8815122176118027, nan, 0.8355494316224595, 0.9376443418013857, nan, 0.9695086282538754, nan, nan, nan, nan, 0.9546388245607149, nan, nan, nan, nan, nan, 0.9017879161528977, nan, nan, 0.6262990278243379, 0.7136813867106893, nan, nan, nan, 0.0, nan, 0.296, nan, nan, nan, 0.9739864453553021, nan, nan, 0.9901446737459403, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.48469809760132343, nan, 0.736824722811984, 0.9738381698380921, nan, nan, 0.9391400220507167, nan, 0.8578616352201258, nan, nan, 0.8309262803781683, 0.8369533738777875, nan, nan, 0.8915490724591724, nan, nan, nan, nan, nan, nan, 0.9732943876486543, 0.0, nan, 0.9759073188842995, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.31630094043887147, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9237091675447839, 0.7608503100088574, nan, nan, nan, 0.9843991085204868, nan, nan, nan, 0.3619005719313682, 0.8325898714248229, nan, nan, 0.91579476861167, 0.8492515498210775, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0226 | 46.0 | 920 | 0.1802 | 0.6550 | 0.7666 | 0.9485 | [0.8863486599760974, 0.9805363874921352, 0.9624226087150931, 0.927956817415082, 0.23766260582283708, 0.9721395726264539, 0.9842704636345309, nan, 0.7638622845044863, nan, 0.8356341548477099, 0.8814399373940264, 0.9174901543888473, 0.7672822257319888, 0.6056844421714941, 0.9140533192009033, 0.9756086685097655, 0.606536366549183, nan, 0.8069687762622019, 0.45701608430393786, nan, 0.9278030038421237, nan, 0.0, nan, nan, 0.9152972027972028, nan, nan, nan, nan, nan, 0.6817827402301718, nan, nan, 0.5748679245283019, 0.7084127065849766, 0.0, nan, nan, 0.0, nan, 0.30933333333333335, nan, nan, nan, 0.8895966029723992, nan, nan, 0.9856348329973357, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.36, nan, 0.6942655215602076, 0.8756982017474743, nan, nan, 0.8319443082655162, 0.0, 0.8388235294117647, nan, nan, 0.8142004906333631, 0.8077144502014968, nan, nan, 0.879245676182538, nan, nan, nan, nan, nan, nan, 0.8083188908145581, 0.0, nan, 0.9728299488720067, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20588235294117646, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9252885624344176, 0.695829926410466, nan, nan, nan, 0.984319437898976, nan, nan, nan, 0.34076990376202976, 0.8141777430284076, 0.0, nan, 0.8965483331694365, 0.8380174238322587, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9586058453127939, 0.9888623502040269, 0.9958853856831251, 0.9644817315109752, 0.2510359869138495, 0.9931131233261064, 0.9907829969182268, nan, 0.8119432574712335, nan, 0.999548069736316, 0.9809841776745536, 0.9409753196550699, 0.845958073806175, 0.6156839269679737, 0.9705191564319616, 0.9986185449687955, 0.8727524204702628, nan, 0.860006889424733, 0.9515011547344111, nan, 0.9711538461538461, nan, nan, nan, nan, 0.9589956954638824, nan, nan, nan, nan, nan, 0.898479243732018, nan, nan, 0.6383674153536708, 0.7098380107304993, nan, nan, nan, 0.0, nan, 0.30933333333333335, nan, nan, nan, 0.969238261434195, nan, nan, 0.9905519186324717, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.543424317617866, nan, 0.7322953526775183, 0.9720694279772202, nan, nan, 0.9355016538037486, nan, 0.8496524329692154, nan, nan, 0.8470216344759585, 0.8126267014190559, nan, nan, 0.8920247344220708, nan, nan, nan, nan, nan, nan, 0.9730857500521594, 0.0, nan, 0.976887733791247, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21724137931034482, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9291886195995785, 0.7537643932683791, nan, nan, nan, 0.9846991256643237, nan, nan, nan, 0.3427188737351518, 0.8197323537129363, nan, nan, 0.9172032193158953, 0.8581220704601582, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0467 | 47.0 | 940 | 0.1790 | 0.6530 | 0.7599 | 0.9478 | [0.8848748111924889, 0.9802108794142012, 0.9625696637451421, 0.9235741202470675, 0.23897816234033786, 0.9707916986933128, 0.9842853193772226, nan, 0.7387254590507656, nan, 0.838199070092845, 0.8732594120680763, 0.9178172095284395, 0.7678105071363498, 0.6285590089111063, 0.9129358986217742, 0.9755773478117411, 0.6252708784797466, nan, 0.8075606921487604, 0.48849878934624696, nan, 0.9270108886547243, nan, 0.0, nan, nan, 0.9177673411314388, nan, nan, nan, nan, nan, 0.693007092873804, nan, nan, 0.582583746898263, 0.709444158517756, 0.0, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.8862294351394511, nan, nan, 0.9831678287203324, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3541666666666667, nan, 0.7054180121994976, 0.8842831266545911, nan, nan, 0.82088814162793, 0.0, 0.8246970193252539, nan, nan, 0.8120166401429488, 0.803122289679098, nan, nan, 0.8796233324614178, nan, nan, nan, nan, nan, nan, 0.8221399612198131, 0.0, nan, 0.9717580715986041, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2566578870354112, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9152151101783841, 0.6601529311809685, nan, nan, nan, 0.959677764922655, nan, nan, nan, 0.31057422969187676, 0.8235446985446986, 0.0, nan, 0.8961026147015293, 0.7779531442663379, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9607936876819052, 0.9888829850345892, 0.9957862383501884, 0.9625070589928801, 0.2529989094874591, 0.9933044254559368, 0.9914389945774006, nan, 0.7787228912434865, nan, 0.9996002155359719, 0.9831615619102918, 0.9356477351571018, 0.8466947960618847, 0.6378172308952567, 0.9658707246660468, 0.9986428877887287, 0.8646841862609498, nan, 0.8616947984843265, 0.9318706697459584, nan, 0.9649020181339573, nan, nan, nan, nan, 0.9536706310266777, nan, nan, nan, nan, nan, 0.8975339087546239, nan, nan, 0.629651357693597, 0.7111277342137846, nan, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.9684468974473439, nan, nan, 0.9948992577961943, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4921422663358147, nan, 0.7420618070299599, 0.9868413380240627, nan, nan, 0.9151047409040793, nan, 0.8334326381992717, nan, nan, 0.843454556000232, 0.8045178105994787, nan, nan, 0.8886422493525712, nan, nan, nan, nan, nan, nan, 0.9730857500521594, 0.0, nan, 0.9757042743769434, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2749216300940439, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.919072708113804, 0.6882196634189548, nan, nan, nan, 0.9598834219098235, nan, nan, nan, 0.31218653761548615, 0.8315402781422199, nan, nan, 0.913682092555332, 0.7949700115921576, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0289 | 48.0 | 960 | 0.1755 | 0.6434 | 0.7678 | 0.9485 | [0.8879402972208282, 0.9804285853204154, 0.9626280362423849, 0.924015665555307, 0.25945982535707035, 0.9715420113915104, 0.9842209072978304, nan, 0.7669830985208067, nan, 0.8452025586353945, 0.8809337506520605, 0.9231887569663194, 0.7665452337583485, 0.6498050863189159, 0.9163181860421894, 0.9756021428019335, 0.6084815756035579, nan, 0.8157018649600828, 0.4846335697399527, nan, 0.927919101438119, nan, 0.0, nan, nan, 0.9156484283758838, nan, nan, nan, nan, nan, 0.6875148181516431, nan, nan, 0.5725788245917677, 0.7125836335563561, 0.0, nan, nan, 0.0, 0.0, 0.27644444444444444, nan, nan, nan, 0.8769472971241328, nan, nan, 0.9832872442053396, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3542009884678748, nan, 0.7007348328703289, 0.887371003950368, nan, nan, 0.8121194549144679, 0.0, 0.8441116850846793, nan, nan, 0.7985780663942227, 0.8238685500144134, nan, nan, 0.8800229023526962, nan, nan, nan, nan, nan, nan, 0.8125862068965517, 0.0, nan, 0.9723319929746719, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.254416961130742, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9254918375889494, 0.6974317817014446, nan, nan, nan, 0.959340894500321, nan, nan, nan, 0.34219734079776065, 0.8429237947122862, 0.0, nan, 0.9033011977797254, 0.7319409470201945, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9575080508615251, 0.9882175117489567, 0.995614382973098, 0.9605286954981157, 0.2786623046165031, 0.993431960209157, 0.9914484333926405, nan, 0.8181988667568617, nan, 0.9990787575394136, 0.980548700827406, 0.9440975319655069, 0.8455562253030607, 0.66173065108146, 0.9649655037432109, 0.9985881164438791, 0.8831258644536653, nan, 0.8693420599379952, 0.9468822170900693, nan, 0.9695451886516525, nan, nan, nan, nan, 0.9539846397404195, nan, nan, nan, nan, nan, 0.8938964241676942, nan, nan, 0.637696949379819, 0.7142746595130004, nan, nan, nan, 0.0, nan, 0.27644444444444444, nan, nan, nan, 0.9720587638488698, nan, nan, 0.9955406684924812, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.533498759305211, nan, 0.737862703467799, 0.9910979805244028, nan, nan, 0.9264608599779492, nan, 0.854617676266137, nan, nan, 0.8273882025404559, 0.827686070083985, nan, nan, 0.8935574229691877, nan, nan, nan, nan, nan, nan, 0.9833089922804089, 0.0, nan, 0.9763482155288439, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.270846394984326, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9319283456269758, 0.7697077059344553, nan, nan, nan, 0.9606977541573805, nan, nan, nan, 0.3442146942366916, 0.853319338756232, nan, nan, 0.9331991951710261, 0.747139761100751, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0698 | 49.0 | 980 | 0.1808 | 0.6530 | 0.7598 | 0.9475 | [0.8836093435122588, 0.9804777228102666, 0.9630128010483755, 0.9264641736262568, 0.25807765408634264, 0.9646736330097487, 0.9843381208549301, nan, 0.7410276238891494, nan, 0.8519160922096213, 0.8865562913907284, 0.9246598722387478, 0.7613580550098232, 0.5897408005967181, 0.9200290377725219, 0.9756745907854244, 0.6028585193512125, nan, 0.8111531554643406, 0.47147147147147145, nan, 0.9273733235283819, nan, 0.0, nan, nan, 0.9204096231316592, nan, nan, nan, nan, nan, 0.6858995598049888, nan, nan, 0.569060773480663, 0.7124880986078588, 0.0, nan, nan, 0.0, nan, 0.2088888888888889, nan, nan, nan, 0.8774578136463683, nan, nan, 0.9864355765637467, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.36174365647365, nan, 0.7092357257920172, 0.893556627873971, nan, nan, 0.8244837758112095, 0.0, 0.8723861637678327, nan, nan, 0.8035749251727321, 0.827466820542412, nan, nan, 0.8793778380917584, nan, nan, nan, nan, nan, nan, 0.8146664344190907, 0.0, nan, 0.9714647296430491, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22696694940040948, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9109243697478991, 0.6935749588138386, nan, nan, nan, 0.9762108505904501, nan, nan, nan, 0.3577420482348829, 0.8325557283566615, 0.0, nan, 0.9054682955206516, 0.7089633362980532, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9602713499994466, 0.9886817954366072, 0.9957465794170137, 0.9642639638874843, 0.27640857869865504, 0.9948773540789865, 0.9939450000235971, nan, 0.7829959948026202, nan, 0.9988701743407902, 0.9716214254608797, 0.9397363465160076, 0.8305538811867926, 0.5978512578963122, 0.9611978274697852, 0.9984633594917219, 0.8653757491931766, nan, 0.8713399931105753, 0.9064665127020786, nan, 0.968229014331676, nan, nan, nan, nan, 0.9595713781057424, nan, nan, nan, nan, nan, 0.893403205918619, nan, nan, 0.6128897083473014, 0.7141972761040033, nan, nan, nan, 0.0, nan, 0.2088888888888889, nan, nan, nan, 0.9707195324865062, nan, nan, 0.9943494771993769, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4598842018196857, nan, 0.7478178815758434, 0.9789889016307411, nan, nan, 0.9244762954796031, nan, 0.8865938430983118, nan, nan, 0.833101328229221, 0.8305821025195482, nan, nan, 0.8904391945457428, nan, nan, nan, nan, nan, nan, 0.975798038806593, 0.0, nan, 0.9752575764607602, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2432601880877743, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9138040042149631, 0.745792736935341, nan, nan, nan, 0.9778844505400308, nan, nan, nan, 0.36022877254729435, 0.842823405930202, nan, nan, 0.9395372233400402, 0.7231490348268736, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0359 | 50.0 | 1000 | 0.1795 | 0.6513 | 0.7613 | 0.9477 | [0.887635762906059, 0.9803369062717342, 0.9621567068126871, 0.9282024113863516, 0.2625996486961221, 0.9710797849149833, 0.9844573173301614, nan, 0.7411390795830155, nan, 0.8340539128920872, 0.8857443036312299, 0.9277117275796272, 0.7666971637694419, 0.6721950615755925, 0.9064881142492129, 0.9756304219858367, 0.5871698113207547, nan, 0.8126492786853922, 0.46215139442231074, nan, 0.9262022345977374, nan, 0.0, nan, nan, 0.9216708542713568, nan, nan, nan, nan, nan, 0.6872916011324316, nan, nan, 0.5742199567500772, 0.7096815712925065, 0.0, nan, nan, 0.0, nan, 0.23022222222222222, nan, nan, nan, 0.8749748596712559, nan, nan, 0.9818095746502422, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3527180783817952, nan, 0.694221544806563, 0.8944710260499734, nan, nan, 0.825626959247649, 0.0, 0.8604954367666232, nan, nan, 0.80206381610224, 0.8233766233766234, nan, nan, 0.8777916927572532, nan, nan, nan, nan, nan, nan, 0.8151055293912437, 0.0, nan, 0.9701870390493145, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.26521106259097527, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.917016806722689, 0.6953376205787781, nan, nan, nan, 0.966294059702771, nan, nan, nan, 0.3131675713535283, 0.8313878080415046, 0.0, nan, 0.9081893165750197, 0.5986188394276629, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9585106736164138, 0.988919095988073, 0.9956309075285875, 0.9635516053903024, 0.28258814976372226, 0.9942184245206819, 0.9921233086822941, nan, 0.7809197219133859, nan, 0.9992525768715996, 0.9790245318623894, 0.9397115670532263, 0.8418726140245127, 0.6844468249342303, 0.9721338748348584, 0.9985576879189627, 0.896726602120793, nan, 0.879021701687909, 0.9376443418013857, nan, 0.966803158818368, nan, nan, nan, nan, 0.9598853868194842, nan, nan, nan, nan, nan, 0.8980271270036991, nan, nan, 0.6231143144485417, 0.7111277342137846, nan, nan, nan, 0.0, nan, 0.23022222222222222, nan, nan, nan, 0.971044194634958, nan, nan, 0.9924252451105161, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46153846153846156, nan, 0.7306440198159944, 0.9810686310715466, nan, nan, 0.9292171995589856, nan, 0.8738828202581926, nan, nan, 0.8317672988805753, 0.8262380538662033, nan, nan, 0.8890650599862586, nan, nan, nan, nan, nan, nan, 0.9749634884206134, 0.0, nan, 0.9737666496496032, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2855799373040752, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9199157007376185, 0.7661647475642162, nan, nan, nan, 0.966998114177953, nan, nan, nan, 0.3147382314122305, 0.8409866176856469, nan, nan, 0.9304828973843058, 0.6072778589788821, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0349 | 51.0 | 1020 | 0.1808 | 0.6503 | 0.7594 | 0.9480 | [0.8873347886810535, 0.9809139798701332, 0.962437723556464, 0.9305045733275169, 0.252868068833652, 0.9635274007867043, 0.9845916722884341, nan, 0.7470188469506341, nan, 0.8458447999293609, 0.8858177031918394, 0.9256909501932473, 0.7788987480247964, 0.632703037817731, 0.9061435397946819, 0.9757144853713005, 0.5942325874566656, nan, 0.8157980138897903, 0.4624505928853755, nan, 0.9289661443760493, nan, 0.0, nan, nan, 0.9204216205415127, nan, nan, nan, nan, nan, 0.6900562733817474, nan, nan, 0.5748902132998746, 0.7117671282236063, 0.0, nan, nan, 0.0, nan, 0.21244444444444444, nan, nan, nan, 0.877752133934132, nan, nan, 0.9876129997775396, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33505154639175255, nan, 0.6967312620835394, 0.8902073467071072, nan, nan, 0.8177479057081629, 0.0, 0.858735332464146, nan, nan, 0.8007681094415788, 0.8225806451612904, nan, nan, 0.8754835337166754, nan, nan, nan, nan, nan, nan, 0.8194444444444444, 0.0, nan, 0.9712778191947387, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24456202233980012, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9218357082984073, 0.6786310517529215, nan, nan, nan, 0.9806951459635305, nan, nan, nan, 0.26807281638368635, 0.8316883116883117, 0.0, nan, 0.9084231145935358, 0.657987228355032, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9590164115843874, 0.9889294134033542, 0.9959630510939256, 0.9683867848271331, 0.2692111959287532, 0.9944734940271224, 0.9918637412631967, nan, 0.7921583861331761, nan, 0.9990439936729764, 0.9769197271011758, 0.9436515016354445, 0.8583484026522001, 0.6430788134658706, 0.9696139355091256, 0.9984329309668055, 0.8692946058091287, nan, 0.8659317946951429, 0.9457274826789839, nan, 0.9710807253582919, nan, nan, nan, nan, 0.9642684251154636, nan, nan, nan, nan, nan, 0.9021783806000822, nan, nan, 0.6143982567884679, 0.7133460586050351, nan, nan, nan, 0.0, nan, 0.21244444444444444, nan, nan, nan, 0.9723631346130434, nan, nan, 0.9943800205658667, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4838709677419355, nan, 0.7311158292050012, 0.9755097280802348, nan, nan, 0.9255788313120177, nan, 0.8720953326713009, nan, nan, 0.8284032248709472, 0.8271068635968722, nan, nan, 0.8851540616246498, nan, nan, nan, nan, nan, nan, 0.9724598372626747, 0.0, nan, 0.9750023205086555, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2608150470219436, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9270811380400421, 0.7201062887511072, nan, nan, nan, 0.9819561117778158, nan, nan, nan, 0.2695116586009679, 0.8401994227236945, nan, nan, 0.9330985915492958, 0.6699259109923895, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0192 | 52.0 | 1040 | 0.1796 | 0.6528 | 0.7603 | 0.9494 | [0.8899294776744382, 0.9816788892698679, 0.962967936666283, 0.9319291462641756, 0.25102599179206564, 0.9706554244891178, 0.9837809752946574, nan, 0.73683809620536, nan, 0.8323029030623326, 0.8862188956732979, 0.9282185482293094, 0.7780221122716999, 0.7094712015888779, 0.8905060154102645, 0.9755361770929929, 0.5946428571428571, nan, 0.805425810992386, 0.4556048834628191, nan, 0.9275550259357914, nan, 0.0, nan, nan, 0.9192478572237676, nan, nan, nan, nan, nan, 0.6836607615430119, nan, nan, 0.5575337186897881, 0.7126978000771903, 0.0, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.8799228154001654, nan, nan, 0.9886473918317833, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35398813447593935, nan, 0.7109121986326017, 0.8893073493098363, nan, nan, 0.8270365997638724, 0.0, 0.8478303425774878, nan, nan, 0.8011024988107563, 0.8120127057464626, nan, nan, 0.8737228189677757, nan, nan, nan, nan, nan, nan, 0.824780316344464, 0.0, nan, 0.9722387490685598, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2649448636099826, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9136735979836169, 0.634417808219178, nan, nan, nan, 0.9822736030828516, nan, nan, nan, 0.2751248576185052, 0.8166189111747851, 0.0, nan, 0.9100492610837438, 0.7561143984220907, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9584121819771366, 0.9895587757355028, 0.9953532949963646, 0.9715204240193998, 0.2668120683387859, 0.9934744717935637, 0.9918920577089164, nan, 0.7774637322009832, nan, 0.9991656672055066, 0.9796777471331107, 0.9404549509366636, 0.8530573973611948, 0.7203012019723058, 0.9670205998923521, 0.9987402590684611, 0.8443983402489627, nan, 0.8672407853944195, 0.9480369515011547, nan, 0.9675709271716876, nan, nan, nan, nan, 0.9555939343983462, nan, nan, nan, nan, nan, 0.9025277435265104, nan, nan, 0.6062688568555146, 0.714481015270326, nan, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.9715717706261922, nan, nan, 0.9956832042027672, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4441687344913151, nan, 0.7456947393253126, 0.9730023907170208, nan, nan, 0.9267916207276736, nan, 0.8601787487586892, nan, nan, 0.8302592657038455, 0.8143643208803939, nan, nan, 0.881295914592252, nan, nan, nan, nan, nan, nan, 0.9791362403505112, 0.0, nan, 0.9764294333317863, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.28620689655172415, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.916754478398314, 0.6563330380868024, nan, nan, nan, 0.9832418995371164, nan, nan, nan, 0.27628684557853056, 0.8226187352400944, nan, nan, 0.9292756539235413, 0.7728441106799052, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0237 | 53.0 | 1060 | 0.1803 | 0.6513 | 0.7608 | 0.9485 | [0.8891086673567483, 0.9816291392085925, 0.9622764118604098, 0.9342977997369637, 0.2515821708063967, 0.9685945273631841, 0.9836941743476263, nan, 0.7383250981882681, nan, 0.8508629111038749, 0.890434552199258, 0.9241884258249085, 0.767355717528976, 0.6700728121570022, 0.9093206515496128, 0.9756822006434004, 0.6005113454777884, nan, 0.8116438356164384, 0.43020238713025427, nan, 0.9286289616529505, nan, 0.0, nan, nan, 0.9179641847313855, nan, nan, nan, nan, nan, 0.6826221707416336, nan, nan, 0.5685159921176293, 0.7108365508365508, 0.0, nan, nan, 0.0, nan, 0.30666666666666664, nan, nan, nan, 0.8773947763654346, nan, nan, 0.9807927044558101, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35251798561151076, nan, 0.6892812331959133, 0.8903591682419659, nan, nan, 0.8367306353783435, 0.0, 0.8723321186881833, nan, nan, 0.7986381595538992, 0.825735718407386, nan, nan, 0.8775149467117235, nan, nan, nan, nan, nan, nan, 0.8084815321477428, 0.0, nan, 0.9721450238335981, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17723718505647262, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9243820695433599, 0.6713810316139767, nan, nan, nan, 0.9727634790801251, nan, nan, nan, 0.26900175131348514, 0.8309529992209815, 0.0, nan, 0.9049671793866954, 0.6818271702590468, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9600500204729812, 0.9882071943336755, 0.9959217397052019, 0.9701584536284146, 0.2687749909123955, 0.9931768907027165, 0.9902261068190721, nan, 0.7806250251162044, nan, 0.9992178130051624, 0.975613296559733, 0.9445931212211319, 0.8602236956667336, 0.6828242410876038, 0.9683417331310857, 0.9985394308040129, 0.8662978331028124, nan, 0.8735446090251464, 0.9572748267898383, nan, 0.9694720678560983, nan, nan, nan, nan, 0.9557247713624053, nan, nan, nan, nan, nan, 0.8962186600904234, nan, nan, 0.6286456587328193, 0.7123400742880727, nan, nan, nan, 0.0, nan, 0.30666666666666664, nan, nan, nan, 0.9720384724645915, nan, nan, 0.9898596023253683, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7257372021703232, 0.9703978697350775, nan, nan, 0.9277839029768468, nan, 0.8875206885137372, nan, nan, 0.8265471840380488, 0.8288444830582102, nan, nan, 0.8920775857512816, nan, nan, nan, nan, nan, nan, 0.9864385562278323, 0.0, nan, 0.9760871583051005, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.19184952978056427, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.930031612223393, 0.7147918511957484, nan, nan, nan, 0.9735556317503857, nan, nan, nan, 0.27030356357237134, 0.8396746260823931, nan, nan, 0.9292756539235413, 0.6951262537170505, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0614 | 54.0 | 1080 | 0.1883 | 0.6508 | 0.7568 | 0.9468 | [0.8824055702304293, 0.9803490653511401, 0.9614253451399699, 0.9310131110150943, 0.25537414965986394, 0.9694586800008299, 0.9841171877556937, nan, 0.7228353464717101, nan, 0.851965984177309, 0.8870763699314202, 0.9239811721093578, 0.7647271849348141, 0.5655964443339342, 0.9194539887338429, 0.9757041195940179, 0.6017357762777242, nan, 0.8085814921549792, 0.47769953051643194, nan, 0.9277383545282091, nan, 0.0, nan, nan, 0.9173396254984101, nan, nan, nan, nan, nan, 0.6932374833004644, nan, nan, 0.5626994984040128, 0.7063093675187273, 0.0, nan, nan, 0.0, nan, 0.288, nan, nan, nan, 0.8729689268631102, nan, nan, 0.9779581899760568, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3333333333333333, nan, 0.6860507082919787, 0.8926620587247432, nan, nan, 0.8399149453219927, 0.0, 0.860742965332637, nan, nan, 0.7879923903312444, 0.8076036866359447, nan, nan, 0.8743739565943238, nan, nan, nan, nan, nan, nan, 0.8253718285214349, 0.0, nan, 0.9695745406414547, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2066259808195292, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9068936527952921, 0.6741479634247715, nan, nan, nan, 0.9526992287917738, nan, nan, nan, 0.30284215128989944, 0.8266286010900596, 0.0, nan, 0.903740157480315, 0.781592187037585, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9626572822947446, 0.9872270398819688, 0.9961101196377817, 0.9702802558585027, 0.27291893856779353, 0.9931768907027165, 0.9933503546634826, nan, 0.7615367098442125, nan, 0.9995654516695347, 0.966976339091305, 0.9387947269303202, 0.8485700890764182, 0.5733392145433923, 0.9623966335567843, 0.9983629453594978, 0.8630705394190872, nan, 0.8698587667929728, 0.9399538106235565, nan, 0.9655235448961684, nan, nan, nan, nan, 0.9511978124059609, nan, nan, nan, nan, nan, 0.8957665433621044, nan, nan, 0.6205162587998659, 0.7077486586875774, nan, nan, nan, 0.0, nan, 0.288, nan, nan, nan, 0.9702528306481069, nan, nan, 0.9897170666150823, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4143920595533499, nan, 0.7174805378627035, 0.9814962389939552, nan, nan, 0.9145534729878722, nan, 0.8728235683548494, nan, nan, 0.8168319703033466, 0.8120474949319433, nan, nan, 0.885788277575181, nan, nan, nan, nan, nan, nan, 0.9841435426663885, 0.0, nan, 0.9731575161275352, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22288401253918494, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9093782929399368, 0.7183348095659876, nan, nan, nan, 0.953025887193554, nan, nan, nan, 0.3047074351077871, 0.8357386512726318, nan, nan, 0.9237424547283702, 0.7986492616299582, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0101 | 55.0 | 1100 | 0.1924 | 0.6495 | 0.7569 | 0.9457 | [0.8778990512426279, 0.979830599554748, 0.9609781444455778, 0.9302440270494653, 0.23685294521488398, 0.9692355500310752, 0.9843375748502994, nan, 0.7305343223060359, nan, 0.8449679619070013, 0.8868036740897377, 0.9237397778591481, 0.7621790550891598, 0.5129380429564706, 0.9181010420543357, 0.975677219029944, 0.600441153300772, nan, 0.8082563824008691, 0.4939540507859734, nan, 0.9270880597535862, nan, 0.0, nan, nan, 0.9205477738869434, nan, nan, nan, nan, nan, 0.6893716132712235, nan, nan, 0.565530039434006, 0.7080673393739704, 0.0, nan, nan, 0.0, nan, 0.26222222222222225, nan, nan, nan, 0.8742959549411162, nan, nan, 0.9841589339989131, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34055517941773866, nan, 0.6820094922539626, 0.8901497241922773, nan, nan, 0.8285742811501597, 0.0, 0.847174993460633, nan, nan, 0.7951212673489415, 0.8292050691244239, nan, nan, 0.8769351055512119, nan, nan, nan, nan, nan, nan, 0.8031896844248388, 0.0, nan, 0.9699598347155199, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21484488257465933, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9220452640402347, 0.6652754590984975, nan, nan, nan, 0.9595717344753747, nan, nan, nan, 0.2800630141781901, 0.8348909657320872, 0.0, nan, 0.8970415360501567, 0.8085997143279319, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9585637927027655, 0.9876552126161354, 0.9962819750148721, 0.9677482458633379, 0.25081788440567065, 0.994452238234919, 0.9930199961300857, nan, 0.777075268241062, nan, 0.99937425040413, 0.9740165481201916, 0.937679651105164, 0.8330319469559976, 0.5218103624820807, 0.9656750012232715, 0.9982077598824242, 0.8785154449054864, nan, 0.8713744402342405, 0.9434180138568129, nan, 0.971117285756069, nan, nan, nan, nan, 0.9630516413497141, nan, nan, nan, nan, nan, 0.8941430332922318, nan, nan, 0.6129735165940329, 0.709528477094511, nan, nan, nan, 0.0, nan, 0.26222222222222225, nan, nan, nan, 0.9701310823424374, nan, nan, 0.9955915741032977, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4160463192721257, nan, 0.7186600613352205, 0.9880075414488134, nan, nan, 0.9149944873208379, nan, 0.8576630254882489, nan, nan, 0.8224000928020416, 0.8337677381986678, nan, nan, 0.8891707626446805, nan, nan, nan, nan, nan, nan, 0.9876903818068016, 0.0, nan, 0.9736680280317446, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2322884012539185, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9272918861959958, 0.7059344552701505, nan, nan, nan, 0.9603120178295903, nan, nan, nan, 0.28156621205455346, 0.843872999212805, nan, nan, 0.9212273641851106, 0.8274280530215211, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0207 | 56.0 | 1120 | 0.1868 | 0.6532 | 0.7624 | 0.9479 | [0.8853785124068325, 0.980024936891269, 0.9623335165397003, 0.9324313781119228, 0.25582026742686487, 0.9705668529173503, 0.9846333816854617, nan, 0.7214831910927375, nan, 0.8404468424207864, 0.8880344713291349, 0.9226547512081401, 0.7647309471633796, 0.6366003751181932, 0.9137648929550849, 0.9754816736655344, 0.5934065934065934, nan, 0.8103714671988669, 0.4681217690982194, nan, 0.9293543174313216, nan, 0.0, nan, nan, 0.9196622922354896, nan, nan, nan, nan, nan, 0.6896111268804995, nan, nan, 0.5628199162401116, 0.7074451652764906, 0.0, nan, nan, 0.0, nan, 0.24, nan, nan, nan, 0.8708821658944308, nan, nan, 0.9877166023009888, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34903381642512077, nan, 0.694848975188781, 0.8917892070639555, nan, nan, 0.8226470588235294, 0.0, 0.8662843737781832, nan, nan, 0.7948079559123843, 0.8435432844406098, nan, nan, 0.8721313189398295, nan, nan, nan, nan, nan, nan, 0.8129421915444348, 0.0, nan, 0.9712233541549894, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.23995306541507774, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9163338238385537, 0.6911037891268533, nan, nan, nan, 0.9752204057177095, nan, nan, nan, 0.3256443861948449, 0.8365758754863813, 0.0, nan, 0.9027736649886487, 0.7447448929241094, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9584531279395327, 0.9893627448451614, 0.9952954590521514, 0.9704574227386309, 0.27400945110868774, 0.9931981464949199, 0.9930860678367651, nan, 0.762112708129613, nan, 0.9990961394726321, 0.9722746407316011, 0.9414709089106948, 0.8414037907708793, 0.6469698640495282, 0.9700787786857171, 0.9987646018883942, 0.8713692946058091, nan, 0.8672063382707544, 0.941108545034642, nan, 0.9672053231939164, nan, nan, nan, nan, 0.9563135377006712, nan, nan, nan, nan, nan, 0.8987053020961776, nan, nan, 0.6081964465303386, 0.7088320264135369, nan, nan, nan, 0.0, nan, 0.24, nan, nan, nan, 0.9725457570715474, nan, nan, 0.9946854542307653, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.47808105872622003, nan, 0.7293701344656759, 0.9854418939143618, nan, nan, 0.9251378169790518, nan, 0.8801059251903344, nan, nan, 0.8239661272547996, 0.8494063133507095, nan, nan, 0.8817187252259394, nan, nan, nan, nan, nan, nan, 0.9828917170874192, 0.0, nan, 0.9750603332250429, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2564263322884012, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9186512118018967, 0.7431355181576617, nan, nan, nan, 0.976641522372707, nan, nan, nan, 0.32793664760228775, 0.8462345840986618, nan, nan, 0.9201207243460765, 0.7606975454866186, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0274 | 57.0 | 1140 | 0.1834 | 0.6533 | 0.7612 | 0.9486 | [0.8881959683070675, 0.9797675441737689, 0.9614714609322655, 0.9322679896517583, 0.24430613501128515, 0.9669500113751526, 0.984646657673861, nan, 0.736616567015131, nan, 0.8481593507930653, 0.8834504960252283, 0.9215160236086044, 0.7677165354330708, 0.6491160049627791, 0.9055209953343701, 0.9756109162256972, 0.5977122603512164, nan, 0.8059677471271768, 0.47941176470588237, nan, 0.9283107586497298, nan, 0.0, nan, nan, 0.9217867943548387, nan, nan, nan, nan, nan, 0.6861524469603124, nan, nan, 0.5680213400282441, 0.7095279765275132, 0.0, nan, nan, 0.0, nan, 0.25422222222222224, nan, nan, nan, 0.8760162601626016, nan, nan, 0.9861873704433995, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34690265486725663, nan, 0.6882352941176471, 0.8908746324701549, nan, nan, 0.8296237935788852, 0.0, 0.8570683019114097, nan, nan, 0.7982429143001035, 0.8336690647482015, nan, nan, 0.8719276226336158, nan, nan, nan, nan, nan, nan, 0.8310882665721375, 0.0, nan, 0.9701218243602487, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1896755162241888, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.915126050420168, 0.6895127993393889, nan, nan, nan, 0.9772260273972603, nan, nan, nan, 0.3120846449807625, 0.8356661482633488, 0.0, nan, 0.9007029007029007, 0.7747583349773131, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9608091807487578, 0.9880111634433342, 0.995948178993985, 0.9696232620113608, 0.25968738640494365, 0.993772052884411, 0.9921421863127741, nan, 0.7805848391893159, nan, 0.9991830491387252, 0.9759761939323559, 0.936267221726633, 0.8489049628290135, 0.6593519116558232, 0.9686597837255957, 0.9985911592963708, 0.855232826187183, nan, 0.8625215294522908, 0.941108545034642, nan, 0.9672053231939164, nan, nan, nan, nan, 0.9571116431814316, nan, nan, nan, nan, nan, 0.8992601726263871, nan, nan, 0.6067717063359035, 0.7111019397441188, nan, nan, nan, 0.0, nan, 0.25422222222222224, nan, nan, nan, 0.9707601152550627, nan, nan, 0.9929546634630069, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.48635235732009924, nan, 0.7231422505307855, 0.9776088942447861, nan, nan, 0.9287761852260198, nan, 0.8697782191327375, nan, nan, 0.8273882025404559, 0.8389805965826818, nan, nan, 0.8811902119338302, nan, nan, nan, nan, nan, nan, 0.9783016899645316, 0.0, nan, 0.9738362649092681, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20156739811912225, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9180189673340359, 0.7395925597874224, nan, nan, nan, 0.9783987656437511, nan, nan, nan, 0.31403431588209413, 0.845972185778011, nan, nan, 0.9152917505030181, 0.7917443677234011, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0311 | 58.0 | 1160 | 0.1862 | 0.6518 | 0.7609 | 0.9480 | [0.8874380375029165, 0.9793794920326367, 0.9609582808064935, 0.9319691480990493, 0.2307321108506825, 0.9703879059722569, 0.9838250444881521, nan, 0.7394069671872037, nan, 0.8430691182076772, 0.8837468819745307, 0.9227914245250769, 0.7624779621861512, 0.6389430969860298, 0.9037020109689214, 0.9757002676181981, 0.5988242770892914, nan, 0.8022559041240748, 0.44176060118089105, nan, 0.9276691045246177, nan, 0.0, nan, nan, 0.9227264718467618, nan, nan, nan, nan, nan, 0.6838909096585853, nan, nan, 0.568059351815697, 0.7084909788175945, 0.0, nan, nan, 0.0, nan, 0.2853333333333333, nan, nan, nan, 0.8758608058608058, nan, nan, 0.9816657620167661, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3446153846153846, nan, 0.6855642337732292, 0.8867415492585212, nan, nan, 0.8340105044098702, 0.0, 0.8403267973856209, nan, nan, 0.7939110725578532, 0.8338133640552995, nan, nan, 0.8769801850786846, nan, nan, nan, nan, nan, nan, 0.815994460792799, 0.0, nan, 0.9689949365332593, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2646887725421757, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9208315833683326, 0.6540404040404041, nan, nan, nan, 0.9563113033794491, nan, nan, nan, 0.29216630196936544, 0.8368831168831169, 0.0, nan, 0.8999901273570935, 0.7769560715870434, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9596936799353718, 0.9876552126161354, 0.9963645977923193, 0.9704648046919695, 0.24332969829153037, 0.9932831696637333, 0.9914814692459802, nan, 0.7833040869087645, nan, 0.9994437781370044, 0.9771374655247496, 0.936465457428883, 0.8399973210099793, 0.649884213676964, 0.9675099084992905, 0.9984451023767721, 0.8688335638543108, nan, 0.8624181880812952, 0.9503464203233256, nan, 0.9692161450716584, nan, nan, nan, nan, 0.9592704530884065, nan, nan, nan, nan, nan, 0.9002877106452939, nan, nan, 0.6096211867247737, 0.710044366487825, nan, nan, nan, 0.0, nan, 0.2853333333333333, nan, nan, nan, 0.9703745789537762, nan, nan, 0.9943087527107238, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46319272125723737, nan, 0.7205944798301486, 0.9774339637310735, nan, nan, 0.9278941565600882, nan, 0.8511751075802715, nan, nan, 0.8228061017342382, 0.838401390095569, nan, nan, 0.886528196184134, nan, nan, nan, nan, nan, nan, 0.9835176298769038, 0.0, nan, 0.9725251775189121, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2852664576802508, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9241306638566913, 0.6882196634189548, nan, nan, nan, 0.9569261100634322, nan, nan, nan, 0.29370875494940607, 0.8454473891367095, nan, nan, 0.917102615694165, 0.7942644019958671, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0248 | 59.0 | 1180 | 0.1950 | 0.6512 | 0.7588 | 0.9461 | [0.8795404754135904, 0.9795725267362585, 0.9616333513128792, 0.9302774726248273, 0.26025910601641455, 0.967562979983026, 0.9839629215637672, nan, 0.7263909964592817, nan, 0.8624414836154123, 0.8888594515829912, 0.9181268226844376, 0.766388411490302, 0.5293981051458294, 0.9146420472951086, 0.9756642167356453, 0.5976519117880374, nan, 0.8031804637906464, 0.4556048834628191, nan, 0.9280812324929972, nan, 0.0, nan, nan, 0.9202850451488471, nan, nan, nan, nan, nan, 0.6861574425069497, nan, nan, 0.5689332501559575, 0.7083354785563507, 0.0, nan, nan, 0.0, nan, 0.2551111111111111, nan, nan, nan, 0.8773222587699149, nan, nan, 0.9756954553211995, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3488210818307906, nan, 0.6899734365854757, 0.8890928344511119, nan, nan, 0.8305000988337616, 0.0, 0.8616938110749186, nan, nan, 0.7938935574229692, 0.8482142857142857, nan, nan, 0.8778141735558675, nan, nan, nan, nan, nan, nan, 0.813969571230982, 0.0, nan, 0.9696640447555034, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.27990775439607957, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9180155560227033, 0.6791666666666667, nan, nan, nan, 0.9492852863134469, nan, nan, nan, 0.2549242755843474, 0.8375779625779626, 0.0, nan, 0.9038405159777192, 0.797160743333169, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9605380520788376, 0.9880317982738964, 0.9957631039725031, 0.9689367403508642, 0.2789531079607416, 0.9935382391701739, 0.9929067303472068, nan, 0.7694667327501908, nan, 0.9991135214058507, 0.9740165481201916, 0.9439488551888195, 0.8362467349809122, 0.5387135903212086, 0.9670939961833929, 0.9985150879840798, 0.8683725218994929, nan, 0.854254219772649, 0.9480369515011547, nan, 0.9690699034805499, nan, nan, nan, nan, 0.9614161792989755, nan, nan, nan, nan, nan, 0.8927661323468968, nan, nan, 0.6114649681528662, 0.7097606273215022, nan, nan, nan, 0.0, nan, 0.2551111111111111, nan, nan, nan, 0.9687715595957956, nan, nan, 0.9895032630496533, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4160463192721257, nan, 0.7230478886529842, 0.9791443954207079, nan, nan, 0.9264608599779492, nan, 0.8756703078450844, nan, nan, 0.8219360825938171, 0.8528815522733855, nan, nan, 0.8922889910681253, nan, nan, nan, nan, nan, nan, 0.9822658042979345, 0.0, nan, 0.9733373555483362, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.30438871473354234, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9203371970495258, 0.7218777679362267, nan, nan, nan, 0.9506686096348362, nan, nan, nan, 0.25622525296964366, 0.8457097874573603, nan, nan, 0.9304828973843058, 0.8150798850864371, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0075 | 60.0 | 1200 | 0.1923 | 0.6393 | 0.7590 | 0.9470 | [0.8831804929332113, 0.9799284731211403, 0.961429562692912, 0.9284269988617323, 0.25270095807569476, 0.9638588972224512, 0.9840020968579893, nan, 0.7280044618524293, nan, 0.8513423464779138, 0.8888888888888888, 0.9184823169704119, 0.7638939782582143, 0.6057095060332409, 0.8993493197433681, 0.9756702355078388, 0.5977948226270374, nan, 0.801490780698313, 0.46231721034870643, nan, 0.9287322358238357, nan, 0.0, nan, nan, 0.9193641691189416, nan, nan, nan, nan, nan, 0.6882403108120914, nan, nan, 0.5555213306637917, 0.7065698692204716, 0.0, nan, nan, 0.0, 0.0, 0.2408888888888889, nan, nan, nan, 0.8764001615924198, nan, nan, 0.9750916855061542, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34826883910386963, nan, 0.6941799892222023, 0.8964120882035684, nan, nan, 0.8302092631161361, 0.0, 0.8758616204968136, nan, nan, 0.7932634521313766, 0.8419689119170984, nan, nan, 0.8724292723666354, nan, nan, nan, nan, nan, nan, 0.8223626470071967, 0.0, nan, 0.9688616316574374, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21132075471698114, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8992427429533025, 0.6938435940099834, nan, nan, nan, 0.9480491669878796, nan, nan, nan, 0.3054195804195804, 0.8455158113011924, 0.0, nan, 0.9049306170652495, 0.7910749506903353, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9594192313225546, 0.988036956981537, 0.9959762707383172, 0.9693944214578619, 0.270374409305707, 0.9943034476894954, 0.9921846609813536, nan, 0.7693327796605629, nan, 0.9993221046044741, 0.9754681376106837, 0.9453612845673506, 0.8377201794923314, 0.6160147450337907, 0.9671184616137398, 0.9985242165415548, 0.8623789764868603, nan, 0.8445056837754048, 0.9491916859122402, nan, 0.9652676221117286, nan, nan, nan, nan, 0.9564967094503539, nan, nan, nan, nan, nan, 0.895561035758323, nan, nan, 0.604592691920885, 0.707955014444903, nan, nan, nan, 0.0, nan, 0.2408888888888889, nan, nan, nan, 0.9684468974473439, nan, nan, 0.9880371814581403, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.42431761786600497, nan, 0.7293229535267752, 0.9853058368481409, nan, nan, 0.9229327453142228, nan, 0.8916914928831513, nan, nan, 0.8230091062003364, 0.8470894874022589, nan, nan, 0.8833571164314783, nan, nan, nan, nan, nan, nan, 0.977467139578552, 0.0, nan, 0.9723801457279435, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22821316614420062, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9009483667017913, 0.7387068201948627, nan, nan, nan, 0.9487399279958855, nan, nan, nan, 0.30743510778706556, 0.8559433219627395, nan, nan, 0.9250503018108652, 0.8085781966634746, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0128 | 61.0 | 1220 | 0.1924 | 0.6385 | 0.7598 | 0.9471 | [0.8845901230759488, 0.9796972240419259, 0.9609740002008446, 0.9285337524736001, 0.24921672796621713, 0.9673895050488329, 0.9839452867347463, nan, 0.7344610592689196, nan, 0.8404127810097348, 0.8895391367959035, 0.9181892489662468, 0.766587749650944, 0.6284746391943942, 0.8928322268482053, 0.9757658485521663, 0.5918399495904222, nan, 0.801305270027737, 0.45955056179775283, nan, 0.9266879674168744, nan, 0.0, nan, nan, 0.9175580221997982, nan, nan, nan, nan, nan, 0.6861441788499795, nan, nan, 0.5483139271132618, 0.7068197616044075, 0.0, nan, nan, 0.0, 0.0, 0.25066666666666665, nan, nan, nan, 0.8742676138860407, nan, nan, 0.9771578957942001, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34369173399627095, nan, 0.6820603105814373, 0.8959259259259259, nan, nan, 0.8334828101644245, 0.0, 0.8773296967335541, nan, nan, 0.7908412729457126, 0.8311239193083574, nan, nan, 0.8711227154046998, nan, nan, nan, nan, nan, nan, 0.81991599579979, 0.0, nan, 0.9686748520162782, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20963503649635037, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.907465825446898, 0.6908646003262643, nan, nan, nan, 0.9520322069467643, nan, nan, nan, 0.3122484689413823, 0.8338094001558037, 0.0, nan, 0.9036085022244191, 0.7738324209695714, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9583612761860496, 0.9875056100945591, 0.9962125718818163, 0.9681247254836102, 0.2660123591421301, 0.9937507970922076, 0.9926660405585891, nan, 0.7770216870052108, nan, 0.9993916323373485, 0.970823051241109, 0.9464020220041629, 0.8457571495546179, 0.6400384379085997, 0.9706170181533493, 0.9983964167369058, 0.8660673121254034, nan, 0.8458835687220118, 0.9445727482678984, nan, 0.9649385785317344, nan, nan, nan, nan, 0.951760411351415, nan, nan, nan, nan, nan, 0.8931154952733251, nan, nan, 0.6077774052966812, 0.7081871646718944, nan, nan, nan, 0.0, nan, 0.25066666666666665, nan, nan, nan, 0.968893307901465, nan, nan, 0.9904093829221857, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4574028122415219, nan, 0.7128568058504364, 0.9873661295652005, nan, nan, 0.9221609702315325, nan, 0.8944058258854684, nan, nan, 0.8208630589872977, 0.8352157544164495, nan, nan, 0.8816658738967285, nan, nan, nan, nan, nan, nan, 0.977467139578552, 0.0, nan, 0.9721364923191164, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22507836990595612, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9093782929399368, 0.75022143489814, nan, nan, nan, 0.9527258700497171, nan, nan, nan, 0.31403431588209413, 0.8425610076095513, nan, nan, 0.9195171026156942, 0.7908371553853133, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.02 | 62.0 | 1240 | 0.1989 | 0.6256 | 0.7582 | 0.9453 | [0.8782824658862393, 0.979638738605013, 0.9610453381629419, 0.9285833365848122, 0.26120211181805875, 0.9680185591780935, 0.9841590832490731, nan, 0.7242763949585154, nan, 0.8509123736514185, 0.8935120357404814, 0.9182442821449164, 0.7711127220007249, 0.5263313104347421, 0.9092330329225676, 0.9757620097011441, 0.5987230646448524, nan, 0.8026486626850169, 0.46292134831460674, nan, 0.9278191265271731, nan, 0.0, nan, nan, 0.9162642724209044, nan, nan, nan, nan, nan, 0.6817649664599418, nan, nan, 0.5542459957691145, 0.7063790351644957, 0.0, 0.0, nan, 0.0, 0.0, 0.2737777777777778, nan, nan, nan, 0.8727491873333577, nan, nan, 0.9745871670215758, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.341644204851752, nan, 0.6791419649726207, 0.8905936939786987, nan, nan, 0.8302652414885194, 0.0, 0.8632911392405064, nan, nan, 0.7861765612304934, 0.824985607369027, nan, nan, 0.8704447109118398, nan, nan, nan, nan, nan, nan, 0.8264144333508495, 0.0, nan, 0.9689870051794303, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1971098265895954, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9110784107630859, 0.7075320512820513, nan, nan, nan, 0.9451773171149563, nan, nan, nan, 0.3041196536342167, 0.8412039439543332, 0.0, nan, 0.9037746919616664, 0.7902915495042179, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603742682292531, 0.9873405314500612, 0.9961134245488796, 0.966257091288926, 0.2805525263540531, 0.9933681928325468, 0.9922082580194534, nan, 0.7659169758750486, nan, 0.9994090142705672, 0.9725649586296995, 0.9470958469620379, 0.8549326903757284, 0.5342081633296051, 0.9675588393599843, 0.9983568596545145, 0.8646841862609498, nan, 0.8518084739924217, 0.9515011547344111, nan, 0.9662181924539339, nan, nan, nan, nan, 0.95334353861653, nan, nan, nan, nan, nan, 0.8855939169749281, nan, nan, 0.6148172980221254, 0.7078002476269087, nan, nan, nan, 0.0, nan, 0.2737777777777778, nan, nan, nan, 0.9697252546568726, nan, nan, 0.9878335590148746, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.41935483870967744, nan, 0.7080443500825666, 0.9849171023732239, nan, nan, 0.9249173098125689, nan, 0.8804369414101291, nan, nan, 0.8137869033118729, 0.8300028960324356, nan, nan, 0.8824057925056815, nan, nan, nan, nan, nan, nan, 0.9843521802628834, 0.0, nan, 0.9724497609876085, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21379310344827587, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9133825079030559, 0.7821080602302923, nan, nan, nan, 0.9458254757414709, nan, nan, nan, 0.3059392872855257, 0.8506953555497245, nan, nan, 0.9297786720321931, 0.8074189808981402, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0234 | 63.0 | 1260 | 0.1964 | 0.6494 | 0.7565 | 0.9456 | [0.8785245058923847, 0.9790431101476543, 0.9614314476887585, 0.9288212229326362, 0.2562762925770118, 0.9672782178832192, 0.9840668601929019, nan, 0.731640417625199, nan, 0.8431639909097574, 0.8868918296135361, 0.9237395059931092, 0.7630092901815532, 0.5302055156546343, 0.9033240236252918, 0.9756418070193694, 0.5989921976592978, nan, 0.8031608264783988, 0.47686832740213525, nan, 0.9272339085086276, nan, 0.0, nan, nan, 0.9215634705941288, nan, nan, nan, nan, nan, 0.6851942244406305, nan, nan, 0.5553194837897387, 0.7080276941291535, 0.0, nan, nan, 0.0, nan, 0.24622222222222223, nan, nan, nan, 0.8709337020595299, nan, nan, 0.9754384201798905, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34243964421855144, nan, 0.6867128463476071, 0.8920538397254671, nan, nan, 0.8227006647484869, 0.0, 0.8767221211333507, nan, nan, 0.7820437424738861, 0.8289170506912442, nan, nan, 0.8731452455590386, nan, nan, nan, nan, nan, nan, 0.8178819444444444, 0.0, nan, 0.9697736782486072, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.21812372930583793, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9106579777170486, 0.6808688387635756, nan, nan, nan, 0.9499057250599932, nan, nan, nan, 0.28228701514753524, 0.8349792099792099, 0.0, nan, 0.908284898039602, 0.8097887636023438, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9588714407445526, 0.9878564022141173, 0.9958209399167163, 0.9705054054353323, 0.2745910577971647, 0.9940271223908516, 0.9919109353393962, nan, 0.7753606686938235, nan, 0.9996002155359719, 0.9777181013209464, 0.9433789275448509, 0.8416047150224365, 0.5372800453693347, 0.9653814160591084, 0.9985485593614878, 0.8494698017519594, nan, 0.8542886668963141, 0.9284064665127021, nan, 0.96855805791167, nan, nan, nan, nan, 0.9627638000287841, nan, nan, nan, nan, nan, 0.8942868886148787, nan, nan, 0.5914347971840429, 0.7095800660338424, nan, nan, nan, 0.0, nan, 0.24622222222222223, nan, nan, nan, 0.9713485653991315, nan, nan, 0.9881898982905896, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7203113941967445, 0.9751404303290637, nan, nan, 0.9142227122381478, nan, 0.8931479642502482, nan, nan, 0.8098718171799779, 0.8334781349551115, nan, nan, 0.8832514137730564, nan, nan, nan, nan, nan, nan, 0.9828917170874192, 0.0, nan, 0.9734417784378335, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.235423197492163, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9129610115911486, 0.7218777679362267, nan, nan, nan, 0.9500685753471627, nan, nan, nan, 0.2836779586449626, 0.8430858042508528, nan, nan, 0.9275653923541247, 0.8288896728995514, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0193 | 64.0 | 1280 | 0.1947 | 0.6508 | 0.7569 | 0.9460 | [0.8796726154893645, 0.9799506824646489, 0.961474026160518, 0.9272962575566862, 0.2552715438334802, 0.9674113335679828, 0.9846163932583744, nan, 0.7206713848264996, nan, 0.8416008194322505, 0.8889623702136102, 0.9227120940812079, 0.7703381116554769, 0.559565494809904, 0.9073494113612984, 0.9756452145165462, 0.5978784956605593, nan, 0.8068824332228505, 0.4808277541083384, nan, 0.9289177793394238, nan, 0.0, nan, nan, 0.9231088108936513, nan, nan, nan, nan, nan, 0.6790843875947279, nan, nan, 0.5547317228733158, 0.7067833698030634, 0.0, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.8670331061167664, nan, nan, 0.9743036602986033, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35010197144799454, nan, 0.6909434641699675, 0.8874625053563776, nan, nan, 0.8271360759493671, 0.0, 0.8807744785913846, nan, nan, 0.7761561259296539, 0.8182080092192452, nan, nan, 0.8725049639460759, nan, nan, nan, nan, nan, nan, 0.819184316471206, 0.0, nan, 0.9695064154433014, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.26658939437844104, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9040202062723637, 0.6762225969645869, nan, nan, nan, 0.939439396536945, nan, nan, nan, 0.267338003502627, 0.839968774395004, 0.0, nan, 0.9099143447868465, 0.830184966548603, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9608069674534931, 0.9881452898419888, 0.9957697137946989, 0.9653011283315678, 0.2737186477644493, 0.9931768907027165, 0.9922743297261328, nan, 0.7597685290611228, nan, 0.9997218890685022, 0.975613296559733, 0.9448904747745069, 0.8529904226106758, 0.5664235416436932, 0.9710818613299408, 0.9985698593289293, 0.8575380359612724, nan, 0.8553565277299345, 0.9122401847575058, nan, 0.966547236033928, nan, nan, nan, nan, 0.9614554303881933, nan, nan, nan, nan, nan, 0.8968351829017673, nan, nan, 0.5831377807576266, 0.7081871646718944, nan, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.9730327502942251, nan, nan, 0.9886175054214476, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.42597187758478083, nan, 0.7242274121255013, 0.9661023537872456, nan, nan, 0.9221609702315325, nan, 0.8974511751075803, nan, nan, 0.805057711269648, 0.822473211699971, nan, nan, 0.8825114951641034, nan, nan, nan, nan, nan, nan, 0.9764239515960776, 0.0, nan, 0.9731285097693414, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2884012539184953, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9051633298208641, 0.7103631532329495, nan, nan, nan, 0.939439396536945, nan, nan, nan, 0.26863176418829743, 0.847021779060614, nan, nan, 0.9297786720321931, 0.8505619676427599, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.008 | 65.0 | 1300 | 0.2011 | 0.6497 | 0.7567 | 0.9449 | [0.8758596087712686, 0.9805794919844963, 0.9616931912659519, 0.9252963852174437, 0.2522798421124268, 0.9668389497622494, 0.9840564057247322, nan, 0.7178978933191025, nan, 0.8376174178984926, 0.885824931561728, 0.9227193492155723, 0.7662872366190361, 0.5246265989027215, 0.9206072846298489, 0.9756260702054794, 0.5987673830594185, nan, 0.8104851839791599, 0.4756598240469208, nan, 0.9278859624544691, nan, 0.0, nan, nan, 0.9224339937095096, nan, nan, nan, nan, nan, 0.6781263095810892, nan, nan, 0.5604447228311933, 0.7053810504634398, 0.0, nan, nan, 0.0, nan, 0.2728888888888889, nan, nan, nan, 0.8669769631990727, nan, nan, 0.9728048329182545, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.333810888252149, nan, 0.6886635113815641, 0.8903471333048798, nan, nan, 0.8288046182940181, 0.0, 0.8726704027108041, nan, nan, 0.7767220902612827, 0.8149322571346209, nan, nan, 0.8719168060200669, nan, nan, nan, nan, nan, nan, 0.81280276816609, 0.0, nan, 0.9704787664424255, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2629608294930876, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9104289318755256, 0.6797385620915033, nan, nan, nan, 0.9282101834390537, nan, nan, nan, 0.27984584391696593, 0.8354134165366615, 0.0, nan, 0.8955827253007297, 0.8409895730867598, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9591381428239434, 0.9879750524898503, 0.9958457267499504, 0.9630348686565953, 0.2695019992729916, 0.9940483781830549, 0.9932890023644232, nan, 0.7568349563982694, nan, 0.9997218890685022, 0.9629118885179271, 0.944394885518882, 0.8476324425691514, 0.5317506576978214, 0.9702255712677986, 0.9986185449687955, 0.8734439834024896, nan, 0.8573889080261798, 0.9364896073903002, nan, 0.9685946183094472, nan, nan, nan, nan, 0.9631301435281495, nan, nan, nan, nan, nan, 0.8978832716810522, nan, nan, 0.5998994301039222, 0.7066910854312836, nan, nan, nan, 0.0, nan, 0.2728888888888889, nan, nan, nan, 0.9713688567834098, nan, nan, 0.9902363038454098, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3854425144747725, nan, 0.7208303845246521, 0.9731190110594958, nan, nan, 0.9180815876515986, nan, 0.8865938430983118, nan, nan, 0.8060727336001392, 0.8187083695337388, nan, nan, 0.8818244278843613, nan, nan, nan, nan, nan, nan, 0.9801794283329857, 0.0, nan, 0.9741495335777602, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.28620689655172415, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9125395152792413, 0.7369353410097431, nan, nan, nan, 0.9282101834390537, nan, nan, nan, 0.28112626484821823, 0.8430858042508528, nan, nan, 0.9137826961770624, 0.8618013204979588, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0183 | 66.0 | 1320 | 0.2009 | 0.6492 | 0.7550 | 0.9454 | [0.8771511273952037, 0.9802232430607422, 0.9619372300420755, 0.9254528128774917, 0.25216265921939923, 0.968766192045431, 0.9835606577606596, nan, 0.7175672413574051, nan, 0.8497879446143843, 0.8846993451081564, 0.922612699951527, 0.761744761439306, 0.5365190365190365, 0.9278644483197436, 0.9753998015106584, 0.5999680613222612, nan, 0.8124694774540127, 0.4632606199770379, nan, 0.9279969091356117, nan, 0.0, nan, nan, 0.9235670390710417, nan, nan, nan, nan, nan, 0.6749988319394478, nan, nan, 0.5606371234821006, 0.705870236869207, 0.0, nan, nan, 0.0, nan, 0.24266666666666667, nan, nan, nan, 0.8715380405035578, nan, nan, 0.9712683252996417, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33048433048433046, nan, 0.6945757371145622, 0.8859253471605639, nan, nan, 0.8324934489014312, 0.0, 0.870001956819516, nan, nan, 0.7808238318801699, 0.8274272543935465, nan, nan, 0.8723048812320543, nan, nan, nan, nan, nan, nan, 0.8179447052686489, 0.0, nan, 0.9707281270046754, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.25187536064627813, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9050726162913071, 0.6855500821018062, nan, nan, nan, 0.9207335361412229, nan, nan, nan, 0.269119579500657, 0.8220890856994009, 0.0, nan, 0.9058984642472855, 0.836038961038961, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.961445503137346, 0.9884909232539064, 0.9954689668847908, 0.9642565819341455, 0.26913849509269355, 0.9935382391701739, 0.9930624707986654, nan, 0.7565536549100506, nan, 0.9995654516695347, 0.9706778922920598, 0.9432798096937258, 0.8351081642220882, 0.5438806534444462, 0.9632773890492734, 0.9988558874631435, 0.8660673121254034, nan, 0.8596279710644161, 0.9318706697459584, nan, 0.9659622696694941, nan, nan, nan, nan, 0.9615339325666287, nan, nan, nan, nan, nan, 0.8906905055487053, nan, nan, 0.5958766342608113, 0.7071811803549319, nan, nan, nan, 0.0, nan, 0.24266666666666667, nan, nan, nan, 0.9692991355870297, nan, nan, 0.9908675334195335, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3837882547559967, nan, 0.728001887237556, 0.969698147680227, nan, nan, 0.9106945975744212, nan, 0.8830188679245283, nan, nan, 0.8103068267501885, 0.8317405154937735, nan, nan, 0.8830928597854236, nan, nan, nan, nan, nan, nan, 0.981431253911955, 0.0, nan, 0.9744279946164199, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.2736677115987461, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9062170706006323, 0.7395925597874224, nan, nan, nan, 0.9210097719869706, nan, nan, nan, 0.27030356357237134, 0.8281290999737602, nan, nan, 0.9316901408450704, 0.8565596492112293, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.025 | 67.0 | 1340 | 0.2064 | 0.6476 | 0.7549 | 0.9443 | [0.8736307213101239, 0.979720329793981, 0.96123141999933, 0.928384895311055, 0.2518891687657431, 0.9648323088981478, 0.9836541835278707, nan, 0.7107059765818681, nan, 0.8528146873887769, 0.884305702217529, 0.9263460227825918, 0.762460491125699, 0.47401770461251747, 0.9262131359417225, 0.9756301296971751, 0.597918637653737, nan, 0.8116769380473565, 0.4547486033519553, nan, 0.927712532865907, nan, 0.0, nan, nan, 0.9202382151490733, nan, nan, nan, nan, nan, 0.6813490309955179, nan, nan, 0.5585014860003128, 0.7058263175510414, 0.0, nan, nan, 0.0, nan, 0.2391111111111111, nan, nan, nan, 0.8692214001345234, nan, nan, 0.968972339915261, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33488992661774514, nan, 0.681828417023193, 0.889568268821778, nan, nan, 0.8317051062544063, 0.0, 0.8847054233320327, nan, nan, 0.7797094146300495, 0.8375035950532068, nan, nan, 0.8726683734782381, nan, nan, nan, nan, nan, nan, 0.8188771075960368, 0.0, nan, 0.9692602932931024, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20815264527320035, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9039108494533221, 0.6969943135662063, nan, nan, nan, 0.9215871111491988, nan, nan, nan, 0.2819144282089422, 0.8302132085283411, 0.0, nan, 0.9071134626690183, 0.8356299212598425, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.961573874262696, 0.9881556072572699, 0.9958440742944015, 0.967058033226172, 0.2689930934205743, 0.9942821918972921, 0.9923073655794724, nan, 0.7488245616385142, nan, 0.9995828336027534, 0.972492379155175, 0.9430567945286946, 0.8401312705110173, 0.4808204288032263, 0.964304937123844, 0.9986672306086618, 0.8741355463347165, nan, 0.8620048225973131, 0.9399538106235565, nan, 0.9674978063761334, nan, nan, nan, nan, 0.9562742866114534, nan, nan, nan, nan, nan, 0.8872379778051788, nan, nan, 0.598474689909487, 0.7071553858852662, nan, nan, nan, 0.0, nan, 0.2391111111111111, nan, nan, nan, 0.9702325392638286, nan, nan, 0.9872328728072408, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4152191894127378, nan, 0.7143194149563576, 0.9755874749752181, nan, nan, 0.9104740904079383, nan, 0.900695134061569, nan, nan, 0.8077257699669392, 0.8433246452360267, nan, nan, 0.8827229004809471, nan, nan, nan, nan, nan, nan, 0.9828917170874192, 0.0, nan, 0.9727746321993781, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22570532915360503, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9060063224446786, 0.7599645704162976, nan, nan, nan, 0.9218241042345277, nan, nan, nan, 0.2835019797624285, 0.8378378378378378, nan, nan, 0.9313883299798793, 0.8558036389294894, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0217 | 68.0 | 1360 | 0.2075 | 0.6323 | 0.7527 | 0.9437 | [0.8726938382373937, 0.9791816811060851, 0.9616538378074777, 0.9295367685118875, 0.259854806974693, 0.9681113204812292, 0.9838789682539683, nan, 0.7155007172144153, nan, 0.8416893493722013, 0.8857538400685608, 0.928789469828323, 0.7627097955447055, 0.4423309835098289, 0.9184568273558206, 0.9757329193654872, 0.5984365028717294, nan, 0.8101163802830523, 0.44275637547476937, nan, 0.9278686228509402, nan, 0.0, nan, nan, 0.9196462270085298, nan, nan, nan, nan, nan, 0.6794514979048095, nan, nan, 0.5548062015503876, 0.705010040677617, 0.0, nan, nan, 0.0, 0.0, 0.21244444444444444, nan, nan, nan, 0.8694988552531162, nan, nan, 0.9702288779883513, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3398858592263792, nan, 0.6788173349534224, 0.8871513553948633, nan, nan, 0.8252619094682744, 0.0, 0.8744960332943166, nan, nan, 0.7793854248292846, 0.8272884283246977, nan, nan, 0.8739951978285834, nan, nan, nan, nan, nan, nan, 0.8154379878577623, 0.0, nan, 0.9703529901292217, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.14443155452436196, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9074074074074074, 0.6938110749185668, nan, nan, nan, 0.9203668781073204, nan, nan, nan, 0.277510286264554, 0.8343302990897269, 0.0, nan, 0.9065300896286812, 0.836694540088539, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9605668249172781, 0.9882639401177217, 0.9959019102386146, 0.9677261000033219, 0.27844420210832427, 0.993772052884411, 0.992260171503273, nan, 0.7550265896882912, nan, 0.9997392710017208, 0.9751778197125853, 0.9424373079591635, 0.8369834572366218, 0.44876258290143195, 0.9686108528649019, 0.9983538168020228, 0.8646841862609498, nan, 0.8656217705821564, 0.9422632794457275, nan, 0.96881398069611, nan, nan, nan, nan, 0.9564051235755126, nan, nan, nan, nan, nan, 0.8930332922318126, nan, nan, 0.5998156218571907, 0.7063557573256294, nan, nan, nan, 0.0, nan, 0.21244444444444444, nan, nan, nan, 0.9709833204821233, nan, nan, 0.9887600411317335, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.44334160463192723, nan, 0.7116772823779193, 0.9681432097805593, nan, nan, 0.9206174200661521, nan, 0.8903012247600133, nan, nan, 0.8076387680528971, 0.8323197219808862, nan, nan, 0.8849426563078061, nan, nan, nan, nan, nan, nan, 0.9808053411224703, 0.0, nan, 0.9740683157748179, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1561128526645768, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9087460484720759, 0.7546501328609388, nan, nan, nan, 0.9203668781073204, nan, nan, nan, 0.278926528816542, 0.8417738126475991, nan, nan, 0.9259557344064386, 0.8573156594929691, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0314 | 69.0 | 1380 | 0.2053 | 0.6458 | 0.7522 | 0.9446 | [0.8761587446054744, 0.9794604991891629, 0.9610682556914282, 0.9291698292556625, 0.25075013638843424, 0.9667831806242114, 0.9846732350572237, nan, 0.7036999338388722, nan, 0.8350901646484885, 0.8888297872340426, 0.9282437905626312, 0.7704415740462601, 0.5243631389981194, 0.9115903586668205, 0.975746057804981, 0.601652357038717, nan, 0.8120283779425992, 0.4672083575159605, nan, 0.9287644313436502, nan, 0.0, nan, nan, 0.9201359174427385, nan, nan, nan, nan, nan, 0.6703698020561608, nan, nan, 0.5550353146575668, 0.7042862659286909, 0.0, nan, nan, 0.0, nan, 0.24711111111111111, nan, nan, nan, 0.8699120511702282, nan, nan, 0.9747560013186418, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.32493540051679587, nan, 0.6763285024154589, 0.8738129239557301, nan, nan, 0.8288369304556354, 0.0, 0.868361177923682, nan, nan, 0.7817896828728973, 0.8183914672816374, nan, nan, 0.8688000838003457, nan, nan, nan, nan, nan, nan, 0.8202676864244742, 0.0, nan, 0.9692713210557348, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17293671624380286, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9002315302041676, 0.68561872909699, nan, nan, nan, 0.9215240870906909, nan, nan, nan, 0.26389983363978636, 0.8261322228006247, 0.0, nan, 0.9007106198183972, 0.8415846456692914, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9609220588072552, 0.9876964822772599, 0.9962340538039527, 0.9669325400194145, 0.2673209741912032, 0.9935594949623773, 0.9923734372861518, nan, 0.7408677481146103, nan, 0.9997392710017208, 0.9702424154449122, 0.9427346615125384, 0.8588842006563525, 0.5314670993556925, 0.9669227381709644, 0.9982960026046818, 0.8561549100968188, nan, 0.8674130210127454, 0.9295612009237876, nan, 0.9676440479672419, nan, nan, nan, nan, 0.9566013790216011, nan, nan, nan, nan, nan, 0.8978216193999178, nan, nan, 0.5861548776399598, 0.7056851011143211, nan, nan, nan, 0.0, nan, 0.24711111111111111, nan, nan, nan, 0.9714094395519662, nan, nan, 0.9934331762046813, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4160463192721257, nan, 0.7067704647322481, 0.9514470640828782, nan, nan, 0.9145534729878722, nan, 0.8843429328037073, nan, nan, 0.8100168203700482, 0.8221836084564147, nan, nan, 0.8766978489509011, nan, nan, nan, nan, nan, nan, 0.9845608178593782, 0.0, nan, 0.9727688309277394, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18589341692789968, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9013698630136986, 0.7263064658990257, nan, nan, nan, 0.9215240870906909, nan, nan, nan, 0.2652001759788825, 0.8328522697454737, nan, nan, 0.9181086519114688, 0.8619021218688574, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0434 | 70.0 | 1400 | 0.2038 | 0.6468 | 0.7543 | 0.9448 | [0.8768763151269043, 0.9787297154926841, 0.960349995381386, 0.9286781060764622, 0.2332094687586017, 0.9671646114375569, 0.9842266125597353, nan, 0.7149410972308708, nan, 0.8369642727206578, 0.8884162956070978, 0.9272580763314043, 0.7670392418904143, 0.5316064269509276, 0.9131371141421393, 0.9757300751410195, 0.5994189799870885, nan, 0.8113122752953261, 0.4497528830313015, nan, 0.9276990376202975, nan, 0.0, nan, nan, 0.9237080087176532, nan, nan, nan, nan, nan, 0.6716740183225886, nan, nan, 0.5716577111235865, 0.7061216084024096, 0.0, nan, nan, 0.0, nan, 0.26666666666666666, nan, nan, nan, 0.8702162456841722, nan, nan, 0.9733136248327108, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3282828282828283, nan, 0.6739120631341601, 0.8820725240586335, nan, nan, 0.8322207831025207, 0.0, 0.8769230769230769, nan, nan, 0.7838344812139537, 0.8304987027961949, nan, nan, 0.8744516398579486, nan, nan, nan, nan, nan, nan, 0.8220944415403381, 0.0, nan, 0.9682380010868685, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.16681222707423582, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9089953762084909, 0.6846473029045643, nan, nan, nan, 0.9240989156987957, nan, nan, nan, 0.26230513224732876, 0.8286011982287054, 0.0, nan, 0.9067505158691166, 0.8237466758593519, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9596925732877395, 0.9872321985896093, 0.9964273911031793, 0.966246018358918, 0.24638313340603418, 0.993602006546784, 0.9924017537318716, nan, 0.7535932916292715, nan, 0.9996697432688464, 0.9702424154449122, 0.9409753196550699, 0.8456901748040988, 0.53893413569842, 0.9647208494397417, 0.9984785737541801, 0.8561549100968188, nan, 0.8706166035136066, 0.9457274826789839, nan, 0.9691795846738812, nan, nan, nan, nan, 0.9648833588465413, nan, nan, nan, nan, nan, 0.8949856144677353, nan, nan, 0.5973851827019778, 0.7075423029302518, nan, nan, nan, 0.0, nan, 0.26666666666666666, nan, nan, nan, 0.9717138103161398, nan, nan, 0.9921910793007606, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43010752688172044, nan, 0.7050719509318235, 0.9602518999397461, nan, nan, 0.9209481808158765, nan, 0.8943396226415095, nan, nan, 0.8119308624789745, 0.8343469446857805, nan, nan, 0.8849426563078061, nan, nan, nan, nan, nan, nan, 0.9843521802628834, 0.0, nan, 0.9715969740567132, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17962382445141065, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9114857744994731, 0.7307351638618246, nan, nan, nan, 0.9241385222012687, nan, nan, nan, 0.2635283765948086, 0.8346890579900289, nan, nan, 0.9283702213279678, 0.8430522655108109, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0056 | 71.0 | 1420 | 0.2062 | 0.6454 | 0.7531 | 0.9442 | [0.8747804088955184, 0.9786649897236112, 0.9608123141262459, 0.9294210268740558, 0.24731770655368004, 0.9656525600512217, 0.9838218010011285, nan, 0.7163804256830463, nan, 0.84963505806566, 0.8858195211786372, 0.9282232512795515, 0.7663414634146342, 0.48021878981881466, 0.910979433736051, 0.9757345791168696, 0.60105956012201, nan, 0.8101118885628608, 0.4411764705882353, nan, 0.9273643329484622, nan, 0.0, nan, nan, 0.9228568560530014, nan, nan, nan, nan, nan, 0.6715939393005226, nan, nan, 0.569192882109946, 0.706701681213151, 0.0, nan, nan, 0.0, nan, 0.25155555555555553, nan, nan, nan, 0.8698829346324438, nan, nan, 0.972074959315502, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33161290322580644, nan, 0.6751440403312927, 0.8810286511927558, nan, nan, 0.8313038313038313, 0.0, 0.8757142857142857, nan, nan, 0.7834131049346432, 0.8378456221198156, nan, nan, 0.8736315295589615, nan, nan, nan, nan, nan, nan, 0.8219345810739899, 0.0, nan, 0.9687926290721601, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1719247467438495, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9100462379150904, 0.6775271512113618, nan, nan, nan, 0.9207233459033254, nan, nan, nan, 0.23586642124638443, 0.8330294347486324, 0.0, nan, 0.9064896755162242, 0.8278849940921623, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9605081725927648, 0.9874849752639968, 0.9961894375041311, 0.9673311654997029, 0.26310432569974557, 0.9937933086766144, 0.9915664185831394, nan, 0.7565402596010877, nan, 0.999548069736316, 0.9775003628973726, 0.9437258400237882, 0.8417386645234747, 0.48683816695285054, 0.9666536184371483, 0.9984238024093306, 0.8630705394190872, nan, 0.8654495349638305, 0.9526558891454965, nan, 0.9690333430827728, nan, nan, nan, nan, 0.9622797032617655, nan, nan, nan, nan, nan, 0.8954171804356761, nan, nan, 0.6004860878310426, 0.7080323978539002, nan, nan, nan, 0.0, nan, 0.25155555555555553, nan, nan, nan, 0.9710239032506798, nan, nan, 0.9912747783060649, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4251447477253929, nan, 0.7076669025713612, 0.9568893467317149, nan, nan, 0.9257993384785006, nan, 0.8928169480304535, nan, nan, 0.8116988573748622, 0.8424558355053576, nan, nan, 0.8856825749167592, nan, nan, nan, nan, nan, nan, 0.9803880659294805, 0.0, nan, 0.9723221330115561, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18620689655172415, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9125395152792413, 0.7183348095659876, nan, nan, nan, 0.9208811932110407, nan, nan, nan, 0.23677958644962604, 0.8391498294410916, nan, nan, 0.9274647887323944, 0.8475379265158006, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0193 | 72.0 | 1440 | 0.2084 | 0.6451 | 0.7504 | 0.9445 | [0.8753184610671267, 0.9790919554930299, 0.9601723619026691, 0.9288935578688229, 0.23673553435376485, 0.9658485186562539, 0.9839405684150486, nan, 0.7095617225854043, nan, 0.8464335129088286, 0.8871801670422909, 0.9232726655841795, 0.7622960873298172, 0.5045302665319761, 0.91042672642839, 0.9757447985275008, 0.5989356555394292, nan, 0.8081746702189876, 0.4707585408222351, nan, 0.9272440392706872, nan, 0.0, nan, nan, 0.9238872496704125, nan, nan, nan, nan, nan, 0.6763806074985171, nan, nan, 0.5639497422680413, 0.7047209637561779, 0.0, nan, nan, 0.0, nan, 0.25955555555555554, nan, nan, nan, 0.8731562882678705, nan, nan, 0.9721372443164777, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3246505717916137, nan, 0.6688750676041103, 0.8884552614773291, nan, nan, 0.8353818255087648, 0.0, 0.8572173346366895, nan, nan, 0.786522820648984, 0.8280529953917051, nan, nan, 0.8721112621562271, nan, nan, nan, nan, nan, nan, 0.8301253752428042, 0.0, nan, 0.9678918706425095, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1756324512939808, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.900105152471083, 0.662751677852349, nan, nan, nan, 0.9266892326149364, nan, nan, nan, 0.22345927938984833, 0.8253719655442443, 0.0, nan, 0.9022415325367829, 0.8388382968250061, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9611832276484844, 0.9873147379118583, 0.9963811223478088, 0.9693021470411286, 0.25074518356961106, 0.9936870297155975, 0.9932370888806037, nan, 0.7473242870346805, nan, 0.999548069736316, 0.9714036870373058, 0.9440231935771632, 0.8324961489518452, 0.5114132232706879, 0.9640847482507218, 0.9984877023116551, 0.8561549100968188, nan, 0.856837754047537, 0.9387990762124712, nan, 0.9668397192161451, nan, nan, nan, nan, 0.9627376326359723, nan, nan, nan, nan, nan, 0.8905260994656802, nan, nan, 0.5868253436138116, 0.7061751960379694, nan, nan, nan, 0.0, nan, 0.25955555555555554, nan, nan, nan, 0.9693803011241426, nan, nan, 0.9900021380356543, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4226633581472291, nan, 0.700212314225053, 0.973468872086921, nan, nan, 0.9142227122381478, nan, 0.8708374710360808, nan, nan, 0.8161069543529957, 0.8326093252244425, nan, nan, 0.8815601712383067, nan, nan, nan, nan, nan, nan, 0.9808053411224703, 0.0, nan, 0.9712721028449436, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18934169278996865, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9020021074815595, 0.6997342781222321, nan, nan, nan, 0.9269672552717299, nan, nan, nan, 0.22428508578970524, 0.8297034898976646, nan, nan, 0.919215291750503, 0.8588780807418981, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0156 | 73.0 | 1460 | 0.2037 | 0.6456 | 0.7530 | 0.9450 | [0.8767944856582645, 0.9791674122091745, 0.9610201595673442, 0.9276834477335997, 0.24405412793876435, 0.9673834309484882, 0.98395486780308, nan, 0.7129569626043533, nan, 0.8404623770623566, 0.8857669276830954, 0.9223188405797101, 0.764861051469235, 0.5318429864814586, 0.9129500392845589, 0.9756940934417163, 0.5973671536362177, nan, 0.808197360722876, 0.465556831228473, nan, 0.9274535205349953, nan, 0.0, nan, nan, 0.9221880685026034, nan, nan, nan, nan, nan, 0.6790610946583306, nan, nan, 0.5667274601538339, 0.7041938058337409, 0.0, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.8730791017230983, nan, nan, 0.9730561355485179, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35969084423305586, nan, 0.6736120506922834, 0.8911913433996959, nan, nan, 0.8271273954920068, 0.0, 0.8559609120521172, nan, nan, 0.7856066945606694, 0.8262748487467588, nan, nan, 0.8718552225534808, nan, nan, nan, nan, nan, nan, 0.8245027283928886, 0.0, nan, 0.9698360314643887, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1205925925925926, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9004629629629629, 0.6797004991680532, nan, nan, nan, 0.9246099777130121, nan, nan, nan, 0.23750657548658602, 0.8261209593326382, 0.0, nan, 0.9067479834743262, 0.8348668995719136, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9600367407013932, 0.9880575918120992, 0.9960142772159429, 0.9667553731392864, 0.25961468556888406, 0.9935594949623773, 0.9926896375966888, nan, 0.751597390593814, nan, 0.9996697432688464, 0.9769923065757004, 0.9461790068391317, 0.8350411894715692, 0.5404306936152113, 0.9665557567157607, 0.9985516022139794, 0.8577685569386814, nan, 0.8565277299345505, 0.9364896073903002, nan, 0.9684483767183387, nan, nan, nan, nan, 0.9616909369234996, nan, nan, nan, nan, nan, 0.8906083025071928, nan, nan, 0.598977539389876, 0.7055561287659926, nan, nan, nan, 0.0, nan, 0.25333333333333335, nan, nan, nan, 0.9695426321983686, nan, nan, 0.9916412987039431, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.5004135649296939, nan, 0.7046945034206181, 0.9796886236855915, nan, nan, 0.918412348401323, nan, 0.8698444223766965, nan, nan, 0.8167739690273186, 0.8305821025195482, nan, nan, 0.8809788066169865, nan, nan, nan, nan, nan, nan, 0.9772585019820572, 0.0, nan, 0.9734649835243886, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12758620689655173, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.901791359325606, 0.7236492471213464, nan, nan, nan, 0.9246099777130121, nan, nan, nan, 0.2383633963924329, 0.8315402781422199, nan, nan, 0.927364185110664, 0.8551484300186483, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.033 | 74.0 | 1480 | 0.2052 | 0.6327 | 0.7524 | 0.9446 | [0.8757768989433468, 0.9793644706904926, 0.9613616277251857, 0.9265149476725248, 0.2567842605156038, 0.9635947833611471, 0.9833870718661232, nan, 0.7074936386768448, nan, 0.8482805423188705, 0.8776964436095096, 0.9148936170212766, 0.7656278699565298, 0.522055854120717, 0.909829650092081, 0.9756749444792946, 0.5948588550701782, nan, 0.8051909796092037, 0.4683691236215902, nan, 0.9273421891977147, nan, 0.0, nan, nan, 0.9206655473976997, nan, nan, nan, nan, nan, 0.678532918691691, nan, nan, 0.5654183266932271, 0.7065259364139529, 0.0, nan, nan, 0.0, 0.0, 0.23644444444444446, nan, nan, nan, 0.8732123916462455, nan, nan, 0.972796256860922, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35908267954133977, nan, 0.6848364717542121, 0.8902561926483848, nan, nan, 0.8244206773618539, 0.0, 0.8500651890482399, nan, nan, 0.7883659911795903, 0.8311239193083574, nan, nan, 0.8700653594771242, nan, nan, nan, nan, nan, nan, 0.8251636878428596, 0.0, nan, 0.9688970324728622, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10894020130254589, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.894127552094296, 0.6846921797004991, nan, nan, nan, 0.9247182827027722, nan, nan, nan, 0.23746933052926744, 0.8261662757362522, 0.0, nan, 0.9091534755677908, 0.8353676542966827, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9604141075440169, 0.9881452898419888, 0.9957994579945799, 0.9659138304586776, 0.2751726644856416, 0.9941334013518683, 0.9917315978498379, nan, 0.7448997361124134, nan, 0.9994437781370044, 0.9833793003338656, 0.9451134899395381, 0.8375192552407742, 0.5294821909607902, 0.9669472036013114, 0.9986002878538457, 0.8695251267865376, nan, 0.8474336892869445, 0.9318706697459584, nan, 0.9673150043872477, nan, nan, nan, nan, 0.9614161792989755, nan, nan, nan, nan, nan, 0.8923140156185779, nan, nan, 0.5947033188065706, 0.7079292199752373, nan, nan, nan, 0.0, nan, 0.23644444444444446, nan, nan, nan, 0.968893307901465, nan, nan, 0.9906435487319413, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4921422663358147, nan, 0.7172446331682001, 0.9786779140508076, nan, nan, 0.9178610804851157, nan, 0.86329030122476, nan, nan, 0.8190940200684415, 0.8352157544164495, nan, nan, 0.8794461180698695, nan, nan, nan, nan, nan, nan, 0.9728771124556645, 0.0, nan, 0.972438158444331, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11536050156739812, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8952581664910432, 0.728963684676705, nan, nan, nan, 0.9249957140408024, nan, nan, nan, 0.23845138583369996, 0.8318026764628706, nan, nan, 0.930281690140845, 0.8554508341313442, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0308 | 75.0 | 1500 | 0.2072 | 0.6450 | 0.7508 | 0.9443 | [0.8751087091251171, 0.9784159063275878, 0.9609948979591837, 0.9257244837256735, 0.24861916126832595, 0.9665811895111258, 0.9836186768607377, nan, 0.7125535948294508, nan, 0.848741604546461, 0.8806116447755342, 0.9208295893338514, 0.7608988488856233, 0.505072351019138, 0.9107958477508651, 0.9756774836110657, 0.5937399678972712, nan, 0.8066748992590667, 0.46714031971580816, nan, 0.9272682618763596, nan, 0.0, nan, nan, 0.9225816150126526, nan, nan, nan, nan, nan, 0.6748705867766033, nan, nan, 0.5689364096792346, 0.7046496060964935, 0.0, nan, nan, 0.0, nan, 0.21955555555555556, nan, nan, nan, 0.8705100182149362, nan, nan, 0.9738811562634266, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.32471626733921816, nan, 0.6800863658854752, 0.8893438990232055, nan, nan, 0.8295137504982064, 0.0, 0.8479608482871126, nan, nan, 0.7849738761141068, 0.8361175115207373, nan, nan, 0.8693309619710206, nan, nan, nan, nan, nan, nan, 0.8304062444562711, 0.0, nan, 0.9687574060265548, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20686645330229853, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.887578947368421, 0.6894254787676936, nan, nan, nan, 0.9251627827278959, nan, nan, nan, 0.2462333566923616, 0.8168058455114823, 0.0, nan, 0.9034768048852556, 0.819891678975874, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9610061640273121, 0.988000846028053, 0.9959944477493555, 0.9649394126179729, 0.26506724827335515, 0.9934957275857671, 0.9921044310518143, nan, 0.7502176737706455, nan, 0.9994263962037858, 0.9780809986935695, 0.9395628902765388, 0.8322952247002879, 0.5113659635469998, 0.9659685863874345, 0.9985850735913875, 0.8526970954356846, nan, 0.8550809507406132, 0.9110854503464203, nan, 0.966254752851711, nan, nan, nan, nan, 0.9635619055095446, nan, nan, nan, nan, nan, 0.8948623099054666, nan, nan, 0.5931109621186724, 0.7059946347503095, nan, nan, nan, 0.0, nan, 0.21955555555555556, nan, nan, nan, 0.9697455460411509, nan, nan, 0.9923234338888832, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.42597187758478083, nan, 0.7133286152394432, 0.9750821201578262, nan, nan, 0.9178610804851157, nan, 0.8603111552466071, nan, nan, 0.8147729250043501, 0.8407182160440196, nan, nan, 0.87833624015644, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9722699215668075, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22288401253918494, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8885142255005268, 0.733392382639504, nan, nan, nan, 0.9256386079204526, nan, nan, nan, 0.2473383194016718, 0.8213067436368408, nan, nan, 0.9228370221327968, 0.8392722141021118, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0194 | 76.0 | 1520 | 0.2029 | 0.6472 | 0.7569 | 0.9444 | [0.8749263025271167, 0.979075199460269, 0.9620199807441446, 0.9276139124882495, 0.2681169181034483, 0.9662368790809158, 0.9839287974016113, nan, 0.7145394569906115, nan, 0.8538610038610038, 0.8841567291311755, 0.9200009664637093, 0.7668866226754649, 0.5014271974186343, 0.9096521538815941, 0.9757471951375446, 0.5966413181242078, nan, 0.8047797200541831, 0.4648127128263337, nan, 0.927344597194529, nan, 0.0, nan, nan, 0.919877752136219, nan, nan, nan, nan, nan, 0.6733985238703248, nan, nan, 0.5754850922858495, 0.7058232931726908, 0.0, nan, nan, 0.0, nan, 0.25155555555555553, nan, nan, nan, 0.868526619593037, nan, nan, 0.9723744200927852, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3446291560102302, nan, 0.6903097804031203, 0.8847227128223243, nan, nan, 0.8169629845526085, 0.0, 0.8580931263858093, nan, nan, 0.7818639094592706, 0.8444700460829493, nan, nan, 0.8718095634961975, nan, nan, nan, nan, nan, nan, 0.8151757744517926, 0.0, nan, 0.970371012482663, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18318840579710144, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9036195286195287, 0.6995192307692307, nan, nan, nan, 0.9314569678276142, nan, nan, nan, 0.25140203294777425, 0.8253844149074798, 0.0, nan, 0.911778962675715, 0.8203424859757897, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9590717439660038, 0.9881968769183944, 0.9956309075285875, 0.9651867080548184, 0.2894220283533261, 0.9939633550142414, 0.990764119287747, nan, 0.7544238007849651, nan, 0.9994437781370044, 0.9793874292350123, 0.9435276043215383, 0.8562052106355904, 0.509192016257345, 0.9660909135391692, 0.9984664023442136, 0.8681420009220839, nan, 0.8595590768170858, 0.9457274826789839, nan, 0.9692161450716584, nan, nan, nan, nan, 0.9648179403645117, nan, nan, nan, nan, nan, 0.8887587340731607, nan, nan, 0.6114649681528662, 0.7072069748245976, nan, nan, nan, 0.0, nan, 0.25155555555555553, nan, nan, nan, 0.9709021549450103, nan, nan, 0.9901446737459403, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7222929936305732, 0.9736438026006337, nan, nan, 0.927122381477398, nan, 0.8711022840119166, nan, nan, 0.8114088509947219, 0.8491167101071532, nan, nan, 0.8845726970033296, nan, nan, nan, nan, nan, nan, 0.9772585019820572, 0.0, nan, 0.9741089246762891, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1981191222570533, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.9049525816649104, 0.7732506643046945, nan, nan, nan, 0.9318961083490486, nan, nan, nan, 0.25244170699516055, 0.8310154815009184, nan, nan, 0.9461770623742455, 0.8402298271256489, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0312 | 77.0 | 1540 | 0.2064 | 0.6462 | 0.7539 | 0.9441 | [0.8738581847592783, 0.9790482518912288, 0.9612313090633597, 0.9272546318104703, 0.2534637326813366, 0.9660901370032856, 0.9841409609097637, nan, 0.7064123283661861, nan, 0.8489371124889282, 0.8836554208974778, 0.9220820189274448, 0.7697989828045532, 0.4864302247906666, 0.8978501954367785, 0.9757046137744225, 0.5930980392156863, nan, 0.8039931509062126, 0.4673913043478261, nan, 0.9275169197320896, nan, 0.0, nan, nan, 0.9223742575746184, nan, nan, nan, nan, nan, 0.6796582433835793, nan, nan, 0.5696253095789726, 0.7041967044284243, 0.0, nan, nan, 0.0, nan, 0.23466666666666666, nan, nan, nan, 0.8711335508448317, nan, nan, 0.9737750863320154, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33762886597938147, nan, 0.6837352117763931, 0.8928401165357112, nan, nan, 0.8262204178631548, 0.0, 0.8649423415206202, nan, nan, 0.788228730822873, 0.8396660909614277, nan, nan, 0.8717573986116186, nan, nan, nan, nan, nan, nan, 0.8258565877781703, 0.0, nan, 0.9690005375629325, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.22435338564370821, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8970526315789473, 0.6832504145936982, nan, nan, nan, 0.9313107940180829, nan, nan, nan, 0.23663921499912388, 0.8150470219435737, 0.0, nan, 0.9092154177264723, 0.8228982083087222, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9601164193309208, 0.9881040201808643, 0.9958820807720272, 0.9663050739856274, 0.271319520174482, 0.9937507970922076, 0.9916419291050588, nan, 0.7436003911430217, nan, 0.9995828336027534, 0.9790245318623894, 0.941594806224601, 0.8515169780992565, 0.4932654893744388, 0.9666046875764545, 0.9985059594266049, 0.8715998155832181, nan, 0.8572511195315191, 0.9434180138568129, nan, 0.9670225212050307, nan, nan, nan, nan, 0.9631039761353377, nan, nan, nan, nan, nan, 0.8876900945334978, nan, nan, 0.5975527991954408, 0.7055045398266612, nan, nan, nan, 0.0, nan, 0.23466666666666666, nan, nan, nan, 0.9697861288097074, nan, nan, 0.9904704696551654, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4334160463192721, nan, 0.714413776834159, 0.9828373729324185, nan, nan, 0.9199558985667035, nan, 0.8789142667990731, nan, nan, 0.819500029000638, 0.8447726614538082, nan, nan, 0.8827229004809471, nan, nan, nan, nan, nan, nan, 0.9755894012100981, 0.0, nan, 0.9725251775189121, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.24200626959247648, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8979978925184404, 0.7298494242692648, nan, nan, nan, 0.9315103720212583, nan, nan, nan, 0.23765948086229652, 0.8186827604303333, nan, nan, 0.935010060362173, 0.842598659341767, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0356 | 78.0 | 1560 | 0.2083 | 0.6455 | 0.7534 | 0.9440 | [0.8742726194957097, 0.9803628547301619, 0.9608517775659033, 0.9274507515885729, 0.25570187347271245, 0.9664861065167053, 0.9838662845153797, nan, 0.7075762202880846, nan, 0.8534409830956232, 0.8882407530160413, 0.9201362055641422, 0.7738359201773836, 0.48817943018791676, 0.9060344034587434, 0.9756974279232807, 0.5955639452571968, nan, 0.8033265533757628, 0.44408427876823336, nan, 0.9267822736030829, nan, 0.0, nan, nan, 0.9190803674514332, nan, nan, nan, nan, nan, 0.6852318548387096, nan, nan, 0.5611102349787099, 0.7038391224862889, 0.0, nan, nan, 0.0, nan, 0.2648888888888889, nan, nan, nan, 0.8656489654174504, nan, nan, 0.9715739497986738, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3414167198468411, nan, 0.6808346912909651, 0.8936809183637203, nan, nan, 0.8229280397022333, 0.0, 0.8654171547572563, nan, nan, 0.7774203474223055, 0.8477259643062752, nan, nan, 0.8701631457854005, nan, nan, nan, nan, nan, nan, 0.8247878359264498, 0.0, nan, 0.96807760345007, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1777064487890283, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8972631578947369, 0.6889991728701406, nan, nan, nan, 0.9348785399083158, nan, nan, nan, 0.22689443714410862, 0.8301148225469729, 0.0, nan, 0.9095242727891818, 0.8067425698654443, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9613525447362306, 0.9881917182107538, 0.9958556414832441, 0.9680841247402475, 0.2738640494365685, 0.9936445181311907, 0.9914673110231204, nan, 0.7448729454944878, nan, 0.9995306878030975, 0.9725649586296995, 0.9441223114282883, 0.8648449534525484, 0.49477780053245957, 0.9638890248079464, 0.9985698593289293, 0.8727524204702628, nan, 0.8435067171891147, 0.9491916859122402, nan, 0.9672053231939164, nan, nan, nan, nan, 0.958197589983122, nan, nan, nan, nan, nan, 0.8940402794903411, nan, nan, 0.5963794837412001, 0.7050918283120099, nan, nan, nan, 0.0, nan, 0.2648888888888889, nan, nan, nan, 0.9711456515563491, nan, nan, 0.9875688498386292, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4425144747725393, nan, 0.7096485020051899, 0.9774922739023111, nan, nan, 0.9141124586549063, nan, 0.8803707381661702, nan, nan, 0.8059857316860971, 0.8528815522733855, nan, nan, 0.8794989693990803, nan, nan, nan, nan, nan, nan, 0.9732943876486543, 0.0, nan, 0.9714809486239384, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.19090909090909092, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8982086406743941, 0.7378210806023029, nan, nan, nan, 0.9352391565232299, nan, nan, nan, 0.2278926528816542, 0.8346890579900289, nan, nan, 0.9405432595573441, 0.8249584194345043, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0293 | 79.0 | 1580 | 0.2077 | 0.6459 | 0.7532 | 0.9443 | [0.8740452477015409, 0.9800527564207426, 0.9619062371250906, 0.9276852718379743, 0.2641432916525853, 0.9665439808113808, 0.984111017016558, nan, 0.711767532434688, nan, 0.8489223501623856, 0.8874229264735132, 0.9213180738082537, 0.7702694520133212, 0.4891501605372958, 0.9137275802938314, 0.9755936616641253, 0.5962673472643165, nan, 0.7982633675001632, 0.4558740865654862, nan, 0.927966398319916, nan, 0.0, nan, nan, 0.9178290291396216, nan, nan, nan, nan, nan, 0.6862128372504702, nan, nan, 0.5566272557560672, 0.7046829544576887, 0.0, nan, nan, 0.0, nan, 0.24977777777777777, nan, nan, nan, 0.8691916439600363, nan, nan, 0.9727399276937098, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33672172808132145, nan, 0.6924641743140003, 0.890537509052691, nan, nan, 0.8269364968597348, 0.0, 0.8610659204789484, nan, nan, 0.783057677776845, 0.8457109959700633, nan, nan, 0.8695538606206248, nan, nan, nan, nan, nan, nan, 0.8224841660802252, 0.0, nan, 0.9695403893281547, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17463556851311954, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8911349757843756, 0.6895127993393889, nan, nan, nan, 0.9341502077888694, nan, nan, nan, 0.2187061711079944, 0.8299426186750131, 0.0, nan, 0.9106934164394235, 0.8276065767451019, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9599127961665725, 0.9889964766026815, 0.9953780818295987, 0.9695752793146595, 0.2841148673209742, 0.9935807507545806, 0.9926660405585891, nan, 0.7539951508981555, nan, 0.9995654516695347, 0.9714762665118305, 0.9415452472990385, 0.8519858013528899, 0.49679421540982055, 0.9646963840093947, 0.9987189591010197, 0.8616874135546335, nan, 0.842369962108164, 0.9364896073903002, nan, 0.9692892658672126, nan, nan, nan, nan, 0.9564967094503539, nan, nan, nan, nan, nan, 0.8922318125770653, nan, nan, 0.5997318136104592, 0.706046223689641, nan, nan, nan, 0.0, nan, 0.24977777777777777, nan, nan, nan, 0.9709224463292886, nan, nan, 0.9889025768420195, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43837882547559964, nan, 0.7227176220806794, 0.9799218643705417, nan, nan, 0.9145534729878722, nan, 0.8760013240648792, nan, nan, 0.81146685227075, 0.8508543295684912, nan, nan, 0.8797103747159241, nan, nan, nan, nan, nan, nan, 0.9753807636136032, 0.0, nan, 0.9731401123126189, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1877742946708464, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.891886195995785, 0.7395925597874224, nan, nan, nan, 0.9345105434596263, nan, nan, nan, 0.21953365596128466, 0.8349514563106796, nan, nan, 0.9407444668008048, 0.8473363237740034, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0312 | 80.0 | 1600 | 0.2061 | 0.6447 | 0.7519 | 0.9440 | [0.8735173563414165, 0.9801301798159879, 0.9610616590176273, 0.9290301362887312, 0.256683403446872, 0.9663924600057873, 0.9839146978805359, nan, 0.7062085945630378, nan, 0.8462659112648075, 0.8873024306016736, 0.9254773197130001, 0.7690532580312139, 0.49035299956572986, 0.9065540960224527, 0.9757062869274846, 0.5946759626049755, nan, 0.8014874696847211, 0.47380675203725264, nan, 0.9277989710565919, nan, 0.0, nan, nan, 0.9207306393093957, nan, nan, nan, nan, nan, 0.682684864856326, nan, nan, 0.5669055682444673, 0.7031459170013387, 0.0, nan, nan, 0.0, nan, 0.24533333333333332, nan, nan, nan, 0.8709783202769175, nan, nan, 0.9737107785807819, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3413858868404323, nan, 0.6774515135037158, 0.8936147775805403, nan, nan, 0.8276206655341262, 0.0, 0.8474675705625448, nan, nan, 0.7843235844576664, 0.8440287769784173, nan, nan, 0.8698515575998328, nan, nan, nan, nan, nan, nan, 0.8257188216616688, 0.0, nan, 0.9689308154288125, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.16695957820738136, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8905263157894737, 0.6889250814332247, nan, nan, nan, 0.9253609837610866, nan, nan, nan, 0.2138160778469361, 0.8099346405228758, 0.0, nan, 0.9087897096082636, 0.8221336090188549, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9591658090147516, 0.9880937027655832, 0.9958953004164188, 0.9696675537313929, 0.27502726281352236, 0.9938358202610211, 0.9933456352558627, nan, 0.7453819672350743, nan, 0.9996002155359719, 0.9697343591232399, 0.9428833382892259, 0.8481682405733039, 0.4980544747081712, 0.9641092136810686, 0.998454230934247, 0.8651452282157677, nan, 0.8538064071650017, 0.9399538106235565, nan, 0.9692161450716584, nan, nan, nan, nan, 0.9628815532964373, nan, nan, nan, nan, nan, 0.8881216605014386, nan, nan, 0.6032517599731814, 0.7045243499793644, nan, nan, nan, 0.0, nan, 0.24533333333333332, nan, nan, nan, 0.9700904995738809, nan, nan, 0.9917532910477392, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4441687344913151, nan, 0.7053550365652277, 0.9882407821337635, nan, nan, 0.9131201764057332, nan, 0.8607083747103608, nan, nan, 0.8142799141581115, 0.8494063133507095, nan, nan, 0.8795518207282913, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9724439597159698, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1786833855799373, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8914646996838778, 0.7493356953055802, nan, nan, nan, 0.9256386079204526, nan, nan, nan, 0.21460624725032995, 0.8129099973760168, nan, nan, 0.9382293762575453, 0.8416914470036793, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0316 | 81.0 | 1620 | 0.2077 | 0.6441 | 0.7501 | 0.9442 | [0.8742621056967808, 0.9789268701572179, 0.9607114681202037, 0.9267659530823941, 0.24950616443021592, 0.9678214233946949, 0.9843254757836092, nan, 0.701719989305766, nan, 0.843099025141098, 0.8855750872341827, 0.9243987927173596, 0.7727135344254368, 0.519132217287325, 0.9142731195768178, 0.9757408872209964, 0.5920230658337338, nan, 0.8016587601252138, 0.4707585408222351, nan, 0.9271560340897135, nan, 0.0, nan, nan, 0.9211370298915086, nan, nan, nan, nan, nan, 0.68297814686708, nan, nan, 0.5623811033608117, 0.7023659346600417, 0.0, nan, nan, 0.0, nan, 0.21688888888888888, nan, nan, nan, 0.8698882050908561, nan, nan, 0.9717978965889551, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3282345442957298, nan, 0.6766640954671265, 0.8954011672878138, nan, nan, 0.8280502392344498, 0.0, 0.8576264566108978, nan, nan, 0.7837785034469285, 0.8303571428571429, nan, nan, 0.8701604054548304, nan, nan, nan, nan, nan, nan, 0.8293333333333334, 0.0, nan, 0.9679142530091227, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1804138735062664, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8800842992623814, 0.6848281642917016, nan, nan, nan, 0.9239242242413852, nan, nan, nan, 0.22230014025245443, 0.8249412992434124, 0.0, nan, 0.9086543173576399, 0.8033013057403301, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9605911711651893, 0.9880421156891775, 0.9960737656157049, 0.9671982903396068, 0.2663031624863686, 0.9934744717935637, 0.9922412938727931, nan, 0.7383226394116781, nan, 0.9996697432688464, 0.9762665118304543, 0.9410744375061949, 0.8561382358850713, 0.5266151010570425, 0.9641092136810686, 0.9984451023767721, 0.8520055325034578, nan, 0.8557009989665862, 0.9387990762124712, nan, 0.9665106756361509, nan, nan, nan, nan, 0.9620049456372415, nan, nan, nan, nan, nan, 0.8805589806822852, nan, nan, 0.5946195105598391, 0.7037247214197276, nan, nan, nan, 0.0, nan, 0.21688888888888888, nan, nan, nan, 0.9694411752769774, nan, nan, 0.9896763421264292, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.42597187758478083, nan, 0.7036093418259023, 0.9899900872708897, nan, nan, 0.9158765159867696, nan, 0.8721615359152598, nan, nan, 0.8143959167101676, 0.8349261511728931, nan, nan, 0.8801860366788224, nan, nan, nan, nan, nan, nan, 0.9732943876486543, 0.0, nan, 0.9712721028449436, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.19404388714733542, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8800842992623814, 0.7236492471213464, nan, nan, nan, 0.9239242242413852, nan, nan, nan, 0.2231412230532336, 0.8297034898976646, nan, nan, 0.9326961770623743, 0.8216823748802984, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0277 | 82.0 | 1640 | 0.2077 | 0.6441 | 0.7519 | 0.9437 | [0.8733909386535597, 0.9791553760417199, 0.9612271813198993, 0.9262147740300916, 0.2538084874863983, 0.9670038685119675, 0.9841140529531568, nan, 0.7083884591899043, nan, 0.8400525854513584, 0.8860492688710315, 0.9269240106822044, 0.774405050995629, 0.4957355084824613, 0.9185185185185185, 0.9757158703300036, 0.5944913230377329, nan, 0.8052795636080265, 0.4485213581599124, nan, 0.9281564949608063, nan, 0.0, nan, nan, 0.9200054936136741, nan, nan, nan, nan, nan, 0.6815700464091874, nan, nan, 0.571631093959203, 0.7023659346600417, 0.0, nan, nan, 0.0, nan, 0.23377777777777778, nan, nan, nan, 0.8685352260969013, nan, nan, 0.9714145981175953, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.342156229825694, nan, 0.6801557124751041, 0.8929470367697534, nan, nan, 0.8224530168150346, 0.0, 0.8547798905965095, nan, nan, 0.7817619525242394, 0.8349654377880185, nan, nan, 0.8707554069585205, nan, nan, nan, nan, nan, nan, 0.8247568523430593, 0.0, nan, 0.9693627450980392, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.15804764671702498, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8891464699683878, 0.6965012205044752, nan, nan, nan, 0.9193023056484101, nan, nan, nan, 0.24436848102375316, 0.8140020898641588, 0.0, nan, 0.9113516676418958, 0.7878489002860243, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9588404546108473, 0.9879595763669285, 0.9960175821270408, 0.9661205251521605, 0.271319520174482, 0.9935807507545806, 0.9919817264536955, nan, 0.7459177795935863, nan, 0.9996349794024091, 0.976339091304979, 0.9460798889880068, 0.8543299176210568, 0.503599615620914, 0.9647208494397417, 0.998369031064481, 0.8607653296449977, nan, 0.8543231140199793, 0.9457274826789839, nan, 0.969691430242761, nan, nan, nan, nan, 0.9640852533657809, nan, nan, nan, nan, nan, 0.890341142622277, nan, nan, 0.60828025477707, 0.7037247214197276, nan, nan, nan, 0.0, nan, 0.23377777777777778, nan, nan, nan, 0.9708412807921757, nan, nan, 0.9908980767860234, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43837882547559964, nan, 0.7089407879216797, 0.986044432350483, nan, nan, 0.9167585446527012, nan, 0.86898378020523, nan, nan, 0.8137289020358448, 0.8395598030697944, nan, nan, 0.8809259552877755, nan, nan, nan, nan, nan, nan, 0.9730857500521594, 0.0, nan, 0.9728268436441267, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.17053291536050158, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8891464699683878, 0.7581930912311781, nan, nan, nan, 0.9193811074918566, nan, nan, nan, 0.2453145622525297, 0.8176331671477303, nan, nan, 0.9401408450704225, 0.8052013507383701, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0049 | 83.0 | 1660 | 0.2097 | 0.6428 | 0.7506 | 0.9436 | [0.8737039435975594, 0.979669221873113, 0.9610905489900287, 0.9270900831040624, 0.2652291105121294, 0.9679242938789033, 0.9841324853173972, nan, 0.7086273211238188, nan, 0.8444369392519934, 0.886755098000396, 0.9261941672318573, 0.7733155921132212, 0.49561628105457534, 0.9192980820768567, 0.9757085501858737, 0.5984302418708954, nan, 0.805322947095099, 0.4464964693101575, nan, 0.9284863349684653, nan, 0.0, nan, nan, 0.9189368687501567, nan, nan, nan, nan, nan, 0.6811163054256977, nan, nan, 0.5629692700673427, 0.7007466529351184, 0.0, nan, nan, 0.0, nan, 0.2568888888888889, nan, nan, nan, 0.8654395544787185, nan, nan, 0.969962814901953, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3346055979643766, nan, 0.6772532188841202, 0.8917903239968391, nan, nan, 0.8282949216801357, 0.0, 0.8492829204693612, nan, nan, 0.7751026450297461, 0.8292050691244239, nan, nan, 0.8707806458355105, nan, nan, nan, nan, nan, nan, 0.8249075215782984, 0.0, nan, 0.9691663824547814, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1257327080890973, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8911349757843756, 0.6843826655764513, nan, nan, nan, 0.9160668380462725, nan, nan, nan, 0.239105655414292, 0.8217640918580376, 0.0, nan, 0.910049809551714, 0.7626708779548932, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.959846397308633, 0.9876087842473703, 0.9959300019829467, 0.9663936574256914, 0.2861504907306434, 0.9935594949623773, 0.9916891231812582, nan, 0.7453149906902603, nan, 0.9995828336027534, 0.9752503991871099, 0.9474923183665378, 0.8563391601366285, 0.5031427716252619, 0.9651122963252924, 0.9983051311621567, 0.8612263715998156, nan, 0.8547020323802963, 0.9491916859122402, nan, 0.96881398069611, nan, nan, nan, nan, 0.9594667085344951, nan, nan, nan, nan, nan, 0.8892725030826141, nan, nan, 0.6095373784780422, 0.7020480808914569, nan, nan, nan, 0.0, nan, 0.2568888888888889, nan, nan, nan, 0.9712268170934621, nan, nan, 0.9905824619989615, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43507030603804797, nan, 0.7072894550601557, 0.987055141985267, nan, nan, 0.9153252480705623, nan, 0.8624958622972526, nan, nan, 0.8047967055275216, 0.8337677381986678, nan, nan, 0.8807674013001427, nan, nan, nan, nan, nan, nan, 0.9770498643855623, 0.0, nan, 0.9726354016800483, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.13448275862068965, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.891886195995785, 0.7413640389725421, nan, nan, nan, 0.9163809360534888, nan, nan, nan, 0.23994720633523978, 0.8262923117292049, nan, nan, 0.9374245472837022, 0.778892192933824, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0281 | 84.0 | 1680 | 0.2072 | 0.6443 | 0.7524 | 0.9438 | [0.8738443668514337, 0.9796780612062219, 0.960843882664013, 0.9272989787035267, 0.2616828741220962, 0.9674702534919813, 0.9840758213905898, nan, 0.7107714202213496, nan, 0.8542480650097306, 0.8860216473072862, 0.9237777616665458, 0.7737296840070086, 0.48627140729709606, 0.9218160957220127, 0.9755614738018523, 0.5937749401436552, nan, 0.805702183121538, 0.43927233814874267, nan, 0.9280787142406947, nan, 0.0, nan, nan, 0.919829793643621, nan, nan, nan, nan, nan, 0.6822607156404097, nan, nan, 0.5652580695462343, 0.7042347792508689, 0.0, nan, nan, 0.0, nan, 0.25422222222222224, nan, nan, nan, 0.8645684655549333, nan, nan, 0.9698895551276551, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34594264795607077, nan, 0.6792997653853096, 0.8903436123348017, nan, nan, 0.8266427002196924, 0.0, 0.8613532856213403, nan, nan, 0.7719667031026655, 0.8450460829493087, nan, nan, 0.869942045632538, nan, nan, nan, nan, nan, nan, 0.8226288931902165, 0.0, nan, 0.9684136288485773, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11665682220909628, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8949473684210526, 0.6916802610114192, nan, nan, nan, 0.9211822660098522, nan, nan, nan, 0.24859796705222573, 0.8255541069100392, 0.0, nan, 0.9072588534533359, 0.8002070903801588, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9607549550147737, 0.988047274396818, 0.9956061206953533, 0.9671761444795908, 0.2817157397310069, 0.993772052884411, 0.9913115705716619, nan, 0.7492933974522122, nan, 0.9994959239366602, 0.9743794454928146, 0.9471949648131629, 0.8576786551470096, 0.4938168528174672, 0.9631550618975387, 0.9987067876910531, 0.8575380359612724, nan, 0.8517740268687565, 0.9480369515011547, nan, 0.9690333430827728, nan, nan, nan, nan, 0.9587863563213879, nan, nan, nan, nan, nan, 0.8871352240032881, nan, nan, 0.6076097888032183, 0.7056335121749897, nan, nan, nan, 0.0, nan, 0.25422222222222224, nan, nan, nan, 0.9710036118664015, nan, nan, 0.9897374288594089, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46898263027295284, nan, 0.7103562160887001, 0.9820793407063305, nan, nan, 0.9126791620727673, nan, 0.8764647467725919, nan, nan, 0.7987645728206021, 0.8496959165942659, nan, nan, 0.8806088473125099, nan, nan, nan, nan, nan, nan, 0.9753807636136032, 0.0, nan, 0.9718406274655405, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1238244514106583, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8958904109589041, 0.7511071744906997, nan, nan, nan, 0.9216955254585977, nan, nan, nan, 0.24962604487461504, 0.8307530831802676, nan, nan, 0.9329979879275654, 0.8179527241570486, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0377 | 85.0 | 1700 | 0.2058 | 0.6449 | 0.7527 | 0.9445 | [0.8757444488200903, 0.9792639096937132, 0.9614307350774912, 0.9266387726638773, 0.26414076720825186, 0.9675883765212352, 0.9838634949011826, nan, 0.7159368014631172, nan, 0.8466997452848246, 0.8856049610766592, 0.9199893865933376, 0.7692401363857769, 0.5241113211640376, 0.9181343999251514, 0.9755931538365505, 0.5942213049690427, nan, 0.8004889975550122, 0.43793659322944656, nan, 0.927004234174336, nan, 0.0, nan, nan, 0.9213997341959427, nan, nan, nan, nan, nan, 0.6817995264404104, nan, nan, 0.5603305785123966, 0.7040002059308073, 0.0, nan, nan, 0.0, nan, 0.24888888888888888, nan, nan, nan, 0.8607735998274498, nan, nan, 0.9704072868186085, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3375796178343949, nan, 0.6833626641397049, 0.8898902959540019, nan, nan, 0.8251197126895451, 0.0, 0.8547147694712165, nan, nan, 0.7681354411929588, 0.8378456221198156, nan, nan, 0.869235595524417, nan, nan, nan, nan, nan, nan, 0.8203866432337434, 0.0, nan, 0.9694943271470433, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.16399182958856143, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8911349757843756, 0.6938775510204082, nan, nan, nan, 0.9251499571550985, nan, nan, nan, 0.2343092566619916, 0.8288993218570684, 0.0, nan, 0.9101673353557099, 0.8079424517146235, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9597689319743701, 0.9881246550114265, 0.9954722717958887, 0.9686525351473253, 0.2848418756815703, 0.9937082855078009, 0.9921563445356338, nan, 0.7550935662331052, nan, 0.9995828336027534, 0.97430686601829, 0.9450887104767569, 0.8461589980577322, 0.5328218781014193, 0.960341537407643, 0.9986976591335782, 0.8628400184416782, nan, 0.8458491215983466, 0.941108545034642, nan, 0.9685214975138929, nan, nan, nan, nan, 0.9615077651738169, nan, nan, nan, nan, nan, 0.8876284422523634, nan, nan, 0.5966309084813946, 0.7054529508873297, nan, nan, nan, 0.0, nan, 0.24888888888888888, nan, nan, nan, 0.9717543930846962, nan, nan, 0.9892283727512446, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.43837882547559964, nan, 0.7145081387119604, 0.9806798965966297, nan, nan, 0.9119073869900772, nan, 0.8689175769612711, nan, nan, 0.794733484136651, 0.8424558355053576, nan, nan, 0.8786533481317055, nan, nan, nan, nan, nan, nan, 0.9739203004381389, 0.0, nan, 0.9730995034111477, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1761755485893417, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.891886195995785, 0.7528786536758193, nan, nan, nan, 0.9254671695525458, nan, nan, nan, 0.2351957765068192, 0.8339018630280767, nan, nan, 0.9357142857142857, 0.826470439997984, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0209 | 86.0 | 1720 | 0.2077 | 0.6334 | 0.7532 | 0.9445 | [0.8759483319956993, 0.9790471222396105, 0.9606095667944832, 0.9268888519640667, 0.25204081632653064, 0.9669768403639372, 0.9836291137818712, nan, 0.7131189187815969, nan, 0.8522520119158775, 0.8846356568717881, 0.9152729454109179, 0.7695902883156297, 0.521812574581183, 0.9174699774685837, 0.9756964885677757, 0.5942304644159138, nan, 0.8017114192977279, 0.4379679144385027, nan, 0.9270323212536729, nan, 0.0, nan, nan, 0.9227402139694724, nan, nan, nan, nan, nan, 0.684721189002675, nan, nan, 0.5642453655819875, 0.7048661174047374, 0.0, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.8658220975477332, nan, nan, 0.9706175994888281, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3426930440331844, nan, 0.6748552298226566, 0.891729797979798, nan, nan, 0.8258726486268987, 0.0, 0.8556821882123087, nan, nan, 0.776294803995635, 0.8448474381116868, nan, nan, 0.8689485528863767, nan, nan, nan, nan, nan, nan, 0.821917808219178, 0.0, nan, 0.9680192393296296, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1853686971728359, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8926315789473684, 0.6882690730106645, nan, nan, nan, 0.9245137520349584, nan, nan, nan, 0.24046972219787924, 0.8308133472367049, 0.0, nan, 0.9053138198605245, 0.8047975568909467, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9602104843796687, 0.9875675145862458, 0.996010972304845, 0.9676891902366285, 0.2693565976008724, 0.9939846108064447, 0.9918967771165363, nan, 0.7520260404806237, nan, 0.9995306878030975, 0.9745246044418638, 0.9451878283278818, 0.8491728618310896, 0.5304273854345531, 0.9663355678426383, 0.998530302246538, 0.8642231443061319, nan, 0.8423355149844988, 0.9457274826789839, nan, 0.9689236618894413, nan, nan, nan, nan, 0.9625806282791014, nan, nan, nan, nan, nan, 0.8890258939580764, nan, nan, 0.5943680858196446, 0.7061751960379694, nan, nan, nan, 0.0, nan, 0.2577777777777778, nan, nan, nan, 0.9707601152550627, nan, nan, 0.9897985155923886, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4441687344913151, nan, 0.703798065581505, 0.9883574024762386, nan, nan, 0.9051819184123484, nan, 0.8698444223766965, nan, nan, 0.8045937010614234, 0.8499855198378222, nan, nan, 0.8774906188890651, nan, nan, nan, nan, nan, nan, 0.9764239515960776, 0.0, nan, 0.9714055320926347, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1993730407523511, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8935721812434141, 0.7431355181576617, nan, nan, nan, 0.924909994856849, nan, nan, nan, 0.2414430268367796, 0.8362634479139334, nan, nan, 0.9272635814889336, 0.8234967995564739, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0119 | 87.0 | 1740 | 0.2100 | 0.6436 | 0.7509 | 0.9442 | [0.8746543248972808, 0.9789932944263435, 0.9606504500369719, 0.9272285761820312, 0.25499524262607043, 0.9677259082910672, 0.983442220932627, nan, 0.7102508067999899, nan, 0.8452652155381473, 0.8845824833848785, 0.9164099354962935, 0.7709608972415883, 0.5099578111428217, 0.9166242303597055, 0.9757316054133299, 0.5941843745007189, nan, 0.8021877848678214, 0.43670212765957445, nan, 0.9267490288034158, nan, 0.0, nan, nan, 0.9222450307915686, nan, nan, nan, nan, nan, 0.6828651774147683, nan, nan, 0.5653280318091451, 0.703568302353123, 0.0, nan, nan, 0.0, nan, 0.24888888888888888, nan, nan, nan, 0.8677933488237003, nan, nan, 0.9728962054687657, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3365570599613153, nan, 0.6700063469036177, 0.8921286885820425, nan, nan, 0.8267550467008135, 0.0, 0.8515212717440875, nan, nan, 0.7819441727905222, 0.8347725964306275, nan, nan, 0.8684554631518385, nan, nan, nan, nan, nan, nan, 0.8300212916962385, 0.0, nan, 0.9685742199920222, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1138353765323993, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8860092709650231, 0.6957585644371941, nan, nan, nan, 0.929508056222146, nan, nan, nan, 0.24377847879425166, 0.8183716075156576, 0.0, nan, 0.9068255687973997, 0.8111702127659575, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9597434790788265, 0.9873869598188262, 0.9961200343710754, 0.9668255016960038, 0.2727735368956743, 0.9936232623389873, 0.991726878442218, nan, 0.7488111663295514, nan, 0.9996523613356277, 0.9756858760342575, 0.9434780453959758, 0.8517179023508138, 0.5179350651396525, 0.9688310417380241, 0.998420759556839, 0.8573075149838635, nan, 0.8487771271098863, 0.9480369515011547, nan, 0.9681193331383445, nan, nan, nan, nan, 0.9640067511873455, nan, nan, nan, nan, nan, 0.8882860665844636, nan, nan, 0.5957928260140798, 0.70491126702435, nan, nan, nan, 0.0, nan, 0.24888888888888888, nan, nan, nan, 0.9700296254210462, nan, nan, 0.9903788395556958, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4317617866004963, nan, 0.6972870960132107, 0.9895624793484811, nan, nan, 0.9076074972436604, nan, 0.8652763985435287, nan, nan, 0.8115828548228061, 0.8398494063133507, nan, nan, 0.877543470218276, nan, nan, nan, nan, nan, nan, 0.9760066764030878, 0.0, nan, 0.9719624541699541, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12225705329153605, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8861959957850368, 0.7555358724534986, nan, nan, nan, 0.9296674095662609, nan, nan, nan, 0.24478662560492742, 0.8228811335607452, nan, nan, 0.9262575452716297, 0.8300992893503352, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.046 | 88.0 | 1760 | 0.2084 | 0.6323 | 0.7524 | 0.9440 | [0.8738586987730413, 0.9786507607934537, 0.9606974094172583, 0.9266005703812213, 0.2573499525809511, 0.9676183771920743, 0.9836130795802402, nan, 0.7010757347880394, nan, 0.8520960834580561, 0.8835270172674151, 0.9183939701406001, 0.7649463152757443, 0.5070914282167464, 0.9175750266709959, 0.9757033970635904, 0.594015748031496, nan, 0.802283341982356, 0.44299674267100975, nan, 0.9275610951614033, nan, 0.0, nan, nan, 0.922552392868314, nan, nan, nan, nan, nan, 0.6806774514920063, nan, nan, 0.5690012763241863, 0.7056672760511883, 0.0, 0.0, nan, 0.0, nan, 0.2248888888888889, nan, nan, nan, 0.8666171650035315, nan, nan, 0.9729405291852148, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34440894568690095, nan, 0.6745738252006855, 0.891617407167038, nan, nan, 0.8212246505217563, 0.0, 0.8585569158635061, nan, nan, 0.7814876957494408, 0.8406672418751797, nan, nan, 0.8707319619922732, nan, nan, nan, nan, nan, nan, 0.8207696362677912, 0.0, nan, 0.9679387106100029, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.18735431235431235, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8854495683301747, 0.682526661197703, nan, nan, nan, 0.9306489612336689, nan, nan, nan, 0.22970366473785728, 0.8249412992434124, 0.0, nan, 0.907628946852324, 0.8116413059536121, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.959913902814205, 0.9877583867689467, 0.9961068147266838, 0.9665708243058195, 0.2761904761904762, 0.9933894486247502, 0.9917599142955577, nan, 0.7368089694988815, nan, 0.9994959239366602, 0.9767019886776019, 0.9420160570918823, 0.8397963967584221, 0.5148001701350052, 0.967950286245535, 0.9983325168345815, 0.8695251267865376, nan, 0.852084050981743, 0.9422632794457275, nan, 0.9685946183094472, nan, nan, nan, nan, 0.9647263544896704, nan, nan, nan, nan, nan, 0.8845869297163995, nan, nan, 0.5978042239356353, 0.7069232356582749, nan, nan, nan, 0.0, nan, 0.2248888888888889, nan, nan, nan, 0.9709833204821233, nan, nan, 0.9902159416010833, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7057324840764331, 0.9880075414488134, nan, nan, 0.9197353914002205, nan, 0.8728235683548494, nan, nan, 0.8104518299402587, 0.8465102809151462, nan, nan, 0.8814544685798847, nan, nan, nan, nan, nan, nan, 0.9745462132276236, 0.0, nan, 0.9711618786838075, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.20156739811912225, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8861959957850368, 0.7369353410097431, nan, nan, nan, 0.9311674952854448, nan, nan, nan, 0.23053233611966564, 0.8297034898976646, nan, nan, 0.9311871227364185, 0.830704097575727, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0159 | 89.0 | 1780 | 0.2122 | 0.6189 | 0.7507 | 0.9437 | [0.8739680165762338, 0.9787646168942677, 0.9607463438269007, 0.9264891999858592, 0.2596928073617971, 0.9670252378982209, 0.9835581925719182, nan, 0.7023105171886157, nan, 0.846479930525913, 0.8842942607323647, 0.9150013194827628, 0.7669150160518505, 0.5194579676889206, 0.9169220767158911, 0.9757007955981858, 0.593539281793947, nan, 0.8001571246194639, 0.4401939655172414, nan, 0.9273281266416838, nan, 0.0, nan, nan, 0.921505295575753, nan, nan, nan, nan, nan, 0.6809217815200532, nan, nan, 0.5613494508967369, 0.7017896227629715, 0.0, nan, nan, 0.0, 0.0, 0.2328888888888889, nan, nan, nan, 0.8659939491657458, nan, nan, 0.9718242167552792, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.33814303638644916, nan, 0.6707945403597577, 0.8918354818907305, nan, nan, 0.8274661893396977, 0.0, 0.8543784206411259, nan, nan, 0.7766311491456218, 0.8377911993097498, nan, nan, 0.8686837017296337, nan, nan, nan, nan, nan, nan, 0.8207596709259584, 0.0, nan, 0.9684610981666387, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.14210526315789473, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.886923562855338, 0.6857610474631751, nan, nan, nan, 0.9290427628759962, nan, nan, nan, 0.22671106826746124, 0.8239289446185998, 0.0, nan, 0.9064585787200629, 0.7879863885190117, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9596925732877395, 0.9879389415363663, 0.9961051622711349, 0.9673164015930256, 0.27902580879680117, 0.9936232623389873, 0.9912171824192627, nan, 0.737786827053166, nan, 0.9996175974691905, 0.9762665118304543, 0.9450887104767569, 0.8479673163217467, 0.5277965941492462, 0.966139844399863, 0.9982229741448824, 0.8725218994928539, nan, 0.8420254908715122, 0.9434180138568129, nan, 0.9680462123427903, nan, nan, nan, nan, 0.9630647250461201, nan, nan, nan, nan, nan, 0.8841348129880806, nan, nan, 0.5954575930271538, 0.703002476269088, nan, nan, nan, 0.0, nan, 0.2328888888888889, nan, nan, nan, 0.9699687512682115, nan, nan, 0.988169536046263, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.445822994210091, nan, 0.7002594951639538, 0.988318529028747, nan, nan, 0.91742006615215, nan, 0.8681231380337636, nan, nan, 0.8053477176497883, 0.843614248479583, nan, nan, 0.8786004968024946, nan, nan, nan, nan, nan, nan, 0.9783016899645316, 0.0, nan, 0.9717478071193205, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1523510971786834, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8876712328767123, 0.7422497785651019, nan, nan, nan, 0.9292816732384708, nan, nan, nan, 0.22762868455785307, 0.8276043033324587, nan, nan, 0.9290744466800804, 0.8053021521092687, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0119 | 90.0 | 1800 | 0.2117 | 0.6181 | 0.7497 | 0.9437 | [0.8739575098067849, 0.9784397974771497, 0.9607283158691582, 0.9261512343495791, 0.255613594735771, 0.9685958293603084, 0.9834329944698605, nan, 0.7031152369900919, nan, 0.8437738395634059, 0.8834658083163633, 0.9193820902727492, 0.7661344180681198, 0.5164295235438172, 0.914827594192159, 0.9757241559816667, 0.5916982323232324, nan, 0.8024743610613707, 0.42700156985871274, nan, 0.9268796860766589, nan, 0.0, nan, nan, 0.9231058563923821, nan, nan, nan, nan, nan, 0.6798546947800679, nan, nan, 0.5654242664551943, 0.701153568853641, 0.0, nan, nan, 0.0, 0.0, 0.23022222222222222, nan, nan, nan, 0.8646394848232698, nan, nan, 0.9705282112845138, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3397755610972569, nan, 0.6719414051903427, 0.8922019561194819, nan, nan, 0.8283210515833499, 0.0, 0.8555982544128183, nan, nan, 0.7736424137352497, 0.8317031070195627, nan, nan, 0.8655154369440083, nan, nan, nan, nan, nan, nan, 0.8246387028551286, 0.0, nan, 0.9683583090513852, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12635590735854588, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8828240252897788, 0.6841243862520459, nan, nan, nan, 0.9234956283216184, nan, nan, nan, 0.2278902620738014, 0.81478578892372, 0.0, nan, 0.9044385395138274, 0.8066105117974484, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.96033000232396, 0.9879492589516474, 0.9961547359376033, 0.9665191506324489, 0.27393675027262815, 0.9932194022871232, 0.9911652689354432, nan, 0.7386039408998969, nan, 0.9997218890685022, 0.9761213528814051, 0.9438497373376945, 0.8451543767999464, 0.5248822445218103, 0.9665068258550668, 0.9982290598498657, 0.8642231443061319, nan, 0.8490527040992077, 0.9422632794457275, nan, 0.9672053231939164, nan, nan, nan, nan, 0.9633002315814264, nan, nan, nan, nan, nan, 0.8846074804767776, nan, nan, 0.5975527991954408, 0.702383408997111, nan, nan, nan, 0.0, nan, 0.23022222222222222, nan, nan, nan, 0.9699078771153768, nan, nan, 0.9877113855489152, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4507857733664185, nan, 0.7012031139419674, 0.9840424498046609, nan, nan, 0.9170893054024256, nan, 0.8696458126448195, nan, nan, 0.8023606519343426, 0.8372429771213438, nan, nan, 0.8741609851487765, nan, nan, nan, nan, nan, nan, 0.9762153139995827, 0.0, nan, 0.9716839931312944, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.13510971786833856, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8828240252897788, 0.7404782993799823, nan, nan, nan, 0.9234956283216184, nan, nan, nan, 0.22877254729432467, 0.8184203621096825, nan, nan, 0.9245472837022133, 0.8253112242326496, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0133 | 91.0 | 1820 | 0.2147 | 0.6286 | 0.7480 | 0.9432 | [0.8729263150270993, 0.978042240212478, 0.9606871737342962, 0.9266024893450097, 0.24815875613747954, 0.9676203883897148, 0.9834365332322396, nan, 0.7050248257120726, nan, 0.8473836995122527, 0.8858010429731336, 0.9202058269314394, 0.7660041458358737, 0.4889030196814368, 0.9140078721926372, 0.9757232034783954, 0.5941568751984757, nan, 0.8012912482065997, 0.4252214695153726, nan, 0.9272510852821734, nan, 0.0, nan, nan, 0.9214326107445806, nan, nan, nan, nan, nan, 0.6813299636708261, nan, nan, 0.5575214217708664, 0.7019594716378711, 0.0, nan, nan, 0.0, nan, 0.2311111111111111, nan, nan, nan, 0.8643056108237167, nan, nan, 0.9700780009388077, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34927627438640657, nan, 0.6761896137999729, 0.8890040218479426, nan, nan, 0.8293781916491438, 0.0, 0.8515986195220421, nan, nan, 0.7736529961394282, 0.8435432844406098, nan, nan, 0.8678840519045625, nan, nan, nan, nan, nan, nan, 0.8223510806536637, 0.0, nan, 0.9681953261410021, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11300264162019372, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.888139877817569, 0.6814814814814815, nan, nan, nan, 0.9203118975193865, nan, nan, nan, 0.219552827707146, 0.8179205851619644, 0.0, nan, 0.9061670109176748, 0.7801134961756724, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603454953908126, 0.9878202912606334, 0.9962406636261485, 0.9677630097700153, 0.26455834242093784, 0.9934532160013604, 0.9916608067355386, nan, 0.7437075536147241, nan, 0.9995654516695347, 0.973943968645667, 0.9438745168004757, 0.8414707655213984, 0.4965894232738386, 0.9657973283750061, 0.99831121686714, 0.8626094974642693, nan, 0.8465036169479848, 0.9422632794457275, nan, 0.9683386955250073, nan, nan, nan, nan, 0.9593358715704361, nan, nan, nan, nan, nan, 0.8864570489108097, nan, nan, 0.5889205497820985, 0.7032088320264135, nan, nan, nan, 0.0, nan, 0.2311111111111111, nan, nan, nan, 0.9696035063512033, nan, nan, 0.9889025768420195, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.45905707196029777, nan, 0.7046473224817174, 0.9838675192909483, nan, nan, 0.9132304299889746, nan, 0.8658060244952003, nan, nan, 0.8020126442781741, 0.8494063133507095, nan, nan, 0.8766449976216902, nan, nan, nan, nan, nan, nan, 0.9764239515960776, 0.0, nan, 0.9714867498955772, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.1206896551724138, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8885142255005268, 0.733392382639504, nan, nan, nan, 0.9206668952511572, nan, nan, nan, 0.22032556093268807, 0.8215691419574914, nan, nan, 0.926861167002012, 0.7967844362683333, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0094 | 92.0 | 1840 | 0.2154 | 0.6403 | 0.7476 | 0.9429 | [0.8718435580202635, 0.9786971965960491, 0.9610000589845412, 0.9265608039063854, 0.25239812232124637, 0.9664503059368281, 0.9833384035756909, nan, 0.7071672787134354, nan, 0.8515512773047019, 0.8845191040843214, 0.9212885763309892, 0.7646412213740458, 0.4633185470111983, 0.9143340047695122, 0.9757033248081841, 0.5955378486055777, nan, 0.803858835819921, 0.43037974683544306, nan, 0.9276564468710625, nan, 0.0, nan, nan, 0.9216203320023104, nan, nan, nan, nan, nan, 0.6801327210969933, nan, nan, 0.5572330058311367, 0.7021857220091137, 0.0, nan, nan, 0.0, nan, 0.22577777777777777, nan, nan, nan, 0.8634515656903312, nan, nan, 0.9702490222683374, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35375, nan, 0.6828664084316728, 0.8888947562971959, nan, nan, 0.8277622097678142, 0.0, 0.8542032949143713, nan, nan, 0.7707528687377554, 0.844559585492228, nan, nan, 0.8706220595922635, nan, nan, nan, nan, nan, nan, 0.8188494492044064, 0.0, nan, 0.9684231825408296, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10390372761960669, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8940829648347021, 0.6851239669421487, nan, nan, nan, 0.9199434592649705, nan, nan, nan, 0.21028160364944293, 0.8196078431372549, 0.0, nan, 0.9048696507624201, 0.7757055456877837, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603709482863562, 0.9878254499682739, 0.9961316015599181, 0.9672167452229534, 0.2697201017811705, 0.993772052884411, 0.9915711379907594, nan, 0.7480610290276345, nan, 0.9994785420034417, 0.9745246044418638, 0.9446426801466944, 0.8385908512490791, 0.4705808220041274, 0.96616430983021, 0.9983294739820898, 0.8614568925772246, nan, 0.8481915260075784, 0.9422632794457275, nan, 0.9690333430827728, nan, nan, nan, nan, 0.9602909814080675, nan, nan, nan, nan, nan, 0.8888409371146733, nan, nan, 0.5846463291987931, 0.7035441601320677, nan, nan, nan, 0.0, nan, 0.22577777777777777, nan, nan, nan, 0.9696440891197597, nan, nan, 0.9901141303794504, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4681555004135649, nan, 0.715310214673272, 0.9815351124414469, nan, nan, 0.9119073869900772, nan, 0.8684541542535584, nan, nan, 0.7986485702685459, 0.8496959165942659, nan, nan, 0.8802388880080334, nan, nan, nan, nan, nan, nan, 0.9770498643855623, 0.0, nan, 0.9717884160207918, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11097178683385579, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8948366701791359, 0.7342781222320638, nan, nan, nan, 0.9204954568832505, nan, nan, nan, 0.21091069071711394, 0.8226187352400944, nan, nan, 0.9253521126760563, 0.7923995766342422, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0151 | 93.0 | 1860 | 0.2154 | 0.6289 | 0.7482 | 0.9430 | [0.8714739520342054, 0.9783945886848747, 0.9608049108231622, 0.92599749874584, 0.2569279761501457, 0.9673640860081538, 0.9834985043325204, nan, 0.702122524673873, nan, 0.8532719988128803, 0.8854489164086687, 0.9183117603104587, 0.7667012384845342, 0.4668891130596725, 0.9161873267972334, 0.9757247917589786, 0.5927680404231802, nan, 0.800952039385739, 0.4303062302006336, nan, 0.9268566221568009, nan, 0.0, nan, nan, 0.921710641930058, nan, nan, nan, nan, nan, 0.6819622761134887, nan, nan, 0.5561065344317364, 0.7039697250540624, 0.0, nan, nan, 0.0, nan, 0.21511111111111111, nan, nan, nan, 0.8647029003874425, nan, nan, 0.9702388682977512, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3478802992518703, nan, 0.6812169909267368, 0.8924339907885948, nan, nan, 0.8256313005289949, 0.0, 0.8575607853335067, nan, nan, 0.7751273867517778, 0.842832469775475, nan, nan, 0.8690762632074485, nan, nan, nan, nan, nan, nan, 0.821749076192152, 0.0, nan, 0.9677201137861653, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.13329418672930124, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8792413066385669, 0.6900826446280992, nan, nan, nan, 0.9242664382094667, nan, nan, nan, 0.203893712181005, 0.8247126436781609, 0.0, nan, 0.9062346588119784, 0.7944813993594482, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9608800061972267, 0.9879389415363663, 0.9960341066825302, 0.9674455857764523, 0.2756815703380589, 0.9935807507545806, 0.9915097856917, nan, 0.7404390982278006, nan, 0.9994959239366602, 0.975613296559733, 0.9440479730399445, 0.8416716897729556, 0.47392050914475653, 0.9625189607085188, 0.998378159621956, 0.8653757491931766, nan, 0.8462280399586635, 0.941108545034642, nan, 0.9668762796139222, nan, nan, nan, nan, 0.9607227433894624, nan, nan, nan, nan, nan, 0.8841964652692149, nan, nan, 0.5827187395239691, 0.705349773008667, nan, nan, nan, 0.0, nan, 0.21511111111111111, nan, nan, nan, 0.9691368045128038, nan, nan, 0.9887702222538969, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46153846153846156, nan, 0.7120075489502241, 0.9867441544053335, nan, nan, 0.9120176405733186, nan, 0.873286991062562, nan, nan, 0.8029116640566093, 0.8479582971329279, nan, nan, 0.8781248348395962, nan, nan, nan, nan, nan, nan, 0.9743375756311288, 0.0, nan, 0.9709762379913677, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.14231974921630094, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8792413066385669, 0.7395925597874224, nan, nan, nan, 0.9247814160809189, nan, nan, nan, 0.2045754509458865, 0.8283914982944109, nan, nan, 0.9285714285714286, 0.8126606521848697, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0294 | 94.0 | 1880 | 0.2178 | 0.6154 | 0.7464 | 0.9425 | [0.8710517543965334, 0.9778187501276839, 0.9602541846501775, 0.9267156949404236, 0.24364406779661016, 0.9680432441388451, 0.9833608077477849, nan, 0.7053855615719195, nan, 0.8480549738991948, 0.8827925270403146, 0.9196303623992665, 0.7683200777123429, 0.4567801607746982, 0.9127450073964497, 0.9756935357236326, 0.5940451416680006, nan, 0.8009568132261529, 0.4263322884012539, nan, 0.9261827222747487, nan, 0.0, nan, nan, 0.9213912999849481, nan, nan, nan, nan, nan, 0.6801568429837961, nan, nan, 0.55888, 0.7020805438253167, 0.0, nan, nan, 0.0, 0.0, 0.21066666666666667, nan, nan, nan, 0.8657853217996845, nan, nan, 0.9699774356516704, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3452157598499062, nan, 0.673253863935095, 0.8892991512818269, nan, nan, 0.8287115827410151, 0.0, 0.8388696730405273, nan, nan, 0.773076815525293, 0.8436510221710337, nan, nan, 0.8616681501649301, nan, nan, nan, nan, nan, nan, 0.8206521739130435, 0.0, nan, 0.9675278118278327, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12031615925058547, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8891464699683878, 0.6881188118811881, nan, nan, nan, 0.9223122080815872, nan, nan, nan, 0.21897746207138472, 0.8127615062761506, 0.0, nan, 0.9054187192118227, 0.777898371978293, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9601031395593329, 0.9876500539084948, 0.9963216339480468, 0.9674160579630976, 0.25917848055252635, 0.9935169833779705, 0.990976492630645, nan, 0.7451274563647811, nan, 0.9996349794024091, 0.977427783422848, 0.9444692239072257, 0.8475654678186324, 0.4636809023456576, 0.966115378969516, 0.9982838311947152, 0.8554633471645919, nan, 0.8477781605235962, 0.9422632794457275, nan, 0.9669859608072536, nan, nan, nan, nan, 0.9610890868888279, nan, nan, nan, nan, nan, 0.8876284422523634, nan, nan, 0.585484411666108, 0.7033120099050764, nan, nan, nan, 0.0, nan, 0.21066666666666667, nan, nan, nan, 0.9687512682115174, nan, nan, 0.9891061992852852, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.456575682382134, nan, 0.7008256664307619, 0.9877548640401174, nan, nan, 0.9126791620727673, nan, 0.8509764978483946, nan, nan, 0.8017516385360478, 0.8485375036200405, nan, nan, 0.8697743248242693, nan, nan, nan, nan, nan, nan, 0.9766325891925725, 0.0, nan, 0.9707383858541793, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12884012539184952, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8891464699683878, 0.7387068201948627, nan, nan, nan, 0.9225098577061547, nan, nan, nan, 0.21970963484381875, 0.8155339805825242, nan, nan, 0.9245472837022133, 0.794718008164911, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0313 | 95.0 | 1900 | 0.2158 | 0.6279 | 0.7475 | 0.9422 | [0.8694105024198824, 0.9781126292938322, 0.9606823684298609, 0.926408580241673, 0.2576579018704256, 0.9670893407525392, 0.9836110083840857, nan, 0.7012927582656446, nan, 0.8547794937348574, 0.883655235842859, 0.9228772440562834, 0.7705247760102395, 0.4320464459243391, 0.9132524552281918, 0.9757342904807643, 0.5949204307989069, nan, 0.80098216593435, 0.4369296833064949, nan, 0.9268028257676436, nan, 0.0, nan, nan, 0.9213477508216463, nan, nan, nan, nan, nan, 0.6796151665931851, nan, nan, 0.5609027286309665, 0.703483432455395, 0.0, nan, nan, 0.0, nan, 0.22311111111111112, nan, nan, nan, 0.8673899611092938, nan, nan, 0.9694886806401982, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3625694873378629, nan, 0.6769614512471656, 0.8907231735863004, nan, nan, 0.8250595710881652, 0.0, 0.8492865055059621, nan, nan, 0.7765675185030023, 0.8431203223949338, nan, nan, 0.8667887029288703, nan, nan, nan, nan, nan, nan, 0.8210526315789474, 0.0, nan, 0.9679524982366069, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.07280260432080497, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8839022334597556, 0.6948738812042311, nan, nan, nan, 0.9223650385604113, nan, nan, nan, 0.22005961774504645, 0.8258030817445808, 0.0, nan, 0.9051969741624915, 0.7891179952644041, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9603886546484733, 0.9878409260911957, 0.9959927952938066, 0.9666372618858676, 0.27640857869865504, 0.9937507970922076, 0.9916324902898188, nan, 0.7390191954777436, nan, 0.9995828336027534, 0.9762665118304543, 0.9426355436614134, 0.8466947960618847, 0.43844420989618615, 0.9668982727406175, 0.998411630999364, 0.8531581373905025, nan, 0.8540130899069928, 0.9399538106235565, nan, 0.9688871014916642, nan, nan, nan, nan, 0.9609713336211747, nan, nan, nan, nan, nan, 0.8869913686806412, nan, nan, 0.5978042239356353, 0.7048080891456872, nan, nan, nan, 0.0, nan, 0.22311111111111112, nan, nan, nan, 0.9684874802159004, nan, nan, 0.9879760947251606, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4855252274607113, nan, 0.704269874970512, 0.9836925887772358, nan, nan, 0.9162072767364939, nan, 0.8628930817610063, nan, nan, 0.8063627399802795, 0.8482479003764842, nan, nan, 0.8759050790127372, nan, nan, nan, nan, nan, nan, 0.9764239515960776, 0.0, nan, 0.9712430964867499, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.07711598746081505, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8840885142255005, 0.7564216120460585, nan, nan, nan, 0.9226812960740614, nan, nan, nan, 0.22085349758029035, 0.8297034898976646, nan, nan, 0.9269617706237424, 0.8062597651328058, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0272 | 96.0 | 1920 | 0.2180 | 0.6271 | 0.7463 | 0.9421 | [0.8690994996552112, 0.9785377877097258, 0.9603735840882739, 0.9257033935467444, 0.2458825941365407, 0.9664841409236491, 0.983561759214633, nan, 0.7015574021207178, nan, 0.8435588670998402, 0.8831006322444679, 0.9252186198328989, 0.7698296836982969, 0.43581500807553736, 0.9128774730996182, 0.9757070330042917, 0.5957717374026387, nan, 0.8010837660871528, 0.4396082698585419, nan, 0.9262329485834208, nan, 0.0, nan, nan, 0.9233815383844206, nan, nan, nan, nan, nan, 0.6815559454375899, nan, nan, 0.5648059796437659, 0.7027367987435957, 0.0, nan, nan, 0.0, nan, 0.23377777777777778, nan, nan, nan, 0.8634138410679769, nan, nan, 0.9697610193318112, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3445859872611465, nan, 0.6763081658518921, 0.892414398595259, nan, nan, 0.8257590791823775, 0.0, 0.854797733046707, nan, nan, 0.7736950933273723, 0.8346297896859695, nan, nan, 0.8700172404785539, nan, nan, nan, nan, nan, nan, 0.8213783403656821, 0.0, nan, 0.9677303713811586, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.07949545321208566, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8811380400421497, 0.6819296811120196, nan, nan, nan, 0.9157024085026142, nan, nan, nan, 0.21794422031222593, 0.817398119122257, 0.0, nan, 0.9064952904238619, 0.795357090048795, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.959586335115036, 0.9871496592673603, 0.996404256725494, 0.9661611258955232, 0.26157760814249365, 0.9941971687284785, 0.9920006040841753, nan, 0.7373715724753191, nan, 0.9996871252020649, 0.973218173900421, 0.9411983348201011, 0.8476324425691514, 0.4420832086201736, 0.9651612271859862, 0.9982412312598322, 0.863992623328723, nan, 0.8555287633482604, 0.9330254041570438, nan, 0.9681558935361216, nan, nan, nan, nan, 0.9640590859729691, nan, nan, nan, nan, nan, 0.8861487875051377, nan, nan, 0.5952899765336909, 0.7040600495253817, nan, nan, nan, 0.0, nan, 0.23377777777777778, nan, nan, nan, 0.9698470029625421, nan, nan, 0.9902973905783895, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.44747725392886684, nan, 0.7049304081151215, 0.9878326109351008, nan, nan, 0.9175303197353915, nan, 0.8687189672293942, nan, nan, 0.8029986659706514, 0.8389805965826818, nan, nan, 0.8801331853496115, nan, nan, nan, nan, nan, nan, 0.9747548508241185, 0.0, nan, 0.970947231633174, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.08495297805642633, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8811380400421497, 0.7387068201948627, nan, nan, nan, 0.9157809017658152, nan, nan, nan, 0.21865376154861416, 0.82104434531619, nan, nan, 0.929476861167002, 0.8133158610957109, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0358 | 97.0 | 1940 | 0.2204 | 0.6267 | 0.7467 | 0.9412 | [0.8666664668427115, 0.9788055427724088, 0.9603455526740029, 0.9252411547107029, 0.25039081084754977, 0.9667748671614944, 0.9835737985707801, nan, 0.6955406698564593, nan, 0.8428519088444347, 0.8851007598282128, 0.9211232208018173, 0.7690342982242764, 0.39240997918414267, 0.9127165891688065, 0.9757001341236019, 0.5930763178599527, nan, 0.8038979631678271, 0.4368932038834951, nan, 0.9270811454176293, nan, 0.0, nan, nan, 0.9229207908386999, nan, nan, nan, nan, nan, 0.6800421390273432, nan, nan, 0.5644108280254777, 0.7010325205345418, 0.0, nan, nan, 0.0, nan, 0.24266666666666667, nan, nan, nan, 0.8630572098899115, nan, nan, 0.9703220920035079, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.35709895513214507, nan, 0.6771162706658235, 0.8936211419617234, nan, nan, 0.8263930200277613, 0.0, 0.864488414475397, nan, nan, 0.7753534111862324, 0.829057365234938, nan, nan, 0.8695924764890283, nan, nan, nan, nan, nan, nan, 0.8195198878570177, 0.0, nan, 0.9672023083283701, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09456264775413711, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8802950474183351, 0.6787181594083813, nan, nan, nan, 0.9149346475251767, nan, nan, nan, 0.2222124824684432, 0.8248956158663883, 0.0, nan, 0.9080954243253813, 0.7813994772917797, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9599393557097484, 0.9875056100945591, 0.9961910899596801, 0.96473271792449, 0.2678298800436205, 0.9939208434298347, 0.9918920577089164, nan, 0.7302184774891833, nan, 0.9996697432688464, 0.9722746407316011, 0.9445187828327882, 0.8469626950639609, 0.3979426266954426, 0.9652590889073739, 0.9983173025721233, 0.8688335638543108, nan, 0.8510850843954529, 0.9353348729792148, nan, 0.968229014331676, nan, nan, nan, nan, 0.9637712446520391, nan, nan, nan, nan, nan, 0.8888203863542951, nan, nan, 0.5941166610794503, 0.7022802311184482, nan, nan, nan, 0.0, nan, 0.24266666666666667, nan, nan, nan, 0.9703745789537762, nan, nan, 0.9912951405503915, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.48056244830438377, nan, 0.7072422741212551, 0.9865109137203832, nan, nan, 0.9189636163175303, nan, 0.8793114862628268, nan, nan, 0.8048547068035496, 0.8328989284679988, nan, nan, 0.8796575233867132, nan, nan, nan, nan, nan, nan, 0.975798038806593, 0.0, nan, 0.9703613031976609, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10031347962382445, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8802950474183351, 0.7316209034543845, nan, nan, nan, 0.9150522887022116, nan, nan, nan, 0.22305323361196655, 0.829441091577014, nan, nan, 0.9344064386317907, 0.7986492616299582, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0094 | 98.0 | 1960 | 0.2193 | 0.6398 | 0.7480 | 0.9418 | [0.8682387005890121, 0.9781146845714402, 0.9602914076473802, 0.9259642141486522, 0.24637088529952975, 0.966499266331866, 0.9833691414020849, nan, 0.7050110706741659, nan, 0.8476130116291066, 0.8831996845011174, 0.9232991092016797, 0.7646122773731563, 0.4101150638985077, 0.9127277772638979, 0.9757173276867074, 0.5931962025316456, nan, 0.8037802907915994, 0.4271488469601677, nan, 0.9273019871256647, nan, 0.0, nan, nan, 0.9230105536937928, nan, nan, nan, nan, nan, 0.67957635654035, nan, nan, 0.566890166028097, 0.7035969000231725, 0.0, nan, nan, 0.0, nan, 0.2391111111111111, nan, nan, nan, 0.8639279969637274, nan, nan, 0.9691012355520128, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.34181141439205953, nan, 0.6764108810393828, 0.8915785405614306, nan, nan, 0.8272277227722772, 0.0, 0.85546722272905, nan, nan, 0.7733079869167762, 0.8361175115207373, nan, nan, 0.867402447954807, nan, nan, nan, nan, nan, nan, 0.8206026629292221, 0.0, nan, 0.9673528391349601, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12101724641917568, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8902002107481559, 0.6872964169381107, nan, nan, nan, 0.9133638973392176, nan, nan, nan, 0.2217741935483871, 0.8224079394097675, 0.0, nan, 0.9046871935673662, 0.8014586310550436, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9596726536303576, 0.9874746578487157, 0.9962951946592636, 0.9653601839582772, 0.2628135223555071, 0.9940483781830549, 0.9914814692459802, nan, 0.7421536977750391, nan, 0.9996002155359719, 0.9752503991871099, 0.942585984735851, 0.8367155582345456, 0.416058854109233, 0.9656505357929246, 0.9983081740146483, 0.8642231443061319, nan, 0.8569410954185326, 0.941108545034642, nan, 0.9690699034805499, nan, nan, nan, nan, 0.9657730502021431, nan, nan, nan, nan, nan, 0.8874434854089601, nan, nan, 0.5952061682869595, 0.7048854725546843, nan, nan, nan, 0.0, nan, 0.2391111111111111, nan, nan, nan, 0.9699687512682115, nan, nan, 0.99020576047892, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4557485525227461, nan, 0.7074309978768577, 0.9864914769966374, nan, nan, 0.9211686879823594, nan, 0.8691161866931479, nan, nan, 0.8022156487442724, 0.8407182160440196, nan, nan, 0.8764335923048465, nan, nan, nan, nan, nan, nan, 0.9772585019820572, 0.0, nan, 0.970517937531907, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.12978056426332288, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8902002107481559, 0.7475642161204605, nan, nan, nan, 0.9136379221669809, nan, nan, nan, 0.22261328640563133, 0.8262923117292049, nan, nan, 0.928169014084507, 0.8197167481477748, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0289 | 99.0 | 1980 | 0.2196 | 0.6274 | 0.7468 | 0.9419 | [0.8687380531611786, 0.9788681985049749, 0.9604636787132215, 0.925547042468748, 0.24836467702371218, 0.966896865437102, 0.9835856103793985, nan, 0.7040263505951775, nan, 0.8438444607483492, 0.8841162131892747, 0.9217962388522684, 0.7696564305259957, 0.4197897091027692, 0.9134047822967061, 0.9757158703300036, 0.5944524151123864, nan, 0.804112343516897, 0.4375, nan, 0.927459145466634, nan, 0.0, nan, nan, 0.9220979125994894, nan, nan, nan, nan, nan, 0.6826550157508984, nan, nan, 0.5629281503903139, 0.7031306317903301, 0.0, nan, nan, 0.0, nan, 0.2328888888888889, nan, nan, nan, 0.8630636485071663, nan, nan, 0.969993821378032, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3376040999359385, nan, 0.6771774485001807, 0.8922410770633196, nan, nan, 0.8274693511412339, 0.0, 0.858379433778067, nan, nan, 0.7732788993904144, 0.8337654854508787, nan, nan, 0.8704419600877651, nan, nan, nan, nan, nan, nan, 0.8219803370786517, 0.0, nan, 0.9675971968452945, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.10560093348891482, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8876712328767123, 0.6876019575856444, nan, nan, nan, 0.9149063798791722, nan, nan, nan, 0.22521258876128694, 0.8215778474399164, 0.0, nan, 0.907843137254902, 0.7788575374901342, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9596294943726967, 0.9876294190779326, 0.9961993522374248, 0.9657588094385655, 0.2649945474372955, 0.9939846108064447, 0.9917693531107975, nan, 0.7415509088717132, nan, 0.9996002155359719, 0.9740165481201916, 0.9425364258102884, 0.8476994173196705, 0.4257943571889916, 0.9654058814894554, 0.998369031064481, 0.859612724757953, nan, 0.8540819841543231, 0.9376443418013857, nan, 0.9689967826849956, nan, nan, nan, nan, 0.9640460022765632, nan, nan, nan, nan, nan, 0.8862309905466502, nan, nan, 0.5922728796513577, 0.704472761040033, nan, nan, nan, 0.0, nan, 0.2328888888888889, nan, nan, nan, 0.9701716651109938, nan, nan, 0.9909795257633296, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.4358974358974359, nan, 0.7072422741212551, 0.9892709284922934, nan, nan, 0.9153252480705623, nan, 0.8731545845746441, nan, nan, 0.8019836436401601, 0.8381117868520127, nan, nan, 0.8806088473125099, nan, nan, nan, nan, nan, nan, 0.9768412267890674, 0.0, nan, 0.9708080011138441, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.11347962382445141, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8876712328767123, 0.7466784765279008, nan, nan, nan, 0.9151808674781416, nan, nan, nan, 0.2260448746150462, 0.8252427184466019, nan, nan, 0.93158953722334, 0.7957764225593468, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
| 0.0231 | 100.0 | 2000 | 0.2163 | 0.6274 | 0.7469 | 0.9418 | [0.8684705920049987, 0.9788583293272919, 0.9606747391887004, 0.9253111886254581, 0.24568229913304662, 0.9673287812952618, 0.9833414396887159, nan, 0.7001172811177401, nan, 0.8421861956856026, 0.8851052596845509, 0.9197815739241791, 0.7674147657591366, 0.4244409937888199, 0.9139732741480556, 0.975721673551077, 0.595, nan, 0.802955985057658, 0.43452699091394975, nan, 0.9271645968165122, nan, 0.0, nan, nan, 0.9220460101548298, nan, nan, nan, nan, nan, 0.6805356890237193, nan, nan, 0.5647439438175649, 0.7019540200293489, 0.0, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.8638079583837581, nan, nan, 0.9697141888726062, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.3530150753768844, nan, 0.6765223671737463, 0.8920254678485999, nan, nan, 0.8264184572729973, 0.0, 0.8616266197825097, nan, nan, 0.7762981574539364, 0.8315092165898618, nan, nan, 0.8692742104162309, nan, nan, nan, nan, nan, nan, 0.824181626187962, 0.0, nan, 0.9686018546354324, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.09251101321585903, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8853530031612223, 0.6819296811120196, nan, nan, nan, 0.9152956298200514, nan, nan, nan, 0.22230014025245443, 0.8228541612314114, 0.0, nan, 0.9069607843137255, 0.7778216258879243, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan] | [0.9598165178225602, 0.9873972772341073, 0.9962357062595016, 0.9644300578376044, 0.26165030897855324, 0.9937295413000042, 0.9923120849870924, nan, 0.7356703682370433, nan, 0.9995828336027534, 0.9734359123239947, 0.9432798096937258, 0.8381890027459648, 0.43059909576395344, 0.9671918579047806, 0.998369031064481, 0.8503918856615952, nan, 0.851498449879435, 0.9387990762124712, nan, 0.9689602222872185, nan, nan, nan, nan, 0.9622666195653596, nan, nan, nan, nan, nan, 0.8897451705713111, nan, nan, 0.5998156218571907, 0.7033120099050764, nan, nan, nan, 0.0, nan, 0.24444444444444444, nan, nan, nan, 0.9703948703380545, nan, nan, 0.990348296189206, nan, 0.0, nan, nan, nan, nan, nan, nan, 0.46484698097601324, nan, 0.707100731304553, 0.9857723182180411, nan, nan, 0.9201764057331864, nan, 0.8760013240648792, nan, nan, 0.8064207412563077, 0.8360845641471184, nan, nan, 0.8786004968024946, nan, nan, nan, nan, nan, nan, 0.9770498643855623, 0.0, nan, 0.9719508516266766, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0987460815047022, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.8853530031612223, 0.7387068201948627, nan, nan, nan, 0.9156094633979085, nan, nan, nan, 0.2231412230532336, 0.8276043033324587, nan, nan, 0.9306841046277666, 0.794718008164911, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] |
### Framework versions
- Transformers 4.28.0
- Pytorch 2.0.0+cu118
- Datasets 2.12.0
- Tokenizers 0.13.3
|
asenella/mmnist_MVTCAEconfig2_seed_3_ratio_05_c
|
asenella
| 2023-05-13T02:45:16Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T02:45:08Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MVTCAEconfig2_seed_2_ratio_05_c
|
asenella
| 2023-05-13T02:36:19Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T02:36:10Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MVTCAEconfig2_seed_1_ratio_05_c
|
asenella
| 2023-05-13T02:35:32Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T02:35:20Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
asenella/mmnist_MVTCAEconfig2_seed_3_ratio_02_c
|
asenella
| 2023-05-13T02:26:41Z | 0 | 0 | null |
[
"multivae",
"en",
"license:apache-2.0",
"region:us"
] | null | 2023-05-13T02:26:32Z |
---
language: en
tags:
- multivae
license: apache-2.0
---
### Downloading this model from the Hub
This model was trained with multivae. It can be downloaded or reloaded using the method `load_from_hf_hub`
```python
>>> from multivae.models import AutoModel
>>> model = AutoModel.load_from_hf_hub(hf_hub_path="your_hf_username/repo_name")
```
|
Subsets and Splits
Filtered Qwen2.5 Distill Models
Identifies specific configurations of models by filtering cards that contain 'distill', 'qwen2.5', '7b' while excluding certain base models and incorrect model ID patterns, uncovering unique model variants.
Filtered Model Cards Count
Finds the count of entries with specific card details that include 'distill', 'qwen2.5', '7b' but exclude certain base models, revealing valuable insights about the dataset's content distribution.
Filtered Distill Qwen 7B Models
Filters for specific card entries containing 'distill', 'qwen', and '7b', excluding certain strings and patterns, to identify relevant model configurations.
Filtered Qwen-7b Model Cards
The query performs a detailed filtering based on specific keywords and excludes certain entries, which could be useful for identifying a specific subset of cards but does not provide deeper insights or trends.
Filtered Qwen 7B Model Cards
The query filters for specific terms related to "distilled" or "distill", "qwen", and "7b" in the 'card' column but excludes certain base models, providing a limited set of entries for further inspection.
Qwen 7B Distilled Models
The query provides a basic filtering of records to find specific card names that include keywords related to distilled Qwen 7b models, excluding a particular base model, which gives limited insight but helps in focusing on relevant entries.
Qwen 7B Distilled Model Cards
The query filters data based on specific keywords in the modelId and card fields, providing limited insight primarily useful for locating specific entries rather than revealing broad patterns or trends.
Qwen 7B Distilled Models
Finds all entries containing the terms 'distilled', 'qwen', and '7b' in a case-insensitive manner, providing a filtered set of records but without deeper analysis.
Distilled Qwen 7B Models
The query filters for specific model IDs containing 'distilled', 'qwen', and '7b', providing a basic retrieval of relevant entries but without deeper analysis or insight.
Filtered Model Cards with Distill Qwen2.
Filters and retrieves records containing specific keywords in the card description while excluding certain phrases, providing a basic count of relevant entries.
Filtered Model Cards with Distill Qwen 7
The query filters specific variations of card descriptions containing 'distill', 'qwen', and '7b' while excluding a particular base model, providing limited but specific data retrieval.
Distill Qwen 7B Model Cards
The query filters and retrieves rows where the 'card' column contains specific keywords ('distill', 'qwen', and '7b'), providing a basic filter result that can help in identifying specific entries.