modelId
stringlengths
5
139
author
stringlengths
2
42
last_modified
timestamp[us, tz=UTC]date
2020-02-15 11:33:14
2025-09-14 06:27:15
downloads
int64
0
223M
likes
int64
0
11.7k
library_name
stringclasses
558 values
tags
listlengths
1
4.05k
pipeline_tag
stringclasses
55 values
createdAt
timestamp[us, tz=UTC]date
2022-03-02 23:29:04
2025-09-14 06:24:19
card
stringlengths
11
1.01M
VinayReddyPulyala/graphcorevqa
VinayReddyPulyala
2023-09-22T07:05:16Z
62
0
transformers
[ "transformers", "pytorch", "vilt", "visual-question-answering", "generated_from_trainer", "base_model:dandelin/vilt-b32-mlm", "base_model:finetune:dandelin/vilt-b32-mlm", "license:apache-2.0", "endpoints_compatible", "region:us" ]
visual-question-answering
2023-09-20T07:34:13Z
--- license: apache-2.0 base_model: dandelin/vilt-b32-mlm tags: - generated_from_trainer model-index: - name: graphcorevqa results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # graphcorevqa This model is a fine-tuned version of [dandelin/vilt-b32-mlm](https://huggingface.co/dandelin/vilt-b32-mlm) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
Johnkimmyshinmmy/train_lora
Johnkimmyshinmmy
2023-09-22T07:00:10Z
0
0
diffusers
[ "diffusers", "tensorboard", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "lora", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-22T06:50:49Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 instance_prompt: a photo of aszx interior tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA DreamBooth - Johnkimmyshinmmy/train_lora These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on a photo of aszx interior using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. ![img_0](./image_0.png) ![img_1](./image_1.png) ![img_2](./image_2.png) ![img_3](./image_3.png) LoRA for the text encoder was enabled: False.
CyberHarem/mannen_ranko_akibameidosensou
CyberHarem
2023-09-22T06:59:20Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/mannen_ranko_akibameidosensou", "license:mit", "region:us" ]
text-to-image
2023-09-22T06:44:42Z
--- license: mit datasets: - CyberHarem/mannen_ranko_akibameidosensou pipeline_tag: text-to-image tags: - art --- # Lora of mannen_ranko_akibameidosensou This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 8060, you need to download `8060/mannen_ranko_akibameidosensou.pt` as the embedding and `8060/mannen_ranko_akibameidosensou.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 8060**, with the score of 0.944. The trigger words are: 1. `mannen_ranko_akibameidosensou` 2. `short_hair, purple_eyes, black_hair, maid_headdress, maid, bow, bangs, animal_ears, purple_hair, pink_bow, apron, lipstick, pig_ears, fake_animal_ears` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | pattern_7 | pattern_8 | pattern_9 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:-------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 9300 | 0.937 | [Download](9300/mannen_ranko_akibameidosensou.zip) | ![pattern_1-9300](9300/previews/pattern_1.png) | ![pattern_2-9300](9300/previews/pattern_2.png) | ![pattern_3-9300](9300/previews/pattern_3.png) | ![pattern_4-9300](9300/previews/pattern_4.png) | ![pattern_5-9300](9300/previews/pattern_5.png) | ![pattern_6-9300](9300/previews/pattern_6.png) | ![pattern_7-9300](9300/previews/pattern_7.png) | ![pattern_8-9300](9300/previews/pattern_8.png) | ![pattern_9-9300](9300/previews/pattern_9.png) | ![bikini-9300](9300/previews/bikini.png) | [<NSFW, click to see>](9300/previews/bondage.png) | ![free-9300](9300/previews/free.png) | ![maid-9300](9300/previews/maid.png) | ![miko-9300](9300/previews/miko.png) | [<NSFW, click to see>](9300/previews/nude.png) | [<NSFW, click to see>](9300/previews/nude2.png) | ![suit-9300](9300/previews/suit.png) | ![yukata-9300](9300/previews/yukata.png) | | 8680 | 0.942 | [Download](8680/mannen_ranko_akibameidosensou.zip) | ![pattern_1-8680](8680/previews/pattern_1.png) | ![pattern_2-8680](8680/previews/pattern_2.png) | ![pattern_3-8680](8680/previews/pattern_3.png) | ![pattern_4-8680](8680/previews/pattern_4.png) | ![pattern_5-8680](8680/previews/pattern_5.png) | ![pattern_6-8680](8680/previews/pattern_6.png) | ![pattern_7-8680](8680/previews/pattern_7.png) | ![pattern_8-8680](8680/previews/pattern_8.png) | ![pattern_9-8680](8680/previews/pattern_9.png) | ![bikini-8680](8680/previews/bikini.png) | [<NSFW, click to see>](8680/previews/bondage.png) | ![free-8680](8680/previews/free.png) | ![maid-8680](8680/previews/maid.png) | ![miko-8680](8680/previews/miko.png) | [<NSFW, click to see>](8680/previews/nude.png) | [<NSFW, click to see>](8680/previews/nude2.png) | ![suit-8680](8680/previews/suit.png) | ![yukata-8680](8680/previews/yukata.png) | | **8060** | **0.944** | [**Download**](8060/mannen_ranko_akibameidosensou.zip) | ![pattern_1-8060](8060/previews/pattern_1.png) | ![pattern_2-8060](8060/previews/pattern_2.png) | ![pattern_3-8060](8060/previews/pattern_3.png) | ![pattern_4-8060](8060/previews/pattern_4.png) | ![pattern_5-8060](8060/previews/pattern_5.png) | ![pattern_6-8060](8060/previews/pattern_6.png) | ![pattern_7-8060](8060/previews/pattern_7.png) | ![pattern_8-8060](8060/previews/pattern_8.png) | ![pattern_9-8060](8060/previews/pattern_9.png) | ![bikini-8060](8060/previews/bikini.png) | [<NSFW, click to see>](8060/previews/bondage.png) | ![free-8060](8060/previews/free.png) | ![maid-8060](8060/previews/maid.png) | ![miko-8060](8060/previews/miko.png) | [<NSFW, click to see>](8060/previews/nude.png) | [<NSFW, click to see>](8060/previews/nude2.png) | ![suit-8060](8060/previews/suit.png) | ![yukata-8060](8060/previews/yukata.png) | | 7440 | 0.889 | [Download](7440/mannen_ranko_akibameidosensou.zip) | ![pattern_1-7440](7440/previews/pattern_1.png) | ![pattern_2-7440](7440/previews/pattern_2.png) | ![pattern_3-7440](7440/previews/pattern_3.png) | ![pattern_4-7440](7440/previews/pattern_4.png) | ![pattern_5-7440](7440/previews/pattern_5.png) | ![pattern_6-7440](7440/previews/pattern_6.png) | ![pattern_7-7440](7440/previews/pattern_7.png) | ![pattern_8-7440](7440/previews/pattern_8.png) | ![pattern_9-7440](7440/previews/pattern_9.png) | ![bikini-7440](7440/previews/bikini.png) | [<NSFW, click to see>](7440/previews/bondage.png) | ![free-7440](7440/previews/free.png) | ![maid-7440](7440/previews/maid.png) | ![miko-7440](7440/previews/miko.png) | [<NSFW, click to see>](7440/previews/nude.png) | [<NSFW, click to see>](7440/previews/nude2.png) | ![suit-7440](7440/previews/suit.png) | ![yukata-7440](7440/previews/yukata.png) | | 6820 | 0.880 | [Download](6820/mannen_ranko_akibameidosensou.zip) | ![pattern_1-6820](6820/previews/pattern_1.png) | ![pattern_2-6820](6820/previews/pattern_2.png) | ![pattern_3-6820](6820/previews/pattern_3.png) | ![pattern_4-6820](6820/previews/pattern_4.png) | ![pattern_5-6820](6820/previews/pattern_5.png) | ![pattern_6-6820](6820/previews/pattern_6.png) | ![pattern_7-6820](6820/previews/pattern_7.png) | ![pattern_8-6820](6820/previews/pattern_8.png) | ![pattern_9-6820](6820/previews/pattern_9.png) | ![bikini-6820](6820/previews/bikini.png) | [<NSFW, click to see>](6820/previews/bondage.png) | ![free-6820](6820/previews/free.png) | ![maid-6820](6820/previews/maid.png) | ![miko-6820](6820/previews/miko.png) | [<NSFW, click to see>](6820/previews/nude.png) | [<NSFW, click to see>](6820/previews/nude2.png) | ![suit-6820](6820/previews/suit.png) | ![yukata-6820](6820/previews/yukata.png) | | 6200 | 0.944 | [Download](6200/mannen_ranko_akibameidosensou.zip) | ![pattern_1-6200](6200/previews/pattern_1.png) | ![pattern_2-6200](6200/previews/pattern_2.png) | ![pattern_3-6200](6200/previews/pattern_3.png) | ![pattern_4-6200](6200/previews/pattern_4.png) | ![pattern_5-6200](6200/previews/pattern_5.png) | ![pattern_6-6200](6200/previews/pattern_6.png) | ![pattern_7-6200](6200/previews/pattern_7.png) | ![pattern_8-6200](6200/previews/pattern_8.png) | ![pattern_9-6200](6200/previews/pattern_9.png) | ![bikini-6200](6200/previews/bikini.png) | [<NSFW, click to see>](6200/previews/bondage.png) | ![free-6200](6200/previews/free.png) | ![maid-6200](6200/previews/maid.png) | ![miko-6200](6200/previews/miko.png) | [<NSFW, click to see>](6200/previews/nude.png) | [<NSFW, click to see>](6200/previews/nude2.png) | ![suit-6200](6200/previews/suit.png) | ![yukata-6200](6200/previews/yukata.png) | | 5580 | 0.937 | [Download](5580/mannen_ranko_akibameidosensou.zip) | ![pattern_1-5580](5580/previews/pattern_1.png) | ![pattern_2-5580](5580/previews/pattern_2.png) | ![pattern_3-5580](5580/previews/pattern_3.png) | ![pattern_4-5580](5580/previews/pattern_4.png) | ![pattern_5-5580](5580/previews/pattern_5.png) | ![pattern_6-5580](5580/previews/pattern_6.png) | ![pattern_7-5580](5580/previews/pattern_7.png) | ![pattern_8-5580](5580/previews/pattern_8.png) | ![pattern_9-5580](5580/previews/pattern_9.png) | ![bikini-5580](5580/previews/bikini.png) | [<NSFW, click to see>](5580/previews/bondage.png) | ![free-5580](5580/previews/free.png) | ![maid-5580](5580/previews/maid.png) | ![miko-5580](5580/previews/miko.png) | [<NSFW, click to see>](5580/previews/nude.png) | [<NSFW, click to see>](5580/previews/nude2.png) | ![suit-5580](5580/previews/suit.png) | ![yukata-5580](5580/previews/yukata.png) | | 4960 | 0.901 | [Download](4960/mannen_ranko_akibameidosensou.zip) | ![pattern_1-4960](4960/previews/pattern_1.png) | ![pattern_2-4960](4960/previews/pattern_2.png) | ![pattern_3-4960](4960/previews/pattern_3.png) | ![pattern_4-4960](4960/previews/pattern_4.png) | ![pattern_5-4960](4960/previews/pattern_5.png) | ![pattern_6-4960](4960/previews/pattern_6.png) | ![pattern_7-4960](4960/previews/pattern_7.png) | ![pattern_8-4960](4960/previews/pattern_8.png) | ![pattern_9-4960](4960/previews/pattern_9.png) | ![bikini-4960](4960/previews/bikini.png) | [<NSFW, click to see>](4960/previews/bondage.png) | ![free-4960](4960/previews/free.png) | ![maid-4960](4960/previews/maid.png) | ![miko-4960](4960/previews/miko.png) | [<NSFW, click to see>](4960/previews/nude.png) | [<NSFW, click to see>](4960/previews/nude2.png) | ![suit-4960](4960/previews/suit.png) | ![yukata-4960](4960/previews/yukata.png) | | 4340 | 0.806 | [Download](4340/mannen_ranko_akibameidosensou.zip) | ![pattern_1-4340](4340/previews/pattern_1.png) | ![pattern_2-4340](4340/previews/pattern_2.png) | ![pattern_3-4340](4340/previews/pattern_3.png) | ![pattern_4-4340](4340/previews/pattern_4.png) | ![pattern_5-4340](4340/previews/pattern_5.png) | ![pattern_6-4340](4340/previews/pattern_6.png) | ![pattern_7-4340](4340/previews/pattern_7.png) | ![pattern_8-4340](4340/previews/pattern_8.png) | ![pattern_9-4340](4340/previews/pattern_9.png) | ![bikini-4340](4340/previews/bikini.png) | [<NSFW, click to see>](4340/previews/bondage.png) | ![free-4340](4340/previews/free.png) | ![maid-4340](4340/previews/maid.png) | ![miko-4340](4340/previews/miko.png) | [<NSFW, click to see>](4340/previews/nude.png) | [<NSFW, click to see>](4340/previews/nude2.png) | ![suit-4340](4340/previews/suit.png) | ![yukata-4340](4340/previews/yukata.png) | | 3720 | 0.828 | [Download](3720/mannen_ranko_akibameidosensou.zip) | ![pattern_1-3720](3720/previews/pattern_1.png) | ![pattern_2-3720](3720/previews/pattern_2.png) | ![pattern_3-3720](3720/previews/pattern_3.png) | ![pattern_4-3720](3720/previews/pattern_4.png) | ![pattern_5-3720](3720/previews/pattern_5.png) | ![pattern_6-3720](3720/previews/pattern_6.png) | ![pattern_7-3720](3720/previews/pattern_7.png) | ![pattern_8-3720](3720/previews/pattern_8.png) | ![pattern_9-3720](3720/previews/pattern_9.png) | ![bikini-3720](3720/previews/bikini.png) | [<NSFW, click to see>](3720/previews/bondage.png) | ![free-3720](3720/previews/free.png) | ![maid-3720](3720/previews/maid.png) | ![miko-3720](3720/previews/miko.png) | [<NSFW, click to see>](3720/previews/nude.png) | [<NSFW, click to see>](3720/previews/nude2.png) | ![suit-3720](3720/previews/suit.png) | ![yukata-3720](3720/previews/yukata.png) | | 3100 | 0.914 | [Download](3100/mannen_ranko_akibameidosensou.zip) | ![pattern_1-3100](3100/previews/pattern_1.png) | ![pattern_2-3100](3100/previews/pattern_2.png) | ![pattern_3-3100](3100/previews/pattern_3.png) | ![pattern_4-3100](3100/previews/pattern_4.png) | ![pattern_5-3100](3100/previews/pattern_5.png) | ![pattern_6-3100](3100/previews/pattern_6.png) | ![pattern_7-3100](3100/previews/pattern_7.png) | ![pattern_8-3100](3100/previews/pattern_8.png) | ![pattern_9-3100](3100/previews/pattern_9.png) | ![bikini-3100](3100/previews/bikini.png) | [<NSFW, click to see>](3100/previews/bondage.png) | ![free-3100](3100/previews/free.png) | ![maid-3100](3100/previews/maid.png) | ![miko-3100](3100/previews/miko.png) | [<NSFW, click to see>](3100/previews/nude.png) | [<NSFW, click to see>](3100/previews/nude2.png) | ![suit-3100](3100/previews/suit.png) | ![yukata-3100](3100/previews/yukata.png) | | 2480 | 0.901 | [Download](2480/mannen_ranko_akibameidosensou.zip) | ![pattern_1-2480](2480/previews/pattern_1.png) | ![pattern_2-2480](2480/previews/pattern_2.png) | ![pattern_3-2480](2480/previews/pattern_3.png) | ![pattern_4-2480](2480/previews/pattern_4.png) | ![pattern_5-2480](2480/previews/pattern_5.png) | ![pattern_6-2480](2480/previews/pattern_6.png) | ![pattern_7-2480](2480/previews/pattern_7.png) | ![pattern_8-2480](2480/previews/pattern_8.png) | ![pattern_9-2480](2480/previews/pattern_9.png) | ![bikini-2480](2480/previews/bikini.png) | [<NSFW, click to see>](2480/previews/bondage.png) | ![free-2480](2480/previews/free.png) | ![maid-2480](2480/previews/maid.png) | ![miko-2480](2480/previews/miko.png) | [<NSFW, click to see>](2480/previews/nude.png) | [<NSFW, click to see>](2480/previews/nude2.png) | ![suit-2480](2480/previews/suit.png) | ![yukata-2480](2480/previews/yukata.png) | | 1860 | 0.853 | [Download](1860/mannen_ranko_akibameidosensou.zip) | ![pattern_1-1860](1860/previews/pattern_1.png) | ![pattern_2-1860](1860/previews/pattern_2.png) | ![pattern_3-1860](1860/previews/pattern_3.png) | ![pattern_4-1860](1860/previews/pattern_4.png) | ![pattern_5-1860](1860/previews/pattern_5.png) | ![pattern_6-1860](1860/previews/pattern_6.png) | ![pattern_7-1860](1860/previews/pattern_7.png) | ![pattern_8-1860](1860/previews/pattern_8.png) | ![pattern_9-1860](1860/previews/pattern_9.png) | ![bikini-1860](1860/previews/bikini.png) | [<NSFW, click to see>](1860/previews/bondage.png) | ![free-1860](1860/previews/free.png) | ![maid-1860](1860/previews/maid.png) | ![miko-1860](1860/previews/miko.png) | [<NSFW, click to see>](1860/previews/nude.png) | [<NSFW, click to see>](1860/previews/nude2.png) | ![suit-1860](1860/previews/suit.png) | ![yukata-1860](1860/previews/yukata.png) | | 1240 | 0.828 | [Download](1240/mannen_ranko_akibameidosensou.zip) | ![pattern_1-1240](1240/previews/pattern_1.png) | ![pattern_2-1240](1240/previews/pattern_2.png) | ![pattern_3-1240](1240/previews/pattern_3.png) | ![pattern_4-1240](1240/previews/pattern_4.png) | ![pattern_5-1240](1240/previews/pattern_5.png) | ![pattern_6-1240](1240/previews/pattern_6.png) | ![pattern_7-1240](1240/previews/pattern_7.png) | ![pattern_8-1240](1240/previews/pattern_8.png) | ![pattern_9-1240](1240/previews/pattern_9.png) | ![bikini-1240](1240/previews/bikini.png) | [<NSFW, click to see>](1240/previews/bondage.png) | ![free-1240](1240/previews/free.png) | ![maid-1240](1240/previews/maid.png) | ![miko-1240](1240/previews/miko.png) | [<NSFW, click to see>](1240/previews/nude.png) | [<NSFW, click to see>](1240/previews/nude2.png) | ![suit-1240](1240/previews/suit.png) | ![yukata-1240](1240/previews/yukata.png) | | 620 | 0.782 | [Download](620/mannen_ranko_akibameidosensou.zip) | ![pattern_1-620](620/previews/pattern_1.png) | ![pattern_2-620](620/previews/pattern_2.png) | ![pattern_3-620](620/previews/pattern_3.png) | ![pattern_4-620](620/previews/pattern_4.png) | ![pattern_5-620](620/previews/pattern_5.png) | ![pattern_6-620](620/previews/pattern_6.png) | ![pattern_7-620](620/previews/pattern_7.png) | ![pattern_8-620](620/previews/pattern_8.png) | ![pattern_9-620](620/previews/pattern_9.png) | ![bikini-620](620/previews/bikini.png) | [<NSFW, click to see>](620/previews/bondage.png) | ![free-620](620/previews/free.png) | ![maid-620](620/previews/maid.png) | ![miko-620](620/previews/miko.png) | [<NSFW, click to see>](620/previews/nude.png) | [<NSFW, click to see>](620/previews/nude2.png) | ![suit-620](620/previews/suit.png) | ![yukata-620](620/previews/yukata.png) |
diana9m/gpt2-lora-kde4
diana9m
2023-09-22T06:54:38Z
5
0
peft
[ "peft", "pytorch", "base_model:openai-community/gpt2", "base_model:adapter:openai-community/gpt2", "region:us" ]
null
2023-08-07T07:27:01Z
--- library_name: peft base_model: gpt2 --- ## Training procedure ### Framework versions - PEFT 0.4.0
Vasanth/deci-finetuned-alpaca-cleaned
Vasanth
2023-09-22T06:40:06Z
15
0
transformers
[ "transformers", "pytorch", "llama", "text-generation", "generated_from_trainer", "custom_code", "base_model:Deci/DeciLM-6b-instruct", "base_model:finetune:Deci/DeciLM-6b-instruct", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-22T05:21:10Z
--- license: other base_model: Deci/DeciLM-6b-instruct tags: - generated_from_trainer model-index: - name: deci-finetuned-alpaca-cleaned results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # deci-finetuned-alpaca-cleaned This model is a fine-tuned version of [Deci/DeciLM-6b-instruct](https://huggingface.co/Deci/DeciLM-6b-instruct) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - training_steps: 1000 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
Johnkimmyshinmmy/lora_train
Johnkimmyshinmmy
2023-09-22T06:26:33Z
1
0
diffusers
[ "diffusers", "tensorboard", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "lora", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-22T06:09:14Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 instance_prompt: a photo of aszx cup tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA DreamBooth - Johnkimmyshinmmy/lora_train These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on a photo of aszx cup using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. ![img_0](./image_0.png) ![img_1](./image_1.png) ![img_2](./image_2.png) ![img_3](./image_3.png) LoRA for the text encoder was enabled: False.
lash/phi-1_5-finetuned-bon
lash
2023-09-22T06:08:43Z
18
0
transformers
[ "transformers", "pytorch", "mixformer-sequential", "text-generation", "generated_from_trainer", "custom_code", "base_model:microsoft/phi-1_5", "base_model:finetune:microsoft/phi-1_5", "license:other", "autotrain_compatible", "region:us" ]
text-generation
2023-09-21T06:44:12Z
--- license: other base_model: microsoft/phi-1_5 tags: - generated_from_trainer model-index: - name: phi-1_5-finetuned-bon results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # phi-1_5-finetuned-bon This model is a fine-tuned version of [microsoft/phi-1_5](https://huggingface.co/microsoft/phi-1_5) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - training_steps: 33500 ### Training results ### Framework versions - Transformers 4.33.1 - Pytorch 2.1.0.dev20230629 - Datasets 2.14.5 - Tokenizers 0.13.3
fastbond/llama-2-7b-finetune-GEM_viggo
fastbond
2023-09-22T05:59:00Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-22T05:58:45Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.5.0
kla-20/Peft-Flan-t5-qa-model
kla-20
2023-09-22T05:50:38Z
0
0
null
[ "pytorch", "tensorboard", "generated_from_trainer", "license:apache-2.0", "region:us" ]
null
2023-09-22T04:35:09Z
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: Peft-Flan-t5-qa-model results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Peft-Flan-t5-qa-model This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 1 ### Framework versions - Transformers 4.27.2 - Pytorch 1.13.1+cu117 - Datasets 2.11.0 - Tokenizers 0.13.3
vasimakram01/ludwig_llm3
vasimakram01
2023-09-22T05:44:16Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-22T05:41:25Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.5.0 - PEFT 0.5.0
Duxiaoman-DI/XuanYuan-176B
Duxiaoman-DI
2023-09-22T05:24:14Z
0
0
null
[ "license:bigscience-bloom-rail-1.0", "region:us" ]
null
2023-09-22T05:24:14Z
--- license: bigscience-bloom-rail-1.0 ---
LizzyBennet/SG_ko_en_translate
LizzyBennet
2023-09-22T05:14:42Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-21T07:54:45Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ### Framework versions - PEFT 0.5.0
anhtu77/wav2vec2-base-vi-vlsp2020-demo
anhtu77
2023-09-22T05:01:57Z
108
0
transformers
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "base_model:nguyenvulebinh/wav2vec2-base-vi-vlsp2020", "base_model:finetune:nguyenvulebinh/wav2vec2-base-vi-vlsp2020", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-22T04:59:51Z
--- license: cc-by-nc-4.0 base_model: nguyenvulebinh/wav2vec2-base-vi-vlsp2020 tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-base-vi-vlsp2020-demo results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-base-vi-vlsp2020-demo This model is a fine-tuned version of [nguyenvulebinh/wav2vec2-base-vi-vlsp2020](https://huggingface.co/nguyenvulebinh/wav2vec2-base-vi-vlsp2020) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2509 - Wer: 0.1280 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.8445 | 1.0 | 787 | 0.3264 | 0.1494 | | 0.5248 | 2.0 | 1574 | 0.2784 | 0.1365 | | 0.445 | 3.0 | 2361 | 0.2509 | 0.1280 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
DesktopDreamers/MovieLLM
DesktopDreamers
2023-09-22T05:00:57Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-18T03:11:20Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ### Framework versions - PEFT 0.5.0
deeprasanth/mldpo
deeprasanth
2023-09-22T04:55:41Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-22T04:45:33Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.5.0
Megalino111/ppo-LunarLander-v2
Megalino111
2023-09-22T04:42:48Z
2
1
stable-baselines3
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T04:42:24Z
--- library_name: stable-baselines3 tags: - LunarLander-v2 - deep-reinforcement-learning - reinforcement-learning - stable-baselines3 model-index: - name: PPO results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: LunarLander-v2 type: LunarLander-v2 metrics: - type: mean_reward value: 256.48 +/- 22.26 name: mean_reward verified: false --- # **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
Vasanth/deci-finetuned-alpaca
Vasanth
2023-09-22T04:39:18Z
0
0
null
[ "generated_from_trainer", "base_model:Deci/DeciLM-6b-instruct", "base_model:finetune:Deci/DeciLM-6b-instruct", "license:other", "region:us" ]
null
2023-09-22T04:12:53Z
--- license: other base_model: Deci/DeciLM-6b-instruct tags: - generated_from_trainer model-index: - name: deci-finetuned-alpaca results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # deci-finetuned-alpaca This model is a fine-tuned version of [Deci/DeciLM-6b-instruct](https://huggingface.co/Deci/DeciLM-6b-instruct) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - training_steps: 1000 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
AmelieSchreiber/esm2_t12_35M_lora_binding_sites_v2_cp3
AmelieSchreiber
2023-09-22T04:36:48Z
40
6
peft
[ "peft", "ESM-2", "biology", "protein language model", "binding sites", "token-classification", "en", "dataset:AmelieSchreiber/binding_sites_random_split_by_family_550K", "license:mit", "region:us" ]
token-classification
2023-09-14T10:13:28Z
--- library_name: peft license: mit datasets: - AmelieSchreiber/binding_sites_random_split_by_family_550K language: - en metrics: - accuracy - precision - recall - f1 - roc_auc - matthews_correlation pipeline_tag: token-classification tags: - ESM-2 - biology - protein language model - binding sites --- # ESM-2 for Binding Site Prediction **This model may be overfit to some extent (see below).** Try running [this notebook](https://huggingface.co/AmelieSchreiber/esm2_t12_35M_lora_binding_sites_v2_cp3/blob/main/testing_esmb.ipynb) on the datasets linked to in the notebook. See if you can figure out why the metrics differ so much on the datasets. Is it due to something like sequence similarity in the train/test split? Is there something fundamentally flawed with the method? Splitting the sequences based on family in UniProt seemed to help, but perhaps a more rigorous approach is necessary? This model *seems* close to SOTA compared to [these SOTA structural models](https://www.biorxiv.org/content/10.1101/2023.08.11.553028v1). Note the especially high metrics below based on the performance on the train/test split. However, initial testing on a couple of these datasets doesn't appear nearly as promising. If you would like to check the data preprocessing step, please see [this notebook](https://huggingface.co/AmelieSchreiber/esm2_t12_35M_lora_binding_sites_v2_cp3/blob/main/data_preprocessing_notebook_v1.ipynb). One of the primary goals in training this model is to prove the viability of using simple, single sequence only (no MSA) protein language models for binary token classification tasks like predicting binding and active sites of protein sequences based on sequence alone. This project is also an attempt to make deep learning techniques like LoRA more accessible and to showcase the competative or even superior performance of simple models and techniques. This however may not be as viable as other methods. The model seems to show good performance, but testing based on [this notebook](https://huggingface.co/AmelieSchreiber/esm2_t12_35M_lora_binding_sites_v2_cp3/blob/main/testing_esmb.ipynb) seems to indicate otherwise. The other potentially important finding is that Low Rank Adaptation (LoRA) helps dramatically improve overfitting of the models. We initially finetuned without LoRA and found overfitting to be a serious issue. However, after using LoRA, we found the overfitting improved quite a lot without any other modification. Due to the simplicity of LoRA, this may prove an important regularization technique for learning on proteins in the future. Keep in mind though, this did not really solve the overfitting problem despite the improvements (the finetuned model wihtout LoRA was *very* overfit). Since most proteins still do not have a predicted 3D fold or backbone structure, it is useful to have a model that can predict binding residues from sequence alone. We also hope that this project will be helpful in this regard. It has been shown that pLMs like ESM-2 contain structural information in the attention maps that recapitulate the contact maps of proteins, and that single sequence masked language models like ESMFold can be used in atomically accurate predictions of folds, even outperforming AlphaFold2. In our approach we show a positive correlation between scaling the model size and data in a 1-to-1 fashion provides what appears to be comparable to SOTA performance, although our comparison to the SOTA models is not fair and comprehensive. Using the notebook linked above should help further evaluate the model, but initial findings seem pretty poor. This model is a finetuned version of the 35M parameter `esm2_t12_35M_UR50D` ([see here](https://huggingface.co/facebook/esm2_t12_35M_UR50D) and [here](https://huggingface.co/docs/transformers/model_doc/esm) for more details). The model was finetuned with LoRA for the binary token classification task of predicting binding sites (and active sites) of protein sequences based on sequence alone. Note, this model has a high recall, meaning it is likely to detect binding sites, but it has a precision score that is somewhat lower than the SOTA structural models mentioned above, meaning the model may return some false positives as well. This may be undesirable for various reasons, one of which could be cost of lab experiments where a higher precision might be more desirable. ## Overfitting Issues ```python Train: ({'accuracy': 0.9908574638195745, 'precision': 0.7748830511095647, 'recall': 0.9862043939282111, 'f1': 0.8678649909611492, 'auc': 0.9886039823329382, 'mcc': 0.8699396085712834}, Test: {'accuracy': 0.9486280975482552, 'precision': 0.40980984516603186, 'recall': 0.827004864790918, 'f1': 0.5480444772577421, 'auc': 0.890196425388581, 'mcc': 0.560633448203768}) ``` Let's analyze the train and test metrics one by one: ### **1. Accuracy** - **Train**: 99.09% - **Test**: 94.86% The accuracy is notably high in both training and test datasets, indicating that the model makes correct predictions a significant majority of the time. The high accuracy on the test dataset signifies good generalization capabilities. ### **2. Precision** - **Train**: 77.49% - **Test**: 41.00% While the precision is quite good in the training dataset, it sees a decrease in the test dataset. This suggests that a substantial proportion of the instances that the model predicts as positive are actually negative, which could potentially lead to a higher false-positive rate. ### **3. Recall** - **Train**: 98.62% - **Test**: 82.70% The recall is impressive in both the training and test datasets, indicating that the model is able to identify a large proportion of actual positive instances correctly. A high recall in the test dataset suggests that the model maintains its sensitivity in identifying positive cases when generalized to unseen data. ### **4. F1-Score** - **Train**: 86.79% - **Test**: 54.80% The F1-score, which is the harmonic mean of precision and recall, is good in the training dataset but sees a decrease in the test dataset. The decrease in the F1-score from training to testing suggests a worsened balance between precision and recall in the unseen data, largely due to a decrease in precision. ### **5. AUC (Area Under the ROC Curve)** - **Train**: 98.86% - **Test**: 89.02% The AUC is quite high in both the training and test datasets, indicating that the model has a good capability to distinguish between the positive and negative classes. A high AUC in the test dataset further suggests that the model generalizes well to unseen data. ### **6. MCC (Matthews Correlation Coefficient)** - **Train**: 86.99% - **Test**: 56.06% The MCC, a balanced metric which takes into account true and false positives and negatives, is good in the training set but decreases in the test set. This suggests a diminished quality of binary classifications on the test dataset compared to the training dataset. ### **Overall Analysis** - **Generalization**: The metrics reveal that the model has a good generalization capability, as indicated by the high accuracy, recall, and AUC on the test dataset. - **Precision-Recall Trade-off**: The model maintains a high recall but experiences a dip in precision in the test dataset, leading to a lower F1-score. It indicates a tendency to predict more false positives, which might require tuning to balance precision and recall optimally. - **Improvement Suggestions**: - **Precision Improvement**: Focus on strategies to improve precision, such as feature engineering or experimenting with different classification thresholds. - **Hyperparameter Tuning**: Engaging in hyperparameter tuning might assist in enhancing the model's performance on unseen data. - **Complexity Reduction**: Consider reducing the model's complexity by training a LoRA for different weight matrices to prevent potential overfitting and improve generalization. - **Class Imbalance**: If the dataset has a class imbalance, techniques such as resampling or utilizing class weights might be beneficial. So, the model performs well on the training dataset and maintains a reasonably good performance on the test dataset, demonstrating a good generalization capability. However, the decrease in certain metrics like precision and F1-score in the test dataset compared to the training dataset indicates room for improvement to optimize the model further for unseen data. It would be advantageous to enhance precision without significantly compromising recall to achieve a more harmonious balance between the two. ## Running Inference You can download and run [this notebook](https://huggingface.co/AmelieSchreiber/esm2_t12_35M_lora_binding_sites_v2_cp3/blob/main/testing_and_inference.ipynb) to test out any of the ESMB models. Be sure to download the datasets linked to in the notebook. Note, if you would like to run the models on the train/test split to get the metrics, you may need to do so locally or in a Colab Pro instance as the datasets are quite large and will not run in a standard Colab (you can still run inference on your own protein sequences though). ## Training procedure This model was finetuned with LoRA on ~549K protein sequences from the UniProt database. The dataset can be found [here](https://huggingface.co/datasets/AmelieSchreiber/binding_sites_random_split_by_family_550K). The model obtains the following test metrics, also shown above: ```python Epoch: 3 Training Loss: 0.029100 Validation Loss: 0.291670 Accuracy: 0.948626 Precision: 0.409795 Recall: 0.826979 F1: 0.548025 Auc: 0.890183 Mcc: 0.560612 ``` ### Framework versions - PEFT 0.5.0 ## Using the model To use the model on one of your protein sequences try running the following: ```python !pip install transformers -q !pip install peft -q ``` ```python from transformers import AutoModelForTokenClassification, AutoTokenizer from peft import PeftModel import torch # Path to the saved LoRA model model_path = "AmelieSchreiber/esm2_t12_35M_lora_binding_sites_v2_cp3" # ESM2 base model base_model_path = "facebook/esm2_t12_35M_UR50D" # Load the model base_model = AutoModelForTokenClassification.from_pretrained(base_model_path) loaded_model = PeftModel.from_pretrained(base_model, model_path) # Ensure the model is in evaluation mode loaded_model.eval() # Load the tokenizer loaded_tokenizer = AutoTokenizer.from_pretrained(base_model_path) # Protein sequence for inference protein_sequence = "MAVPETRPNHTIYINNLNEKIKKDELKKSLHAIFSRFGQILDILVSRSLKMRGQAFVIFKEVSSATNALRSMQGFPFYDKPMRIQYAKTDSDIIAKMKGT" # Replace with your actual sequence # Tokenize the sequence inputs = loaded_tokenizer(protein_sequence, return_tensors="pt", truncation=True, max_length=1024, padding='max_length') # Run the model with torch.no_grad(): logits = loaded_model(**inputs).logits # Get predictions tokens = loaded_tokenizer.convert_ids_to_tokens(inputs["input_ids"][0]) # Convert input ids back to tokens predictions = torch.argmax(logits, dim=2) # Define labels id2label = { 0: "No binding site", 1: "Binding site" } # Print the predicted labels for each token for token, prediction in zip(tokens, predictions[0].numpy()): if token not in ['<pad>', '<cls>', '<eos>']: print((token, id2label[prediction])) ```
oelu/test
oelu
2023-09-22T04:18:23Z
0
0
diffusers
[ "diffusers", "law", "en", "de", "dataset:allenai/dolma", "license:bigscience-bloom-rail-1.0", "region:us" ]
null
2023-09-22T04:10:32Z
--- license: bigscience-bloom-rail-1.0 datasets: - allenai/dolma language: - en - de metrics: - accuracy library_name: diffusers tags: - law ---
Johnkimmyshinmmy/dogresult
Johnkimmyshinmmy
2023-09-22T04:03:38Z
3
0
diffusers
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "lora", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-22T03:54:58Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 instance_prompt: a photo of sks dog tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA DreamBooth - Johnkimmyshinmmy/dogresult These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on a photo of sks dog using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. ![img_0](./image_0.png) ![img_1](./image_1.png) ![img_2](./image_2.png) ![img_3](./image_3.png) LoRA for the text encoder was enabled: False.
yamiletzii/NLPmodelo
yamiletzii
2023-09-22T03:56:14Z
120
0
transformers
[ "transformers", "pytorch", "tensorboard", "roberta", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-21T22:35:04Z
--- license: apache-2.0 tags: - generated_from_trainer datasets: - glue metrics: - accuracy - f1 model-index: - name: NLPmodelo results: - task: name: Text Classification type: text-classification dataset: name: glue type: glue config: mrpc split: validation args: mrpc metrics: - name: Accuracy type: accuracy value: 0.8333333333333334 - name: F1 type: f1 value: 0.8763636363636363 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # NLPmodelo This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.6317 - Accuracy: 0.8333 - F1: 0.8764 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.5022 | 1.09 | 500 | 0.4258 | 0.8382 | 0.8809 | | 0.3347 | 2.18 | 1000 | 0.6317 | 0.8333 | 0.8764 | ### Framework versions - Transformers 4.28.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
yashika0998/vit-base-patch16-224-finetuned-flower
yashika0998
2023-09-22T03:43:20Z
170
0
transformers
[ "transformers", "pytorch", "vit", "image-classification", "generated_from_trainer", "dataset:imagefolder", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2023-09-21T17:37:33Z
--- license: apache-2.0 tags: - generated_from_trainer datasets: - imagefolder model-index: - name: vit-base-patch16-224-finetuned-flower results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
LTC-AI-Labs/L2-7B-Guanaco-Vicuna
LTC-AI-Labs
2023-09-22T03:36:28Z
80
0
transformers
[ "transformers", "pytorch", "llama", "text-generation", "en", "dataset:rishiraj/vicuna-unfiltered-guanaco", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-21T15:29:16Z
--- datasets: - rishiraj/vicuna-unfiltered-guanaco language: - en --- Llama 2 7b fine-tuned on the vicuna-unfiltered-guanaco dataset ![image/png](https://cdn-uploads.huggingface.co/production/uploads/638ee0b22cc490759feb38e9/DjIde-i2QNvv6R773R9zh.png)
jonas-luehrs/bert-base-cased-MLM-chemistry-tokenCLS-CATALYST
jonas-luehrs
2023-09-22T03:34:03Z
105
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:jonas-luehrs/bert-base-cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/bert-base-cased-MLM-chemistry", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-22T03:21:48Z
--- license: apache-2.0 base_model: jonas-luehrs/bert-base-cased-MLM-chemistry tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: bert-base-cased-MLM-chemistry-tokenCLS-CATALYST results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-cased-MLM-chemistry-tokenCLS-CATALYST This model is a fine-tuned version of [jonas-luehrs/bert-base-cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/bert-base-cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0705 - Precision: 0.6134 - Recall: 0.8232 - F1: 0.7030 - Accuracy: 0.9775 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.0945 | 1.0 | 1114 | 0.0880 | 0.5498 | 0.6406 | 0.5917 | 0.9700 | | 0.0622 | 2.0 | 2228 | 0.0816 | 0.6515 | 0.8725 | 0.7460 | 0.9773 | | 0.0459 | 3.0 | 3342 | 0.0705 | 0.6134 | 0.8232 | 0.7030 | 0.9775 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
chgenly/a2c-PandaReachDense-v3
chgenly
2023-09-22T03:33:24Z
0
0
stable-baselines3
[ "stable-baselines3", "PandaReachDense-v3", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T03:27:09Z
--- library_name: stable-baselines3 tags: - PandaReachDense-v3 - deep-reinforcement-learning - reinforcement-learning - stable-baselines3 model-index: - name: A2C results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: PandaReachDense-v3 type: PandaReachDense-v3 metrics: - type: mean_reward value: -0.18 +/- 0.10 name: mean_reward verified: false --- # **A2C** Agent playing **PandaReachDense-v3** This is a trained model of a **A2C** agent playing **PandaReachDense-v3** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
jonas-luehrs/bert-base-cased-MLM-chemistry-tokenCLS-BATTERY
jonas-luehrs
2023-09-22T03:29:48Z
105
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:jonas-luehrs/bert-base-cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/bert-base-cased-MLM-chemistry", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-22T03:23:41Z
--- license: apache-2.0 base_model: jonas-luehrs/bert-base-cased-MLM-chemistry tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: bert-base-cased-MLM-chemistry-tokenCLS-BATTERY results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-cased-MLM-chemistry-tokenCLS-BATTERY This model is a fine-tuned version of [jonas-luehrs/bert-base-cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/bert-base-cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0704 - Precision: 0.7487 - Recall: 0.8681 - F1: 0.8040 - Accuracy: 0.9787 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 338 | 0.0852 | 0.6765 | 0.7699 | 0.7202 | 0.9719 | | 0.1476 | 2.0 | 676 | 0.0677 | 0.7154 | 0.8252 | 0.7664 | 0.9773 | | 0.0524 | 3.0 | 1014 | 0.0704 | 0.7487 | 0.8681 | 0.8040 | 0.9787 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
jonas-luehrs/chembert_cased-MLM-chemistry-textCLS-PETROCHEMICAL
jonas-luehrs
2023-09-22T03:19:43Z
103
0
transformers
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "base_model:jonas-luehrs/chembert_cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/chembert_cased-MLM-chemistry", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-22T03:06:44Z
--- base_model: jonas-luehrs/chembert_cased-MLM-chemistry tags: - generated_from_trainer metrics: - f1 - precision - recall - accuracy model-index: - name: chembert_cased-MLM-chemistry-textCLS-PETROCHEMICAL results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chembert_cased-MLM-chemistry-textCLS-PETROCHEMICAL This model is a fine-tuned version of [jonas-luehrs/chembert_cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/chembert_cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5883 - F1: 0.7662 - Precision: 0.7616 - Recall: 0.7838 - Accuracy: 0.7838 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | Precision | Recall | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------:|:---------:|:------:|:--------:| | 1.1285 | 1.0 | 125 | 0.7318 | 0.7165 | 0.7123 | 0.7297 | 0.7297 | | 0.6049 | 2.0 | 250 | 0.6196 | 0.7524 | 0.7483 | 0.7703 | 0.7703 | | 0.4449 | 3.0 | 375 | 0.5883 | 0.7662 | 0.7616 | 0.7838 | 0.7838 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
yejeekang/qlora-koalpaca-polyglot-5.8b-150step
yejeekang
2023-09-22T03:12:35Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-22T03:12:34Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ### Framework versions - PEFT 0.6.0.dev0
jonas-luehrs/chembert_cased-MLM-chemistry-tokenCLS-BATTERY
jonas-luehrs
2023-09-22T02:52:07Z
106
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:jonas-luehrs/chembert_cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/chembert_cased-MLM-chemistry", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-22T02:48:33Z
--- base_model: jonas-luehrs/chembert_cased-MLM-chemistry tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: chembert_cased-MLM-chemistry-tokenCLS-BATTERY results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chembert_cased-MLM-chemistry-tokenCLS-BATTERY This model is a fine-tuned version of [jonas-luehrs/chembert_cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/chembert_cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0659 - Precision: 0.6898 - Recall: 0.8528 - F1: 0.7627 - Accuracy: 0.9771 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 338 | 0.0783 | 0.6913 | 0.7761 | 0.7312 | 0.9728 | | 0.1439 | 2.0 | 676 | 0.0633 | 0.6970 | 0.8466 | 0.7645 | 0.9762 | | 0.0491 | 3.0 | 1014 | 0.0659 | 0.6898 | 0.8528 | 0.7627 | 0.9771 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
hipnologo/GPT-Neox-20b-QLoRA-FineTune-english_quotes_dataset
hipnologo
2023-09-22T02:52:03Z
36
0
peft
[ "peft", "text-generation-inference", "text-generation", "en", "dataset:Abirate/english_quotes", "base_model:EleutherAI/gpt-neox-20b", "base_model:adapter:EleutherAI/gpt-neox-20b", "license:apache-2.0", "region:us" ]
text-generation
2023-06-19T19:25:15Z
--- language: - en license: apache-2.0 library_name: peft tags: - text-generation-inference datasets: - Abirate/english_quotes pipeline_tag: text-generation base_model: EleutherAI/gpt-neox-20b --- # hipnologo/GPT-Neox-20b-QLoRA-FineTune-english_quotes_dataset ## Training procedure The following `bitsandbytes` quantization config was used during training: - load_in_8bit: False - load_in-4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ## Model description This model is a fine-tuned version of the `EleutherAI/gpt-neox-20b` model using the QLoRa library and the PEFT library. #### How to use The code below performs the following steps: 1. Imports the necessary libraries: `torch` and classes from the `transformers` library. 2. Specifies the `model_id` as "hipnologo/GPT-Neox-20b-QLoRA-FineTune-english_quotes_dataset". 3. Defines a `BitsAndBytesConfig` object named `bnb_config` with the following configuration: - `load_in_4bit` set to `True` - `bnb_4bit_use_double_quant` set to `True` - `bnb_4bit_quant_type` set to "nf4" - `bnb_4bit_compute_dtype` set to `torch.bfloat16` 4. Initializes an `AutoTokenizer` object named `tokenizer` by loading the tokenizer for the specified `model_id`. 5. Initializes an `AutoModelForCausalLM` object named `model` by loading the pre-trained model for the specified `model_id` and providing the `quantization_config` as `bnb_config`. The model is loaded on device `cuda:0`. 6. Defines a variable `text` with the value "Twenty years from now". 7. Defines a variable `device` with the value "cuda:0", representing the device on which the model will be executed. 8. Encodes the `text` using the `tokenizer` and converts it to a PyTorch tensor, assigning it to the `inputs` variable. The tensor is moved to the specified `device`. 9. Generates text using the `model.generate` method by passing the `inputs` tensor and setting the `max_new_tokens` parameter to 20. The generated output is assigned to the `outputs` variable. 10. Decodes the `outputs` tensor using the `tokenizer` to obtain the generated text without special tokens, and assigns it to the `generated_text` variable. 11. Prints the `generated_text`. ```python import torch from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig # Load the base pre-trained model base_model_id = "EleutherAI/gpt-neox-20b" tokenizer = AutoTokenizer.from_pretrained(base_model_id) model = AutoModelForCausalLM.from_pretrained(base_model_id) # Fine-tuning model model_id = "hipnologo/GPT-Neox-20b-QLoRA-FineTune-english_quotes_dataset" bnb_config = BitsAndBytesConfig( load_in_4bit=True, bnb_4bit_use_double_quant=True, bnb_4bit_quant_type="nf4", bnb_4bit_compute_dtype=torch.bfloat16 ) # Load the fine-tuned model model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config=bnb_config, device_map={"":0}) text = "Twenty years from now" device = "cuda:0" inputs = tokenizer(text, return_tensors="pt").to(device) outputs = model.generate(**inputs, max_new_tokens=20) generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True) print(generated_text) ``` ### Framework versions - PEFT 0.4.0.dev0 ## Training procedure - Trainable params: 8650752 - all params: 10597552128 - trainable%: 0.08162971878329976 ## License This model is licensed under Apache 2.0. Please see the [LICENSE](https://www.apache.org/licenses/LICENSE-2.0) for more information.
hipnologo/llama-2-7b-hf-finetune-oa-guanaco-small
hipnologo
2023-09-22T02:51:40Z
9
0
peft
[ "peft", "pytorch", "llama", "llama-2", "text-generation", "en", "dataset:mychen76/small_openassistant-guanaco", "base_model:meta-llama/Llama-2-7b-hf", "base_model:adapter:meta-llama/Llama-2-7b-hf", "8-bit", "region:us" ]
text-generation
2023-08-15T06:01:00Z
--- language: - en library_name: peft tags: - llama-2 datasets: - mychen76/small_openassistant-guanaco pipeline_tag: text-generation base_model: meta-llama/Llama-2-7b-hf --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.4.0
hipnologo/falcon-7b-qlora-finetune-chatbot
hipnologo
2023-09-22T02:51:15Z
8
1
peft
[ "peft", "text-generation-inference", "text-generation", "en", "dataset:hipnologo/Ecommerce-FAQ-Chatbot-Dataset", "base_model:tiiuae/falcon-7b", "base_model:adapter:tiiuae/falcon-7b", "license:apache-2.0", "region:us" ]
text-generation
2023-06-19T21:30:40Z
--- language: - en license: apache-2.0 library_name: peft tags: - text-generation-inference datasets: - hipnologo/Ecommerce-FAQ-Chatbot-Dataset pipeline_tag: text-generation base_model: tiiuae/falcon-7b --- # Falcon 7B LLM Fine Tune Model ## Model description This model is a fine-tuned version of the `tiiuae/falcon-7b` model using the QLoRa library and the PEFT library. ## Intended uses & limitations #### How to use - The model and tokenizer are loaded using the `from_pretrained` methods. - The padding token of the tokenizer is set to be the same as the end-of-sentence (EOS) token. - The `generation_config` is used to set parameters for generating responses, such as the maximum number of new tokens to generate and the temperature for the softmax function. - The prompt is defined, encoded using the tokenizer, and passed to the `model.generate` method to generate a response. - The generated response is decoded using the tokenizer and printed. ```python # Import necessary classes and functions from transformers import AutoTokenizer, AutoModelForCausalLM from peft import PeftConfig, PeftModel # Specify the model PEFT_MODEL = "hipnologo/falcon-7b-qlora-finetune-chatbot" # Load the PEFT config config = PeftConfig.from_pretrained(PEFT_MODEL) # Load the base model and tokenizer model = AutoModelForCausalLM.from_pretrained( config.based_model_name_or_path, return_dict=True, quantization_config=bnb_config, device_map="auto", trust_remote_code=True, ) tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path) # Set the padding token to be the same as the EOS token tokenizer.pad_token = tokenizer.eos_token # Load the PEFT model model = PeftModel.from_pretrained(model, PEFT_MODEL) # Set the generation parameters generation_config = model.generation_config generation_config.max_new_tokens = 200 generation_config.temperature = 0.7 generation_config.top_p = 0.7 generation_config.num_return_sequences = 1 generation_config.pad_token_id = tokenizer.eos_token_id generation_config.eos_token_id = tokenizer.eos_token_id # Define the prompt prompt = """ <human>: How can I create an account? <assistant>: """.strip() print(prompt) # Encode the prompt encoding = tokenizer(prompt, return_tensors="pt").to(model.device) # Generate a response with torch.inference_mode(): outputs = model.generate( input_ids=encoding.input_ids, attention_mask=encoding.attention_mask, generation_config=generation_config, ) # Print the generated response print(tokenizer.decode(outputs[0],skip_special_tokens=True)) ``` ## Training procedure The model was fine-tuned on the [Ecommerce-FAQ-Chatbot-Dataset](https://kaggle.com/datasets/saadmakhdoom/ecommerce-faq-chatbot-dataset) using the `bitsandbytes` quantization config: - load_in_8bit: `False` - load_in_4bit: `True` - llm_int8_threshold: `6.0` - llm_int8_skip_modules: `None` - llm_int8_enable_fp32_cpu_offload: `False` - llm_int8_has_fp16_weight: `False` - bnb_4bit_quant_type: `nf4` - bnb_4bit_use_double_quant: `True` - bnb_4bit_compute_dtype: `bfloat16` ### Framework versions - PEFT 0.4.0.dev0 ## Evaluation results The model was trained for 80 steps, with the training loss decreasing from 0.184 to nearly 0. The final training loss was `0.03094411873175886`. - Trainable params: 2359296 - All params: 3611104128 - Trainable%: 0.06533447711203746 ## License This model is licensed under Apache 2.0. Please see the [LICENSE](https://www.apache.org/licenses/LICENSE-2.0) for more information.
jonas-luehrs/chembert_cased-MLM-chemistry-textCLS-RHEOLOGY
jonas-luehrs
2023-09-22T02:47:55Z
105
0
transformers
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "base_model:jonas-luehrs/chembert_cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/chembert_cased-MLM-chemistry", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-22T02:46:07Z
--- base_model: jonas-luehrs/chembert_cased-MLM-chemistry tags: - generated_from_trainer metrics: - f1 - precision - recall - accuracy model-index: - name: chembert_cased-MLM-chemistry-textCLS-RHEOLOGY results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chembert_cased-MLM-chemistry-textCLS-RHEOLOGY This model is a fine-tuned version of [jonas-luehrs/chembert_cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/chembert_cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5925 - F1: 0.7527 - Precision: 0.7836 - Recall: 0.7716 - Accuracy: 0.7716 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | Precision | Recall | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------:|:---------:|:------:|:--------:| | 1.1553 | 1.0 | 46 | 0.8494 | 0.6731 | 0.6580 | 0.7099 | 0.7099 | | 0.7613 | 2.0 | 92 | 0.6545 | 0.7297 | 0.7155 | 0.7593 | 0.7593 | | 0.5792 | 3.0 | 138 | 0.5925 | 0.7527 | 0.7836 | 0.7716 | 0.7716 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
INo0121/whisper-base-ko-callvoice
INo0121
2023-09-22T02:35:51Z
81
2
transformers
[ "transformers", "pytorch", "whisper", "automatic-speech-recognition", "hf-asr-leaderboard", "generated_from_trainer", "ko", "dataset:INo0121/low_quality_call_voice", "base_model:openai/whisper-base", "base_model:finetune:openai/whisper-base", "license:apache-2.0", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-07T01:50:46Z
--- language: - ko license: apache-2.0 base_model: openai/whisper-base tags: - hf-asr-leaderboard - generated_from_trainer datasets: - INo0121/low_quality_call_voice model-index: - name: Whisper Base for Korean Low quaiity Call Voices results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Whisper Base for Korean Low quaiity Call Voices This model is a fine-tuned version of [openai/whisper-base](https://huggingface.co/openai/whisper-base) on the Korean Low Quaiity Call Voices dataset. It achieves the following results on the evaluation set: - Loss: 0.4941 - Cer: 30.7538 ## Model description 프로젝트 용도로 파인튜닝된 모델입니다. OpenAI의 Whisper-Base 모델을 바탕으로 '한국어 저음질 음성 통화 데이터'에 대한 정확도를 증가시키고자 파인튜닝을 진행한 모델이며, 사용한 데이터는 AI-HUB의 ‘저음질 전화망 음성인식 데이터’ 중 일부로서 오디오 파일 기준 240,771.06초(파일 1개당 평균 길이는 약 5.296초) 텍스트 데이터 기준 총 1,696,414글자의 크기입니다. This is a fine-tuned model for project use. This model was fine-tuned to increase the accuracy of ‘Korean low-quality voice call data’ based on OpenAI’s Whisper-Base model. The data used is part of AI-HUB’s ‘low-quality telephone network voice recognition data’, which is 240,771.06 seconds based on audio files(average length per file is about 5.296 seconds). The total size is 1,696,414 characters based on text data. ## Intended uses & limitations 파인튜닝에 사용된 Base model과 dataset 모두 학습 목적으로 사용하였으며, 따라서 본 모델 역시 학습 목적으로만 사용 가능합니다. Both the base model and dataset used for fine tuning were used for learning purposes, so this model can also be used only for learning purposes. ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 8000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Cer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | 0.6416 | 0.44 | 1000 | 0.6564 | 64.1489 | | 0.5914 | 0.88 | 2000 | 0.5688 | 37.4957 | | 0.435 | 1.32 | 3000 | 0.5349 | 32.6734 | | 0.4056 | 1.76 | 4000 | 0.5124 | 30.9065 | | 0.3368 | 2.2 | 5000 | 0.5057 | 32.6925 | | 0.3107 | 2.64 | 6000 | 0.4979 | 32.8315 | | 0.3016 | 3.08 | 7000 | 0.4947 | 29.3060 | | 0.2979 | 3.52 | 8000 | 0.4941 | 30.7538 | ### Framework versions - Transformers 4.34.0.dev0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
CyberHarem/sakakibara_satomi_idolmastercinderellagirls
CyberHarem
2023-09-22T02:24:09Z
0
1
null
[ "art", "text-to-image", "dataset:CyberHarem/sakakibara_satomi_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T02:12:34Z
--- license: mit datasets: - CyberHarem/sakakibara_satomi_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of sakakibara_satomi_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 4420, you need to download `4420/sakakibara_satomi_idolmastercinderellagirls.pt` as the embedding and `4420/sakakibara_satomi_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 4420**, with the score of 0.876. The trigger words are: 1. `sakakibara_satomi_idolmastercinderellagirls` 2. `long_hair, grey_hair, breasts, purple_eyes, large_breasts, drill_hair, blush, open_mouth, jewelry, cleavage, necklace, smile` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:---------------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.866 | [Download](5100/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![pattern_3-5100](5100/previews/pattern_3.png) | ![pattern_4-5100](5100/previews/pattern_4.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.864 | [Download](4760/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![pattern_3-4760](4760/previews/pattern_3.png) | ![pattern_4-4760](4760/previews/pattern_4.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | **4420** | **0.876** | [**Download**](4420/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![pattern_3-4420](4420/previews/pattern_3.png) | ![pattern_4-4420](4420/previews/pattern_4.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.866 | [Download](4080/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![pattern_3-4080](4080/previews/pattern_3.png) | ![pattern_4-4080](4080/previews/pattern_4.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.874 | [Download](3740/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![pattern_3-3740](3740/previews/pattern_3.png) | ![pattern_4-3740](3740/previews/pattern_4.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.873 | [Download](3400/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![pattern_3-3400](3400/previews/pattern_3.png) | ![pattern_4-3400](3400/previews/pattern_4.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.867 | [Download](3060/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![pattern_3-3060](3060/previews/pattern_3.png) | ![pattern_4-3060](3060/previews/pattern_4.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.849 | [Download](2720/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![pattern_3-2720](2720/previews/pattern_3.png) | ![pattern_4-2720](2720/previews/pattern_4.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.852 | [Download](2380/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![pattern_3-2380](2380/previews/pattern_3.png) | ![pattern_4-2380](2380/previews/pattern_4.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.847 | [Download](2040/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![pattern_3-2040](2040/previews/pattern_3.png) | ![pattern_4-2040](2040/previews/pattern_4.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.845 | [Download](1700/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![pattern_3-1700](1700/previews/pattern_3.png) | ![pattern_4-1700](1700/previews/pattern_4.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.836 | [Download](1360/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![pattern_3-1360](1360/previews/pattern_3.png) | ![pattern_4-1360](1360/previews/pattern_4.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.803 | [Download](1020/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![pattern_3-1020](1020/previews/pattern_3.png) | ![pattern_4-1020](1020/previews/pattern_4.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.808 | [Download](680/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![pattern_3-680](680/previews/pattern_3.png) | ![pattern_4-680](680/previews/pattern_4.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.756 | [Download](340/sakakibara_satomi_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![pattern_3-340](340/previews/pattern_3.png) | ![pattern_4-340](340/previews/pattern_4.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
weishuai-4670/textual_inversion_style_2
weishuai-4670
2023-09-22T02:22:27Z
13
0
diffusers
[ "diffusers", "tensorboard", "safetensors", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "textual_inversion", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "autotrain_compatible", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
2023-09-20T03:04:57Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - textual_inversion inference: true --- # Textual inversion text2image fine-tuning - weishuai-4670/textual_inversion_style_2 These are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following.
DevaMalla/llama_7b_qlora_pds-eval
DevaMalla
2023-09-22T02:20:09Z
0
0
null
[ "generated_from_trainer", "base_model:DevaMalla/llama7b", "base_model:finetune:DevaMalla/llama7b", "region:us" ]
null
2023-09-21T10:35:24Z
--- base_model: DevaMalla/llama7b tags: - generated_from_trainer model-index: - name: llama_7b_qlora_pds-eval results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # llama_7b_qlora_pds-eval This model is a fine-tuned version of [DevaMalla/llama7b](https://huggingface.co/DevaMalla/llama7b) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.9304 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 10 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 1.2419 | 0.03 | 50 | 1.0315 | | 0.9965 | 0.05 | 100 | 0.9918 | | 0.9747 | 0.08 | 150 | 0.9787 | | 0.9771 | 0.11 | 200 | 0.9705 | | 0.9653 | 0.14 | 250 | 0.9645 | | 0.9507 | 0.16 | 300 | 0.9599 | | 0.9596 | 0.19 | 350 | 0.9563 | | 0.9497 | 0.22 | 400 | 0.9542 | | 0.9481 | 0.24 | 450 | 0.9517 | | 0.9475 | 0.27 | 500 | 0.9495 | | 0.9376 | 0.3 | 550 | 0.9476 | | 0.943 | 0.33 | 600 | 0.9463 | | 0.952 | 0.35 | 650 | 0.9447 | | 0.9489 | 0.38 | 700 | 0.9433 | | 0.9347 | 0.41 | 750 | 0.9421 | | 0.9307 | 0.44 | 800 | 0.9414 | | 0.9366 | 0.46 | 850 | 0.9406 | | 0.9246 | 0.49 | 900 | 0.9393 | | 0.9267 | 0.52 | 950 | 0.9383 | | 0.9358 | 0.54 | 1000 | 0.9373 | | 0.9405 | 0.57 | 1050 | 0.9365 | | 0.9276 | 0.6 | 1100 | 0.9359 | | 0.9403 | 0.63 | 1150 | 0.9353 | | 0.9218 | 0.65 | 1200 | 0.9347 | | 0.9406 | 0.68 | 1250 | 0.9343 | | 0.9257 | 0.71 | 1300 | 0.9337 | | 0.92 | 0.73 | 1350 | 0.9334 | | 0.9356 | 0.76 | 1400 | 0.9328 | | 0.9279 | 0.79 | 1450 | 0.9322 | | 0.9214 | 0.82 | 1500 | 0.9320 | | 0.9214 | 0.84 | 1550 | 0.9316 | | 0.9258 | 0.87 | 1600 | 0.9312 | | 0.9308 | 0.9 | 1650 | 0.9310 | | 0.9251 | 0.93 | 1700 | 0.9307 | | 0.9207 | 0.95 | 1750 | 0.9306 | | 0.926 | 0.98 | 1800 | 0.9304 | ### Framework versions - Transformers 4.32.0.dev0 - Pytorch 2.0.0 - Datasets 2.12.0 - Tokenizers 0.13.3
reeen115/lora_output_2-1
reeen115
2023-09-22T02:10:42Z
1
0
diffusers
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "lora", "base_model:stabilityai/stable-diffusion-2-1", "base_model:adapter:stabilityai/stable-diffusion-2-1", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-22T01:49:29Z
--- license: creativeml-openrail-m base_model: stabilityai/stable-diffusion-2-1 instance_prompt: cardboards, grayscale tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA DreamBooth - reeen115/lora_output_2-1 These are LoRA adaption weights for stabilityai/stable-diffusion-2-1. The weights were trained on cardboards, grayscale using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. LoRA for the text encoder was enabled: False.
chansuga/swin-tiny-patch4-window7-224-finetuned-eurosat
chansuga
2023-09-22T01:55:30Z
187
0
transformers
[ "transformers", "pytorch", "tensorboard", "swin", "image-classification", "generated_from_trainer", "dataset:imagefolder", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2023-09-14T05:57:22Z
--- license: apache-2.0 tags: - generated_from_trainer datasets: - imagefolder metrics: - accuracy model-index: - name: swin-tiny-patch4-window7-224-finetuned-eurosat results: - task: name: Image Classification type: image-classification dataset: name: imagefolder type: imagefolder args: default metrics: - name: Accuracy type: accuracy value: 0.9777777777777777 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0628 - Accuracy: 0.9778 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2799 | 1.0 | 190 | 0.1301 | 0.9574 | | 0.1848 | 2.0 | 380 | 0.0803 | 0.9711 | | 0.1504 | 3.0 | 570 | 0.0628 | 0.9778 | ### Framework versions - Transformers 4.17.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.0
ahirtonlopes/videomae-base-finetuned-ucf101-subset
ahirtonlopes
2023-09-22T01:46:48Z
59
0
transformers
[ "transformers", "pytorch", "videomae", "video-classification", "generated_from_trainer", "base_model:MCG-NJU/videomae-base", "base_model:finetune:MCG-NJU/videomae-base", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
video-classification
2023-08-31T16:22:57Z
--- license: cc-by-nc-4.0 base_model: MCG-NJU/videomae-base tags: - generated_from_trainer metrics: - accuracy model-index: - name: videomae-base-finetuned-ucf101-subset results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # videomae-base-finetuned-ucf101-subset This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4043 - Accuracy: 0.8968 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - training_steps: 300 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.835 | 0.25 | 75 | 1.5384 | 0.4143 | | 0.5822 | 1.25 | 150 | 0.6968 | 0.8 | | 0.2967 | 2.25 | 225 | 0.3601 | 0.9143 | | 0.0952 | 3.25 | 300 | 0.2773 | 0.9 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
CyberHarem/oohara_michiru_idolmastercinderellagirls
CyberHarem
2023-09-22T01:36:06Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/oohara_michiru_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T01:26:43Z
--- license: mit datasets: - CyberHarem/oohara_michiru_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of oohara_michiru_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 4420, you need to download `4420/oohara_michiru_idolmastercinderellagirls.pt` as the embedding and `4420/oohara_michiru_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 4420**, with the score of 0.974. The trigger words are: 1. `oohara_michiru_idolmastercinderellagirls` 2. `brown_hair, drill_hair, smile, food, pink_eyes, fang, open_mouth, bow, hair_ornament` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:------------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.971 | [Download](5100/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.971 | [Download](4760/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | **4420** | **0.974** | [**Download**](4420/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.870 | [Download](4080/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.972 | [Download](3740/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.967 | [Download](3400/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.964 | [Download](3060/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.927 | [Download](2720/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.957 | [Download](2380/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.912 | [Download](2040/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.788 | [Download](1700/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.755 | [Download](1360/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.706 | [Download](1020/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.644 | [Download](680/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.520 | [Download](340/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
Rosi-si/py_gec_mT5_v2
Rosi-si
2023-09-22T01:21:44Z
111
0
transformers
[ "transformers", "pytorch", "mt5", "text2text-generation", "generated_from_trainer", "base_model:Rosi-si/py_gec_mT5", "base_model:finetune:Rosi-si/py_gec_mT5", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text2text-generation
2023-09-22T00:12:46Z
--- license: apache-2.0 base_model: Rosi-si/py_gec_mT5 tags: - generated_from_trainer model-index: - name: py_gec_mT5_v2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # py_gec_mT5_v2 This model is a fine-tuned version of [Rosi-si/py_gec_mT5](https://huggingface.co/Rosi-si/py_gec_mT5) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.3107 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 0.0 | 1.0 | 4187 | 0.3107 | | 0.0 | 2.0 | 8374 | 0.3107 | | 0.0 | 3.0 | 12561 | 0.3107 | | 0.0 | 4.0 | 16748 | 0.3107 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
4bit/Qwen-VL-Chat
4bit
2023-09-22T01:12:54Z
4
0
transformers
[ "transformers", "pytorch", "qwen", "text-generation", "custom_code", "zh", "en", "arxiv:2308.12966", "autotrain_compatible", "region:us" ]
text-generation
2023-09-22T01:07:31Z
--- language: - zh - en tags: - qwen pipeline_tag: text-generation inference: false --- # Qwen-VL-Chat <br> <p align="center"> <img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/logo_vl.jpg" width="400"/> <p> <br> <p align="center"> Qwen-VL <a href="https://modelscope.cn/models/qwen/Qwen-VL/summary">🤖 <a> | <a href="https://huggingface.co/Qwen/Qwen-VL">🤗</a>&nbsp | Qwen-VL-Chat <a href="https://modelscope.cn/models/qwen/Qwen-VL-Chat/summary">🤖 <a>| <a href="https://huggingface.co/Qwen/Qwen-VL-Chat">🤗</a>&nbsp | Qwen-VL-Chat-Int4 <a href="https://huggingface.co/Qwen/Qwen-VL-Chat-Int4">🤗</a> <br> <a href="assets/wechat.png">WeChat</a>&nbsp&nbsp | &nbsp&nbsp<a href="https://discord.gg/z3GAxXZ9Ce">Discord</a>&nbsp&nbsp | &nbsp&nbsp<a href="https://modelscope.cn/studios/qwen/Qwen-VL-Chat-Demo/summary">Demo</a>&nbsp | &nbsp<a href="https://arxiv.org/abs/2308.12966">Report</a> </p> <br> **Qwen-VL** 是阿里云研发的大规模视觉语言模型(Large Vision Language Model, LVLM)。Qwen-VL 可以以图像、文本、检测框作为输入,并以文本和检测框作为输出。Qwen-VL 系列模型性能强大,具备多语言对话、多图交错对话等能力,并支持中文开放域定位和细粒度图像识别与理解。 **Qwen-VL** (Qwen Large Vision Language Model) is the visual multimodal version of the large model series, Qwen (abbr. Tongyi Qianwen), proposed by Alibaba Cloud. Qwen-VL accepts image, text, and bounding box as inputs, outputs text and bounding box. The features of Qwen-VL include: 目前,我们提供了Qwen-VL和Qwen-VL-Chat两个模型,分别为预训练模型和Chat模型。如果想了解更多关于模型的信息,请点击[链接](https://github.com/QwenLM/Qwen-VL/blob/master/visual_memo.md)查看我们的技术备忘录。本仓库为Qwen-VL-Chat仓库。 We release Qwen-VL and Qwen-VL-Chat, which are pretrained model and Chat model respectively. For more details about Qwen-VL, please refer to our [technical memo](https://github.com/QwenLM/Qwen-VL/blob/master/visual_memo.md). This repo is the one for Qwen-VL-Chat. <br> ## 安装要求 (Requirements) * python 3.8及以上版本 * pytorch 1.12及以上版本,推荐2.0及以上版本 * 建议使用CUDA 11.4及以上(GPU用户需考虑此选项) * python 3.8 and above * pytorch 1.12 and above, 2.0 and above are recommended * CUDA 11.4 and above are recommended (this is for GPU users) <br> ## 快速开始 (Quickstart) 我们提供简单的示例来说明如何利用 🤗 Transformers 快速使用Qwen-VL-Chat。 在开始前,请确保你已经配置好环境并安装好相关的代码包。最重要的是,确保你满足上述要求,然后安装相关的依赖库。 Below, we provide simple examples to show how to use Qwen-VL-Chat with 🤗 Transformers. Before running the code, make sure you have setup the environment and installed the required packages. Make sure you meet the above requirements, and then install the dependent libraries. ```bash pip install -r requirements.txt ``` 接下来你可以开始使用Transformers来使用我们的模型。关于视觉模块的更多用法,请参考[教程](TUTORIAL.md)。 Now you can start with Transformers. More usage aboue vision encoder, please refer to [tutorial](TUTORIAL_zh.md). #### 🤗 Transformers To use Qwen-VL-Chat for the inference, all you need to do is to input a few lines of codes as demonstrated below. However, **please make sure that you are using the latest code.** ```python from transformers import AutoModelForCausalLM, AutoTokenizer from transformers.generation import GenerationConfig import torch torch.manual_seed(1234) # Note: The default behavior now has injection attack prevention off. tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-VL-Chat", trust_remote_code=True) # use bf16 # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="auto", trust_remote_code=True, bf16=True).eval() # use fp16 # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="auto", trust_remote_code=True, fp16=True).eval() # use cpu only # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="cpu", trust_remote_code=True).eval() # use cuda device model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="cuda", trust_remote_code=True).eval() # Specify hyperparameters for generation (No need to do this if you are using transformers>=4.32.0) # model.generation_config = GenerationConfig.from_pretrained("Qwen/Qwen-VL-Chat", trust_remote_code=True) # 1st dialogue turn query = tokenizer.from_list_format([ {'image': 'https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg'}, {'text': '这是什么'}, ]) response, history = model.chat(tokenizer, query=query, history=None) print(response) # 图中是一名年轻女子在沙滩上和她的狗玩耍,狗的品种可能是拉布拉多。她们坐在沙滩上,狗的前腿抬起来,似乎在和人类击掌。两人之间充满了信任和爱。 # 2nd dialogue turn response, history = model.chat(tokenizer, '输出"击掌"的检测框', history=history) print(response) # <ref>击掌</ref><box>(517,508),(589,611)</box> image = tokenizer.draw_bbox_on_latest_picture(response, history) if image: image.save('1.jpg') else: print("no box") ``` <p align="center"> <img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo_highfive.jpg" width="500"/> <p> <br> ## 量化 (Quantization) ### 用法 (Usage) 当前我们提供了基于[AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ)的量化方案,并提供了Qwen-VL-Chat的Int4量化版本Qwen-VL-Chat-Int4 [点击此处](https://huggingface.co/Qwen/Qwen-VL-Chat-Int4)。该模型在效果评测上几乎无损,并在显存占用和推理速度上具有明显优势。 下文说明如何使用该量化模型。开始之前,请确保你满足要求(如torch2.0及以上、transformers 4.32.0及以上,等)并安装所需的代码库: We provide a new solution based on [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ), and release an Int4 quantized model for Qwen-VL-Chat, Qwen-VL-Chat-Int4 [Click here](https://huggingface.co/Qwen/Qwen-VL-Chat-Int4), which achieves nearly lossless model effects but improved performance on both memory costs and inference speed. Here we demonstrate how to use our provided quantized models for inference. Before you start, make sure you meet the requirements (e.g., torch 2.0 and above, transformers 4.32.0 and above, etc.) and install the required packages: ```bash pip install optimum git clone https://github.com/JustinLin610/AutoGPTQ.git & cd AutoGPTQ pip install -v . ``` 如遇到安装 `auto-gptq` 的问题,建议您前往官方[repo](https://github.com/PanQiWei/AutoGPTQ) 寻找合适的wheel。 随后你便可以按照上述用法,轻松调用量化模型: If you meet problems installing `auto-gptq`, we advise you to check out the official [repo](https://github.com/PanQiWei/AutoGPTQ) to find a wheel. Then you can load the quantized model easily and run inference as same as usual: ```python model = AutoModelForCausalLM.from_pretrained( "Qwen/Qwen-VL-Chat-Int4", device_map="auto", trust_remote_code=True ).eval() # Either a local path or an u[](https://)rl between <img></img> tags. image_path = 'https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg' response, history = model.chat(tokenizer, query=f'<img>{image_path}</img>这是什么', history=None) print(response) ``` ### 效果评测 (Performance) 我们列出不同精度下模型在评测基准 **[TouchStone](https://github.com/OFA-Sys/TouchStone)** 上的表现,并发现量化模型并没有显著性能损失。结果如下所示: We illustrate the model performance of both BF16 and Int4 models on the benchmark **[TouchStone](https://github.com/OFA-Sys/TouchStone)**, and we find that the quantized model does not suffer from significant performance degradation. Results are shown below: | Quantization | ZH. | EN | | ------------ | :--------: | :-----------: | | BF16 | 401.2 | 645.2 | | Int4 | 386.6 | 651.4 | ### 推理速度 (Inference Speed) 我们测算了在输入一张图片(即258个token)的条件下BF16和Int4的模型生成1792 (2048-258) 和 7934 (8192-258) 个token的平均速度。 We measured the average inference speed (tokens/s) of generating 1792 (2048-258) and 7934 (8192-258) tokens with the context of an image (which takes 258 tokens) under BF16 precision and Int4 quantization, respectively. | Quantization | Speed (2048 tokens) | Speed (8192 tokens) | | ------------ | :-----------------: | :-----------------: | | BF16 | 28.87 | 24.32 | | Int4 | 37.79 | 34.34 | 推理速度测算是在单卡 A100-SXM4-80G GPU上运行,使用PyTorch 2.0.1及CUDA 11.4。 The profiling runs on a single A100-SXM4-80G GPU with PyTorch 2.0.1 and CUDA 11.4. ### GPU显存占用 (GPU Memory Usage) 我们还测算了在一张图片输入的条件下BF16和Int4模型生成1792 (2048-258) 和 7934 (8192-258) 个token所需显存。结果如下所示: We also profile the peak GPU memory usage for encoding 1792 (2048-258) tokens (including an image) as context (and generating single token) and generating 7934 (8192-258) tokens (with an image as context) under BF16 or Int4 quantization level, respectively. The results are shown below. | Quantization | Peak Usage for Encoding 2048 Tokens | Peak Usage for Generating 8192 Tokens | | ------------ | :---------------------------------: | :-----------------------------------: | | BF16 | 22.60GB | 28.01GB | | Int4 | 11.82GB | 17.23GB | 上述速度和显存测算使用[此脚本](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile_mm.py)完成。 The above speed and memory profiling are conducted using [this script](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile_mm.py). <br> ## 评测 我们从两个角度评测了两个模型的能力: 1. 在**英文标准 Benchmark** 上评测模型的基础任务能力。目前评测了四大类多模态任务: - Zero-shot Caption: 评测模型在未见过数据集上的零样本图片描述能力; - General VQA: 评测模型的通用问答能力,例如判断题、颜色、个数、类目等问答能力; - Text-based VQA:评测模型对于图片中文字相关的识别/问答能力,例如文档问答、图表问答、文字问答等; - Referring Expression Compression:评测模型给定物体描述画检测框的能力; 2. **试金石 (TouchStone)**:为了评测模型整体的图文对话能力和人类对齐水平。我们为此构建了一个基于 GPT4 打分来评测 LVLM 模型的 Benchmark:TouchStone。在 TouchStone-v0.1 中: - 评测基准总计涵盖 300+张图片、800+道题目、27个类别。包括基础属性问答、人物地标问答、影视作品问答、视觉推理、反事实推理、诗歌创作、故事写作,商品比较、图片解题等**尽可能广泛的类别**。 - 为了弥补目前 GPT4 无法直接读取图片的缺陷,我们给所有的带评测图片提供了**人工标注的充分详细描述**,并且将图片的详细描述、问题和模型的输出结果一起交给 GPT4 打分。 - 评测同时包含英文版本和中文版本。 评测结果如下: We evaluated the model's ability from two perspectives: 1. **Standard Benchmarks**: We evaluate the model's basic task capabilities on four major categories of multimodal tasks: - Zero-shot Caption: Evaluate model's zero-shot image captioning ability on unseen datasets; - General VQA: Evaluate the general question-answering ability of pictures, such as the judgment, color, number, category, etc; - Text-based VQA: Evaluate the model's ability to recognize text in pictures, such as document QA, chart QA, etc; - Referring Expression Comprehension: Evaluate the ability to localize a target object in an image described by a referring expression. 2. **TouchStone**: To evaluate the overall text-image dialogue capability and alignment level with humans, we have constructed a benchmark called TouchStone, which is based on scoring with GPT4 to evaluate the LVLM model. - The TouchStone benchmark covers a total of 300+ images, 800+ questions, and 27 categories. Such as attribute-based Q&A, celebrity recognition, writing poetry, summarizing multiple images, product comparison, math problem solving, etc; - In order to break the current limitation of GPT4 in terms of direct image input, TouchStone provides fine-grained image annotations by human labeling. These detailed annotations, along with the questions and the model's output, are then presented to GPT4 for scoring. - The benchmark includes both English and Chinese versions. The results of the evaluation are as follows: Qwen-VL outperforms current SOTA generalist models on multiple VL tasks and has a more comprehensive coverage in terms of capability range. <p align="center"> <img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/radar.png" width="600"/> <p> ### 零样本图像描述 & 通用视觉问答 (Zero-shot Captioning & General VQA) <table> <thead> <tr> <th rowspan="2">Model type</th> <th rowspan="2">Model</th> <th colspan="2">Zero-shot Captioning</th> <th colspan="5">General VQA</th> </tr> <tr> <th>NoCaps</th> <th>Flickr30K</th> <th>VQAv2<sup>dev</sup></th> <th>OK-VQA</th> <th>GQA</th> <th>SciQA-Img<br>(0-shot)</th> <th>VizWiz<br>(0-shot)</th> </tr> </thead> <tbody align="center"> <tr> <td rowspan="10">Generalist<br>Models</td> <td>Flamingo-9B</td> <td>-</td> <td>61.5</td> <td>51.8</td> <td>44.7</td> <td>-</td> <td>-</td> <td>28.8</td> </tr> <tr> <td>Flamingo-80B</td> <td>-</td> <td>67.2</td> <td>56.3</td> <td>50.6</td> <td>-</td> <td>-</td> <td>31.6</td> </tr> <tr> <td>Unified-IO-XL</td> <td>100.0</td> <td>-</td> <td>77.9</td> <td>54.0</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>Kosmos-1</td> <td>-</td> <td>67.1</td> <td>51.0</td> <td>-</td> <td>-</td> <td>-</td> <td>29.2</td> </tr> <tr> <td>Kosmos-2</td> <td>-</td> <td>66.7</td> <td>45.6</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>BLIP-2 (Vicuna-13B)</td> <td>103.9</td> <td>71.6</td> <td>65.0</td> <td>45.9</td> <td>32.3</td> <td>61.0</td> <td>19.6</td> </tr> <tr> <td>InstructBLIP (Vicuna-13B)</td> <td><strong>121.9</strong></td> <td>82.8</td> <td>-</td> <td>-</td> <td>49.5</td> <td>63.1</td> <td>33.4</td> </tr> <tr> <td>Shikra (Vicuna-13B)</td> <td>-</td> <td>73.9</td> <td>77.36</td> <td>47.16</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td><strong>Qwen-VL (Qwen-7B)</strong></td> <td>121.4</td> <td><b>85.8</b></td> <td><b>78.8</b></td> <td><b>58.6</b></td> <td><b>59.3</b></td> <td>67.1</td> <td>35.2</td> </tr> <!-- <tr> <td>Qwen-VL (4-shot)</td> <td>-</td> <td>-</td> <td>-</td> <td>63.6</td> <td>-</td> <td>-</td> <td>39.1</td> </tr> --> <tr> <td>Qwen-VL-Chat</td> <td>120.2</td> <td>81.0</td> <td>78.2</td> <td>56.6</td> <td>57.5</td> <td><b>68.2</b></td> <td><b>38.9</b></td> </tr> <!-- <tr> <td>Qwen-VL-Chat (4-shot)</td> <td>-</td> <td>-</td> <td>-</td> <td>60.6</td> <td>-</td> <td>-</td> <td>44.45</td> </tr> --> <tr> <td>Previous SOTA<br>(Per Task Fine-tuning)</td> <td>-</td> <td>127.0<br>(PALI-17B)</td> <td>84.5<br>(InstructBLIP<br>-FlanT5-XL)</td> <td>86.1<br>(PALI-X<br>-55B)</td> <td>66.1<br>(PALI-X<br>-55B)</td> <td>72.1<br>(CFR)</td> <td>92.53<br>(LLaVa+<br>GPT-4)</td> <td>70.9<br>(PALI-X<br>-55B)</td> </tr> </tbody> </table> - 在 Zero-shot Caption 中,Qwen-VL 在 Flickr30K 数据集上取得了 **SOTA** 的结果,并在 Nocaps 数据集上取得了和 InstructBlip 可竞争的结果。 - 在 General VQA 中,Qwen-VL 取得了 LVLM 模型同等量级和设定下 **SOTA** 的结果。 - For zero-shot image captioning, Qwen-VL achieves the **SOTA** on Flickr30K and competitive results on Nocaps with InstructBlip. - For general VQA, Qwen-VL achieves the **SOTA** under the same generalist LVLM scale settings. ### 文本导向的视觉问答 (Text-oriented VQA) <table> <thead> <tr> <th>Model type</th> <th>Model</th> <th>TextVQA</th> <th>DocVQA</th> <th>ChartQA</th> <th>AI2D</th> <th>OCR-VQA</th> </tr> </thead> <tbody align="center"> <tr> <td rowspan="5">Generalist Models</td> <td>BLIP-2 (Vicuna-13B)</td> <td>42.4</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>InstructBLIP (Vicuna-13B)</td> <td>50.7</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>mPLUG-DocOwl (LLaMA-7B)</td> <td>52.6</td> <td>62.2</td> <td>57.4</td> <td>-</td> <td>-</td> </tr> <tr> <td>Pic2Struct-Large (1.3B)</td> <td>-</td> <td><b>76.6</b></td> <td>58.6</td> <td>42.1</td> <td>71.3</td> </tr> <tr> <td>Qwen-VL (Qwen-7B)</td> <td><b>63.8</b></td> <td>65.1</td> <td><b>65.7</b></td> <td><b>62.3</b></td> <td><b>75.7</b></td> </tr> <tr> <td>Specialist SOTAs<br>(Specialist/Finetuned)</td> <td>PALI-X-55B (Single-task FT)<br>(Without OCR Pipeline)</td> <td>71.44</td> <td>80.0</td> <td>70.0</td> <td>81.2</td> <td>75.0</td> </tr> </tbody> </table> - 在文字相关的识别/问答评测上,取得了当前规模下通用 LVLM 达到的最好结果。 - 分辨率对上述某几个评测非常重要,大部分 224 分辨率的开源 LVLM 模型无法完成以上评测,或只能通过切图的方式解决。Qwen-VL 将分辨率提升到 448,可以直接以端到端的方式进行以上评测。Qwen-VL 在很多任务上甚至超过了 1024 分辨率的 Pic2Struct-Large 模型。 - In text-related recognition/QA evaluation, Qwen-VL achieves the SOTA under the generalist LVLM scale settings. - Resolution is important for several above evaluations. While most open-source LVLM models with 224 resolution are incapable of these evaluations or can only solve these by cutting images, Qwen-VL scales the resolution to 448 so that it can be evaluated end-to-end. Qwen-VL even outperforms Pic2Struct-Large models of 1024 resolution on some tasks. ### 细粒度视觉定位 (Referring Expression Comprehension) <table> <thead> <tr> <th rowspan="2">Model type</th> <th rowspan="2">Model</th> <th colspan="3">RefCOCO</th> <th colspan="3">RefCOCO+</th> <th colspan="2">RefCOCOg</th> <th>GRIT</th> </tr> <tr> <th>val</th> <th>test-A</th> <th>test-B</th> <th>val</th> <th>test-A</th> <th>test-B</th> <th>val-u</th> <th>test-u</th> <th>refexp</th> </tr> </thead> <tbody align="center"> <tr> <td rowspan="8">Generalist Models</td> <td>GPV-2</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>51.50</td> </tr> <tr> <td>OFA-L*</td> <td>79.96</td> <td>83.67</td> <td>76.39</td> <td>68.29</td> <td>76.00</td> <td>61.75</td> <td>67.57</td> <td>67.58</td> <td>61.70</td> </tr> <tr> <td>Unified-IO</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td><b>78.61</b></td> </tr> <tr> <td>VisionLLM-H</td> <td></td> <td>86.70</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>Shikra-7B</td> <td>87.01</td> <td>90.61</td> <td>80.24 </td> <td>81.60</td> <td>87.36</td> <td>72.12</td> <td>82.27</td> <td>82.19</td> <td>69.34</td> </tr> <tr> <td>Shikra-13B</td> <td>87.83 </td> <td>91.11</td> <td>81.81</td> <td>82.89</td> <td>87.79</td> <td>74.41</td> <td>82.64</td> <td>83.16</td> <td>69.03</td> </tr> <tr> <td>Qwen-VL-7B</td> <td><b>89.36</b></td> <td>92.26</td> <td><b>85.34</b></td> <td><b>83.12</b></td> <td>88.25</td> <td><b>77.21</b></td> <td>85.58</td> <td>85.48</td> <td>78.22</td> </tr> <tr> <td>Qwen-VL-7B-Chat</td> <td>88.55</td> <td><b>92.27</b></td> <td>84.51</td> <td>82.82</td> <td><b>88.59</b></td> <td>76.79</td> <td><b>85.96</b></td> <td><b>86.32</b></td> <td>-</td> <tr> <td rowspan="3">Specialist SOTAs<br>(Specialist/Finetuned)</td> <td>G-DINO-L</td> <td>90.56&nbsp;&nbsp;</td> <td>93.19</td> <td>88.24</td> <td>82.75</td> <td>88.95</td> <td>75.92</td> <td>86.13</td> <td>87.02</td> <td>-</td> </tr> <tr> <td>UNINEXT-H</td> <td>92.64 </td> <td>94.33</td> <td>91.46</td> <td>85.24</td> <td>89.63</td> <td>79.79</td> <td>88.73</td> <td>89.37</td> <td>-</td> </tr> <tr> <td>ONE-PEACE</td> <td>92.58 </td> <td>94.18</td> <td>89.26</td> <td>88.77</td> <td>92.21</td> <td>83.23</td> <td>89.22</td> <td>89.27</td> <td>-</td> </tr> </tbody> </table> - 在定位任务上,Qwen-VL 全面超过 Shikra-13B,取得了目前 Generalist LVLM 模型上在 Refcoco 上的 **SOTA**。 - Qwen-VL 并没有在任何中文定位数据上训练过,但通过中文 Caption 数据和 英文 Grounding 数据的训练,可以 Zero-shot 泛化出中文 Grounding 能力。 我们提供了以上**所有**评测脚本以供复现我们的实验结果。请阅读 [eval/EVALUATION.md](eval/EVALUATION.md) 了解更多信息。 - Qwen-VL achieves the **SOTA** in all above referring expression comprehension benchmarks. - Qwen-VL has not been trained on any Chinese grounding data, but it can still generalize to the Chinese Grounding tasks in a zero-shot way by training Chinese Caption data and English Grounding data. We provide all of the above evaluation scripts for reproducing our experimental results. Please read [eval/EVALUATION.md](eval/EVALUATION.md) for more information. ### 闲聊能力测评 (Chat Evaluation) TouchStone 是一个基于 GPT4 打分来评测 LVLM 模型的图文对话能力和人类对齐水平的基准。它涵盖了 300+张图片、800+道题目、27个类别,包括基础属性、人物地标、视觉推理、诗歌创作、故事写作、商品比较、图片解题等**尽可能广泛的类别**。关于 TouchStone 的详细介绍,请参考[touchstone/README_CN.md](touchstone/README_CN.md)了解更多信息。 TouchStone is a benchmark based on scoring with GPT4 to evaluate the abilities of the LVLM model on text-image dialogue and alignment levels with humans. It covers a total of 300+ images, 800+ questions, and 27 categories, such as attribute-based Q&A, celebrity recognition, writing poetry, summarizing multiple images, product comparison, math problem solving, etc. Please read [touchstone/README_CN.md](touchstone/README.md) for more information. #### 英语 (English) | Model | Score | |---------------|-------| | PandaGPT | 488.5 | | MiniGPT4 | 531.7 | | InstructBLIP | 552.4 | | LLaMA-AdapterV2 | 590.1 | | mPLUG-Owl | 605.4 | | LLaVA | 602.7 | | Qwen-VL-Chat | 645.2 | #### 中文 (Chinese) | Model | Score | |---------------|-------| | VisualGLM | 247.1 | | Qwen-VL-Chat | 401.2 | Qwen-VL-Chat 模型在中英文的对齐评测中均取得当前 LVLM 模型下的最好结果。 Qwen-VL-Chat has achieved the best results in both Chinese and English alignment evaluation. <br> ## 常见问题 (FAQ) 如遇到问题,敬请查阅 [FAQ](https://github.com/QwenLM/Qwen-VL/blob/master/FAQ_zh.md)以及issue区,如仍无法解决再提交issue。 If you meet problems, please refer to [FAQ](https://github.com/QwenLM/Qwen-VL/blob/master/FAQ.md) and the issues first to search a solution before you launch a new issue. <br> ## 使用协议 (License Agreement) 研究人员与开发者可使用Qwen-VL和Qwen-VL-Chat或进行二次开发。我们同样允许商业使用,具体细节请查看[LICENSE](https://github.com/QwenLM/Qwen-VL/blob/master/LICENSE)。如需商用,请填写[问卷](https://dashscope.console.aliyun.com/openModelApply/qianwen)申请。 Researchers and developers are free to use the codes and model weights of both Qwen-VL and Qwen-VL-Chat. We also allow their commercial use. Check our license at [LICENSE](LICENSE) for more details. <br> ## 引用 (Citation) 如果你觉得我们的论文和代码对你的研究有帮助,请考虑:star: 和引用 :pencil: :) If you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil: :) ```BibTeX @article{Qwen-VL, title={Qwen-VL: A Frontier Large Vision-Language Model with Versatile Abilities}, author={Bai, Jinze and Bai, Shuai and Yang, Shusheng and Wang, Shijie and Tan, Sinan and Wang, Peng and Lin, Junyang and Zhou, Chang and Zhou, Jingren}, journal={arXiv preprint arXiv:2308.12966}, year={2023} } ``` <br> ## 联系我们 (Contact Us) 如果你想给我们的研发团队和产品团队留言,请通过邮件(qianwen_opensource@alibabacloud.com)联系我们。 If you are interested to leave a message to either our research team or product team, feel free to send an email to qianwen_opensource@alibabacloud.com. ``` ```
jtlowell/cozy_sticker
jtlowell
2023-09-22T00:56:40Z
1
3
diffusers
[ "diffusers", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "lora", "dataset:jtlowell/cozy_stickers", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0", "region:us" ]
text-to-image
2023-09-21T23:37:15Z
--- base_model: stabilityai/stable-diffusion-xl-base-1.0 instance_prompt: cozy_stick tags: - stable-diffusion-xl - stable-diffusion-xl-diffusers - text-to-image - diffusers - lora inference: true datasets: - jtlowell/cozy_stickers --- # LoRA DreamBooth - jtlowell/cozy_sticker These are LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained on the concept prompt: `cozy_stick` Use this keyword to trigger your custom model in your prompts. LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Usage Make sure to upgrade diffusers to >= 0.19.0: ``` pip install diffusers --upgrade ``` In addition make sure to install transformers, safetensors, accelerate as well as the invisible watermark: ``` pip install invisible_watermark transformers accelerate safetensors ``` To just use the base model, you can run: ```python import torch from diffusers import DiffusionPipeline, AutoencoderKL vae = AutoencoderKL.from_pretrained('madebyollin/sdxl-vae-fp16-fix', torch_dtype=torch.float16) pipe = DiffusionPipeline.from_pretrained( "stabilityai/stable-diffusion-xl-base-1.0", vae=vae, torch_dtype=torch.float16, variant="fp16", use_safetensors=True ) # This is where you load your trained weights pipe.load_lora_weights('jtlowell/cozy_sticker') pipe.to("cuda") prompt = "A majestic cozy_stick jumping from a big stone at night" image = pipe(prompt=prompt, num_inference_steps=50).images[0] ```
AshokKakunuri/ast-finetuned-audioset-10-10-0.4593-finetuned-gtzan-Ashok
AshokKakunuri
2023-09-22T00:52:13Z
159
0
transformers
[ "transformers", "pytorch", "audio-spectrogram-transformer", "audio-classification", "generated_from_trainer", "dataset:marsyas/gtzan", "base_model:MIT/ast-finetuned-audioset-10-10-0.4593", "base_model:finetune:MIT/ast-finetuned-audioset-10-10-0.4593", "license:bsd-3-clause", "model-index", "endpoints_compatible", "region:us" ]
audio-classification
2023-09-22T00:17:50Z
--- license: bsd-3-clause base_model: MIT/ast-finetuned-audioset-10-10-0.4593 tags: - generated_from_trainer datasets: - marsyas/gtzan metrics: - accuracy model-index: - name: ast-finetuned-audioset-10-10-0.4593-finetuned-gtzan results: - task: name: Audio Classification type: audio-classification dataset: name: GTZAN type: marsyas/gtzan config: all split: train args: all metrics: - name: Accuracy type: accuracy value: 0.91 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # ast-finetuned-audioset-10-10-0.4593-finetuned-gtzan This model is a fine-tuned version of [MIT/ast-finetuned-audioset-10-10-0.4593](https://huggingface.co/MIT/ast-finetuned-audioset-10-10-0.4593) on the GTZAN dataset. It achieves the following results on the evaluation set: - Loss: 0.3592 - Accuracy: 0.91 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.0215 | 1.0 | 112 | 0.6979 | 0.82 | | 0.5726 | 2.0 | 225 | 0.4903 | 0.84 | | 0.402 | 3.0 | 337 | 0.5950 | 0.82 | | 0.0031 | 4.0 | 450 | 0.7435 | 0.84 | | 0.0015 | 5.0 | 562 | 0.6883 | 0.84 | | 0.001 | 6.0 | 675 | 0.5155 | 0.88 | | 0.0002 | 7.0 | 787 | 0.4624 | 0.9 | | 0.0002 | 8.0 | 900 | 0.3535 | 0.9 | | 0.1006 | 9.0 | 1012 | 0.3671 | 0.9 | | 0.0001 | 9.96 | 1120 | 0.3592 | 0.91 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
CyberHarem/sawada_marina_idolmastercinderellagirls
CyberHarem
2023-09-22T00:48:12Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/sawada_marina_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T00:36:38Z
--- license: mit datasets: - CyberHarem/sawada_marina_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of sawada_marina_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 3400, you need to download `3400/sawada_marina_idolmastercinderellagirls.pt` as the embedding and `3400/sawada_marina_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 3400**, with the score of 0.949. The trigger words are: 1. `sawada_marina_idolmastercinderellagirls` 2. `brown_eyes, brown_hair, smile, long_hair, breasts, open_mouth` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.935 | [Download](5100/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![pattern_3-5100](5100/previews/pattern_3.png) | ![pattern_4-5100](5100/previews/pattern_4.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.910 | [Download](4760/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![pattern_3-4760](4760/previews/pattern_3.png) | ![pattern_4-4760](4760/previews/pattern_4.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.924 | [Download](4420/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![pattern_3-4420](4420/previews/pattern_3.png) | ![pattern_4-4420](4420/previews/pattern_4.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.912 | [Download](4080/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![pattern_3-4080](4080/previews/pattern_3.png) | ![pattern_4-4080](4080/previews/pattern_4.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.912 | [Download](3740/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![pattern_3-3740](3740/previews/pattern_3.png) | ![pattern_4-3740](3740/previews/pattern_4.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | **3400** | **0.949** | [**Download**](3400/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![pattern_3-3400](3400/previews/pattern_3.png) | ![pattern_4-3400](3400/previews/pattern_4.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.945 | [Download](3060/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![pattern_3-3060](3060/previews/pattern_3.png) | ![pattern_4-3060](3060/previews/pattern_4.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.873 | [Download](2720/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![pattern_3-2720](2720/previews/pattern_3.png) | ![pattern_4-2720](2720/previews/pattern_4.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.935 | [Download](2380/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![pattern_3-2380](2380/previews/pattern_3.png) | ![pattern_4-2380](2380/previews/pattern_4.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.947 | [Download](2040/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![pattern_3-2040](2040/previews/pattern_3.png) | ![pattern_4-2040](2040/previews/pattern_4.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.931 | [Download](1700/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![pattern_3-1700](1700/previews/pattern_3.png) | ![pattern_4-1700](1700/previews/pattern_4.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.943 | [Download](1360/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![pattern_3-1360](1360/previews/pattern_3.png) | ![pattern_4-1360](1360/previews/pattern_4.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.939 | [Download](1020/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![pattern_3-1020](1020/previews/pattern_3.png) | ![pattern_4-1020](1020/previews/pattern_4.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.946 | [Download](680/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![pattern_3-680](680/previews/pattern_3.png) | ![pattern_4-680](680/previews/pattern_4.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.765 | [Download](340/sawada_marina_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![pattern_3-340](340/previews/pattern_3.png) | ![pattern_4-340](340/previews/pattern_4.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
luffycodes/higgs-llama-vicuna-ep25-70b
luffycodes
2023-09-22T00:34:15Z
1,498
2
transformers
[ "transformers", "pytorch", "llama", "text-generation", "chatgpt", "tutorbot", "physics", "code", "math", "mathematics", "en", "arxiv:2309.12161", "license:llama2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-20T04:27:23Z
--- license: llama2 language: - en library_name: transformers tags: - chatgpt - tutorbot - physics - code - math - mathematics - llama --- # Higgs Model Card ## Github details Please checkout the repo: https://github.com/luffycodes/Tutorbot-Spock-Phys. ## Model details **Model type:** Higgs is an open-source educational tutoring chatbot trained by fine-tuning LLaMA-2-70B-chat model on synthetic student-tutorbot conversations generated using [specialized prompts](https://github.com/luffycodes/Tutorbot-Spock-Phys/tree/main/prompts/conversation_gen). Higgs performs a code soliloquy (an inner monologue) in which Higgs prompts itself through a series of prompts to decide if it's next response to the student would need any math calculations. If Higgs determines that the response might require such a calculation, such as in cases to verify student calculation, it outputs python code. Then, it uses the output of the python code to frame a mathematically accurate reply to the student. **Model date:** Higgs was trained between May 2023 and Sept 2023. **Organizations developing the model:** The Higgs (Spock) team with members from Rice University and OpenStax. **Where to send questions or comments about the model:** Shashank Sonkar (ss164@rice.edu) If you use this work, please cite: Code Soliloquies for Accurate Calculations in Large Language Models https://arxiv.org/abs/2309.12161 ``` @misc{sonkar2023code, title={Code Soliloquies for Accurate Calculations in Large Language Models}, author={Shashank Sonkar and MyCo Le and Xinghe Chen and Lucy Liu and Debshila Basu Mallick and Richard G. Baraniuk}, year={2023}, eprint={2309.12161}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
jonas-luehrs/bert-base-uncased-MLM-fos-chemistry
jonas-luehrs
2023-09-22T00:13:32Z
124
0
transformers
[ "transformers", "pytorch", "bert", "fill-mask", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
2023-09-22T00:02:09Z
--- license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer model-index: - name: bert-base-uncased-MLM-fos-chemistry results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-uncased-MLM-fos-chemistry This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 5.3995 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 7.1312 | 1.0 | 211 | 6.1375 | | 5.9819 | 2.0 | 422 | 5.7641 | | 5.7234 | 3.0 | 633 | 5.6039 | | 5.5776 | 4.0 | 844 | 5.4360 | | 5.4826 | 5.0 | 1055 | 5.4316 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
dwang-LI/segformer-b0-finetuned-segments-sidewalk-2
dwang-LI
2023-09-22T00:08:12Z
187
0
transformers
[ "transformers", "pytorch", "segformer", "vision", "image-segmentation", "generated_from_trainer", "base_model:nvidia/mit-b2", "base_model:finetune:nvidia/mit-b2", "license:other", "endpoints_compatible", "region:us" ]
image-segmentation
2023-09-21T18:54:13Z
--- license: other base_model: nvidia/mit-b2 tags: - vision - image-segmentation - generated_from_trainer model-index: - name: segformer-b0-finetuned-segments-sidewalk-2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b2](https://huggingface.co/nvidia/mit-b2) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.7413 - Mean Iou: 0.4159 - Mean Accuracy: 0.4945 - Overall Accuracy: 0.8774 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.8388 - Accuracy Flat-sidewalk: 0.9573 - Accuracy Flat-crosswalk: 0.8007 - Accuracy Flat-cyclinglane: 0.8552 - Accuracy Flat-parkingdriveway: 0.5279 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.7175 - Accuracy Human-person: 0.8243 - Accuracy Human-rider: 0.0845 - Accuracy Vehicle-car: 0.9557 - Accuracy Vehicle-truck: 0.1300 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.6660 - Accuracy Vehicle-motorcycle: 0.0883 - Accuracy Vehicle-bicycle: 0.7186 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0195 - Accuracy Construction-building: 0.9361 - Accuracy Construction-door: 0.1788 - Accuracy Construction-wall: 0.6178 - Accuracy Construction-fenceguardrail: 0.3916 - Accuracy Construction-bridge: 0.5516 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.2882 - Accuracy Object-pole: 0.6120 - Accuracy Object-trafficsign: 0.5812 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9537 - Accuracy Nature-terrain: 0.8377 - Accuracy Sky: 0.9844 - Accuracy Void-ground: 0.0132 - Accuracy Void-dynamic: 0.2116 - Accuracy Void-static: 0.4815 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.7575 - Iou Flat-sidewalk: 0.8765 - Iou Flat-crosswalk: 0.6860 - Iou Flat-cyclinglane: 0.7491 - Iou Flat-parkingdriveway: 0.4030 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.5916 - Iou Human-person: 0.6521 - Iou Human-rider: 0.0488 - Iou Vehicle-car: 0.8662 - Iou Vehicle-truck: 0.1122 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.4442 - Iou Vehicle-motorcycle: 0.0883 - Iou Vehicle-bicycle: 0.5874 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0193 - Iou Construction-building: 0.7672 - Iou Construction-door: 0.1600 - Iou Construction-wall: 0.5304 - Iou Construction-fenceguardrail: 0.3282 - Iou Construction-bridge: 0.3920 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.2410 - Iou Object-pole: 0.5084 - Iou Object-trafficsign: 0.3941 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.8872 - Iou Nature-terrain: 0.7611 - Iou Sky: 0.9529 - Iou Void-ground: 0.0098 - Iou Void-dynamic: 0.1155 - Iou Void-static: 0.3780 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 1.6058 | 0.1 | 20 | 2.0687 | 0.0800 | 0.1356 | 0.6081 | nan | 0.4527 | 0.9100 | 0.0 | 0.0 | 0.0027 | nan | 0.0016 | 0.0075 | 0.0 | 0.8602 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9256 | 0.0 | 0.0197 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8878 | 0.0039 | 0.2678 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.3225 | 0.6221 | 0.0 | 0.0 | 0.0027 | 0.0 | 0.0015 | 0.0075 | 0.0 | 0.4904 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4424 | 0.0 | 0.0196 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6175 | 0.0039 | 0.2675 | 0.0 | 0.0 | 0.0009 | 0.0 | | 1.0866 | 0.2 | 40 | 1.2653 | 0.1408 | 0.1825 | 0.6924 | nan | 0.5482 | 0.9667 | 0.0 | 0.0 | 0.0041 | nan | 0.0008 | 0.0357 | 0.0 | 0.8373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9144 | 0.0 | 0.0653 | 0.0 | 0.0 | nan | 0.0 | 0.0022 | 0.0 | 0.0 | 0.9105 | 0.6945 | 0.8432 | 0.0 | 0.0 | 0.0178 | 0.0 | nan | 0.4138 | 0.6618 | 0.0 | 0.0 | 0.0041 | nan | 0.0008 | 0.0352 | 0.0 | 0.6599 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5239 | 0.0 | 0.0645 | 0.0 | 0.0 | nan | 0.0 | 0.0022 | 0.0 | 0.0 | 0.7587 | 0.5554 | 0.8085 | 0.0 | 0.0 | 0.0172 | 0.0 | | 0.8624 | 0.3 | 60 | 1.0931 | 0.1666 | 0.2119 | 0.7183 | nan | 0.5402 | 0.9777 | 0.0 | 0.2995 | 0.0167 | nan | 0.0017 | 0.1723 | 0.0 | 0.9303 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8316 | 0.0 | 0.0853 | 0.0016 | 0.0 | nan | 0.0 | 0.2050 | 0.0 | 0.0 | 0.9005 | 0.8119 | 0.9414 | 0.0 | 0.0 | 0.0648 | 0.0 | nan | 0.4592 | 0.6670 | 0.0 | 0.2883 | 0.0163 | nan | 0.0017 | 0.1657 | 0.0 | 0.6319 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5774 | 0.0 | 0.0838 | 0.0016 | 0.0 | nan | 0.0 | 0.1763 | 0.0 | 0.0 | 0.7505 | 0.5903 | 0.8594 | 0.0 | 0.0 | 0.0622 | 0.0 | | 1.1933 | 0.4 | 80 | 0.9780 | 0.1780 | 0.2241 | 0.7326 | nan | 0.5494 | 0.9759 | 0.0 | 0.4403 | 0.0635 | nan | 0.0928 | 0.1458 | 0.0 | 0.9218 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9418 | 0.0 | 0.1182 | 0.0043 | 0.0 | nan | 0.0 | 0.2582 | 0.0 | 0.0 | 0.8457 | 0.8430 | 0.9439 | 0.0 | 0.0 | 0.0278 | 0.0 | nan | 0.4827 | 0.6919 | 0.0 | 0.3890 | 0.0580 | nan | 0.0852 | 0.1389 | 0.0 | 0.6634 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5628 | 0.0 | 0.1040 | 0.0043 | 0.0 | nan | 0.0 | 0.2140 | 0.0 | 0.0 | 0.7901 | 0.6139 | 0.8700 | 0.0 | 0.0 | 0.0276 | 0.0 | | 0.8678 | 0.5 | 100 | 0.8980 | 0.2055 | 0.2559 | 0.7586 | nan | 0.7035 | 0.9240 | 0.0 | 0.5471 | 0.2248 | nan | 0.1919 | 0.4983 | 0.0 | 0.9434 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0122 | 0.0 | 0.0 | 0.8942 | 0.0 | 0.1942 | 0.0415 | 0.0 | nan | 0.0 | 0.3669 | 0.0 | 0.0 | 0.9329 | 0.7145 | 0.9508 | 0.0 | 0.0 | 0.0486 | 0.0 | nan | 0.5343 | 0.7617 | 0.0 | 0.4561 | 0.1464 | nan | 0.1592 | 0.4375 | 0.0 | 0.6418 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0122 | 0.0 | 0.0 | 0.5930 | 0.0 | 0.1698 | 0.0408 | 0.0 | nan | 0.0 | 0.2674 | 0.0 | 0.0 | 0.8039 | 0.6337 | 0.8700 | 0.0 | 0.0 | 0.0478 | 0.0 | | 0.8691 | 0.6 | 120 | 0.8161 | 0.2164 | 0.2640 | 0.7752 | nan | 0.7915 | 0.9325 | 0.0 | 0.5391 | 0.2122 | nan | 0.1662 | 0.4892 | 0.0 | 0.8716 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1287 | 0.0 | 0.0 | 0.9406 | 0.0 | 0.0777 | 0.1205 | 0.0 | nan | 0.0 | 0.3821 | 0.0 | 0.0 | 0.9227 | 0.8422 | 0.9483 | 0.0 | 0.0 | 0.0846 | 0.0 | nan | 0.5700 | 0.7869 | 0.0 | 0.4869 | 0.1733 | nan | 0.1470 | 0.3997 | 0.0 | 0.7279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1273 | 0.0 | 0.0 | 0.5747 | 0.0 | 0.0749 | 0.1087 | 0.0 | nan | 0.0 | 0.2806 | 0.0 | 0.0 | 0.8250 | 0.6766 | 0.8837 | 0.0 | 0.0 | 0.0806 | 0.0 | | 0.5554 | 0.7 | 140 | 0.8194 | 0.2308 | 0.2859 | 0.7661 | nan | 0.5964 | 0.9383 | 0.0 | 0.5685 | 0.2664 | nan | 0.4518 | 0.6607 | 0.0 | 0.9273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1700 | 0.0 | 0.0 | 0.9297 | 0.0 | 0.2699 | 0.1437 | 0.0 | nan | 0.0 | 0.3847 | 0.0 | 0.0 | 0.9183 | 0.8642 | 0.9349 | 0.0 | 0.0 | 0.1238 | 0.0 | nan | 0.4999 | 0.7466 | 0.0 | 0.4657 | 0.1921 | nan | 0.2892 | 0.4987 | 0.0 | 0.7165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1661 | 0.0 | 0.0 | 0.6027 | 0.0 | 0.2471 | 0.1317 | 0.0 | nan | 0.0 | 0.2981 | 0.0 | 0.0 | 0.8326 | 0.6980 | 0.8876 | 0.0 | 0.0 | 0.1145 | 0.0 | | 0.794 | 0.8 | 160 | 0.7478 | 0.2403 | 0.2971 | 0.7860 | nan | 0.6689 | 0.9516 | 0.0000 | 0.6487 | 0.2353 | nan | 0.2907 | 0.7860 | 0.0 | 0.9162 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1873 | 0.0 | 0.0 | 0.8967 | 0.0 | 0.4288 | 0.1785 | 0.0 | nan | 0.0 | 0.3825 | 0.0 | 0.0 | 0.9359 | 0.8514 | 0.9644 | 0.0 | 0.0 | 0.1844 | 0.0 | nan | 0.5519 | 0.7755 | 0.0000 | 0.4870 | 0.1750 | nan | 0.2347 | 0.4389 | 0.0 | 0.7562 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1836 | 0.0 | 0.0 | 0.6576 | 0.0 | 0.3686 | 0.1488 | 0.0 | nan | 0.0 | 0.3083 | 0.0 | 0.0 | 0.8332 | 0.7090 | 0.9028 | 0.0 | 0.0 | 0.1575 | 0.0 | | 0.6339 | 0.9 | 180 | 0.7390 | 0.2411 | 0.2973 | 0.7876 | nan | 0.8110 | 0.9249 | 0.0209 | 0.5771 | 0.3073 | nan | 0.3368 | 0.7894 | 0.0 | 0.8493 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5413 | 0.0 | 0.0 | 0.9288 | 0.0 | 0.1732 | 0.1312 | 0.0 | nan | 0.0 | 0.3136 | 0.0 | 0.0 | 0.9477 | 0.7834 | 0.9502 | 0.0 | 0.0 | 0.1273 | 0.0 | nan | 0.5789 | 0.8126 | 0.0209 | 0.4832 | 0.2278 | nan | 0.2592 | 0.3822 | 0.0 | 0.7646 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4843 | 0.0 | 0.0 | 0.6012 | 0.0 | 0.1659 | 0.1198 | 0.0 | nan | 0.0 | 0.2735 | 0.0 | 0.0 | 0.8253 | 0.7060 | 0.8938 | 0.0 | 0.0 | 0.1164 | 0.0 | | 0.8587 | 1.0 | 200 | 0.7084 | 0.2675 | 0.3230 | 0.8018 | nan | 0.7580 | 0.9389 | 0.3724 | 0.5680 | 0.3650 | nan | 0.3542 | 0.7193 | 0.0 | 0.9144 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5573 | 0.0 | 0.0 | 0.9250 | 0.0 | 0.3356 | 0.2530 | 0.0 | nan | 0.0 | 0.2509 | 0.0 | 0.0 | 0.9308 | 0.8555 | 0.9539 | 0.0 | 0.0 | 0.2823 | 0.0 | nan | 0.6263 | 0.7931 | 0.3623 | 0.5008 | 0.2209 | nan | 0.2872 | 0.4916 | 0.0 | 0.7620 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4623 | 0.0 | 0.0 | 0.6392 | 0.0 | 0.3075 | 0.1780 | 0.0 | nan | 0.0 | 0.2203 | 0.0 | 0.0 | 0.8510 | 0.7357 | 0.9055 | 0.0 | 0.0 | 0.2151 | 0.0 | | 0.5614 | 1.1 | 220 | 0.7561 | 0.2601 | 0.3206 | 0.7938 | nan | 0.6713 | 0.9499 | 0.2579 | 0.6298 | 0.3682 | nan | 0.3742 | 0.7721 | 0.0 | 0.9318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5424 | 0.0 | 0.0 | 0.8845 | 0.0 | 0.4127 | 0.1613 | 0.0 | nan | 0.0 | 0.2406 | 0.0 | 0.0 | 0.9196 | 0.8874 | 0.9724 | 0.0 | 0.0 | 0.2822 | 0.0 | nan | 0.5910 | 0.7841 | 0.2544 | 0.4716 | 0.2352 | nan | 0.2894 | 0.4556 | 0.0 | 0.7663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4722 | 0.0 | 0.0 | 0.6528 | 0.0 | 0.3492 | 0.1422 | 0.0 | nan | 0.0 | 0.2133 | 0.0 | 0.0 | 0.8336 | 0.6845 | 0.9031 | 0.0 | 0.0 | 0.2248 | 0.0 | | 0.6716 | 1.2 | 240 | 0.7154 | 0.2718 | 0.3453 | 0.7967 | nan | 0.6683 | 0.9515 | 0.6023 | 0.6991 | 0.3307 | nan | 0.4583 | 0.7933 | 0.0 | 0.9318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5865 | 0.0 | 0.0 | 0.8739 | 0.0 | 0.3943 | 0.2839 | 0.0 | nan | 0.0 | 0.4104 | 0.0 | 0.0 | 0.8423 | 0.9039 | 0.9740 | 0.0 | 0.0 | 0.3438 | 0.0 | nan | 0.6036 | 0.8231 | 0.5148 | 0.4430 | 0.2387 | nan | 0.3350 | 0.4530 | 0.0 | 0.7768 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4575 | 0.0 | 0.0 | 0.6788 | 0.0 | 0.3633 | 0.1930 | 0.0 | nan | 0.0 | 0.3109 | 0.0 | 0.0 | 0.7773 | 0.5568 | 0.9052 | 0.0 | 0.0 | 0.2673 | 0.0 | | 0.5977 | 1.3 | 260 | 0.6926 | 0.2792 | 0.3446 | 0.8034 | nan | 0.6946 | 0.9427 | 0.6251 | 0.5124 | 0.4221 | nan | 0.4001 | 0.8085 | 0.0 | 0.8952 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5728 | 0.0 | 0.0 | 0.8277 | 0.0 | 0.6152 | 0.2122 | 0.0 | nan | 0.0 | 0.3902 | 0.0 | 0.0 | 0.9603 | 0.8804 | 0.9529 | 0.0 | 0.0 | 0.3160 | 0.0 | nan | 0.6006 | 0.7857 | 0.5078 | 0.4729 | 0.2514 | nan | 0.3110 | 0.4450 | 0.0 | 0.7790 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4750 | 0.0 | 0.0 | 0.6803 | 0.0 | 0.4791 | 0.1668 | 0.0 | nan | 0.0 | 0.2995 | 0.0 | 0.0 | 0.8278 | 0.7035 | 0.9083 | 0.0 | 0.0 | 0.2401 | 0.0 | | 0.323 | 1.4 | 280 | 0.6871 | 0.2769 | 0.3368 | 0.8095 | nan | 0.9001 | 0.8911 | 0.5783 | 0.5414 | 0.3860 | nan | 0.2742 | 0.7728 | 0.0 | 0.9198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5951 | 0.0 | 0.0 | 0.9161 | 0.0 | 0.3427 | 0.1982 | 0.0 | nan | 0.0 | 0.3783 | 0.0 | 0.0 | 0.9283 | 0.8808 | 0.9731 | 0.0 | 0.0 | 0.3014 | 0.0 | nan | 0.6163 | 0.8266 | 0.5474 | 0.4864 | 0.2627 | nan | 0.2177 | 0.4875 | 0.0 | 0.7824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4786 | 0.0 | 0.0 | 0.6533 | 0.0 | 0.3035 | 0.1736 | 0.0 | nan | 0.0 | 0.3035 | 0.0 | 0.0 | 0.8537 | 0.7231 | 0.9064 | 0.0 | 0.0 | 0.2376 | 0.0 | | 0.4141 | 1.5 | 300 | 0.6476 | 0.2867 | 0.3494 | 0.8218 | nan | 0.7932 | 0.9491 | 0.6643 | 0.6611 | 0.2548 | nan | 0.4516 | 0.7783 | 0.0 | 0.9142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6078 | 0.0 | 0.0 | 0.8901 | 0.0 | 0.3304 | 0.2616 | 0.0 | nan | 0.0 | 0.4842 | 0.0 | 0.0 | 0.9445 | 0.7866 | 0.9691 | 0.0 | 0.0 | 0.4399 | 0.0 | nan | 0.6531 | 0.8299 | 0.5304 | 0.5258 | 0.2096 | nan | 0.3576 | 0.4954 | 0.0 | 0.7931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4708 | 0.0 | 0.0 | 0.6738 | 0.0 | 0.2988 | 0.1819 | 0.0 | nan | 0.0 | 0.3425 | 0.0 | 0.0 | 0.8504 | 0.7206 | 0.9123 | 0.0 | 0.0 | 0.3267 | 0.0 | | 0.3646 | 1.6 | 320 | 0.6528 | 0.2763 | 0.3318 | 0.8139 | nan | 0.7482 | 0.9494 | 0.6282 | 0.6090 | 0.3414 | nan | 0.5601 | 0.7720 | 0.0 | 0.8964 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5823 | 0.0 | 0.0 | 0.9521 | 0.0 | 0.2277 | 0.0532 | 0.0 | nan | 0.0 | 0.3867 | 0.0 | 0.0 | 0.9466 | 0.8266 | 0.9693 | 0.0 | 0.0 | 0.1695 | 0.0 | nan | 0.6420 | 0.8276 | 0.5175 | 0.5250 | 0.2451 | nan | 0.3864 | 0.5556 | 0.0 | 0.7824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5152 | 0.0 | 0.0 | 0.6083 | 0.0 | 0.2015 | 0.0523 | 0.0 | nan | 0.0 | 0.3181 | 0.0 | 0.0 | 0.8566 | 0.7311 | 0.9165 | 0.0 | 0.0 | 0.1596 | 0.0 | | 0.9788 | 1.7 | 340 | 0.7683 | 0.2612 | 0.3304 | 0.7955 | nan | 0.5996 | 0.9751 | 0.3832 | 0.6502 | 0.2049 | nan | 0.4099 | 0.8274 | 0.0 | 0.9395 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7249 | 0.0 | 0.0 | 0.9074 | 0.0 | 0.2946 | 0.1679 | 0.0 | nan | 0.0 | 0.4672 | 0.0 | 0.0 | 0.9432 | 0.8381 | 0.9654 | 0.0 | 0.0 | 0.2729 | 0.0 | nan | 0.5599 | 0.7533 | 0.3669 | 0.5210 | 0.1659 | nan | 0.3300 | 0.3950 | 0.0 | 0.7744 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3063 | 0.0 | 0.0 | 0.6650 | 0.0 | 0.2691 | 0.1526 | 0.0 | nan | 0.0 | 0.3416 | 0.0 | 0.0 | 0.8595 | 0.7511 | 0.9171 | 0.0 | 0.0 | 0.2307 | 0.0 | | 0.6605 | 1.8 | 360 | 0.6275 | 0.2884 | 0.3441 | 0.8263 | nan | 0.8420 | 0.9395 | 0.5045 | 0.6022 | 0.2955 | nan | 0.4330 | 0.7870 | 0.0 | 0.8991 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6139 | 0.0 | 0.0 | 0.9320 | 0.0 | 0.5533 | 0.1601 | 0.0 | nan | 0.0 | 0.3684 | 0.0 | 0.0 | 0.9458 | 0.8448 | 0.9551 | 0.0 | 0.0 | 0.3342 | 0.0 | nan | 0.6410 | 0.8340 | 0.4894 | 0.5314 | 0.2450 | nan | 0.3572 | 0.4974 | 0.0 | 0.7906 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4876 | 0.0 | 0.0 | 0.6881 | 0.0 | 0.4509 | 0.1472 | 0.0 | nan | 0.0 | 0.3078 | 0.0 | 0.0 | 0.8575 | 0.7292 | 0.9075 | 0.0 | 0.0 | 0.2676 | 0.0 | | 0.7524 | 1.9 | 380 | 0.6273 | 0.2919 | 0.3560 | 0.8269 | nan | 0.8081 | 0.9440 | 0.6296 | 0.6118 | 0.3642 | nan | 0.4947 | 0.8006 | 0.0 | 0.9350 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6406 | 0.0 | 0.0 | 0.8665 | 0.0 | 0.4322 | 0.1933 | 0.0 | nan | 0.0 | 0.3940 | 0.0 | 0.0 | 0.9558 | 0.8249 | 0.9678 | 0.0 | 0.0 | 0.5301 | 0.0 | nan | 0.6452 | 0.8301 | 0.5816 | 0.5276 | 0.2781 | nan | 0.3737 | 0.4358 | 0.0 | 0.7892 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4562 | 0.0 | 0.0 | 0.6994 | 0.0 | 0.3982 | 0.1730 | 0.0 | nan | 0.0 | 0.3147 | 0.0 | 0.0 | 0.8553 | 0.7263 | 0.9197 | 0.0 | 0.0 | 0.3361 | 0.0 | | 1.2079 | 2.0 | 400 | 0.6490 | 0.2897 | 0.3486 | 0.8215 | nan | 0.7698 | 0.9312 | 0.5400 | 0.7152 | 0.5081 | nan | 0.4067 | 0.7781 | 0.0 | 0.8939 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6318 | 0.0 | 0.0 | 0.9512 | 0.0 | 0.4015 | 0.1934 | 0.0 | nan | 0.0 | 0.3986 | 0.0 | 0.0 | 0.9288 | 0.8495 | 0.9746 | 0.0 | 0.0 | 0.2818 | 0.0 | nan | 0.6377 | 0.8292 | 0.5056 | 0.5954 | 0.2895 | nan | 0.3449 | 0.5046 | 0.0 | 0.7963 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5140 | 0.0 | 0.0 | 0.6506 | 0.0 | 0.3486 | 0.1627 | 0.0 | nan | 0.0 | 0.3316 | 0.0 | 0.0 | 0.8600 | 0.7487 | 0.9086 | 0.0 | 0.0 | 0.2415 | 0.0 | | 0.405 | 2.1 | 420 | 0.6384 | 0.2937 | 0.3622 | 0.8234 | nan | 0.7031 | 0.9520 | 0.6652 | 0.7688 | 0.3272 | nan | 0.4249 | 0.7941 | 0.0 | 0.9555 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6440 | 0.0 | 0.0 | 0.8782 | 0.0 | 0.5263 | 0.2536 | 0.0 | nan | 0.0 | 0.4750 | 0.0 | 0.0 | 0.9257 | 0.8877 | 0.9717 | 0.0 | 0.0204 | 0.4167 | 0.0 | nan | 0.6183 | 0.8256 | 0.5422 | 0.5549 | 0.2588 | nan | 0.3615 | 0.4897 | 0.0 | 0.7693 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4770 | 0.0 | 0.0 | 0.6953 | 0.0 | 0.4538 | 0.1952 | 0.0 | nan | 0.0 | 0.3494 | 0.0 | 0.0 | 0.8578 | 0.7154 | 0.9221 | 0.0 | 0.0178 | 0.2943 | 0.0 | | 0.3988 | 2.2 | 440 | 0.5934 | 0.2969 | 0.3613 | 0.8282 | nan | 0.8282 | 0.9065 | 0.6606 | 0.7416 | 0.4514 | nan | 0.3391 | 0.8057 | 0.0 | 0.9266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6561 | 0.0 | 0.0 | 0.9058 | 0.0 | 0.5015 | 0.2832 | 0.0 | nan | 0.0 | 0.3546 | 0.0 | 0.0 | 0.9493 | 0.8867 | 0.9751 | 0.0 | 0.0271 | 0.3634 | 0.0 | nan | 0.6390 | 0.8430 | 0.6169 | 0.5474 | 0.2994 | nan | 0.2954 | 0.4906 | 0.0 | 0.8041 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4835 | 0.0 | 0.0 | 0.7002 | 0.0 | 0.4378 | 0.2141 | 0.0 | nan | 0.0 | 0.3096 | 0.0 | 0.0 | 0.8614 | 0.7253 | 0.9219 | 0.0 | 0.0236 | 0.2891 | 0.0 | | 0.3143 | 2.3 | 460 | 0.6416 | 0.2938 | 0.3609 | 0.8219 | nan | 0.6598 | 0.9464 | 0.7305 | 0.7885 | 0.4049 | nan | 0.5647 | 0.7899 | 0.0 | 0.9416 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6645 | 0.0 | 0.0 | 0.9118 | 0.0 | 0.3080 | 0.1446 | 0.0 | nan | 0.0 | 0.4547 | 0.0013 | 0.0 | 0.9516 | 0.8457 | 0.9714 | 0.0 | 0.0180 | 0.4518 | 0.0 | nan | 0.6057 | 0.8265 | 0.6098 | 0.5589 | 0.2849 | nan | 0.4108 | 0.5416 | 0.0 | 0.7952 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4605 | 0.0 | 0.0 | 0.6670 | 0.0 | 0.2780 | 0.1332 | 0.0 | nan | 0.0 | 0.3446 | 0.0013 | 0.0 | 0.8621 | 0.7526 | 0.9232 | 0.0 | 0.0166 | 0.3277 | 0.0 | | 0.2928 | 2.4 | 480 | 0.6131 | 0.2982 | 0.3799 | 0.8252 | nan | 0.8027 | 0.9202 | 0.7321 | 0.6574 | 0.5363 | nan | 0.4619 | 0.8093 | 0.0 | 0.9384 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6897 | 0.0 | 0.0 | 0.7726 | 0.0 | 0.6742 | 0.3453 | 0.0 | nan | 0.0 | 0.5261 | 0.0073 | 0.0 | 0.9268 | 0.9278 | 0.9714 | 0.0 | 0.0291 | 0.4292 | 0.0 | nan | 0.6613 | 0.8264 | 0.5482 | 0.6088 | 0.3040 | nan | 0.3857 | 0.4958 | 0.0 | 0.7986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3815 | 0.0 | 0.0 | 0.6863 | 0.0 | 0.4553 | 0.2230 | 0.0 | nan | 0.0 | 0.3542 | 0.0072 | 0.0 | 0.8510 | 0.6960 | 0.9259 | 0.0 | 0.0247 | 0.3074 | 0.0 | | 0.4599 | 2.5 | 500 | 0.6091 | 0.3002 | 0.3624 | 0.8349 | nan | 0.7736 | 0.9424 | 0.7071 | 0.7431 | 0.3981 | nan | 0.5366 | 0.7966 | 0.0 | 0.9173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6335 | 0.0 | 0.0 | 0.9459 | 0.0 | 0.4041 | 0.2116 | 0.0 | nan | 0.0 | 0.4241 | 0.0093 | 0.0 | 0.9330 | 0.9141 | 0.9820 | 0.0 | 0.0143 | 0.3110 | 0.0 | nan | 0.6768 | 0.8398 | 0.5669 | 0.6214 | 0.2799 | nan | 0.4125 | 0.5186 | 0.0 | 0.8087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4910 | 0.0 | 0.0 | 0.6632 | 0.0 | 0.3655 | 0.1828 | 0.0 | nan | 0.0 | 0.3481 | 0.0093 | 0.0 | 0.8663 | 0.7557 | 0.9166 | 0.0 | 0.0129 | 0.2718 | 0.0 | | 0.4748 | 2.6 | 520 | 0.6341 | 0.2957 | 0.3561 | 0.8299 | nan | 0.8109 | 0.9485 | 0.4931 | 0.6152 | 0.3592 | nan | 0.5488 | 0.8282 | 0.0 | 0.9279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6551 | 0.0 | 0.0 | 0.9281 | 0.0 | 0.4952 | 0.1922 | 0.0 | nan | 0.0 | 0.4332 | 0.0063 | 0.0 | 0.9320 | 0.8761 | 0.9674 | 0.0 | 0.0217 | 0.3563 | 0.0 | nan | 0.6506 | 0.8292 | 0.4840 | 0.5546 | 0.2594 | nan | 0.4223 | 0.4598 | 0.0 | 0.8118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4897 | 0.0 | 0.0 | 0.6874 | 0.0 | 0.4248 | 0.1705 | 0.0 | nan | 0.0 | 0.3533 | 0.0063 | 0.0 | 0.8680 | 0.7591 | 0.9237 | 0.0 | 0.0187 | 0.2890 | 0.0 | | 0.3089 | 2.7 | 540 | 0.6322 | 0.3046 | 0.3687 | 0.8342 | nan | 0.7583 | 0.9454 | 0.4860 | 0.7492 | 0.4990 | nan | 0.4709 | 0.8254 | 0.0 | 0.9293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6323 | 0.0 | 0.0 | 0.8981 | 0.0107 | 0.5664 | 0.2286 | 0.0 | nan | 0.0 | 0.4848 | 0.0162 | 0.0 | 0.9488 | 0.8615 | 0.9829 | 0.0 | 0.0939 | 0.4090 | 0.0 | nan | 0.6480 | 0.8330 | 0.4586 | 0.6005 | 0.3357 | nan | 0.3928 | 0.4786 | 0.0 | 0.8041 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5014 | 0.0 | 0.0 | 0.7103 | 0.0106 | 0.4791 | 0.2035 | 0.0 | nan | 0.0 | 0.3768 | 0.0159 | 0.0 | 0.8618 | 0.7505 | 0.9134 | 0.0 | 0.0524 | 0.3200 | 0.0 | | 1.2466 | 2.8 | 560 | 0.6182 | 0.3016 | 0.3657 | 0.8295 | nan | 0.8185 | 0.9364 | 0.4282 | 0.6726 | 0.4169 | nan | 0.5586 | 0.8235 | 0.0 | 0.9237 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6627 | 0.0 | 0.0 | 0.9047 | 0.0223 | 0.4594 | 0.3392 | 0.0 | nan | 0.0 | 0.4488 | 0.0188 | 0.0 | 0.9480 | 0.7482 | 0.9804 | 0.0000 | 0.0888 | 0.5031 | 0.0 | nan | 0.6578 | 0.8283 | 0.4222 | 0.6020 | 0.3158 | nan | 0.4153 | 0.5162 | 0.0 | 0.8096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4946 | 0.0 | 0.0 | 0.7006 | 0.0211 | 0.4068 | 0.2277 | 0.0 | nan | 0.0 | 0.3612 | 0.0184 | 0.0 | 0.8493 | 0.6842 | 0.9195 | 0.0000 | 0.0515 | 0.3479 | 0.0 | | 0.3471 | 2.9 | 580 | 0.6088 | 0.3130 | 0.3846 | 0.8405 | nan | 0.7740 | 0.9325 | 0.8216 | 0.7382 | 0.5086 | nan | 0.5655 | 0.8194 | 0.0 | 0.9507 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6233 | 0.0 | 0.0 | 0.9001 | 0.0216 | 0.5836 | 0.3403 | 0.0 | nan | 0.0 | 0.3942 | 0.0267 | 0.0 | 0.9301 | 0.8750 | 0.9710 | 0.0 | 0.0167 | 0.5157 | 0.0 | nan | 0.6756 | 0.8458 | 0.4939 | 0.6495 | 0.3248 | nan | 0.4361 | 0.5502 | 0.0 | 0.7944 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4773 | 0.0 | 0.0 | 0.7238 | 0.0215 | 0.4953 | 0.2605 | 0.0 | nan | 0.0 | 0.3396 | 0.0261 | 0.0 | 0.8604 | 0.7330 | 0.9247 | 0.0 | 0.0149 | 0.3694 | 0.0 | | 0.447 | 3.0 | 600 | 0.6063 | 0.3064 | 0.3674 | 0.8408 | nan | 0.7719 | 0.9575 | 0.7510 | 0.7675 | 0.3209 | nan | 0.5208 | 0.8243 | 0.0 | 0.9409 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6379 | 0.0 | 0.0 | 0.8865 | 0.0032 | 0.4287 | 0.1902 | 0.0 | nan | 0.0 | 0.4460 | 0.0301 | 0.0 | 0.9650 | 0.8454 | 0.9800 | 0.0000 | 0.0084 | 0.4821 | 0.0 | nan | 0.6733 | 0.8440 | 0.6036 | 0.6185 | 0.2724 | nan | 0.4230 | 0.4980 | 0.0 | 0.8106 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5051 | 0.0 | 0.0 | 0.7000 | 0.0032 | 0.3955 | 0.1754 | 0.0 | nan | 0.0 | 0.3558 | 0.0298 | 0.0 | 0.8493 | 0.7531 | 0.9206 | 0.0000 | 0.0075 | 0.3646 | 0.0 | | 0.3025 | 3.1 | 620 | 0.6267 | 0.3100 | 0.3841 | 0.8346 | nan | 0.7058 | 0.9394 | 0.7532 | 0.8210 | 0.4279 | nan | 0.5254 | 0.8072 | 0.0 | 0.9292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6649 | 0.0 | 0.0 | 0.8850 | 0.0367 | 0.6998 | 0.3119 | 0.0 | nan | 0.0 | 0.4649 | 0.0061 | 0.0 | 0.9281 | 0.8715 | 0.9689 | 0.0019 | 0.0069 | 0.5364 | 0.0 | nan | 0.6314 | 0.8461 | 0.5290 | 0.5602 | 0.3011 | nan | 0.4210 | 0.5059 | 0.0 | 0.8239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4858 | 0.0 | 0.0 | 0.7287 | 0.0366 | 0.5161 | 0.2431 | 0.0 | nan | 0.0 | 0.3731 | 0.0061 | 0.0 | 0.8683 | 0.7459 | 0.9288 | 0.0018 | 0.0063 | 0.3594 | 0.0 | | 0.5402 | 3.2 | 640 | 0.6114 | 0.3148 | 0.3742 | 0.8413 | nan | 0.8162 | 0.9555 | 0.7193 | 0.6148 | 0.4400 | nan | 0.5070 | 0.8157 | 0.0 | 0.9328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6549 | 0.0 | 0.0 | 0.9089 | 0.0828 | 0.4958 | 0.3236 | 0.0 | nan | 0.0 | 0.3724 | 0.0483 | 0.0 | 0.9490 | 0.8336 | 0.9762 | 0.0 | 0.0212 | 0.5066 | 0.0 | nan | 0.6801 | 0.8399 | 0.6177 | 0.5636 | 0.3160 | nan | 0.4129 | 0.5409 | 0.0 | 0.8207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5088 | 0.0 | 0.0 | 0.7141 | 0.0751 | 0.4367 | 0.2617 | 0.0 | nan | 0.0 | 0.3152 | 0.0471 | 0.0 | 0.8678 | 0.7556 | 0.9279 | 0.0 | 0.0185 | 0.3540 | 0.0 | | 0.3071 | 3.3 | 660 | 0.6226 | 0.3122 | 0.3814 | 0.8409 | nan | 0.7812 | 0.9437 | 0.7351 | 0.7543 | 0.4146 | nan | 0.5299 | 0.8407 | 0.0 | 0.9495 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7120 | 0.0 | 0.0 | 0.9031 | 0.0202 | 0.4934 | 0.3704 | 0.0 | nan | 0.0 | 0.3671 | 0.0979 | 0.0 | 0.9470 | 0.8862 | 0.9775 | 0.0 | 0.0764 | 0.4034 | 0.0 | nan | 0.6701 | 0.8389 | 0.5992 | 0.6383 | 0.3214 | nan | 0.4274 | 0.5242 | 0.0 | 0.8016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4534 | 0.0 | 0.0 | 0.7129 | 0.0199 | 0.4292 | 0.2599 | 0.0 | nan | 0.0 | 0.2979 | 0.0930 | 0.0 | 0.8654 | 0.7525 | 0.9216 | 0.0 | 0.0545 | 0.3106 | 0.0 | | 0.2812 | 3.4 | 680 | 0.5891 | 0.3154 | 0.3853 | 0.8385 | nan | 0.7691 | 0.9259 | 0.7320 | 0.7075 | 0.6098 | nan | 0.6156 | 0.8274 | 0.0 | 0.9228 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6670 | 0.0 | 0.0 | 0.8950 | 0.0003 | 0.7067 | 0.2373 | 0.0 | nan | 0.0 | 0.3845 | 0.0771 | 0.0 | 0.9436 | 0.8482 | 0.9784 | 0.0002 | 0.0573 | 0.4248 | 0.0 | nan | 0.6809 | 0.8459 | 0.6360 | 0.6308 | 0.2685 | nan | 0.4481 | 0.5109 | 0.0 | 0.8121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4987 | 0.0 | 0.0 | 0.7307 | 0.0003 | 0.5476 | 0.1996 | 0.0 | nan | 0.0 | 0.3162 | 0.0746 | 0.0 | 0.8696 | 0.7530 | 0.9245 | 0.0002 | 0.0378 | 0.3082 | 0.0 | | 0.4997 | 3.5 | 700 | 0.5982 | 0.3166 | 0.3801 | 0.8448 | nan | 0.8607 | 0.9301 | 0.7868 | 0.6661 | 0.4482 | nan | 0.5705 | 0.7812 | 0.0 | 0.9462 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6700 | 0.0 | 0.0 | 0.9333 | 0.0002 | 0.4569 | 0.2193 | 0.0 | nan | 0.0 | 0.4467 | 0.1375 | 0.0 | 0.9381 | 0.8742 | 0.9743 | 0.0 | 0.0891 | 0.4326 | 0.0 | nan | 0.7075 | 0.8577 | 0.6178 | 0.5973 | 0.2946 | nan | 0.4482 | 0.5701 | 0.0 | 0.8044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4950 | 0.0 | 0.0 | 0.6931 | 0.0002 | 0.4124 | 0.1985 | 0.0 | nan | 0.0 | 0.3601 | 0.1275 | 0.0 | 0.8717 | 0.7650 | 0.9302 | 0.0 | 0.0483 | 0.3308 | 0.0 | | 0.3472 | 3.6 | 720 | 0.6052 | 0.3213 | 0.3866 | 0.8432 | nan | 0.7653 | 0.9485 | 0.7447 | 0.7379 | 0.4858 | nan | 0.6064 | 0.8020 | 0.0 | 0.9290 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6426 | 0.0 | 0.0 | 0.9297 | 0.0 | 0.6149 | 0.3437 | 0.0 | nan | 0.0 | 0.5040 | 0.1435 | 0.0 | 0.9358 | 0.8249 | 0.9786 | 0.0 | 0.1058 | 0.3279 | 0.0 | nan | 0.6746 | 0.8433 | 0.6430 | 0.6389 | 0.3082 | nan | 0.4471 | 0.5350 | 0.0 | 0.8149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5024 | 0.0 | 0.0 | 0.7226 | 0.0 | 0.4948 | 0.2599 | 0.0 | nan | 0.0 | 0.3883 | 0.1319 | 0.0 | 0.8688 | 0.7569 | 0.9257 | 0.0 | 0.0589 | 0.2650 | 0.0 | | 0.4252 | 3.7 | 740 | 0.6622 | 0.3123 | 0.3891 | 0.8234 | nan | 0.6217 | 0.9484 | 0.7138 | 0.8309 | 0.4263 | nan | 0.5506 | 0.8600 | 0.0 | 0.9525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6234 | 0.0 | 0.0 | 0.8508 | 0.0084 | 0.5880 | 0.3145 | 0.0 | nan | 0.0 | 0.4943 | 0.2299 | 0.0 | 0.9380 | 0.9003 | 0.9850 | 0.0 | 0.2103 | 0.4036 | 0.0 | nan | 0.5607 | 0.8413 | 0.6569 | 0.4842 | 0.3210 | nan | 0.4427 | 0.4777 | 0.0 | 0.7951 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4647 | 0.0 | 0.0 | 0.7209 | 0.0084 | 0.5126 | 0.2370 | 0.0 | nan | 0.0 | 0.3795 | 0.2061 | 0.0 | 0.8637 | 0.7505 | 0.9223 | 0.0 | 0.0647 | 0.2846 | 0.0 | | 0.3308 | 3.8 | 760 | 0.6377 | 0.3127 | 0.3808 | 0.8289 | nan | 0.6214 | 0.9625 | 0.5946 | 0.8099 | 0.4454 | nan | 0.5188 | 0.8494 | 0.0 | 0.9136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6696 | 0.0 | 0.0 | 0.9263 | 0.0024 | 0.6152 | 0.2950 | 0.0 | nan | 0.0027 | 0.4453 | 0.1733 | 0.0 | 0.9520 | 0.8743 | 0.9699 | 0.0004 | 0.1977 | 0.3462 | 0.0 | nan | 0.5763 | 0.8251 | 0.5772 | 0.5411 | 0.3457 | nan | 0.4107 | 0.4836 | 0.0 | 0.8239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5117 | 0.0 | 0.0 | 0.7262 | 0.0024 | 0.5110 | 0.2294 | 0.0 | nan | 0.0026 | 0.3668 | 0.1574 | 0.0 | 0.8705 | 0.7696 | 0.9339 | 0.0004 | 0.0602 | 0.2801 | 0.0 | | 0.4693 | 3.9 | 780 | 0.5859 | 0.3216 | 0.3778 | 0.8472 | nan | 0.7901 | 0.9565 | 0.7815 | 0.7387 | 0.4418 | nan | 0.5939 | 0.8066 | 0.0 | 0.9374 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5633 | 0.0 | 0.0 | 0.9404 | 0.0047 | 0.4300 | 0.1846 | 0.0 | nan | 0.0062 | 0.5028 | 0.1681 | 0.0 | 0.9364 | 0.8907 | 0.9751 | 0.0 | 0.0659 | 0.3760 | 0.0 | nan | 0.6844 | 0.8378 | 0.6638 | 0.6608 | 0.3533 | nan | 0.4678 | 0.5651 | 0.0 | 0.8096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5098 | 0.0 | 0.0 | 0.7056 | 0.0047 | 0.4085 | 0.1746 | 0.0 | nan | 0.0060 | 0.3747 | 0.1550 | 0.0 | 0.8729 | 0.7620 | 0.9327 | 0.0 | 0.0350 | 0.3067 | 0.0 | | 0.4175 | 4.0 | 800 | 0.5300 | 0.3304 | 0.3990 | 0.8474 | nan | 0.8728 | 0.9146 | 0.8169 | 0.6208 | 0.5492 | nan | 0.5932 | 0.8393 | 0.0 | 0.9432 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6227 | 0.0 | 0.0 | 0.8618 | 0.0183 | 0.7515 | 0.3741 | 0.0 | nan | 0.0 | 0.4608 | 0.2043 | 0.0 | 0.9611 | 0.8383 | 0.9787 | 0.0 | 0.1044 | 0.4406 | 0.0 | nan | 0.7017 | 0.8579 | 0.6370 | 0.5716 | 0.3295 | nan | 0.4682 | 0.5922 | 0.0 | 0.8053 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5317 | 0.0 | 0.0 | 0.7427 | 0.0183 | 0.5577 | 0.2733 | 0.0 | nan | 0.0 | 0.3651 | 0.1856 | 0.0 | 0.8612 | 0.7375 | 0.9269 | 0.0 | 0.0746 | 0.3346 | 0.0 | | 0.2417 | 4.1 | 820 | 0.6240 | 0.3189 | 0.3775 | 0.8361 | nan | 0.8131 | 0.9538 | 0.7682 | 0.4531 | 0.4900 | nan | 0.5241 | 0.8388 | 0.0 | 0.9496 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5933 | 0.0 | 0.0 | 0.9136 | 0.0182 | 0.4895 | 0.2692 | 0.0 | nan | 0.0003 | 0.4741 | 0.2422 | 0.0 | 0.9537 | 0.8414 | 0.9750 | 0.0 | 0.0438 | 0.4760 | 0.0 | nan | 0.6420 | 0.8400 | 0.6500 | 0.4374 | 0.3497 | nan | 0.4397 | 0.5665 | 0.0 | 0.8008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5243 | 0.0 | 0.0 | 0.7188 | 0.0182 | 0.4395 | 0.2415 | 0.0 | nan | 0.0003 | 0.3859 | 0.2129 | 0.0 | 0.8699 | 0.7533 | 0.9329 | 0.0 | 0.0325 | 0.3490 | 0.0 | | 0.2375 | 4.2 | 840 | 0.5756 | 0.3343 | 0.4028 | 0.8498 | nan | 0.8028 | 0.9399 | 0.7560 | 0.8128 | 0.5208 | nan | 0.5758 | 0.8137 | 0.0 | 0.9266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6480 | 0.0 | 0.0 | 0.9155 | 0.0155 | 0.5142 | 0.4670 | 0.0 | nan | 0.0 | 0.4630 | 0.3347 | 0.0 | 0.9205 | 0.8892 | 0.9875 | 0.0 | 0.1196 | 0.4656 | 0.0 | nan | 0.7055 | 0.8568 | 0.6719 | 0.6537 | 0.3342 | nan | 0.4664 | 0.5856 | 0.0 | 0.8130 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5293 | 0.0 | 0.0 | 0.7177 | 0.0154 | 0.4507 | 0.2814 | 0.0 | nan | 0.0 | 0.3815 | 0.2734 | 0.0 | 0.8704 | 0.7453 | 0.9204 | 0.0 | 0.0711 | 0.3540 | 0.0 | | 0.4241 | 4.3 | 860 | 0.5682 | 0.3289 | 0.3927 | 0.8526 | nan | 0.8019 | 0.9477 | 0.7645 | 0.7864 | 0.4614 | nan | 0.6374 | 0.8213 | 0.0 | 0.9414 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6449 | 0.0 | 0.0 | 0.9389 | 0.0015 | 0.5075 | 0.1875 | 0.0 | nan | 0.0060 | 0.4489 | 0.3299 | 0.0 | 0.9467 | 0.8621 | 0.9705 | 0.0 | 0.1149 | 0.4442 | 0.0 | nan | 0.7105 | 0.8518 | 0.6212 | 0.6864 | 0.3408 | nan | 0.4810 | 0.5663 | 0.0 | 0.8198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4875 | 0.0 | 0.0 | 0.7131 | 0.0015 | 0.4607 | 0.1773 | 0.0 | nan | 0.0057 | 0.3741 | 0.2755 | 0.0 | 0.8731 | 0.7626 | 0.9348 | 0.0 | 0.0422 | 0.3375 | 0.0 | | 0.5282 | 4.4 | 880 | 0.6106 | 0.3241 | 0.3981 | 0.8456 | nan | 0.7704 | 0.9356 | 0.8287 | 0.8018 | 0.5745 | nan | 0.5025 | 0.7925 | 0.0 | 0.9564 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6370 | 0.0 | 0.0 | 0.9097 | 0.0360 | 0.5623 | 0.4363 | 0.0 | nan | 0.0 | 0.4484 | 0.2753 | 0.0 | 0.9503 | 0.7920 | 0.9724 | 0.0009 | 0.1229 | 0.4328 | 0.0 | nan | 0.7081 | 0.8635 | 0.4271 | 0.6627 | 0.3582 | nan | 0.4362 | 0.6102 | 0.0 | 0.8004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4666 | 0.0 | 0.0 | 0.7321 | 0.0353 | 0.4640 | 0.2690 | 0.0 | nan | 0.0 | 0.3772 | 0.2455 | 0.0 | 0.8642 | 0.7079 | 0.9363 | 0.0009 | 0.0663 | 0.3381 | 0.0 | | 0.3367 | 4.5 | 900 | 0.5852 | 0.3273 | 0.3859 | 0.8544 | nan | 0.8327 | 0.9558 | 0.6822 | 0.7709 | 0.3897 | nan | 0.6299 | 0.7857 | 0.0 | 0.9039 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6906 | 0.0 | 0.0 | 0.9428 | 0.0104 | 0.4495 | 0.3420 | 0.0 | nan | 0.0 | 0.4741 | 0.2323 | 0.0 | 0.9426 | 0.8893 | 0.9785 | 0.0 | 0.0100 | 0.4372 | 0.0 | nan | 0.7217 | 0.8512 | 0.6538 | 0.6746 | 0.3280 | nan | 0.4868 | 0.5682 | 0.0 | 0.8238 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4554 | 0.0 | 0.0 | 0.7065 | 0.0103 | 0.4132 | 0.2616 | 0.0 | nan | 0.0 | 0.3935 | 0.2107 | 0.0 | 0.8720 | 0.7465 | 0.9320 | 0.0 | 0.0089 | 0.3559 | 0.0 | | 0.1462 | 4.6 | 920 | 0.5898 | 0.3302 | 0.3945 | 0.8517 | nan | 0.8338 | 0.9321 | 0.7807 | 0.7720 | 0.5273 | nan | 0.5959 | 0.8227 | 0.0 | 0.9378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7047 | 0.0 | 0.0 | 0.9415 | 0.0315 | 0.4600 | 0.3444 | 0.0 | nan | 0.0 | 0.4654 | 0.2423 | 0.0 | 0.9457 | 0.8191 | 0.9787 | 0.0002 | 0.0175 | 0.4721 | 0.0 | nan | 0.7254 | 0.8518 | 0.6417 | 0.6844 | 0.3375 | nan | 0.4761 | 0.5773 | 0.0 | 0.8160 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4945 | 0.0 | 0.0 | 0.7011 | 0.0308 | 0.4212 | 0.2757 | 0.0 | nan | 0.0 | 0.3946 | 0.2157 | 0.0 | 0.8680 | 0.7422 | 0.9321 | 0.0002 | 0.0144 | 0.3644 | 0.0 | | 0.4018 | 4.7 | 940 | 0.6261 | 0.3313 | 0.4006 | 0.8471 | nan | 0.7361 | 0.9560 | 0.8252 | 0.7443 | 0.4880 | nan | 0.5874 | 0.7623 | 0.0 | 0.9292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6837 | 0.0 | 0.0 | 0.9315 | 0.0560 | 0.5691 | 0.3433 | 0.0 | nan | 0.0 | 0.4978 | 0.3762 | 0.0 | 0.9313 | 0.9096 | 0.9814 | 0.0004 | 0.0463 | 0.4639 | 0.0 | nan | 0.6786 | 0.8459 | 0.4753 | 0.6694 | 0.3442 | nan | 0.4817 | 0.6111 | 0.0 | 0.8128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4944 | 0.0 | 0.0 | 0.7258 | 0.0547 | 0.4986 | 0.2692 | 0.0 | nan | 0.0 | 0.3990 | 0.3001 | 0.0 | 0.8752 | 0.7656 | 0.9259 | 0.0004 | 0.0314 | 0.3436 | 0.0 | | 0.4323 | 4.8 | 960 | 0.6071 | 0.3369 | 0.4009 | 0.8527 | nan | 0.8909 | 0.9288 | 0.7706 | 0.7429 | 0.3946 | nan | 0.5634 | 0.8032 | 0.0 | 0.9419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6484 | 0.0 | 0.0 | 0.8991 | 0.1415 | 0.6081 | 0.2444 | 0.0 | nan | 0.0739 | 0.5198 | 0.4198 | 0.0 | 0.9632 | 0.7166 | 0.9803 | 0.0110 | 0.0255 | 0.5409 | 0.0 | nan | 0.7203 | 0.8627 | 0.6071 | 0.6653 | 0.3017 | nan | 0.4226 | 0.5991 | 0.0 | 0.8170 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4908 | 0.0 | 0.0 | 0.7391 | 0.1318 | 0.5113 | 0.2209 | 0.0 | nan | 0.0611 | 0.4137 | 0.3238 | 0.0 | 0.8530 | 0.6852 | 0.9328 | 0.0093 | 0.0178 | 0.3944 | 0.0 | | 0.2333 | 4.9 | 980 | 0.6312 | 0.3291 | 0.4049 | 0.8368 | nan | 0.6661 | 0.9599 | 0.7668 | 0.8162 | 0.4021 | nan | 0.5460 | 0.8279 | 0.0 | 0.9377 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6167 | 0.0 | 0.0 | 0.8637 | 0.0490 | 0.7147 | 0.3548 | 0.0 | nan | 0.0487 | 0.4632 | 0.4447 | 0.0 | 0.9449 | 0.8856 | 0.9798 | 0.0024 | 0.2556 | 0.4098 | 0.0 | nan | 0.6061 | 0.8459 | 0.6123 | 0.5262 | 0.3227 | nan | 0.4682 | 0.5298 | 0.0 | 0.8213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4777 | 0.0 | 0.0 | 0.7401 | 0.0488 | 0.5668 | 0.2560 | 0.0 | nan | 0.0474 | 0.3885 | 0.3212 | 0.0 | 0.8738 | 0.7688 | 0.9317 | 0.0021 | 0.0662 | 0.3102 | 0.0 | | 0.322 | 5.0 | 1000 | 0.5919 | 0.3324 | 0.3983 | 0.8527 | nan | 0.8795 | 0.9454 | 0.7501 | 0.6501 | 0.4403 | nan | 0.6006 | 0.8655 | 0.0 | 0.9171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6457 | 0.0 | 0.0 | 0.9125 | 0.0151 | 0.4907 | 0.2739 | 0.0 | nan | 0.0 | 0.5307 | 0.4038 | 0.0 | 0.9539 | 0.8585 | 0.9824 | 0.0017 | 0.2231 | 0.4063 | 0.0 | nan | 0.7358 | 0.8591 | 0.6885 | 0.6124 | 0.2908 | nan | 0.4820 | 0.5026 | 0.0 | 0.8247 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5430 | 0.0 | 0.0 | 0.7227 | 0.0151 | 0.4502 | 0.2349 | 0.0 | nan | 0.0 | 0.4150 | 0.3229 | 0.0 | 0.8710 | 0.7707 | 0.9332 | 0.0017 | 0.0580 | 0.3015 | 0.0 | | 0.3299 | 5.1 | 1020 | 0.5630 | 0.3381 | 0.4067 | 0.8554 | nan | 0.8362 | 0.9417 | 0.7732 | 0.7712 | 0.4461 | nan | 0.6305 | 0.8050 | 0.0042 | 0.9430 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.6503 | 0.0 | 0.0 | 0.9244 | 0.0045 | 0.5766 | 0.3341 | 0.0 | nan | 0.0 | 0.5153 | 0.4275 | 0.0 | 0.9401 | 0.8734 | 0.9810 | 0.0002 | 0.2593 | 0.3754 | 0.0 | nan | 0.7239 | 0.8683 | 0.6261 | 0.6428 | 0.3037 | nan | 0.5023 | 0.6126 | 0.0030 | 0.8145 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.5157 | 0.0 | 0.0 | 0.7251 | 0.0044 | 0.4858 | 0.2659 | 0.0 | nan | 0.0 | 0.4109 | 0.3378 | 0.0 | 0.8759 | 0.7719 | 0.9339 | 0.0002 | 0.0783 | 0.3135 | 0.0 | | 0.2486 | 5.2 | 1040 | 0.5781 | 0.3354 | 0.4008 | 0.8553 | nan | 0.8114 | 0.9444 | 0.7999 | 0.8031 | 0.4525 | nan | 0.5980 | 0.7806 | 0.0264 | 0.9194 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.6433 | 0.0 | 0.0 | 0.9335 | 0.0814 | 0.5906 | 0.2020 | 0.0 | nan | 0.0002 | 0.4619 | 0.4333 | 0.0 | 0.9473 | 0.8681 | 0.9758 | 0.0013 | 0.0796 | 0.4692 | 0.0 | nan | 0.7074 | 0.8746 | 0.5776 | 0.6471 | 0.3072 | nan | 0.4838 | 0.5905 | 0.0113 | 0.8288 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.5141 | 0.0 | 0.0 | 0.7319 | 0.0772 | 0.4996 | 0.1780 | 0.0 | nan | 0.0002 | 0.3924 | 0.3248 | 0.0 | 0.8743 | 0.7747 | 0.9360 | 0.0012 | 0.0425 | 0.3553 | 0.0 | | 0.2187 | 5.3 | 1060 | 0.5951 | 0.3310 | 0.3996 | 0.8521 | nan | 0.8741 | 0.9406 | 0.7747 | 0.6985 | 0.4974 | nan | 0.5988 | 0.7802 | 0.0157 | 0.9384 | 0.0049 | 0.0 | 0.0 | 0.0 | 0.6773 | 0.0 | 0.0 | 0.9064 | 0.1097 | 0.4512 | 0.2733 | 0.0 | nan | 0.0080 | 0.3901 | 0.4226 | 0.0 | 0.9569 | 0.8016 | 0.9778 | 0.0022 | 0.2137 | 0.4736 | 0.0 | nan | 0.7341 | 0.8672 | 0.6043 | 0.6460 | 0.3411 | nan | 0.4676 | 0.5519 | 0.0038 | 0.8196 | 0.0048 | 0.0 | 0.0 | 0.0 | 0.5047 | 0.0 | 0.0 | 0.7232 | 0.1031 | 0.4142 | 0.2260 | 0.0 | nan | 0.0077 | 0.3316 | 0.3239 | 0.0 | 0.8641 | 0.7379 | 0.9360 | 0.0020 | 0.0542 | 0.3228 | 0.0 | | 1.1803 | 5.4 | 1080 | 0.6495 | 0.3336 | 0.3988 | 0.8391 | nan | 0.6558 | 0.9554 | 0.7805 | 0.8250 | 0.5203 | nan | 0.5736 | 0.8174 | 0.0056 | 0.9396 | 0.0066 | 0.0 | 0.0 | 0.0 | 0.6255 | 0.0 | 0.0 | 0.9273 | 0.0083 | 0.5809 | 0.3201 | 0.0 | nan | 0.0 | 0.5183 | 0.4578 | 0.0 | 0.9433 | 0.8644 | 0.9790 | 0.0010 | 0.0665 | 0.3882 | 0.0 | nan | 0.6035 | 0.8349 | 0.6789 | 0.5849 | 0.3641 | nan | 0.4660 | 0.5953 | 0.0038 | 0.8272 | 0.0064 | 0.0 | 0.0 | 0.0 | 0.5366 | 0.0 | 0.0 | 0.7289 | 0.0081 | 0.4550 | 0.2526 | 0.0 | nan | 0.0 | 0.4133 | 0.3445 | 0.0 | 0.8748 | 0.7729 | 0.9337 | 0.0010 | 0.0502 | 0.3385 | 0.0 | | 0.2229 | 5.5 | 1100 | 0.5565 | 0.3363 | 0.4115 | 0.8594 | nan | 0.8699 | 0.9352 | 0.8052 | 0.7833 | 0.4652 | nan | 0.6304 | 0.7524 | 0.0693 | 0.9367 | 0.0196 | 0.0 | 0.0 | 0.0 | 0.7243 | 0.0 | 0.0 | 0.9247 | 0.0419 | 0.5269 | 0.3136 | 0.0 | nan | 0.0 | 0.5243 | 0.4829 | 0.0 | 0.9406 | 0.9026 | 0.9807 | 0.0009 | 0.1594 | 0.3776 | 0.0 | nan | 0.7612 | 0.8653 | 0.5482 | 0.6938 | 0.3423 | nan | 0.4929 | 0.5383 | 0.0143 | 0.8332 | 0.0188 | 0.0 | 0.0 | 0.0 | 0.5054 | 0.0 | 0.0 | 0.7323 | 0.0410 | 0.4596 | 0.2366 | 0.0 | nan | 0.0 | 0.4012 | 0.3332 | 0.0 | 0.8753 | 0.7496 | 0.9344 | 0.0009 | 0.0710 | 0.3135 | 0.0 | | 0.3822 | 5.6 | 1120 | 0.5669 | 0.3459 | 0.4115 | 0.8612 | nan | 0.8962 | 0.9368 | 0.7621 | 0.7845 | 0.5097 | nan | 0.4860 | 0.8278 | 0.0 | 0.9454 | 0.0269 | 0.0 | 0.0 | 0.0 | 0.6764 | 0.0 | 0.0 | 0.9072 | 0.1391 | 0.5740 | 0.2863 | 0.0 | nan | 0.0011 | 0.5018 | 0.4754 | 0.0 | 0.9440 | 0.8610 | 0.9885 | 0.0011 | 0.1953 | 0.4424 | 0.0 | nan | 0.7435 | 0.8646 | 0.7032 | 0.6904 | 0.3584 | nan | 0.4298 | 0.5766 | 0.0 | 0.8297 | 0.0261 | 0.0 | 0.0 | 0.0 | 0.5415 | 0.0 | 0.0 | 0.7357 | 0.1221 | 0.4772 | 0.2503 | 0.0 | nan | 0.0010 | 0.4117 | 0.3376 | 0.0 | 0.8797 | 0.7681 | 0.9209 | 0.0010 | 0.0649 | 0.3353 | 0.0 | | 0.2649 | 5.7 | 1140 | 0.5898 | 0.3381 | 0.4023 | 0.8563 | nan | 0.7976 | 0.9442 | 0.7650 | 0.7640 | 0.5549 | nan | 0.6973 | 0.8434 | 0.0 | 0.9444 | 0.0094 | 0.0 | 0.0 | 0.0 | 0.6017 | 0.0 | 0.0 | 0.9358 | 0.0022 | 0.5838 | 0.3365 | 0.0 | nan | 0.0000 | 0.4293 | 0.3976 | 0.0 | 0.9468 | 0.8714 | 0.9810 | 0.0012 | 0.0478 | 0.4181 | 0.0 | nan | 0.7151 | 0.8569 | 0.7113 | 0.6623 | 0.3494 | nan | 0.5183 | 0.5293 | 0.0 | 0.8284 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.5048 | 0.0 | 0.0 | 0.7321 | 0.0022 | 0.4675 | 0.2713 | 0.0 | nan | 0.0000 | 0.3827 | 0.3276 | 0.0 | 0.8788 | 0.7549 | 0.9334 | 0.0012 | 0.0408 | 0.3430 | 0.0 | | 0.36 | 5.8 | 1160 | 0.5610 | 0.3374 | 0.4024 | 0.8560 | nan | 0.8640 | 0.9259 | 0.7671 | 0.7668 | 0.6144 | nan | 0.5556 | 0.8438 | 0.0 | 0.9443 | 0.0083 | 0.0 | 0.0 | 0.0 | 0.6159 | 0.0 | 0.0 | 0.9392 | 0.0182 | 0.5002 | 0.2419 | 0.0 | nan | 0.0076 | 0.5526 | 0.4234 | 0.0 | 0.9436 | 0.8715 | 0.9810 | 0.0 | 0.1199 | 0.3711 | 0.0 | nan | 0.7443 | 0.8662 | 0.6964 | 0.6839 | 0.2997 | nan | 0.4887 | 0.5272 | 0.0 | 0.8304 | 0.0082 | 0.0 | 0.0 | 0.0 | 0.5343 | 0.0 | 0.0 | 0.7226 | 0.0181 | 0.4603 | 0.2237 | 0.0 | nan | 0.0071 | 0.4219 | 0.3267 | 0.0 | 0.8794 | 0.7650 | 0.9339 | 0.0 | 0.0503 | 0.3097 | 0.0 | | 0.1851 | 5.9 | 1180 | 0.5901 | 0.3316 | 0.4029 | 0.8500 | nan | 0.7504 | 0.9589 | 0.7043 | 0.8477 | 0.4014 | nan | 0.5987 | 0.8495 | 0.0 | 0.9416 | 0.0057 | 0.0 | 0.0 | 0.0 | 0.7332 | 0.0 | 0.0 | 0.9259 | 0.0010 | 0.5587 | 0.3454 | 0.0 | nan | 0.0001 | 0.5006 | 0.4310 | 0.0 | 0.9551 | 0.8353 | 0.9730 | 0.0 | 0.2293 | 0.3470 | 0.0 | nan | 0.6760 | 0.8659 | 0.6777 | 0.5612 | 0.3372 | nan | 0.5049 | 0.4991 | 0.0 | 0.8348 | 0.0057 | 0.0 | 0.0 | 0.0 | 0.4929 | 0.0 | 0.0 | 0.7364 | 0.0010 | 0.4741 | 0.2656 | 0.0 | nan | 0.0001 | 0.4060 | 0.3345 | 0.0 | 0.8720 | 0.7557 | 0.9380 | 0.0 | 0.0757 | 0.2966 | 0.0 | | 0.2929 | 6.0 | 1200 | 0.5772 | 0.3361 | 0.4214 | 0.8502 | nan | 0.7678 | 0.9437 | 0.8101 | 0.7995 | 0.5434 | nan | 0.6081 | 0.8514 | 0.0 | 0.9448 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.7424 | 0.0 | 0.0 | 0.8567 | 0.1035 | 0.7333 | 0.3929 | 0.0 | nan | 0.1165 | 0.4942 | 0.4598 | 0.0 | 0.9381 | 0.8854 | 0.9776 | 0.0 | 0.1032 | 0.4068 | 0.0 | nan | 0.6929 | 0.8728 | 0.5952 | 0.6048 | 0.3191 | nan | 0.4872 | 0.4926 | 0.0 | 0.8332 | 0.0043 | 0.0 | 0.0 | 0.0 | 0.4247 | 0.0 | 0.0 | 0.7432 | 0.1007 | 0.5006 | 0.3004 | 0.0 | nan | 0.0834 | 0.4060 | 0.3476 | 0.0 | 0.8751 | 0.7607 | 0.9382 | 0.0 | 0.0532 | 0.3191 | 0.0 | | 0.1837 | 6.1 | 1220 | 0.5619 | 0.3425 | 0.4019 | 0.8599 | nan | 0.8292 | 0.9577 | 0.7582 | 0.7946 | 0.4884 | nan | 0.5736 | 0.8172 | 0.0 | 0.9262 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.6724 | 0.0 | 0.0 | 0.9509 | 0.0249 | 0.4688 | 0.4031 | 0.0 | nan | 0.0112 | 0.4901 | 0.4293 | 0.0 | 0.9458 | 0.8549 | 0.9790 | 0.0001 | 0.0968 | 0.3822 | 0.0 | nan | 0.7390 | 0.8602 | 0.6756 | 0.6938 | 0.3610 | nan | 0.4749 | 0.5803 | 0.0 | 0.8379 | 0.0071 | 0.0 | 0.0 | 0.0 | 0.5735 | 0.0 | 0.0 | 0.7104 | 0.0247 | 0.4235 | 0.2840 | 0.0 | nan | 0.0101 | 0.4083 | 0.3263 | 0.0 | 0.8765 | 0.7763 | 0.9354 | 0.0001 | 0.0555 | 0.3266 | 0.0 | | 0.2823 | 6.2 | 1240 | 0.5561 | 0.3477 | 0.4153 | 0.8616 | nan | 0.8684 | 0.9325 | 0.7535 | 0.7614 | 0.5858 | nan | 0.6483 | 0.8377 | 0.0 | 0.9540 | 0.0172 | 0.0 | 0.0 | 0.0 | 0.6799 | 0.0 | 0.0 | 0.9022 | 0.0421 | 0.6256 | 0.2876 | 0.0 | nan | 0.0021 | 0.5645 | 0.4458 | 0.0 | 0.9561 | 0.8393 | 0.9661 | 0.0 | 0.1596 | 0.4593 | 0.0 | nan | 0.7564 | 0.8604 | 0.6944 | 0.6911 | 0.3387 | nan | 0.5181 | 0.5511 | 0.0 | 0.8288 | 0.0169 | 0.0 | 0.0 | 0.0 | 0.5345 | 0.0 | 0.0 | 0.7512 | 0.0414 | 0.5198 | 0.2474 | 0.0 | nan | 0.0020 | 0.4409 | 0.3486 | 0.0 | 0.8731 | 0.7634 | 0.9364 | 0.0 | 0.0737 | 0.3371 | 0.0 | | 0.1737 | 6.3 | 1260 | 0.5944 | 0.3440 | 0.4109 | 0.8589 | nan | 0.8164 | 0.9394 | 0.7514 | 0.8265 | 0.5133 | nan | 0.5731 | 0.8081 | 0.0049 | 0.9528 | 0.0190 | 0.0 | 0.0 | 0.0 | 0.6786 | 0.0 | 0.0 | 0.9077 | 0.0189 | 0.6365 | 0.4641 | 0.0 | nan | 0.0003 | 0.5127 | 0.4451 | 0.0 | 0.9562 | 0.8655 | 0.9817 | 0.0 | 0.0706 | 0.4066 | 0.0 | nan | 0.7103 | 0.8631 | 0.6924 | 0.6440 | 0.3634 | nan | 0.4701 | 0.5896 | 0.0033 | 0.8266 | 0.0181 | 0.0 | 0.0 | 0.0 | 0.4933 | 0.0 | 0.0 | 0.7567 | 0.0189 | 0.5532 | 0.2877 | 0.0 | nan | 0.0003 | 0.4161 | 0.3459 | 0.0 | 0.8718 | 0.7740 | 0.9335 | 0.0 | 0.0414 | 0.3359 | 0.0 | | 0.2331 | 6.4 | 1280 | 0.5757 | 0.3439 | 0.4080 | 0.8642 | nan | 0.8474 | 0.9577 | 0.7499 | 0.8125 | 0.3469 | nan | 0.6753 | 0.8720 | 0.0 | 0.9422 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.6022 | 0.0 | 0.0 | 0.9147 | 0.0303 | 0.6387 | 0.2659 | 0.0 | nan | 0.0033 | 0.5037 | 0.5161 | 0.0 | 0.9533 | 0.8503 | 0.9761 | 0.0 | 0.1375 | 0.4339 | 0.0 | nan | 0.7464 | 0.8625 | 0.7014 | 0.6901 | 0.2919 | nan | 0.5113 | 0.4932 | 0.0 | 0.8415 | 0.0242 | 0.0 | 0.0 | 0.0 | 0.5310 | 0.0 | 0.0 | 0.7484 | 0.0301 | 0.5440 | 0.2362 | 0.0 | nan | 0.0031 | 0.4189 | 0.3563 | 0.0 | 0.8716 | 0.7666 | 0.9388 | 0.0 | 0.0665 | 0.3304 | 0.0 | | 0.2482 | 6.5 | 1300 | 0.5652 | 0.3439 | 0.4139 | 0.8565 | nan | 0.8961 | 0.9318 | 0.7939 | 0.6718 | 0.5130 | nan | 0.6420 | 0.8713 | 0.0 | 0.9445 | 0.0252 | 0.0 | 0.0 | 0.0 | 0.6115 | 0.0 | 0.0 | 0.9118 | 0.1801 | 0.5194 | 0.3921 | 0.0 | nan | 0.0787 | 0.5022 | 0.4451 | 0.0 | 0.9485 | 0.8205 | 0.9784 | 0.0000 | 0.1026 | 0.4647 | 0.0 | nan | 0.7433 | 0.8670 | 0.6723 | 0.6234 | 0.3394 | nan | 0.4819 | 0.4912 | 0.0 | 0.8309 | 0.0242 | 0.0 | 0.0 | 0.0 | 0.5364 | 0.0 | 0.0 | 0.7400 | 0.1646 | 0.4697 | 0.2621 | 0.0 | nan | 0.0725 | 0.4230 | 0.3409 | 0.0 | 0.8691 | 0.7309 | 0.9383 | 0.0000 | 0.0509 | 0.3325 | 0.0 | | 0.2564 | 6.6 | 1320 | 0.5605 | 0.3430 | 0.4096 | 0.8605 | nan | 0.8574 | 0.9411 | 0.7728 | 0.7655 | 0.5374 | nan | 0.6314 | 0.8056 | 0.0 | 0.9448 | 0.0233 | 0.0 | 0.0 | 0.0 | 0.6626 | 0.0 | 0.0 | 0.9296 | 0.0727 | 0.5102 | 0.3465 | 0.0 | nan | 0.0388 | 0.5049 | 0.4320 | 0.0 | 0.9485 | 0.8772 | 0.9762 | 0.0105 | 0.1357 | 0.3834 | 0.0 | nan | 0.7479 | 0.8716 | 0.6437 | 0.6747 | 0.3443 | nan | 0.5198 | 0.5438 | 0.0 | 0.8378 | 0.0228 | 0.0 | 0.0 | 0.0 | 0.5055 | 0.0 | 0.0 | 0.7366 | 0.0677 | 0.4568 | 0.2458 | 0.0 | nan | 0.0316 | 0.4134 | 0.3461 | 0.0 | 0.8717 | 0.7620 | 0.9386 | 0.0074 | 0.0703 | 0.3169 | 0.0 | | 0.2327 | 6.7 | 1340 | 0.5858 | 0.3514 | 0.4217 | 0.8616 | nan | 0.8699 | 0.9510 | 0.7048 | 0.7627 | 0.4538 | nan | 0.6080 | 0.7268 | 0.1376 | 0.9419 | 0.0359 | 0.0 | 0.0 | 0.0 | 0.7361 | 0.0 | 0.0 | 0.8837 | 0.2082 | 0.6157 | 0.3706 | 0.0 | nan | 0.0909 | 0.4611 | 0.4802 | 0.0 | 0.9503 | 0.8700 | 0.9807 | 0.0210 | 0.1612 | 0.4738 | 0.0 | nan | 0.7492 | 0.8674 | 0.6835 | 0.6819 | 0.3450 | nan | 0.5073 | 0.5297 | 0.0222 | 0.8360 | 0.0342 | 0.0 | 0.0 | 0.0 | 0.5117 | 0.0 | 0.0 | 0.7413 | 0.1671 | 0.4875 | 0.2640 | 0.0 | nan | 0.0776 | 0.4006 | 0.3457 | 0.0 | 0.8775 | 0.7710 | 0.9368 | 0.0109 | 0.0734 | 0.3249 | 0.0 | | 0.242 | 6.8 | 1360 | 0.5879 | 0.3377 | 0.4038 | 0.8588 | nan | 0.8580 | 0.9459 | 0.6603 | 0.7829 | 0.5026 | nan | 0.6233 | 0.6993 | 0.1826 | 0.9596 | 0.0324 | 0.0 | 0.0 | 0.0 | 0.6813 | 0.0 | 0.0 | 0.8888 | 0.0606 | 0.6100 | 0.2078 | 0.0 | nan | 0.0106 | 0.5106 | 0.4489 | 0.0 | 0.9608 | 0.8217 | 0.9807 | 0.0013 | 0.0424 | 0.4508 | 0.0 | nan | 0.7297 | 0.8711 | 0.6473 | 0.6682 | 0.3661 | nan | 0.5064 | 0.4987 | 0.0245 | 0.8177 | 0.0323 | 0.0 | 0.0 | 0.0 | 0.4811 | 0.0 | 0.0 | 0.7394 | 0.0582 | 0.4697 | 0.1953 | 0.0 | nan | 0.0093 | 0.4198 | 0.3574 | 0.0 | 0.8634 | 0.7439 | 0.9376 | 0.0010 | 0.0280 | 0.3386 | 0.0 | | 0.2912 | 6.9 | 1380 | 0.6065 | 0.3460 | 0.4119 | 0.8580 | nan | 0.8233 | 0.9491 | 0.7514 | 0.8086 | 0.4890 | nan | 0.6596 | 0.7872 | 0.0 | 0.9363 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.6847 | 0.0 | 0.0 | 0.9520 | 0.0737 | 0.5223 | 0.3596 | 0.0 | nan | 0.0359 | 0.5441 | 0.4222 | 0.0 | 0.9219 | 0.8838 | 0.9741 | 0.0026 | 0.2035 | 0.3599 | 0.0 | nan | 0.7319 | 0.8655 | 0.6413 | 0.6811 | 0.3626 | nan | 0.5255 | 0.5864 | 0.0 | 0.8358 | 0.0330 | 0.0 | 0.0 | 0.0 | 0.5516 | 0.0 | 0.0 | 0.7090 | 0.0674 | 0.4307 | 0.2864 | 0.0 | nan | 0.0250 | 0.4413 | 0.3338 | 0.0 | 0.8745 | 0.7683 | 0.9393 | 0.0022 | 0.0863 | 0.2947 | 0.0 | | 0.343 | 7.0 | 1400 | 0.5567 | 0.3432 | 0.4134 | 0.8604 | nan | 0.8250 | 0.9528 | 0.7195 | 0.8020 | 0.5162 | nan | 0.6391 | 0.8356 | 0.0060 | 0.9500 | 0.0391 | 0.0 | 0.0 | 0.0 | 0.7280 | 0.0 | 0.0 | 0.8962 | 0.0651 | 0.6283 | 0.3323 | 0.0 | nan | 0.0065 | 0.5047 | 0.4910 | 0.0 | 0.9584 | 0.8309 | 0.9817 | 0.0008 | 0.1522 | 0.3661 | 0.0 | nan | 0.7363 | 0.8592 | 0.6737 | 0.6849 | 0.3638 | nan | 0.5259 | 0.5049 | 0.0037 | 0.8376 | 0.0377 | 0.0 | 0.0 | 0.0 | 0.4747 | 0.0 | 0.0 | 0.7459 | 0.0630 | 0.4846 | 0.2764 | 0.0 | nan | 0.0052 | 0.4170 | 0.3515 | 0.0 | 0.8684 | 0.7582 | 0.9370 | 0.0007 | 0.0725 | 0.3003 | 0.0 | | 0.2914 | 7.1 | 1420 | 0.5773 | 0.3455 | 0.4101 | 0.8628 | nan | 0.8788 | 0.9472 | 0.7366 | 0.7511 | 0.4820 | nan | 0.6145 | 0.8190 | 0.0 | 0.9357 | 0.0365 | 0.0 | 0.0 | 0.0 | 0.6832 | 0.0 | 0.0 | 0.9259 | 0.0426 | 0.5416 | 0.2700 | 0.0 | nan | 0.0390 | 0.5164 | 0.4809 | 0.0 | 0.9462 | 0.8881 | 0.9822 | 0.0007 | 0.1845 | 0.4210 | 0.0 | nan | 0.7424 | 0.8681 | 0.6758 | 0.6744 | 0.3689 | nan | 0.5043 | 0.5500 | 0.0 | 0.8381 | 0.0355 | 0.0 | 0.0 | 0.0 | 0.5276 | 0.0 | 0.0 | 0.7309 | 0.0420 | 0.4570 | 0.2420 | 0.0 | nan | 0.0278 | 0.4148 | 0.3608 | 0.0 | 0.8780 | 0.7737 | 0.9370 | 0.0007 | 0.0782 | 0.3276 | 0.0 | | 0.2474 | 7.2 | 1440 | 0.5986 | 0.3456 | 0.4083 | 0.8604 | nan | 0.8241 | 0.9533 | 0.7590 | 0.8042 | 0.4869 | nan | 0.6086 | 0.7678 | 0.0 | 0.9578 | 0.0176 | 0.0 | 0.0 | 0.0 | 0.6546 | 0.0 | 0.0 | 0.9157 | 0.0721 | 0.5568 | 0.2897 | 0.0 | nan | 0.0205 | 0.4932 | 0.4753 | 0.0 | 0.9612 | 0.8364 | 0.9759 | 0.0003 | 0.2280 | 0.4053 | 0.0 | nan | 0.7370 | 0.8694 | 0.6417 | 0.6868 | 0.3282 | nan | 0.5111 | 0.6177 | 0.0 | 0.8225 | 0.0173 | 0.0 | 0.0 | 0.0 | 0.5195 | 0.0 | 0.0 | 0.7384 | 0.0702 | 0.4737 | 0.2533 | 0.0 | nan | 0.0171 | 0.4074 | 0.3697 | 0.0 | 0.8687 | 0.7579 | 0.9399 | 0.0003 | 0.0826 | 0.3284 | 0.0 | | 0.2505 | 7.3 | 1460 | 0.5985 | 0.3471 | 0.4082 | 0.8566 | nan | 0.7938 | 0.9652 | 0.7313 | 0.7649 | 0.4475 | nan | 0.6260 | 0.8675 | 0.0020 | 0.9491 | 0.0379 | 0.0 | 0.0 | 0.0 | 0.6807 | 0.0 | 0.0 | 0.9081 | 0.1519 | 0.6030 | 0.2748 | 0.0 | nan | 0.0039 | 0.5783 | 0.46 | 0.0 | 0.9587 | 0.8108 | 0.9788 | 0.0001 | 0.0826 | 0.3844 | 0.0 | nan | 0.7066 | 0.8387 | 0.6786 | 0.6773 | 0.3727 | nan | 0.5193 | 0.5468 | 0.0016 | 0.8359 | 0.0354 | 0.0 | 0.0 | 0.0 | 0.5663 | 0.0 | 0.0 | 0.7500 | 0.1320 | 0.4856 | 0.2477 | 0.0 | nan | 0.0034 | 0.4339 | 0.3538 | 0.0 | 0.8699 | 0.7449 | 0.9397 | 0.0001 | 0.0558 | 0.3117 | 0.0 | | 0.2611 | 7.4 | 1480 | 0.5702 | 0.3476 | 0.4143 | 0.8581 | nan | 0.7707 | 0.9551 | 0.7422 | 0.8445 | 0.4893 | nan | 0.6655 | 0.8344 | 0.0 | 0.9515 | 0.0443 | 0.0 | 0.0 | 0.0 | 0.6860 | 0.0 | 0.0 | 0.9232 | 0.1644 | 0.6029 | 0.3108 | 0.0 | nan | 0.0147 | 0.5112 | 0.4610 | 0.0 | 0.9478 | 0.8592 | 0.9807 | 0.0065 | 0.0864 | 0.4052 | 0.0 | nan | 0.7017 | 0.8700 | 0.6427 | 0.6132 | 0.3791 | nan | 0.5387 | 0.5832 | 0.0 | 0.8422 | 0.0404 | 0.0 | 0.0 | 0.0 | 0.5270 | 0.0 | 0.0 | 0.7452 | 0.1416 | 0.4843 | 0.2549 | 0.0 | nan | 0.0129 | 0.4183 | 0.3607 | 0.0 | 0.8781 | 0.7718 | 0.9388 | 0.0049 | 0.0451 | 0.3281 | 0.0 | | 0.1479 | 7.5 | 1500 | 0.5701 | 0.3439 | 0.4035 | 0.8595 | nan | 0.8494 | 0.9474 | 0.7318 | 0.7205 | 0.5444 | nan | 0.6775 | 0.8485 | 0.0 | 0.9484 | 0.0252 | 0.0 | 0.0 | 0.0 | 0.6319 | 0.0 | 0.0 | 0.9298 | 0.0287 | 0.5318 | 0.2629 | 0.0 | nan | 0.0034 | 0.4961 | 0.4789 | 0.0 | 0.9603 | 0.8198 | 0.9658 | 0.0029 | 0.0514 | 0.4563 | 0.0 | nan | 0.7411 | 0.8677 | 0.6846 | 0.6476 | 0.3372 | nan | 0.5484 | 0.5975 | 0.0 | 0.8345 | 0.0246 | 0.0 | 0.0 | 0.0 | 0.5414 | 0.0 | 0.0 | 0.7303 | 0.0280 | 0.4573 | 0.2322 | 0.0 | nan | 0.0030 | 0.4152 | 0.3618 | 0.0 | 0.8665 | 0.7385 | 0.9396 | 0.0027 | 0.0370 | 0.3666 | 0.0 | | 0.2073 | 7.6 | 1520 | 0.6279 | 0.3474 | 0.4175 | 0.8507 | nan | 0.7556 | 0.9561 | 0.7696 | 0.7821 | 0.4914 | nan | 0.6223 | 0.8651 | 0.0 | 0.9547 | 0.0416 | 0.0 | 0.0 | 0.0 | 0.6631 | 0.0 | 0.0 | 0.9289 | 0.1014 | 0.5414 | 0.3211 | 0.0 | nan | 0.1147 | 0.5668 | 0.5320 | 0.0 | 0.9357 | 0.8535 | 0.9849 | 0.0320 | 0.1488 | 0.3977 | 0.0 | nan | 0.6791 | 0.8419 | 0.6776 | 0.6390 | 0.3672 | nan | 0.5206 | 0.5841 | 0.0 | 0.8318 | 0.0399 | 0.0 | 0.0 | 0.0 | 0.5261 | 0.0 | 0.0 | 0.7370 | 0.0923 | 0.4608 | 0.2570 | 0.0 | nan | 0.0822 | 0.4361 | 0.3484 | 0.0 | 0.8778 | 0.7638 | 0.9356 | 0.0195 | 0.0696 | 0.3278 | 0.0 | | 0.3208 | 7.7 | 1540 | 0.5902 | 0.3443 | 0.4097 | 0.8625 | nan | 0.8514 | 0.9559 | 0.8170 | 0.7319 | 0.4577 | nan | 0.6391 | 0.8315 | 0.0 | 0.9341 | 0.0208 | 0.0 | 0.0 | 0.0 | 0.6960 | 0.0 | 0.0 | 0.9322 | 0.0069 | 0.5405 | 0.3043 | 0.0 | nan | 0.0195 | 0.4643 | 0.5060 | 0.0 | 0.9447 | 0.8907 | 0.9827 | 0.0075 | 0.1400 | 0.4345 | 0.0 | nan | 0.7546 | 0.8646 | 0.6123 | 0.6707 | 0.3476 | nan | 0.5256 | 0.6086 | 0.0 | 0.8453 | 0.0198 | 0.0 | 0.0 | 0.0 | 0.5410 | 0.0 | 0.0 | 0.7307 | 0.0068 | 0.4733 | 0.2653 | 0.0 | nan | 0.0173 | 0.4133 | 0.3411 | 0.0 | 0.8744 | 0.7639 | 0.9379 | 0.0062 | 0.0566 | 0.3404 | 0.0 | | 0.3173 | 7.8 | 1560 | 0.5700 | 0.3494 | 0.4171 | 0.8633 | nan | 0.8505 | 0.9393 | 0.7827 | 0.8097 | 0.4909 | nan | 0.6613 | 0.8247 | 0.0331 | 0.9473 | 0.0464 | 0.0 | 0.0 | 0.0 | 0.6595 | 0.0 | 0.0 | 0.9329 | 0.0327 | 0.5948 | 0.2566 | 0.0 | nan | 0.0584 | 0.5498 | 0.5224 | 0.0 | 0.9481 | 0.8445 | 0.9784 | 0.0012 | 0.1162 | 0.4649 | 0.0 | nan | 0.7463 | 0.8649 | 0.6431 | 0.6921 | 0.3346 | nan | 0.5205 | 0.5660 | 0.0149 | 0.8430 | 0.0429 | 0.0 | 0.0 | 0.0 | 0.5391 | 0.0 | 0.0 | 0.7422 | 0.0322 | 0.5000 | 0.2370 | 0.0 | nan | 0.0506 | 0.4451 | 0.3526 | 0.0 | 0.8779 | 0.7619 | 0.9406 | 0.0011 | 0.0622 | 0.3703 | 0.0 | | 0.1863 | 7.9 | 1580 | 0.6211 | 0.3514 | 0.4297 | 0.8592 | nan | 0.8506 | 0.9447 | 0.8257 | 0.7064 | 0.4736 | nan | 0.6324 | 0.8422 | 0.0 | 0.9651 | 0.0536 | 0.0 | 0.0 | 0.0 | 0.6058 | 0.0 | 0.0 | 0.9151 | 0.1508 | 0.5574 | 0.4391 | 0.0 | nan | 0.1478 | 0.5624 | 0.5315 | 0.0 | 0.9451 | 0.8766 | 0.9861 | 0.0010 | 0.3727 | 0.3650 | 0.0 | nan | 0.7333 | 0.8714 | 0.5576 | 0.6480 | 0.3506 | nan | 0.5172 | 0.6009 | 0.0 | 0.8275 | 0.0493 | 0.0 | 0.0 | 0.0 | 0.5273 | 0.0 | 0.0 | 0.7444 | 0.1344 | 0.4871 | 0.3098 | 0.0 | nan | 0.1057 | 0.4511 | 0.3565 | 0.0 | 0.8806 | 0.7541 | 0.9369 | 0.0008 | 0.0878 | 0.3123 | 0.0 | | 0.2516 | 8.0 | 1600 | 0.5767 | 0.3501 | 0.4204 | 0.8653 | nan | 0.8572 | 0.9601 | 0.7830 | 0.7746 | 0.4109 | nan | 0.6479 | 0.7926 | 0.0650 | 0.9358 | 0.0504 | 0.0 | 0.0 | 0.0 | 0.7195 | 0.0 | 0.0 | 0.9099 | 0.0133 | 0.5805 | 0.3600 | 0.0 | nan | 0.0194 | 0.5341 | 0.5156 | 0.0 | 0.9602 | 0.8242 | 0.9783 | 0.0 | 0.3244 | 0.4364 | 0.0 | nan | 0.7522 | 0.8696 | 0.6293 | 0.6857 | 0.3462 | nan | 0.5311 | 0.5778 | 0.0151 | 0.8472 | 0.0474 | 0.0 | 0.0 | 0.0 | 0.5229 | 0.0 | 0.0 | 0.7538 | 0.0132 | 0.5060 | 0.2932 | 0.0 | nan | 0.0177 | 0.4476 | 0.3751 | 0.0 | 0.8715 | 0.7523 | 0.9414 | 0.0 | 0.0762 | 0.3314 | 0.0 | | 0.2206 | 8.1 | 1620 | 0.6162 | 0.3546 | 0.4307 | 0.8563 | nan | 0.8299 | 0.9257 | 0.7671 | 0.8026 | 0.4752 | nan | 0.6825 | 0.7886 | 0.0158 | 0.9459 | 0.0448 | 0.0 | 0.0 | 0.0 | 0.6966 | 0.0 | 0.0 | 0.9133 | 0.1368 | 0.6901 | 0.3385 | 0.0 | nan | 0.2352 | 0.5165 | 0.5090 | 0.0 | 0.9412 | 0.8623 | 0.9821 | 0.0369 | 0.2019 | 0.4425 | 0.0 | nan | 0.7445 | 0.8644 | 0.6314 | 0.6645 | 0.3391 | nan | 0.5353 | 0.5417 | 0.0075 | 0.8451 | 0.0427 | 0.0 | 0.0 | 0.0 | 0.4609 | 0.0 | 0.0 | 0.7530 | 0.1226 | 0.5129 | 0.2932 | 0.0 | nan | 0.1710 | 0.4440 | 0.3757 | 0.0 | 0.8807 | 0.7562 | 0.9398 | 0.0078 | 0.0675 | 0.3445 | 0.0 | | 0.189 | 8.2 | 1640 | 0.5740 | 0.3467 | 0.4147 | 0.8648 | nan | 0.8577 | 0.9535 | 0.7585 | 0.7673 | 0.4784 | nan | 0.6893 | 0.8107 | 0.0637 | 0.9507 | 0.0421 | 0.0 | 0.0 | 0.0 | 0.6907 | 0.0 | 0.0 | 0.9336 | 0.0316 | 0.5791 | 0.2895 | 0.0 | nan | 0.0854 | 0.5458 | 0.5167 | 0.0 | 0.9507 | 0.8159 | 0.9754 | 0.0 | 0.0640 | 0.4187 | 0.0 | nan | 0.7502 | 0.8706 | 0.6532 | 0.6880 | 0.3630 | nan | 0.5298 | 0.4944 | 0.0264 | 0.8465 | 0.0401 | 0.0 | 0.0 | 0.0 | 0.5009 | 0.0 | 0.0 | 0.7409 | 0.0303 | 0.4910 | 0.2531 | 0.0 | nan | 0.0716 | 0.4439 | 0.3748 | 0.0 | 0.8713 | 0.7383 | 0.9429 | 0.0 | 0.0365 | 0.3374 | 0.0 | | 0.2028 | 8.3 | 1660 | 0.6399 | 0.3438 | 0.4078 | 0.8578 | nan | 0.8066 | 0.9687 | 0.7532 | 0.8068 | 0.3267 | nan | 0.6167 | 0.7656 | 0.0031 | 0.9349 | 0.0451 | 0.0 | 0.0 | 0.0 | 0.7131 | 0.0 | 0.0 | 0.9415 | 0.0212 | 0.5982 | 0.3048 | 0.0 | nan | 0.0864 | 0.5129 | 0.4953 | 0.0 | 0.9371 | 0.8768 | 0.9825 | 0.0 | 0.2933 | 0.2605 | 0.0 | nan | 0.7242 | 0.8444 | 0.7154 | 0.6779 | 0.2892 | nan | 0.5380 | 0.5381 | 0.0014 | 0.8407 | 0.0419 | 0.0 | 0.0 | 0.0 | 0.5174 | 0.0 | 0.0 | 0.7326 | 0.0193 | 0.4840 | 0.2500 | 0.0 | nan | 0.0719 | 0.4295 | 0.3630 | 0.0 | 0.8772 | 0.7656 | 0.9371 | 0.0 | 0.1074 | 0.2369 | 0.0 | | 0.2783 | 8.4 | 1680 | 0.5473 | 0.3515 | 0.4219 | 0.8664 | nan | 0.8826 | 0.9387 | 0.7571 | 0.8127 | 0.5269 | nan | 0.6741 | 0.8037 | 0.0 | 0.9550 | 0.0350 | 0.0 | 0.0 | 0.0 | 0.6732 | 0.0 | 0.0 | 0.9218 | 0.0816 | 0.5645 | 0.3347 | 0.0 | nan | 0.0116 | 0.5560 | 0.5146 | 0.0 | 0.9369 | 0.8790 | 0.9825 | 0.0009 | 0.2686 | 0.3906 | 0.0 | nan | 0.7608 | 0.8771 | 0.7013 | 0.6885 | 0.3555 | nan | 0.5491 | 0.5221 | 0.0 | 0.8338 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.5370 | 0.0 | 0.0 | 0.7435 | 0.0750 | 0.5028 | 0.2650 | 0.0 | nan | 0.0099 | 0.4431 | 0.3785 | 0.0 | 0.8797 | 0.7654 | 0.9401 | 0.0007 | 0.0687 | 0.3162 | 0.0 | | 0.199 | 8.5 | 1700 | 0.5928 | 0.3595 | 0.4303 | 0.8670 | nan | 0.8255 | 0.9598 | 0.7983 | 0.8581 | 0.4141 | nan | 0.6636 | 0.8888 | 0.0 | 0.9441 | 0.0413 | 0.0 | 0.0132 | 0.0 | 0.6113 | 0.0 | 0.0 | 0.8697 | 0.2663 | 0.7008 | 0.3577 | 0.0 | nan | 0.2193 | 0.5255 | 0.4762 | 0.0 | 0.9526 | 0.8591 | 0.9803 | 0.0 | 0.0804 | 0.4627 | 0.0 | nan | 0.7395 | 0.8762 | 0.6573 | 0.7119 | 0.3497 | nan | 0.5446 | 0.5005 | 0.0 | 0.8375 | 0.0407 | 0.0 | 0.0131 | 0.0 | 0.5413 | 0.0 | 0.0 | 0.7524 | 0.2323 | 0.4987 | 0.2915 | 0.0 | nan | 0.1353 | 0.4328 | 0.3726 | 0.0 | 0.8722 | 0.7516 | 0.9419 | 0.0 | 0.0540 | 0.3573 | 0.0 | | 0.1591 | 8.6 | 1720 | 0.5927 | 0.3568 | 0.4274 | 0.8632 | nan | 0.8250 | 0.9475 | 0.7706 | 0.8347 | 0.5623 | nan | 0.6601 | 0.8675 | 0.0 | 0.9547 | 0.0375 | 0.0 | 0.0 | 0.0 | 0.6326 | 0.0 | 0.0 | 0.9278 | 0.1591 | 0.5080 | 0.3138 | 0.0 | nan | 0.2364 | 0.5457 | 0.5151 | 0.0 | 0.9625 | 0.7945 | 0.9821 | 0.0 | 0.2648 | 0.3730 | 0.0 | nan | 0.7411 | 0.8788 | 0.6746 | 0.7028 | 0.3475 | nan | 0.5438 | 0.5751 | 0.0 | 0.8345 | 0.0370 | 0.0 | 0.0 | 0.0 | 0.5548 | 0.0 | 0.0 | 0.7334 | 0.1370 | 0.4510 | 0.2575 | 0.0 | nan | 0.1640 | 0.4431 | 0.3603 | 0.0 | 0.8660 | 0.7312 | 0.9409 | 0.0 | 0.1150 | 0.3292 | 0.0 | | 0.1553 | 8.7 | 1740 | 0.5695 | 0.3598 | 0.4404 | 0.8679 | nan | 0.8717 | 0.9483 | 0.7673 | 0.7990 | 0.5361 | nan | 0.6620 | 0.7190 | 0.1766 | 0.9412 | 0.0958 | 0.0 | 0.0079 | 0.0 | 0.7929 | 0.0 | 0.0 | 0.8975 | 0.1616 | 0.6552 | 0.3511 | 0.0 | nan | 0.0630 | 0.5369 | 0.5200 | 0.0 | 0.9408 | 0.8779 | 0.9758 | 0.0004 | 0.3845 | 0.4103 | 0.0 | nan | 0.7609 | 0.8754 | 0.6801 | 0.7116 | 0.3771 | nan | 0.5357 | 0.5635 | 0.0429 | 0.8473 | 0.0878 | 0.0 | 0.0078 | 0.0 | 0.4463 | 0.0 | 0.0 | 0.7517 | 0.1464 | 0.5156 | 0.2927 | 0.0 | nan | 0.0538 | 0.4402 | 0.3753 | 0.0 | 0.8760 | 0.7600 | 0.9433 | 0.0003 | 0.0908 | 0.3304 | 0.0 | | 0.1498 | 8.8 | 1760 | 0.5841 | 0.3633 | 0.4338 | 0.8642 | nan | 0.8380 | 0.9265 | 0.7928 | 0.9074 | 0.5189 | nan | 0.6488 | 0.8137 | 0.0 | 0.9494 | 0.0536 | 0.0 | 0.0085 | 0.0 | 0.7206 | 0.0 | 0.0 | 0.9184 | 0.2066 | 0.6024 | 0.4015 | 0.0 | nan | 0.0487 | 0.5791 | 0.5140 | 0.0 | 0.9414 | 0.8646 | 0.9797 | 0.0054 | 0.1717 | 0.4690 | 0.0 | nan | 0.7476 | 0.8657 | 0.6433 | 0.6603 | 0.3790 | nan | 0.5424 | 0.6239 | 0.0 | 0.8431 | 0.0509 | 0.0 | 0.0084 | 0.0 | 0.5871 | 0.0 | 0.0 | 0.7513 | 0.1678 | 0.5091 | 0.3164 | 0.0 | nan | 0.0414 | 0.4595 | 0.3703 | 0.0 | 0.8827 | 0.7695 | 0.9422 | 0.0033 | 0.0879 | 0.3730 | 0.0 | | 0.1828 | 8.9 | 1780 | 0.6037 | 0.3609 | 0.4253 | 0.8654 | nan | 0.8656 | 0.9520 | 0.7476 | 0.7937 | 0.4616 | nan | 0.6561 | 0.8421 | 0.0 | 0.9473 | 0.0502 | 0.0 | 0.0 | 0.0 | 0.7023 | 0.0 | 0.0 | 0.8957 | 0.1736 | 0.6450 | 0.3825 | 0.0 | nan | 0.0666 | 0.5406 | 0.5340 | 0.0 | 0.9570 | 0.7497 | 0.9826 | 0.0161 | 0.1262 | 0.5220 | 0.0 | nan | 0.7611 | 0.8645 | 0.7076 | 0.7248 | 0.3637 | nan | 0.5355 | 0.5979 | 0.0 | 0.8396 | 0.0462 | 0.0 | 0.0 | 0.0 | 0.5541 | 0.0 | 0.0 | 0.7555 | 0.1530 | 0.4978 | 0.3036 | 0.0 | nan | 0.0557 | 0.4552 | 0.3779 | 0.0 | 0.8626 | 0.6957 | 0.9399 | 0.0079 | 0.0662 | 0.3827 | 0.0 | | 0.2588 | 9.0 | 1800 | 0.5867 | 0.3597 | 0.4377 | 0.8588 | nan | 0.8202 | 0.9224 | 0.7975 | 0.8144 | 0.6519 | nan | 0.6872 | 0.7382 | 0.0319 | 0.9508 | 0.0736 | 0.0 | 0.0036 | 0.0 | 0.7372 | 0.0 | 0.0 | 0.9216 | 0.2294 | 0.5725 | 0.3596 | 0.0 | nan | 0.0690 | 0.5450 | 0.5389 | 0.0 | 0.9345 | 0.8713 | 0.9842 | 0.0037 | 0.2376 | 0.5116 | 0.0 | nan | 0.7439 | 0.8635 | 0.6600 | 0.7027 | 0.3202 | nan | 0.5441 | 0.6061 | 0.0136 | 0.8412 | 0.0651 | 0.0 | 0.0036 | 0.0 | 0.4753 | 0.0 | 0.0 | 0.7408 | 0.1843 | 0.4959 | 0.3006 | 0.0 | nan | 0.0559 | 0.4579 | 0.3852 | 0.0 | 0.8822 | 0.7645 | 0.9387 | 0.0026 | 0.0771 | 0.3845 | 0.0 | | 0.2303 | 9.1 | 1820 | 0.5980 | 0.3532 | 0.4292 | 0.8599 | nan | 0.8094 | 0.9427 | 0.8150 | 0.7852 | 0.4762 | nan | 0.7417 | 0.7367 | 0.0162 | 0.9494 | 0.0567 | 0.0 | 0.0859 | 0.0 | 0.7208 | 0.0 | 0.0 | 0.9223 | 0.0887 | 0.6457 | 0.3506 | 0.0 | nan | 0.0485 | 0.5267 | 0.5245 | 0.0 | 0.9494 | 0.8379 | 0.9815 | 0.0079 | 0.2876 | 0.4272 | 0.0 | nan | 0.7148 | 0.8715 | 0.6250 | 0.6618 | 0.3683 | nan | 0.5072 | 0.6077 | 0.0080 | 0.8445 | 0.0527 | 0.0 | 0.0836 | 0.0 | 0.4327 | 0.0 | 0.0 | 0.7509 | 0.0843 | 0.5323 | 0.2819 | 0.0 | nan | 0.0407 | 0.4408 | 0.3615 | 0.0 | 0.8794 | 0.7633 | 0.9411 | 0.0036 | 0.0975 | 0.3483 | 0.0 | | 0.1926 | 9.2 | 1840 | 0.6364 | 0.3523 | 0.4171 | 0.8567 | nan | 0.8013 | 0.9537 | 0.7411 | 0.7849 | 0.4775 | nan | 0.6554 | 0.8531 | 0.0 | 0.9392 | 0.0431 | 0.0 | 0.0162 | 0.0 | 0.6634 | 0.0 | 0.0 | 0.9319 | 0.1200 | 0.4821 | 0.3603 | 0.0 | nan | 0.0067 | 0.4962 | 0.5323 | 0.0 | 0.9510 | 0.8484 | 0.9816 | 0.0133 | 0.2473 | 0.4469 | 0.0 | nan | 0.7045 | 0.8678 | 0.6994 | 0.6142 | 0.3742 | nan | 0.5336 | 0.6132 | 0.0 | 0.8493 | 0.0400 | 0.0 | 0.0161 | 0.0 | 0.5747 | 0.0 | 0.0 | 0.7271 | 0.1111 | 0.4372 | 0.3011 | 0.0 | nan | 0.0058 | 0.4273 | 0.3716 | 0.0 | 0.8753 | 0.7585 | 0.9391 | 0.0073 | 0.0719 | 0.3542 | 0.0 | | 0.1283 | 9.3 | 1860 | 0.6048 | 0.3556 | 0.4276 | 0.8652 | nan | 0.8396 | 0.9455 | 0.7744 | 0.8039 | 0.5487 | nan | 0.6756 | 0.7530 | 0.0430 | 0.9380 | 0.0847 | 0.0 | 0.0 | 0.0 | 0.7757 | 0.0 | 0.0 | 0.9322 | 0.1295 | 0.5577 | 0.2965 | 0.0 | nan | 0.1061 | 0.5510 | 0.5570 | 0.0 | 0.9408 | 0.8590 | 0.9812 | 0.0007 | 0.0445 | 0.5457 | 0.0 | nan | 0.7397 | 0.8676 | 0.7034 | 0.6899 | 0.3817 | nan | 0.5452 | 0.5603 | 0.0131 | 0.8537 | 0.0754 | 0.0 | 0.0 | 0.0 | 0.4386 | 0.0 | 0.0 | 0.7376 | 0.1184 | 0.4767 | 0.2647 | 0.0 | nan | 0.0761 | 0.4534 | 0.3578 | 0.0 | 0.8771 | 0.7525 | 0.9408 | 0.0006 | 0.0278 | 0.4266 | 0.0 | | 0.242 | 9.4 | 1880 | 0.6183 | 0.3537 | 0.4259 | 0.8652 | nan | 0.8536 | 0.9574 | 0.7653 | 0.7996 | 0.4903 | nan | 0.6016 | 0.8441 | 0.0 | 0.9293 | 0.1477 | 0.0 | 0.0374 | 0.0 | 0.7279 | 0.0 | 0.0 | 0.9335 | 0.0248 | 0.5401 | 0.3316 | 0.0 | nan | 0.0044 | 0.5457 | 0.5598 | 0.0 | 0.9513 | 0.8410 | 0.9830 | 0.0017 | 0.4150 | 0.3431 | 0.0 | nan | 0.7464 | 0.8714 | 0.7064 | 0.6863 | 0.3932 | nan | 0.5200 | 0.5922 | 0.0 | 0.8531 | 0.1256 | 0.0 | 0.0368 | 0.0 | 0.5249 | 0.0 | 0.0 | 0.7372 | 0.0233 | 0.4739 | 0.2831 | 0.0 | nan | 0.0039 | 0.4503 | 0.3367 | 0.0 | 0.8766 | 0.7550 | 0.9400 | 0.0011 | 0.0830 | 0.2982 | 0.0 | | 0.1308 | 9.5 | 1900 | 0.6292 | 0.3489 | 0.4163 | 0.8593 | nan | 0.7832 | 0.9470 | 0.7857 | 0.8231 | 0.5413 | nan | 0.6989 | 0.8729 | 0.0 | 0.9641 | 0.0627 | 0.0 | 0.0080 | 0.0 | 0.5577 | 0.0 | 0.0 | 0.9156 | 0.0280 | 0.5252 | 0.3589 | 0.0 | nan | 0.0166 | 0.5069 | 0.5345 | 0.0 | 0.9496 | 0.8725 | 0.9817 | 0.0001 | 0.0829 | 0.5056 | 0.0 | nan | 0.7085 | 0.8673 | 0.6933 | 0.6324 | 0.3683 | nan | 0.5443 | 0.5951 | 0.0 | 0.8239 | 0.0577 | 0.0 | 0.0080 | 0.0 | 0.5004 | 0.0 | 0.0 | 0.7442 | 0.0277 | 0.4713 | 0.2897 | 0.0 | nan | 0.0152 | 0.4401 | 0.3716 | 0.0 | 0.8759 | 0.7625 | 0.9409 | 0.0001 | 0.0449 | 0.3823 | 0.0 | | 0.2915 | 9.6 | 1920 | 0.6007 | 0.3634 | 0.4343 | 0.8576 | nan | 0.8409 | 0.9099 | 0.7642 | 0.8886 | 0.6304 | nan | 0.6444 | 0.8280 | 0.0 | 0.9398 | 0.1026 | 0.0 | 0.1016 | 0.0 | 0.6896 | 0.0 | 0.0 | 0.9341 | 0.1520 | 0.5981 | 0.3620 | 0.0 | nan | 0.0959 | 0.5529 | 0.5185 | 0.0 | 0.9471 | 0.7731 | 0.9769 | 0.0006 | 0.2137 | 0.4315 | 0.0 | nan | 0.7414 | 0.8598 | 0.7123 | 0.6627 | 0.3403 | nan | 0.5468 | 0.6359 | 0.0 | 0.8490 | 0.0906 | 0.0 | 0.0997 | 0.0 | 0.5503 | 0.0 | 0.0 | 0.7453 | 0.1337 | 0.5070 | 0.2869 | 0.0 | nan | 0.0682 | 0.4524 | 0.3722 | 0.0 | 0.8685 | 0.7152 | 0.9420 | 0.0005 | 0.0906 | 0.3574 | 0.0 | | 0.1619 | 9.7 | 1940 | 0.5878 | 0.3652 | 0.4320 | 0.8669 | nan | 0.8511 | 0.9478 | 0.7936 | 0.7795 | 0.5206 | nan | 0.6435 | 0.8479 | 0.0 | 0.9404 | 0.0867 | 0.0 | 0.0883 | 0.0 | 0.6922 | 0.0 | 0.0 | 0.9222 | 0.1750 | 0.6193 | 0.3488 | 0.0 | nan | 0.1099 | 0.5323 | 0.4732 | 0.0 | 0.9485 | 0.8684 | 0.9788 | 0.0002 | 0.1708 | 0.4845 | 0.0 | nan | 0.7496 | 0.8625 | 0.7050 | 0.6968 | 0.3725 | nan | 0.5267 | 0.5723 | 0.0 | 0.8505 | 0.0783 | 0.0 | 0.0870 | 0.0 | 0.5447 | 0.0 | 0.0 | 0.7519 | 0.1509 | 0.5221 | 0.3030 | 0.0 | nan | 0.0797 | 0.4446 | 0.3724 | 0.0 | 0.8815 | 0.7715 | 0.9416 | 0.0002 | 0.0589 | 0.3618 | 0.0 | | 0.2238 | 9.8 | 1960 | 0.5750 | 0.3620 | 0.4280 | 0.8640 | nan | 0.8034 | 0.9497 | 0.8037 | 0.8240 | 0.5548 | nan | 0.6514 | 0.7874 | 0.0 | 0.9436 | 0.0789 | 0.0 | 0.0050 | 0.0 | 0.6596 | 0.0 | 0.0 | 0.9175 | 0.1363 | 0.5863 | 0.2951 | 0.0 | nan | 0.1792 | 0.5565 | 0.5238 | 0.0 | 0.9600 | 0.8624 | 0.9827 | 0.0002 | 0.2023 | 0.4319 | 0.0 | nan | 0.7282 | 0.8672 | 0.7068 | 0.6609 | 0.3731 | nan | 0.5422 | 0.6318 | 0.0 | 0.8465 | 0.0714 | 0.0 | 0.0050 | 0.0 | 0.5472 | 0.0 | 0.0 | 0.7508 | 0.1235 | 0.4912 | 0.2610 | 0.0 | nan | 0.1251 | 0.4525 | 0.3775 | 0.0 | 0.8787 | 0.7697 | 0.9406 | 0.0002 | 0.0868 | 0.3471 | 0.0 | | 0.181 | 9.9 | 1980 | 0.6077 | 0.3659 | 0.4361 | 0.8660 | nan | 0.8058 | 0.9632 | 0.8071 | 0.8176 | 0.4577 | nan | 0.6573 | 0.8114 | 0.0016 | 0.9514 | 0.0731 | 0.0 | 0.0823 | 0.0 | 0.6843 | 0.0 | 0.0 | 0.9176 | 0.2187 | 0.5992 | 0.4228 | 0.0 | nan | 0.0728 | 0.5537 | 0.5348 | 0.0 | 0.9523 | 0.8465 | 0.9807 | 0.0004 | 0.3214 | 0.4197 | 0.0 | nan | 0.7249 | 0.8696 | 0.6807 | 0.6553 | 0.3866 | nan | 0.5360 | 0.6245 | 0.0011 | 0.8409 | 0.0679 | 0.0 | 0.0811 | 0.0 | 0.5402 | 0.0 | 0.0 | 0.7561 | 0.1858 | 0.5035 | 0.3159 | 0.0 | nan | 0.0618 | 0.4582 | 0.3643 | 0.0 | 0.8823 | 0.7655 | 0.9433 | 0.0003 | 0.1145 | 0.3475 | 0.0 | | 0.1723 | 10.0 | 2000 | 0.5698 | 0.3647 | 0.4371 | 0.8676 | nan | 0.8435 | 0.9415 | 0.7988 | 0.8204 | 0.5753 | nan | 0.6608 | 0.8658 | 0.0004 | 0.9529 | 0.0692 | 0.0 | 0.0399 | 0.0 | 0.6560 | 0.0 | 0.0 | 0.9084 | 0.2922 | 0.6382 | 0.2656 | 0.0 | nan | 0.1818 | 0.5671 | 0.4989 | 0.0 | 0.9476 | 0.8513 | 0.9835 | 0.0000 | 0.1109 | 0.5181 | 0.0 | nan | 0.7495 | 0.8765 | 0.6768 | 0.6880 | 0.3644 | nan | 0.5321 | 0.5254 | 0.0002 | 0.8442 | 0.0641 | 0.0 | 0.0396 | 0.0 | 0.5579 | 0.0 | 0.0 | 0.7536 | 0.2312 | 0.5268 | 0.2439 | 0.0 | nan | 0.1472 | 0.4636 | 0.3572 | 0.0 | 0.8827 | 0.7725 | 0.9408 | 0.0000 | 0.0551 | 0.3764 | 0.0 | | 0.1796 | 10.1 | 2020 | 0.6002 | 0.3561 | 0.4173 | 0.8698 | nan | 0.8854 | 0.9622 | 0.7805 | 0.7688 | 0.4451 | nan | 0.6674 | 0.7923 | 0.0049 | 0.9383 | 0.0769 | 0.0 | 0.0 | 0.0 | 0.6885 | 0.0 | 0.0 | 0.9317 | 0.1050 | 0.4506 | 0.3630 | 0.0 | nan | 0.0482 | 0.5843 | 0.5063 | 0.0 | 0.9593 | 0.8249 | 0.9777 | 0.0114 | 0.1671 | 0.4151 | 0.0 | nan | 0.7746 | 0.8784 | 0.6935 | 0.6923 | 0.3830 | nan | 0.5434 | 0.5779 | 0.0020 | 0.8516 | 0.0677 | 0.0 | 0.0 | 0.0 | 0.5666 | 0.0 | 0.0 | 0.7279 | 0.1004 | 0.4194 | 0.2902 | 0.0 | nan | 0.0433 | 0.4666 | 0.3417 | 0.0 | 0.8740 | 0.7591 | 0.9439 | 0.0088 | 0.0505 | 0.3390 | 0.0 | | 0.1796 | 10.2 | 2040 | 0.6191 | 0.3570 | 0.4340 | 0.8510 | nan | 0.7408 | 0.9279 | 0.7743 | 0.8695 | 0.5925 | nan | 0.7097 | 0.7978 | 0.0060 | 0.9488 | 0.1261 | 0.0 | 0.0286 | 0.0 | 0.6876 | 0.0 | 0.0 | 0.9216 | 0.1616 | 0.5833 | 0.3228 | 0.0 | nan | 0.1482 | 0.5801 | 0.5649 | 0.0 | 0.9540 | 0.8119 | 0.9782 | 0.0149 | 0.1834 | 0.4523 | 0.0 | nan | 0.6754 | 0.8756 | 0.7088 | 0.5711 | 0.3412 | nan | 0.5429 | 0.5913 | 0.0021 | 0.8415 | 0.1061 | 0.0 | 0.0284 | 0.0 | 0.5505 | 0.0 | 0.0 | 0.7490 | 0.1510 | 0.4931 | 0.2742 | 0.0 | nan | 0.1126 | 0.4679 | 0.3353 | 0.0 | 0.8774 | 0.7492 | 0.9425 | 0.0090 | 0.0610 | 0.3659 | 0.0 | | 0.2422 | 10.3 | 2060 | 0.5916 | 0.3663 | 0.4314 | 0.8704 | nan | 0.8751 | 0.9613 | 0.7615 | 0.7939 | 0.4967 | nan | 0.6193 | 0.8196 | 0.0002 | 0.9310 | 0.1808 | 0.0 | 0.0156 | 0.0 | 0.6595 | 0.0 | 0.0 | 0.9255 | 0.2948 | 0.5006 | 0.2877 | 0.0 | nan | 0.1222 | 0.5373 | 0.5146 | 0.0 | 0.9422 | 0.8742 | 0.9779 | 0.0004 | 0.2006 | 0.5132 | 0.0 | nan | 0.7712 | 0.8737 | 0.7154 | 0.7130 | 0.3837 | nan | 0.5275 | 0.5635 | 0.0001 | 0.8538 | 0.1490 | 0.0 | 0.0155 | 0.0 | 0.5508 | 0.0 | 0.0 | 0.7359 | 0.2405 | 0.4500 | 0.2541 | 0.0 | nan | 0.0939 | 0.4606 | 0.3456 | 0.0 | 0.8812 | 0.7705 | 0.9437 | 0.0003 | 0.0603 | 0.3672 | 0.0 | | 0.2026 | 10.4 | 2080 | 0.5646 | 0.3630 | 0.4326 | 0.8686 | nan | 0.8538 | 0.9433 | 0.7673 | 0.8307 | 0.5518 | nan | 0.6772 | 0.8276 | 0.0011 | 0.9554 | 0.1327 | 0.0 | 0.0814 | 0.0 | 0.6519 | 0.0 | 0.0 | 0.9182 | 0.0747 | 0.5815 | 0.3975 | 0.0 | nan | 0.0813 | 0.5442 | 0.5356 | 0.0 | 0.9545 | 0.8541 | 0.9815 | 0.0002 | 0.2292 | 0.4163 | 0.0 | nan | 0.7587 | 0.8724 | 0.6988 | 0.7286 | 0.3550 | nan | 0.5457 | 0.5709 | 0.0006 | 0.8423 | 0.1143 | 0.0 | 0.0805 | 0.0 | 0.5323 | 0.0 | 0.0 | 0.7452 | 0.0723 | 0.5020 | 0.2997 | 0.0 | nan | 0.0661 | 0.4535 | 0.3444 | 0.0 | 0.8810 | 0.7710 | 0.9428 | 0.0001 | 0.0852 | 0.3510 | 0.0 | | 0.0985 | 10.5 | 2100 | 0.6142 | 0.3533 | 0.4256 | 0.8627 | nan | 0.8371 | 0.9483 | 0.8197 | 0.7611 | 0.5121 | nan | 0.6438 | 0.8376 | 0.0129 | 0.9141 | 0.1704 | 0.0 | 0.0046 | 0.0 | 0.7401 | 0.0 | 0.0 | 0.9358 | 0.0800 | 0.6162 | 0.3199 | 0.0 | nan | 0.0784 | 0.4951 | 0.4927 | 0.0 | 0.9401 | 0.8604 | 0.9809 | 0.0009 | 0.1092 | 0.5067 | 0.0 | nan | 0.7385 | 0.8643 | 0.6029 | 0.6977 | 0.3564 | nan | 0.5397 | 0.4815 | 0.0045 | 0.8455 | 0.1454 | 0.0 | 0.0046 | 0.0 | 0.5542 | 0.0 | 0.0 | 0.7474 | 0.0779 | 0.5220 | 0.2741 | 0.0 | nan | 0.0691 | 0.4378 | 0.3395 | 0.0 | 0.8814 | 0.7696 | 0.9414 | 0.0006 | 0.0474 | 0.3636 | 0.0 | | 0.129 | 10.6 | 2120 | 0.5970 | 0.3600 | 0.4274 | 0.8712 | nan | 0.8555 | 0.9481 | 0.8121 | 0.8462 | 0.4839 | nan | 0.6820 | 0.7683 | 0.0 | 0.9493 | 0.0611 | 0.0 | 0.0 | 0.0 | 0.7116 | 0.0 | 0.0 | 0.9261 | 0.0578 | 0.5941 | 0.3243 | 0.0 | nan | 0.2018 | 0.5643 | 0.5253 | 0.0 | 0.9469 | 0.8967 | 0.9830 | 0.0139 | 0.0910 | 0.4330 | 0.0 | nan | 0.7628 | 0.8796 | 0.6506 | 0.7280 | 0.3696 | nan | 0.5534 | 0.5205 | 0.0 | 0.8490 | 0.0568 | 0.0 | 0.0 | 0.0 | 0.5488 | 0.0 | 0.0 | 0.7399 | 0.0564 | 0.5002 | 0.2847 | 0.0 | nan | 0.1699 | 0.4503 | 0.3859 | 0.0 | 0.8830 | 0.7681 | 0.9423 | 0.0101 | 0.0500 | 0.3604 | 0.0 | | 0.2195 | 10.7 | 2140 | 0.5875 | 0.3673 | 0.4346 | 0.8706 | nan | 0.8730 | 0.9515 | 0.7622 | 0.8346 | 0.5210 | nan | 0.6573 | 0.8155 | 0.0 | 0.9442 | 0.0891 | 0.0 | 0.2017 | 0.0 | 0.7272 | 0.0 | 0.0 | 0.9272 | 0.1671 | 0.5758 | 0.2867 | 0.0 | nan | 0.1521 | 0.5740 | 0.4965 | 0.0 | 0.9551 | 0.7943 | 0.9859 | 0.0004 | 0.2049 | 0.4082 | 0.0 | nan | 0.7686 | 0.8777 | 0.7037 | 0.7166 | 0.3780 | nan | 0.5573 | 0.5395 | 0.0 | 0.8537 | 0.0818 | 0.0 | 0.1935 | 0.0 | 0.5376 | 0.0 | 0.0 | 0.7443 | 0.1575 | 0.5101 | 0.2481 | 0.0 | nan | 0.1143 | 0.4611 | 0.3603 | 0.0 | 0.8727 | 0.7406 | 0.9378 | 0.0004 | 0.0644 | 0.3343 | 0.0 | | 0.1582 | 10.8 | 2160 | 0.6342 | 0.3674 | 0.4391 | 0.8630 | nan | 0.8124 | 0.9575 | 0.7501 | 0.7968 | 0.5084 | nan | 0.6739 | 0.8201 | 0.0 | 0.9606 | 0.0620 | 0.0 | 0.1560 | 0.0 | 0.6487 | 0.0 | 0.0 | 0.8970 | 0.2414 | 0.6243 | 0.3606 | 0.0 | nan | 0.1296 | 0.5946 | 0.5022 | 0.0 | 0.9500 | 0.8442 | 0.9796 | 0.0014 | 0.3767 | 0.4047 | 0.0 | nan | 0.7290 | 0.8592 | 0.6867 | 0.6946 | 0.3653 | nan | 0.5461 | 0.5826 | 0.0 | 0.8365 | 0.0584 | 0.0 | 0.1430 | 0.0 | 0.5369 | 0.0 | 0.0 | 0.7539 | 0.2097 | 0.5298 | 0.2998 | 0.0 | nan | 0.1009 | 0.4661 | 0.3656 | 0.0 | 0.8808 | 0.7576 | 0.9442 | 0.0009 | 0.0815 | 0.3260 | 0.0 | | 0.141 | 10.9 | 2180 | 0.5708 | 0.3747 | 0.4457 | 0.8710 | nan | 0.8407 | 0.9478 | 0.7710 | 0.8542 | 0.5662 | nan | 0.6839 | 0.8254 | 0.0 | 0.9435 | 0.0333 | 0.0 | 0.0817 | 0.0 | 0.7243 | 0.0 | 0.0008 | 0.9160 | 0.2876 | 0.6263 | 0.4032 | 0.0 | nan | 0.1865 | 0.6024 | 0.5075 | 0.0 | 0.9494 | 0.8427 | 0.9741 | 0.0032 | 0.2235 | 0.4666 | 0.0 | nan | 0.7579 | 0.8806 | 0.6630 | 0.7200 | 0.3798 | nan | 0.5666 | 0.6190 | 0.0 | 0.8524 | 0.0318 | 0.0 | 0.0809 | 0.0 | 0.5689 | 0.0 | 0.0007 | 0.7592 | 0.2481 | 0.5451 | 0.3295 | 0.0 | nan | 0.1405 | 0.4706 | 0.3839 | 0.0 | 0.8806 | 0.7427 | 0.9450 | 0.0021 | 0.0652 | 0.3562 | 0.0 | | 0.3199 | 11.0 | 2200 | 0.5767 | 0.3700 | 0.4486 | 0.8663 | nan | 0.8247 | 0.9515 | 0.8180 | 0.8596 | 0.5109 | nan | 0.5983 | 0.8435 | 0.0015 | 0.9562 | 0.0444 | 0.0 | 0.1568 | 0.0 | 0.7398 | 0.0 | 0.0 | 0.9145 | 0.3573 | 0.6270 | 0.3562 | 0.0 | nan | 0.1856 | 0.5679 | 0.5520 | 0.0 | 0.9489 | 0.7975 | 0.9869 | 0.0110 | 0.3207 | 0.4260 | 0.0 | nan | 0.7408 | 0.8811 | 0.6150 | 0.7041 | 0.3718 | nan | 0.5332 | 0.5834 | 0.0008 | 0.8431 | 0.0424 | 0.0 | 0.1501 | 0.0 | 0.5590 | 0.0 | 0.0 | 0.7555 | 0.2712 | 0.5466 | 0.2982 | 0.0 | nan | 0.1552 | 0.4603 | 0.3707 | 0.0 | 0.8720 | 0.7199 | 0.9373 | 0.0073 | 0.0800 | 0.3409 | 0.0 | | 0.3278 | 11.1 | 2220 | 0.6226 | 0.3555 | 0.4225 | 0.8637 | nan | 0.8071 | 0.9370 | 0.7867 | 0.8534 | 0.6282 | nan | 0.6651 | 0.8242 | 0.0 | 0.9478 | 0.0697 | 0.0 | 0.0220 | 0.0 | 0.7002 | 0.0 | 0.0007 | 0.9362 | 0.0432 | 0.5970 | 0.2850 | 0.0 | nan | 0.0440 | 0.5153 | 0.5356 | 0.0 | 0.9503 | 0.8485 | 0.9854 | 0.0018 | 0.0713 | 0.4643 | 0.0 | nan | 0.7277 | 0.8729 | 0.6498 | 0.7103 | 0.3589 | nan | 0.5541 | 0.5788 | 0.0 | 0.8502 | 0.0648 | 0.0 | 0.0218 | 0.0 | 0.5862 | 0.0 | 0.0006 | 0.7406 | 0.0431 | 0.5097 | 0.2525 | 0.0 | nan | 0.0372 | 0.4472 | 0.3750 | 0.0 | 0.8800 | 0.7613 | 0.9395 | 0.0014 | 0.0410 | 0.3711 | 0.0 | | 0.1683 | 11.2 | 2240 | 0.6238 | 0.3707 | 0.4469 | 0.8674 | nan | 0.8060 | 0.9628 | 0.7842 | 0.8534 | 0.4724 | nan | 0.6818 | 0.8545 | 0.0 | 0.9610 | 0.0733 | 0.0 | 0.2164 | 0.0 | 0.6961 | 0.0 | 0.0 | 0.9128 | 0.3061 | 0.5964 | 0.3623 | 0.0 | nan | 0.2198 | 0.5741 | 0.5310 | 0.0 | 0.9357 | 0.8772 | 0.9830 | 0.0016 | 0.1934 | 0.4458 | 0.0 | nan | 0.7305 | 0.8713 | 0.6423 | 0.7167 | 0.3767 | nan | 0.5533 | 0.5352 | 0.0 | 0.8412 | 0.0703 | 0.0 | 0.2041 | 0.0 | 0.5618 | 0.0 | 0.0 | 0.7557 | 0.2336 | 0.5064 | 0.2999 | 0.0 | nan | 0.1291 | 0.4647 | 0.3650 | 0.0 | 0.8804 | 0.7569 | 0.9428 | 0.0015 | 0.0728 | 0.3509 | 0.0 | | 0.1605 | 11.3 | 2260 | 0.5795 | 0.3728 | 0.4441 | 0.8687 | nan | 0.8542 | 0.9531 | 0.7614 | 0.8369 | 0.5294 | nan | 0.6406 | 0.7638 | 0.0 | 0.9529 | 0.1167 | 0.0 | 0.3061 | 0.0 | 0.7072 | 0.0 | 0.0 | 0.9357 | 0.1757 | 0.5333 | 0.3332 | 0.0 | nan | 0.2005 | 0.5990 | 0.5629 | 0.0 | 0.9396 | 0.8592 | 0.9558 | 0.0038 | 0.2350 | 0.4551 | 0.0 | nan | 0.7537 | 0.8742 | 0.6982 | 0.7383 | 0.3793 | nan | 0.5505 | 0.6206 | 0.0 | 0.8509 | 0.1002 | 0.0 | 0.2371 | 0.0 | 0.5679 | 0.0 | 0.0 | 0.7413 | 0.1622 | 0.4769 | 0.2774 | 0.0 | nan | 0.1264 | 0.4381 | 0.3432 | 0.0 | 0.8818 | 0.7618 | 0.9269 | 0.0031 | 0.0698 | 0.3487 | 0.0 | | 0.1396 | 11.4 | 2280 | 0.5767 | 0.3647 | 0.4384 | 0.8682 | nan | 0.8746 | 0.9356 | 0.8180 | 0.8394 | 0.4808 | nan | 0.6852 | 0.7584 | 0.0 | 0.9572 | 0.1009 | 0.0 | 0.3371 | 0.0 | 0.6918 | 0.0 | 0.0 | 0.9334 | 0.0883 | 0.5746 | 0.3300 | 0.0 | nan | 0.1219 | 0.5086 | 0.5141 | 0.0 | 0.9436 | 0.8834 | 0.9812 | 0.0021 | 0.2657 | 0.4033 | 0.0 | nan | 0.7602 | 0.8805 | 0.5760 | 0.7281 | 0.3592 | nan | 0.5368 | 0.6123 | 0.0 | 0.8479 | 0.0859 | 0.0 | 0.2574 | 0.0 | 0.5466 | 0.0 | 0.0 | 0.7417 | 0.0864 | 0.4825 | 0.2789 | 0.0 | nan | 0.0901 | 0.4440 | 0.3503 | 0.0 | 0.8823 | 0.7613 | 0.9418 | 0.0021 | 0.0847 | 0.3331 | 0.0 | | 0.1348 | 11.5 | 2300 | 0.6258 | 0.3692 | 0.4448 | 0.8655 | nan | 0.8082 | 0.9612 | 0.7932 | 0.8424 | 0.4898 | nan | 0.6678 | 0.8474 | 0.0 | 0.9463 | 0.1012 | 0.0 | 0.5633 | 0.0 | 0.6651 | 0.0 | 0.0 | 0.9174 | 0.1172 | 0.5338 | 0.3249 | 0.0 | nan | 0.1375 | 0.5718 | 0.5233 | 0.0 | 0.9462 | 0.8342 | 0.9834 | 0.0045 | 0.1324 | 0.5217 | 0.0 | nan | 0.7343 | 0.8648 | 0.6512 | 0.7241 | 0.3822 | nan | 0.5542 | 0.5387 | 0.0 | 0.8505 | 0.0884 | 0.0 | 0.3417 | 0.0 | 0.5591 | 0.0 | 0.0 | 0.7463 | 0.1116 | 0.4585 | 0.2749 | 0.0 | nan | 0.1039 | 0.4578 | 0.3698 | 0.0 | 0.8810 | 0.7603 | 0.9383 | 0.0037 | 0.0536 | 0.3669 | 0.0 | | 0.1144 | 11.6 | 2320 | 0.6251 | 0.3706 | 0.4457 | 0.8656 | nan | 0.8156 | 0.9497 | 0.8051 | 0.8338 | 0.5297 | nan | 0.7026 | 0.8641 | 0.0015 | 0.9475 | 0.0807 | 0.0 | 0.5249 | 0.0 | 0.7219 | 0.0 | 0.0 | 0.9244 | 0.1660 | 0.5631 | 0.2868 | 0.0 | nan | 0.0826 | 0.5438 | 0.5127 | 0.0 | 0.9554 | 0.8453 | 0.9789 | 0.0003 | 0.1870 | 0.4395 | 0.0 | nan | 0.7347 | 0.8709 | 0.6465 | 0.7180 | 0.3731 | nan | 0.5482 | 0.5144 | 0.0008 | 0.8519 | 0.0725 | 0.0 | 0.4376 | 0.0 | 0.5665 | 0.0 | 0.0 | 0.7457 | 0.1464 | 0.4915 | 0.2523 | 0.0 | nan | 0.0738 | 0.4565 | 0.3591 | 0.0 | 0.8778 | 0.7599 | 0.9440 | 0.0003 | 0.0663 | 0.3493 | 0.0 | | 0.1364 | 11.7 | 2340 | 0.6131 | 0.3674 | 0.4372 | 0.8665 | nan | 0.8147 | 0.9550 | 0.7895 | 0.8548 | 0.4751 | nan | 0.6712 | 0.7707 | 0.0073 | 0.9493 | 0.0768 | 0.0 | 0.0791 | 0.0 | 0.7509 | 0.0 | 0.0 | 0.9269 | 0.2658 | 0.5650 | 0.3656 | 0.0 | nan | 0.1236 | 0.5760 | 0.5266 | 0.0 | 0.9430 | 0.8617 | 0.9851 | 0.0006 | 0.2049 | 0.4518 | 0.0 | nan | 0.7291 | 0.8676 | 0.6741 | 0.6949 | 0.3800 | nan | 0.5491 | 0.5884 | 0.0023 | 0.8533 | 0.0695 | 0.0 | 0.0772 | 0.0 | 0.5511 | 0.0 | 0.0 | 0.7475 | 0.2158 | 0.4930 | 0.2880 | 0.0 | nan | 0.1135 | 0.4663 | 0.3732 | 0.0 | 0.8833 | 0.7754 | 0.9408 | 0.0006 | 0.0709 | 0.3511 | 0.0 | | 0.2197 | 11.8 | 2360 | 0.5734 | 0.3734 | 0.4465 | 0.8701 | nan | 0.8327 | 0.9541 | 0.7742 | 0.8469 | 0.5193 | nan | 0.6696 | 0.7860 | 0.0435 | 0.9492 | 0.0867 | 0.0 | 0.1161 | 0.0 | 0.7444 | 0.0 | 0.0 | 0.9244 | 0.2077 | 0.6531 | 0.3496 | 0.0 | nan | 0.2229 | 0.5427 | 0.5329 | 0.0 | 0.9429 | 0.8597 | 0.9803 | 0.0501 | 0.2688 | 0.4307 | 0.0 | nan | 0.7428 | 0.8713 | 0.6954 | 0.7325 | 0.3784 | nan | 0.5642 | 0.5273 | 0.0127 | 0.8512 | 0.0784 | 0.0 | 0.1132 | 0.0 | 0.5606 | 0.0 | 0.0 | 0.7609 | 0.1806 | 0.5258 | 0.2898 | 0.0 | nan | 0.1667 | 0.4618 | 0.3730 | 0.0 | 0.8832 | 0.7786 | 0.9455 | 0.0381 | 0.0867 | 0.3316 | 0.0 | | 0.2554 | 11.9 | 2380 | 0.5918 | 0.3705 | 0.4438 | 0.8689 | nan | 0.8391 | 0.9453 | 0.7625 | 0.8541 | 0.5608 | nan | 0.7264 | 0.8564 | 0.0 | 0.9460 | 0.1626 | 0.0 | 0.1483 | 0.0 | 0.7141 | 0.0 | 0.0 | 0.9265 | 0.1750 | 0.5530 | 0.3455 | 0.0 | nan | 0.1034 | 0.5706 | 0.5443 | 0.0 | 0.9554 | 0.8439 | 0.9802 | 0.0052 | 0.3029 | 0.3817 | 0.0 | nan | 0.7510 | 0.8732 | 0.6893 | 0.7540 | 0.3691 | nan | 0.5639 | 0.5705 | 0.0 | 0.8499 | 0.1376 | 0.0 | 0.1311 | 0.0 | 0.5748 | 0.0 | 0.0 | 0.7445 | 0.1558 | 0.4815 | 0.2829 | 0.0 | nan | 0.0955 | 0.4686 | 0.3581 | 0.0 | 0.8806 | 0.7759 | 0.9447 | 0.0045 | 0.0779 | 0.3222 | 0.0 | | 0.2711 | 12.0 | 2400 | 0.6319 | 0.3632 | 0.4330 | 0.8671 | nan | 0.8422 | 0.9554 | 0.7821 | 0.8127 | 0.5184 | nan | 0.6174 | 0.6287 | 0.2510 | 0.9395 | 0.1899 | 0.0 | 0.0304 | 0.0 | 0.7609 | 0.0 | 0.0 | 0.9433 | 0.1254 | 0.5463 | 0.3575 | 0.0 | nan | 0.0987 | 0.5525 | 0.4929 | 0.0 | 0.9425 | 0.8727 | 0.9788 | 0.0021 | 0.1966 | 0.4170 | 0.0 | nan | 0.7427 | 0.8662 | 0.6861 | 0.7240 | 0.3806 | nan | 0.5351 | 0.5373 | 0.0305 | 0.8533 | 0.1544 | 0.0 | 0.0300 | 0.0 | 0.5271 | 0.0 | 0.0 | 0.7409 | 0.1153 | 0.4812 | 0.2847 | 0.0 | nan | 0.0876 | 0.4611 | 0.3645 | 0.0 | 0.8860 | 0.7777 | 0.9444 | 0.0015 | 0.0678 | 0.3417 | 0.0 | | 0.1332 | 12.1 | 2420 | 0.6043 | 0.3666 | 0.4293 | 0.8684 | nan | 0.8262 | 0.9548 | 0.7555 | 0.8478 | 0.5225 | nan | 0.6493 | 0.8324 | 0.0038 | 0.9513 | 0.1307 | 0.0 | 0.0527 | 0.0 | 0.6916 | 0.0 | 0.0 | 0.9388 | 0.1511 | 0.5920 | 0.3121 | 0.0 | nan | 0.0948 | 0.5519 | 0.4701 | 0.0 | 0.9443 | 0.8559 | 0.9821 | 0.0002 | 0.1775 | 0.4472 | 0.0 | nan | 0.7376 | 0.8679 | 0.7048 | 0.7237 | 0.3671 | nan | 0.5495 | 0.5856 | 0.0021 | 0.8509 | 0.1173 | 0.0 | 0.0517 | 0.0 | 0.5722 | 0.0 | 0.0 | 0.7460 | 0.1402 | 0.5015 | 0.2701 | 0.0 | nan | 0.0822 | 0.4603 | 0.3715 | 0.0 | 0.8849 | 0.7750 | 0.9458 | 0.0002 | 0.0680 | 0.3565 | 0.0 | | 0.2308 | 12.2 | 2440 | 0.5906 | 0.3718 | 0.4328 | 0.8706 | nan | 0.8531 | 0.9513 | 0.7697 | 0.7995 | 0.5138 | nan | 0.6808 | 0.8252 | 0.0 | 0.9505 | 0.1204 | 0.0 | 0.1194 | 0.0 | 0.6586 | 0.0 | 0.0 | 0.9241 | 0.0845 | 0.5784 | 0.3090 | 0.0 | nan | 0.1074 | 0.5745 | 0.5363 | 0.0 | 0.9534 | 0.8744 | 0.9838 | 0.0069 | 0.1622 | 0.5118 | 0.0 | nan | 0.7502 | 0.8699 | 0.7013 | 0.7250 | 0.3777 | nan | 0.5536 | 0.6461 | 0.0 | 0.8491 | 0.1068 | 0.0 | 0.1119 | 0.0 | 0.5816 | 0.0 | 0.0 | 0.7484 | 0.0827 | 0.5018 | 0.2731 | 0.0 | nan | 0.0941 | 0.4679 | 0.3875 | 0.0 | 0.8832 | 0.7777 | 0.9435 | 0.0049 | 0.0723 | 0.3883 | 0.0 | | 0.1515 | 12.3 | 2460 | 0.5838 | 0.3799 | 0.4509 | 0.8692 | nan | 0.8421 | 0.9520 | 0.7778 | 0.8411 | 0.5345 | nan | 0.6488 | 0.8483 | 0.0140 | 0.9347 | 0.1763 | 0.0 | 0.4235 | 0.0 | 0.6691 | 0.0 | 0.0 | 0.9141 | 0.1371 | 0.6244 | 0.3622 | 0.0 | nan | 0.1605 | 0.5416 | 0.5749 | 0.0 | 0.9604 | 0.8121 | 0.9781 | 0.0010 | 0.2560 | 0.4436 | 0.0 | nan | 0.7502 | 0.8705 | 0.6726 | 0.7230 | 0.3918 | nan | 0.5458 | 0.5629 | 0.0056 | 0.8510 | 0.1486 | 0.0 | 0.3683 | 0.0 | 0.5684 | 0.0 | 0.0 | 0.7557 | 0.1281 | 0.5094 | 0.3033 | 0.0 | nan | 0.1315 | 0.4601 | 0.3689 | 0.0 | 0.8734 | 0.7513 | 0.9462 | 0.0008 | 0.1108 | 0.3598 | 0.0 | | 0.1225 | 12.4 | 2480 | 0.6349 | 0.3697 | 0.4419 | 0.8653 | nan | 0.8298 | 0.9563 | 0.7270 | 0.8458 | 0.4908 | nan | 0.6935 | 0.8170 | 0.0016 | 0.9535 | 0.1277 | 0.0 | 0.1943 | 0.0 | 0.6818 | 0.0 | 0.0 | 0.9243 | 0.0993 | 0.5907 | 0.3374 | 0.0 | nan | 0.1578 | 0.5889 | 0.5923 | 0.0 | 0.9541 | 0.7663 | 0.9832 | 0.0017 | 0.4623 | 0.3629 | 0.0 | nan | 0.7383 | 0.8689 | 0.6986 | 0.7201 | 0.3735 | nan | 0.5547 | 0.6206 | 0.0008 | 0.8499 | 0.1108 | 0.0 | 0.1761 | 0.0 | 0.5647 | 0.0 | 0.0 | 0.7506 | 0.0919 | 0.5152 | 0.2830 | 0.0 | nan | 0.1368 | 0.4754 | 0.3761 | 0.0 | 0.8704 | 0.7117 | 0.9433 | 0.0012 | 0.0886 | 0.3105 | 0.0 | | 0.17 | 12.5 | 2500 | 0.6301 | 0.3746 | 0.4515 | 0.8591 | nan | 0.7800 | 0.9414 | 0.7639 | 0.8348 | 0.5670 | nan | 0.6941 | 0.8344 | 0.0025 | 0.9472 | 0.1556 | 0.0 | 0.2321 | 0.0 | 0.7384 | 0.0 | 0.0 | 0.9207 | 0.1489 | 0.5901 | 0.3355 | 0.0 | nan | 0.2611 | 0.5878 | 0.5471 | 0.0 | 0.9567 | 0.8401 | 0.9839 | 0.0020 | 0.3890 | 0.3939 | 0.0 | nan | 0.6970 | 0.8698 | 0.7009 | 0.6399 | 0.3747 | nan | 0.5503 | 0.6201 | 0.0013 | 0.8517 | 0.1344 | 0.0 | 0.2098 | 0.0 | 0.5732 | 0.0 | 0.0 | 0.7501 | 0.1287 | 0.5132 | 0.2866 | 0.0 | nan | 0.2142 | 0.4713 | 0.3932 | 0.0 | 0.8804 | 0.7604 | 0.9405 | 0.0012 | 0.0953 | 0.3289 | 0.0 | | 0.165 | 12.6 | 2520 | 0.6584 | 0.3661 | 0.4352 | 0.8570 | nan | 0.7368 | 0.9524 | 0.7641 | 0.8569 | 0.5238 | nan | 0.6428 | 0.8209 | 0.0036 | 0.9346 | 0.1399 | 0.0 | 0.1906 | 0.0 | 0.7413 | 0.0 | 0.0 | 0.9404 | 0.2085 | 0.5960 | 0.3233 | 0.0 | nan | 0.1134 | 0.5718 | 0.5039 | 0.0 | 0.9471 | 0.8686 | 0.9802 | 0.0023 | 0.1083 | 0.4567 | 0.0 | nan | 0.6616 | 0.8708 | 0.6893 | 0.5929 | 0.3828 | nan | 0.5459 | 0.5759 | 0.0018 | 0.8532 | 0.1247 | 0.0 | 0.1748 | 0.0 | 0.5620 | 0.0 | 0.0 | 0.7446 | 0.1639 | 0.4943 | 0.2780 | 0.0 | nan | 0.0852 | 0.4645 | 0.3842 | 0.0 | 0.8847 | 0.7794 | 0.9446 | 0.0020 | 0.0760 | 0.3766 | 0.0 | | 0.1429 | 12.7 | 2540 | 0.6257 | 0.3674 | 0.4410 | 0.8642 | nan | 0.8160 | 0.9534 | 0.7486 | 0.8338 | 0.4912 | nan | 0.6777 | 0.8433 | 0.0 | 0.9496 | 0.1179 | 0.0 | 0.3490 | 0.0 | 0.7247 | 0.0 | 0.0 | 0.9433 | 0.1158 | 0.5400 | 0.3428 | 0.0 | nan | 0.0911 | 0.6048 | 0.5402 | 0.0 | 0.9376 | 0.8745 | 0.9812 | 0.0009 | 0.2446 | 0.3913 | 0.0 | nan | 0.7176 | 0.8700 | 0.6719 | 0.6720 | 0.3812 | nan | 0.5478 | 0.5687 | 0.0 | 0.8470 | 0.1068 | 0.0 | 0.2404 | 0.0 | 0.5742 | 0.0 | 0.0 | 0.7373 | 0.1023 | 0.4654 | 0.2893 | 0.0 | nan | 0.0700 | 0.4721 | 0.3740 | 0.0 | 0.8863 | 0.7742 | 0.9449 | 0.0008 | 0.1089 | 0.3334 | 0.0 | | 0.2148 | 12.8 | 2560 | 0.6506 | 0.3615 | 0.4361 | 0.8657 | nan | 0.8294 | 0.9461 | 0.7752 | 0.8529 | 0.5457 | nan | 0.6227 | 0.8170 | 0.0 | 0.9457 | 0.1039 | 0.0 | 0.1968 | 0.0 | 0.7317 | 0.0 | 0.0 | 0.9237 | 0.0543 | 0.5955 | 0.3732 | 0.0 | nan | 0.0714 | 0.5452 | 0.5722 | 0.0 | 0.9555 | 0.8497 | 0.9830 | 0.0013 | 0.3009 | 0.3629 | 0.0 | nan | 0.7423 | 0.8720 | 0.6351 | 0.7116 | 0.3801 | nan | 0.5362 | 0.5780 | 0.0 | 0.8514 | 0.0939 | 0.0 | 0.1506 | 0.0 | 0.5538 | 0.0 | 0.0 | 0.7452 | 0.0523 | 0.4998 | 0.3005 | 0.0 | nan | 0.0557 | 0.4536 | 0.3680 | 0.0 | 0.8828 | 0.7690 | 0.9439 | 0.0008 | 0.0857 | 0.3067 | 0.0 | | 0.1008 | 12.9 | 2580 | 0.6366 | 0.3646 | 0.4308 | 0.8690 | nan | 0.8348 | 0.9596 | 0.7654 | 0.8486 | 0.4781 | nan | 0.6440 | 0.8230 | 0.0 | 0.9401 | 0.0809 | 0.0 | 0.0403 | 0.0 | 0.7241 | 0.0 | 0.0 | 0.9404 | 0.0910 | 0.5811 | 0.3870 | 0.0 | nan | 0.0529 | 0.6005 | 0.5440 | 0.0 | 0.9417 | 0.8558 | 0.9761 | 0.0091 | 0.2742 | 0.3930 | 0.0 | nan | 0.7476 | 0.8739 | 0.6846 | 0.7247 | 0.3771 | nan | 0.5427 | 0.6177 | 0.0 | 0.8530 | 0.0728 | 0.0 | 0.0367 | 0.0 | 0.5974 | 0.0 | 0.0 | 0.7460 | 0.0835 | 0.4990 | 0.3228 | 0.0 | nan | 0.0427 | 0.4805 | 0.3583 | 0.0 | 0.8849 | 0.7654 | 0.9476 | 0.0044 | 0.0760 | 0.3287 | 0.0 | | 0.1522 | 13.0 | 2600 | 0.6428 | 0.3646 | 0.4295 | 0.8690 | nan | 0.8303 | 0.9632 | 0.7664 | 0.8164 | 0.4636 | nan | 0.6786 | 0.7941 | 0.0266 | 0.9519 | 0.0963 | 0.0 | 0.0264 | 0.0 | 0.7352 | 0.0 | 0.0 | 0.9292 | 0.1689 | 0.5832 | 0.3683 | 0.0 | nan | 0.0134 | 0.5600 | 0.5372 | 0.0 | 0.9532 | 0.8346 | 0.9827 | 0.0022 | 0.2319 | 0.4291 | 0.0 | nan | 0.7495 | 0.8672 | 0.6985 | 0.7028 | 0.3705 | nan | 0.5573 | 0.5974 | 0.0090 | 0.8505 | 0.0855 | 0.0 | 0.0226 | 0.0 | 0.5779 | 0.0 | 0.0 | 0.7493 | 0.1519 | 0.4990 | 0.3153 | 0.0 | nan | 0.0110 | 0.4666 | 0.3700 | 0.0 | 0.8830 | 0.7672 | 0.9455 | 0.0016 | 0.0710 | 0.3461 | 0.0 | | 0.2057 | 13.1 | 2620 | 0.6362 | 0.3648 | 0.4290 | 0.8681 | nan | 0.8362 | 0.9518 | 0.7971 | 0.8013 | 0.4958 | nan | 0.6809 | 0.8544 | 0.0 | 0.9491 | 0.1131 | 0.0 | 0.0089 | 0.0 | 0.7202 | 0.0 | 0.0 | 0.9329 | 0.0788 | 0.5645 | 0.3541 | 0.0 | nan | 0.0622 | 0.5870 | 0.5498 | 0.0 | 0.9617 | 0.8374 | 0.9785 | 0.0082 | 0.1591 | 0.4444 | 0.0 | nan | 0.7457 | 0.8653 | 0.7014 | 0.7078 | 0.3798 | nan | 0.5451 | 0.6337 | 0.0 | 0.8506 | 0.1009 | 0.0 | 0.0084 | 0.0 | 0.5899 | 0.0 | 0.0 | 0.7449 | 0.0734 | 0.4999 | 0.3024 | 0.0 | nan | 0.0552 | 0.4695 | 0.3761 | 0.0 | 0.8769 | 0.7513 | 0.9487 | 0.0065 | 0.0730 | 0.3664 | 0.0 | | 0.1275 | 13.2 | 2640 | 0.6206 | 0.3704 | 0.4324 | 0.8716 | nan | 0.8514 | 0.9568 | 0.7906 | 0.8056 | 0.5298 | nan | 0.6604 | 0.8550 | 0.0 | 0.9368 | 0.1211 | 0.0 | 0.0309 | 0.0 | 0.7165 | 0.0 | 0.0 | 0.9430 | 0.1613 | 0.5821 | 0.3947 | 0.0 | nan | 0.0998 | 0.5349 | 0.5008 | 0.0 | 0.9382 | 0.8708 | 0.9835 | 0.0024 | 0.0890 | 0.4824 | 0.0 | nan | 0.7574 | 0.8693 | 0.7222 | 0.7114 | 0.3926 | nan | 0.5542 | 0.5887 | 0.0 | 0.8555 | 0.1079 | 0.0 | 0.0280 | 0.0 | 0.6065 | 0.0 | 0.0 | 0.7420 | 0.1495 | 0.5062 | 0.3216 | 0.0 | nan | 0.0856 | 0.4645 | 0.3651 | 0.0 | 0.8853 | 0.7743 | 0.9452 | 0.0023 | 0.0430 | 0.3754 | 0.0 | | 0.1347 | 13.3 | 2660 | 0.6095 | 0.3719 | 0.4455 | 0.8700 | nan | 0.8279 | 0.9563 | 0.7853 | 0.8425 | 0.5450 | nan | 0.6901 | 0.8485 | 0.0 | 0.9497 | 0.1046 | 0.0 | 0.0767 | 0.0 | 0.7409 | 0.0 | 0.0 | 0.9110 | 0.2988 | 0.5294 | 0.3394 | 0.0 | nan | 0.1154 | 0.6044 | 0.5436 | 0.0 | 0.9577 | 0.8662 | 0.9823 | 0.0019 | 0.3407 | 0.3988 | 0.0 | nan | 0.7461 | 0.8734 | 0.6989 | 0.7428 | 0.3914 | nan | 0.5611 | 0.5997 | 0.0 | 0.8506 | 0.0949 | 0.0 | 0.0700 | 0.0 | 0.5830 | 0.0 | 0.0 | 0.7483 | 0.2296 | 0.4694 | 0.2874 | 0.0 | nan | 0.0914 | 0.4729 | 0.3638 | 0.0 | 0.8769 | 0.7703 | 0.9457 | 0.0016 | 0.0945 | 0.3362 | 0.0 | | 0.2485 | 13.4 | 2680 | 0.6268 | 0.3714 | 0.4387 | 0.8683 | nan | 0.8264 | 0.9537 | 0.8050 | 0.8298 | 0.5365 | nan | 0.6674 | 0.8409 | 0.0 | 0.9517 | 0.1271 | 0.0 | 0.0620 | 0.0 | 0.7334 | 0.0 | 0.0001 | 0.9130 | 0.1673 | 0.6948 | 0.2833 | 0.0 | nan | 0.1799 | 0.5789 | 0.4927 | 0.0 | 0.9572 | 0.7666 | 0.9814 | 0.0028 | 0.2146 | 0.4709 | 0.0 | nan | 0.7393 | 0.8683 | 0.6553 | 0.7410 | 0.3897 | nan | 0.5474 | 0.6279 | 0.0 | 0.8533 | 0.1132 | 0.0 | 0.0553 | 0.0 | 0.6019 | 0.0 | 0.0001 | 0.7628 | 0.1513 | 0.5262 | 0.2585 | 0.0 | nan | 0.1395 | 0.4651 | 0.3625 | 0.0 | 0.8740 | 0.7215 | 0.9462 | 0.0020 | 0.1102 | 0.3729 | 0.0 | | 0.1525 | 13.5 | 2700 | 0.6110 | 0.3650 | 0.4255 | 0.8704 | nan | 0.8474 | 0.9557 | 0.7782 | 0.8435 | 0.5040 | nan | 0.6619 | 0.7982 | 0.0 | 0.9442 | 0.0899 | 0.0 | 0.0353 | 0.0 | 0.6722 | 0.0 | 0.0 | 0.9479 | 0.0813 | 0.5063 | 0.3830 | 0.0 | nan | 0.0530 | 0.5744 | 0.5097 | 0.0 | 0.9437 | 0.8528 | 0.9840 | 0.0021 | 0.1918 | 0.4547 | 0.0 | nan | 0.7505 | 0.8771 | 0.6676 | 0.7346 | 0.3815 | nan | 0.5483 | 0.6471 | 0.0 | 0.8551 | 0.0823 | 0.0 | 0.0322 | 0.0 | 0.5850 | 0.0 | 0.0 | 0.7303 | 0.0762 | 0.4660 | 0.3171 | 0.0 | nan | 0.0483 | 0.4664 | 0.3777 | 0.0 | 0.8836 | 0.7640 | 0.9446 | 0.0016 | 0.0762 | 0.3674 | 0.0 | | 0.143 | 13.6 | 2720 | 0.6194 | 0.3740 | 0.4436 | 0.8696 | nan | 0.8290 | 0.9541 | 0.7935 | 0.8487 | 0.5119 | nan | 0.6949 | 0.8598 | 0.0002 | 0.9392 | 0.1333 | 0.0 | 0.1574 | 0.0 | 0.6618 | 0.0 | 0.0 | 0.9352 | 0.2334 | 0.5895 | 0.3310 | 0.0 | nan | 0.1549 | 0.5873 | 0.5214 | 0.0 | 0.9487 | 0.8358 | 0.9781 | 0.0015 | 0.2644 | 0.4311 | 0.0 | nan | 0.7467 | 0.8738 | 0.6418 | 0.7283 | 0.3822 | nan | 0.5565 | 0.6359 | 0.0002 | 0.8549 | 0.1172 | 0.0 | 0.1287 | 0.0 | 0.5698 | 0.0 | 0.0 | 0.7493 | 0.1940 | 0.5161 | 0.2898 | 0.0 | nan | 0.1212 | 0.4736 | 0.3769 | 0.0 | 0.8815 | 0.7545 | 0.9493 | 0.0012 | 0.0793 | 0.3459 | 0.0 | | 0.1094 | 13.7 | 2740 | 0.6446 | 0.3740 | 0.4465 | 0.8647 | nan | 0.8092 | 0.9513 | 0.7723 | 0.8255 | 0.5528 | nan | 0.6324 | 0.8453 | 0.0056 | 0.9468 | 0.1589 | 0.0 | 0.2978 | 0.0000 | 0.6940 | 0.0 | 0.0 | 0.9222 | 0.1008 | 0.6086 | 0.4271 | 0.0 | nan | 0.1750 | 0.5541 | 0.5967 | 0.0 | 0.9486 | 0.8623 | 0.9790 | 0.0019 | 0.1964 | 0.4233 | 0.0 | nan | 0.7344 | 0.8628 | 0.6750 | 0.6872 | 0.3726 | nan | 0.5472 | 0.6229 | 0.0033 | 0.8487 | 0.1334 | 0.0 | 0.2540 | 0.0000 | 0.5686 | 0.0 | 0.0 | 0.7526 | 0.0916 | 0.5157 | 0.3333 | 0.0 | nan | 0.1280 | 0.4655 | 0.3615 | 0.0 | 0.8826 | 0.7633 | 0.9485 | 0.0014 | 0.0656 | 0.3464 | 0.0 | | 0.1719 | 13.8 | 2760 | 0.6517 | 0.3692 | 0.4442 | 0.8633 | nan | 0.8127 | 0.9523 | 0.7780 | 0.8328 | 0.4678 | nan | 0.6884 | 0.8158 | 0.0169 | 0.9453 | 0.1204 | 0.0 | 0.3134 | 0.0 | 0.7445 | 0.0 | 0.0 | 0.9160 | 0.1944 | 0.5866 | 0.3583 | 0.0 | nan | 0.0522 | 0.5863 | 0.5834 | 0.0 | 0.9586 | 0.7890 | 0.9860 | 0.0052 | 0.2654 | 0.4462 | 0.0 | nan | 0.7298 | 0.8677 | 0.6782 | 0.6843 | 0.3674 | nan | 0.5435 | 0.6222 | 0.0066 | 0.8529 | 0.1059 | 0.0 | 0.2405 | 0.0 | 0.5573 | 0.0 | 0.0 | 0.7502 | 0.1709 | 0.5163 | 0.2980 | 0.0 | nan | 0.0417 | 0.4815 | 0.3531 | 0.0 | 0.8734 | 0.7261 | 0.9391 | 0.0036 | 0.0683 | 0.3350 | 0.0 | | 0.197 | 13.9 | 2780 | 0.6215 | 0.3729 | 0.4418 | 0.8687 | nan | 0.8668 | 0.9514 | 0.7759 | 0.7972 | 0.5255 | nan | 0.6496 | 0.8175 | 0.0228 | 0.9515 | 0.0731 | 0.0 | 0.2487 | 0.0 | 0.7049 | 0.0 | 0.0 | 0.9290 | 0.1966 | 0.5696 | 0.3480 | 0.0 | nan | 0.1606 | 0.5942 | 0.5603 | 0.0 | 0.9526 | 0.7950 | 0.9848 | 0.0269 | 0.2232 | 0.4112 | 0.0 | nan | 0.7569 | 0.8791 | 0.6367 | 0.7200 | 0.3761 | nan | 0.5464 | 0.6425 | 0.0090 | 0.8494 | 0.0679 | 0.0 | 0.1974 | 0.0 | 0.5789 | 0.0 | 0.0 | 0.7471 | 0.1676 | 0.5024 | 0.2995 | 0.0 | nan | 0.1214 | 0.4876 | 0.3790 | 0.0 | 0.8760 | 0.7302 | 0.9424 | 0.0164 | 0.0679 | 0.3361 | 0.0 | | 0.1238 | 14.0 | 2800 | 0.6008 | 0.3822 | 0.4546 | 0.8742 | nan | 0.8710 | 0.9515 | 0.7717 | 0.8274 | 0.4743 | nan | 0.7189 | 0.8324 | 0.0504 | 0.9358 | 0.1246 | 0.0 | 0.4430 | 0.0 | 0.7318 | 0.0 | 0.0173 | 0.9097 | 0.0782 | 0.6669 | 0.3868 | 0.0 | nan | 0.1546 | 0.5889 | 0.5536 | 0.0 | 0.9543 | 0.8475 | 0.9784 | 0.0074 | 0.1882 | 0.4838 | 0.0 | nan | 0.7677 | 0.8817 | 0.6589 | 0.7277 | 0.3922 | nan | 0.5545 | 0.6251 | 0.0184 | 0.8534 | 0.1129 | 0.0 | 0.3401 | 0.0 | 0.5678 | 0.0 | 0.0133 | 0.7551 | 0.0757 | 0.5311 | 0.3201 | 0.0 | nan | 0.1346 | 0.4819 | 0.3939 | 0.0 | 0.8806 | 0.7508 | 0.9467 | 0.0057 | 0.0696 | 0.3693 | 0.0 | | 0.1317 | 14.1 | 2820 | 0.6037 | 0.3756 | 0.4368 | 0.8766 | nan | 0.8728 | 0.9625 | 0.7553 | 0.8525 | 0.4601 | nan | 0.6745 | 0.8292 | 0.0124 | 0.9444 | 0.1426 | 0.0 | 0.0085 | 0.0 | 0.6589 | 0.0 | 0.0052 | 0.9372 | 0.1747 | 0.5779 | 0.3701 | 0.0 | nan | 0.1536 | 0.6059 | 0.5533 | 0.0 | 0.9529 | 0.8101 | 0.9833 | 0.0028 | 0.2206 | 0.4557 | 0.0 | nan | 0.7769 | 0.8838 | 0.7156 | 0.7342 | 0.3727 | nan | 0.5626 | 0.6729 | 0.0066 | 0.8449 | 0.1261 | 0.0 | 0.0054 | 0.0 | 0.5665 | 0.0 | 0.0038 | 0.7449 | 0.1511 | 0.5088 | 0.3111 | 0.0 | nan | 0.1232 | 0.4809 | 0.3857 | 0.0 | 0.8812 | 0.7537 | 0.9452 | 0.0027 | 0.0850 | 0.3737 | 0.0 | | 0.1561 | 14.2 | 2840 | 0.6731 | 0.3774 | 0.4507 | 0.8659 | nan | 0.8091 | 0.9530 | 0.7726 | 0.8470 | 0.5168 | nan | 0.7039 | 0.8604 | 0.0233 | 0.9451 | 0.1814 | 0.0 | 0.4042 | 0.0002 | 0.7136 | 0.0 | 0.0019 | 0.9294 | 0.1412 | 0.5499 | 0.3167 | 0.0 | nan | 0.2017 | 0.5679 | 0.5189 | 0.0 | 0.9553 | 0.8126 | 0.9828 | 0.0065 | 0.2318 | 0.4767 | 0.0 | nan | 0.7279 | 0.8725 | 0.6738 | 0.6924 | 0.3847 | nan | 0.5622 | 0.6163 | 0.0140 | 0.8548 | 0.1535 | 0.0 | 0.2975 | 0.0002 | 0.5667 | 0.0 | 0.0014 | 0.7453 | 0.1297 | 0.4810 | 0.2752 | 0.0 | nan | 0.1615 | 0.4737 | 0.3750 | 0.0 | 0.8758 | 0.7340 | 0.9463 | 0.0055 | 0.0817 | 0.3751 | 0.0 | | 0.1034 | 14.3 | 2860 | 0.6550 | 0.3748 | 0.4388 | 0.8693 | nan | 0.8425 | 0.9480 | 0.7923 | 0.8447 | 0.5247 | nan | 0.6776 | 0.7910 | 0.0 | 0.9425 | 0.1608 | 0.0 | 0.0004 | 0.0 | 0.6871 | 0.0 | 0.0 | 0.9286 | 0.1559 | 0.4997 | 0.3733 | 0.0035 | nan | 0.2962 | 0.5880 | 0.5191 | 0.0 | 0.9588 | 0.8267 | 0.9847 | 0.0279 | 0.1601 | 0.5066 | 0.0 | nan | 0.7517 | 0.8791 | 0.6509 | 0.7343 | 0.3776 | nan | 0.5648 | 0.6827 | 0.0 | 0.8562 | 0.1422 | 0.0 | 0.0004 | 0.0 | 0.5781 | 0.0 | 0.0 | 0.7368 | 0.1417 | 0.4471 | 0.3179 | 0.0035 | nan | 0.2035 | 0.4832 | 0.3897 | 0.0 | 0.8790 | 0.7543 | 0.9447 | 0.0138 | 0.0668 | 0.3940 | 0.0 | | 0.1665 | 14.4 | 2880 | 0.6118 | 0.3831 | 0.4646 | 0.8676 | nan | 0.7911 | 0.9537 | 0.7726 | 0.8726 | 0.4892 | nan | 0.7120 | 0.8720 | 0.0038 | 0.9533 | 0.1108 | 0.0 | 0.1787 | 0.0 | 0.6998 | 0.0 | 0.0 | 0.9148 | 0.3593 | 0.6052 | 0.4110 | 0.0 | nan | 0.3748 | 0.6291 | 0.5697 | 0.0 | 0.9530 | 0.8561 | 0.9818 | 0.0107 | 0.3523 | 0.4381 | 0.0 | nan | 0.7136 | 0.8780 | 0.6737 | 0.6563 | 0.3897 | nan | 0.5721 | 0.6119 | 0.0021 | 0.8477 | 0.0971 | 0.0 | 0.1439 | 0.0 | 0.5845 | 0.0 | 0.0 | 0.7619 | 0.2726 | 0.5318 | 0.3321 | 0.0 | nan | 0.2603 | 0.4942 | 0.3754 | 0.0 | 0.8866 | 0.7676 | 0.9465 | 0.0086 | 0.0947 | 0.3578 | 0.0 | | 0.1398 | 14.5 | 2900 | 0.6139 | 0.3737 | 0.4408 | 0.8712 | nan | 0.8352 | 0.9550 | 0.7588 | 0.8556 | 0.5251 | nan | 0.6427 | 0.8668 | 0.0036 | 0.9531 | 0.1265 | 0.0 | 0.0435 | 0.0 | 0.6477 | 0.0 | 0.0 | 0.9243 | 0.2127 | 0.6372 | 0.3441 | 0.0 | nan | 0.1601 | 0.5896 | 0.5623 | 0.0 | 0.9568 | 0.8104 | 0.9820 | 0.0034 | 0.2632 | 0.4466 | 0.0 | nan | 0.7429 | 0.8702 | 0.6934 | 0.7224 | 0.3869 | nan | 0.5542 | 0.6216 | 0.0023 | 0.8513 | 0.1085 | 0.0 | 0.0360 | 0.0 | 0.5713 | 0.0 | 0.0 | 0.7667 | 0.1898 | 0.5516 | 0.2950 | 0.0 | nan | 0.1310 | 0.4800 | 0.3634 | 0.0 | 0.8795 | 0.7426 | 0.9460 | 0.0033 | 0.0936 | 0.3543 | 0.0 | | 0.1276 | 14.6 | 2920 | 0.6433 | 0.3719 | 0.4329 | 0.8700 | nan | 0.8228 | 0.9536 | 0.7414 | 0.8581 | 0.5513 | nan | 0.6904 | 0.8408 | 0.0 | 0.9392 | 0.1324 | 0.0 | 0.0 | 0.0033 | 0.6332 | 0.0 | 0.0 | 0.9406 | 0.1394 | 0.6059 | 0.3639 | 0.0 | nan | 0.1756 | 0.5418 | 0.5289 | 0.0 | 0.9547 | 0.8334 | 0.9822 | 0.0075 | 0.2013 | 0.4118 | 0.0 | nan | 0.7384 | 0.8704 | 0.6956 | 0.7200 | 0.3889 | nan | 0.5758 | 0.6502 | 0.0 | 0.8530 | 0.1164 | 0.0 | 0.0 | 0.0033 | 0.5765 | 0.0 | 0.0 | 0.7459 | 0.1295 | 0.5186 | 0.3150 | 0.0 | nan | 0.1326 | 0.4657 | 0.3834 | 0.0 | 0.8835 | 0.7562 | 0.9467 | 0.0067 | 0.0838 | 0.3460 | 0.0 | | 0.1534 | 14.7 | 2940 | 0.6143 | 0.3755 | 0.4412 | 0.8690 | nan | 0.8303 | 0.9475 | 0.7472 | 0.8284 | 0.5513 | nan | 0.6966 | 0.8160 | 0.0036 | 0.9502 | 0.1188 | 0.0 | 0.0228 | 0.0112 | 0.6890 | 0.0 | 0.0 | 0.9327 | 0.1902 | 0.6148 | 0.3769 | 0.0 | nan | 0.2047 | 0.6157 | 0.5613 | 0.0 | 0.9489 | 0.8397 | 0.9842 | 0.0051 | 0.1582 | 0.4742 | 0.0 | nan | 0.7388 | 0.8680 | 0.7020 | 0.7018 | 0.3790 | nan | 0.5612 | 0.6487 | 0.0019 | 0.8491 | 0.1049 | 0.0 | 0.0214 | 0.0111 | 0.5808 | 0.0 | 0.0 | 0.7596 | 0.1761 | 0.5272 | 0.3231 | 0.0 | nan | 0.1657 | 0.4738 | 0.3796 | 0.0 | 0.8843 | 0.7596 | 0.9453 | 0.0042 | 0.0823 | 0.3670 | 0.0 | | 0.7322 | 14.8 | 2960 | 0.5924 | 0.3867 | 0.4634 | 0.8740 | nan | 0.8622 | 0.9550 | 0.7538 | 0.8256 | 0.4927 | nan | 0.6743 | 0.8511 | 0.0127 | 0.9476 | 0.1325 | 0.0 | 0.2133 | 0.0395 | 0.7274 | 0.0 | 0.0 | 0.9184 | 0.3518 | 0.6067 | 0.4630 | 0.0 | nan | 0.2801 | 0.5880 | 0.5689 | 0.0 | 0.9469 | 0.8717 | 0.9805 | 0.0367 | 0.2787 | 0.4502 | 0.0 | nan | 0.7656 | 0.8766 | 0.6694 | 0.7074 | 0.3811 | nan | 0.5560 | 0.6148 | 0.0075 | 0.8546 | 0.1119 | 0.0 | 0.1506 | 0.0395 | 0.5836 | 0.0 | 0.0 | 0.7703 | 0.2928 | 0.5369 | 0.3488 | 0.0 | nan | 0.2015 | 0.4753 | 0.3556 | 0.0 | 0.8884 | 0.7731 | 0.9472 | 0.0223 | 0.0949 | 0.3494 | 0.0 | | 0.1363 | 14.9 | 2980 | 0.5998 | 0.3843 | 0.4588 | 0.8758 | nan | 0.8613 | 0.9508 | 0.7900 | 0.8233 | 0.5551 | nan | 0.6846 | 0.7923 | 0.1220 | 0.9444 | 0.1305 | 0.0 | 0.2587 | 0.0345 | 0.7305 | 0.0 | 0.0 | 0.9263 | 0.2417 | 0.6455 | 0.3663 | 0.0 | nan | 0.1970 | 0.5860 | 0.5722 | 0.0 | 0.9550 | 0.8504 | 0.9841 | 0.0068 | 0.1978 | 0.4733 | 0.0 | nan | 0.7683 | 0.8822 | 0.6306 | 0.7341 | 0.3911 | nan | 0.5612 | 0.6044 | 0.0279 | 0.8514 | 0.1080 | 0.0 | 0.1795 | 0.0345 | 0.5812 | 0.0 | 0.0 | 0.7676 | 0.2177 | 0.5378 | 0.3020 | 0.0 | nan | 0.1535 | 0.4746 | 0.3613 | 0.0 | 0.8867 | 0.7781 | 0.9441 | 0.0047 | 0.1336 | 0.3809 | 0.0 | | 0.1537 | 15.0 | 3000 | 0.6302 | 0.3781 | 0.4487 | 0.8742 | nan | 0.8608 | 0.9579 | 0.7474 | 0.8513 | 0.5297 | nan | 0.6613 | 0.8043 | 0.0522 | 0.9385 | 0.1654 | 0.0 | 0.2167 | 0.0953 | 0.7527 | 0.0 | 0.0 | 0.9337 | 0.1895 | 0.5803 | 0.3136 | 0.0 | nan | 0.1148 | 0.6138 | 0.5339 | 0.0 | 0.9526 | 0.8402 | 0.9592 | 0.0053 | 0.2302 | 0.4568 | 0.0 | nan | 0.7672 | 0.8787 | 0.6736 | 0.7405 | 0.3974 | nan | 0.5523 | 0.5813 | 0.0141 | 0.8544 | 0.1415 | 0.0 | 0.1406 | 0.0953 | 0.5765 | 0.0 | 0.0 | 0.7532 | 0.1724 | 0.5052 | 0.2751 | 0.0 | nan | 0.0875 | 0.4545 | 0.3693 | 0.0 | 0.8846 | 0.7687 | 0.9310 | 0.0042 | 0.1148 | 0.3662 | 0.0 | | 0.1485 | 15.1 | 3020 | 0.6229 | 0.3789 | 0.4409 | 0.8729 | nan | 0.8526 | 0.9580 | 0.7556 | 0.8393 | 0.4950 | nan | 0.6441 | 0.7653 | 0.0076 | 0.9464 | 0.0919 | 0.0 | 0.0298 | 0.0986 | 0.7278 | 0.0 | 0.0 | 0.9373 | 0.2535 | 0.5704 | 0.3935 | 0.0 | nan | 0.1312 | 0.5455 | 0.5439 | 0.0 | 0.9529 | 0.8331 | 0.9853 | 0.0344 | 0.2334 | 0.4811 | 0.0 | nan | 0.7615 | 0.8757 | 0.6904 | 0.7305 | 0.3855 | nan | 0.5535 | 0.6358 | 0.0036 | 0.8538 | 0.0840 | 0.0 | 0.0231 | 0.0986 | 0.5857 | 0.0 | 0.0 | 0.7518 | 0.2238 | 0.5053 | 0.3084 | 0.0 | nan | 0.1041 | 0.4677 | 0.3890 | 0.0 | 0.8814 | 0.7567 | 0.9427 | 0.0189 | 0.1170 | 0.3772 | 0.0 | | 0.1718 | 15.2 | 3040 | 0.6701 | 0.3798 | 0.4523 | 0.8674 | nan | 0.7972 | 0.9520 | 0.8221 | 0.8217 | 0.5098 | nan | 0.7271 | 0.8380 | 0.0024 | 0.9484 | 0.0892 | 0.0 | 0.2547 | 0.0806 | 0.7163 | 0.0 | 0.0 | 0.9314 | 0.2443 | 0.5898 | 0.4167 | 0.0 | nan | 0.1191 | 0.6087 | 0.5311 | 0.0 | 0.9565 | 0.8495 | 0.9722 | 0.0061 | 0.2044 | 0.4828 | 0.0 | nan | 0.7285 | 0.8674 | 0.6197 | 0.6923 | 0.3759 | nan | 0.5581 | 0.6158 | 0.0015 | 0.8562 | 0.0833 | 0.0 | 0.1950 | 0.0806 | 0.5738 | 0.0 | 0.0 | 0.7618 | 0.2240 | 0.5185 | 0.3276 | 0.0 | nan | 0.1040 | 0.4752 | 0.3956 | 0.0 | 0.8844 | 0.7688 | 0.9478 | 0.0047 | 0.1121 | 0.3815 | 0.0 | | 0.2132 | 15.3 | 3060 | 0.6335 | 0.3901 | 0.4622 | 0.8752 | nan | 0.8468 | 0.9570 | 0.7840 | 0.8462 | 0.4725 | nan | 0.6814 | 0.8495 | 0.0 | 0.9521 | 0.0949 | 0.0 | 0.4795 | 0.0581 | 0.7033 | 0.0 | 0.0 | 0.9150 | 0.2418 | 0.6258 | 0.4263 | 0.0077 | nan | 0.1338 | 0.5820 | 0.5426 | 0.0 | 0.9475 | 0.8725 | 0.9865 | 0.0032 | 0.2268 | 0.5548 | 0.0 | nan | 0.7638 | 0.8767 | 0.6849 | 0.7075 | 0.3723 | nan | 0.5592 | 0.6402 | 0.0 | 0.8557 | 0.0874 | 0.0 | 0.3410 | 0.0580 | 0.5791 | 0.0 | 0.0 | 0.7701 | 0.2224 | 0.5375 | 0.3389 | 0.0077 | nan | 0.1213 | 0.4724 | 0.3778 | 0.0 | 0.8877 | 0.7703 | 0.9448 | 0.0028 | 0.1137 | 0.3887 | 0.0 | | 0.1008 | 15.4 | 3080 | 0.6443 | 0.3736 | 0.4415 | 0.8748 | nan | 0.8665 | 0.9538 | 0.7823 | 0.8147 | 0.5270 | nan | 0.6705 | 0.8323 | 0.0113 | 0.9477 | 0.1022 | 0.0 | 0.0254 | 0.0193 | 0.7429 | 0.0 | 0.0 | 0.9372 | 0.1383 | 0.5936 | 0.3546 | 0.0074 | nan | 0.1973 | 0.5904 | 0.5592 | 0.0 | 0.9524 | 0.8656 | 0.9749 | 0.0057 | 0.2072 | 0.4480 | 0.0 | nan | 0.7751 | 0.8771 | 0.7096 | 0.7160 | 0.3638 | nan | 0.5610 | 0.6100 | 0.0053 | 0.8599 | 0.0934 | 0.0 | 0.0221 | 0.0193 | 0.5593 | 0.0 | 0.0 | 0.7542 | 0.1324 | 0.5138 | 0.3004 | 0.0074 | nan | 0.1418 | 0.4798 | 0.3722 | 0.0 | 0.8868 | 0.7715 | 0.9507 | 0.0051 | 0.0993 | 0.3687 | 0.0 | | 0.0901 | 15.5 | 3100 | 0.6716 | 0.3743 | 0.4341 | 0.8706 | nan | 0.8558 | 0.9587 | 0.6992 | 0.7659 | 0.5143 | nan | 0.6714 | 0.8210 | 0.0 | 0.9460 | 0.1066 | 0.0 | 0.0 | 0.0136 | 0.6662 | 0.0 | 0.0001 | 0.9354 | 0.2046 | 0.5539 | 0.3793 | 0.0 | nan | 0.2436 | 0.5818 | 0.5368 | 0.0 | 0.9485 | 0.8743 | 0.9826 | 0.0010 | 0.1007 | 0.5287 | 0.0 | nan | 0.7455 | 0.8663 | 0.6775 | 0.6980 | 0.3842 | nan | 0.5610 | 0.6267 | 0.0 | 0.8585 | 0.0963 | 0.0 | 0.0 | 0.0136 | 0.5838 | 0.0 | 0.0001 | 0.7477 | 0.1866 | 0.4929 | 0.3108 | 0.0 | nan | 0.1832 | 0.4803 | 0.3828 | 0.0 | 0.8878 | 0.7795 | 0.9468 | 0.0009 | 0.0737 | 0.3941 | 0.0 | | 0.1319 | 15.6 | 3120 | 0.6274 | 0.3881 | 0.4643 | 0.8720 | nan | 0.8344 | 0.9556 | 0.7932 | 0.8385 | 0.4978 | nan | 0.7199 | 0.8512 | 0.0053 | 0.9580 | 0.0994 | 0.0 | 0.3726 | 0.0014 | 0.6745 | 0.0 | 0.0032 | 0.9289 | 0.2962 | 0.6204 | 0.3558 | 0.0025 | nan | 0.4086 | 0.5790 | 0.5311 | 0.0 | 0.9423 | 0.8488 | 0.9777 | 0.0062 | 0.2870 | 0.4688 | 0.0 | nan | 0.7538 | 0.8741 | 0.6520 | 0.7409 | 0.3659 | nan | 0.5670 | 0.6344 | 0.0029 | 0.8528 | 0.0888 | 0.0 | 0.2982 | 0.0014 | 0.5932 | 0.0 | 0.0024 | 0.7569 | 0.2374 | 0.5227 | 0.3058 | 0.0025 | nan | 0.2161 | 0.4790 | 0.3802 | 0.0 | 0.8840 | 0.7661 | 0.9486 | 0.0043 | 0.1112 | 0.3771 | 0.0 | | 0.1196 | 15.7 | 3140 | 0.6027 | 0.3772 | 0.4491 | 0.8713 | nan | 0.8343 | 0.9517 | 0.7782 | 0.8481 | 0.4852 | nan | 0.6646 | 0.8781 | 0.0082 | 0.9449 | 0.1244 | 0.0 | 0.1531 | 0.0005 | 0.6859 | 0.0 | 0.0001 | 0.9241 | 0.1114 | 0.6287 | 0.4011 | 0.0004 | nan | 0.2774 | 0.5891 | 0.5923 | 0.0 | 0.9515 | 0.8736 | 0.9861 | 0.0088 | 0.1966 | 0.4716 | 0.0 | nan | 0.7500 | 0.8720 | 0.6315 | 0.7355 | 0.3701 | nan | 0.5496 | 0.6068 | 0.0071 | 0.8624 | 0.1100 | 0.0 | 0.1400 | 0.0005 | 0.6012 | 0.0 | 0.0001 | 0.7640 | 0.1051 | 0.5415 | 0.3113 | 0.0004 | nan | 0.1998 | 0.4782 | 0.3660 | 0.0 | 0.8829 | 0.7651 | 0.9441 | 0.0053 | 0.0993 | 0.3703 | 0.0 | | 0.2085 | 15.8 | 3160 | 0.6267 | 0.3826 | 0.4602 | 0.8690 | nan | 0.8379 | 0.9526 | 0.7944 | 0.8073 | 0.5268 | nan | 0.6582 | 0.8739 | 0.0182 | 0.9506 | 0.1455 | 0.0 | 0.4964 | 0.0267 | 0.7657 | 0.0 | 0.0 | 0.9212 | 0.0797 | 0.6069 | 0.3732 | 0.0039 | nan | 0.2219 | 0.5556 | 0.5704 | 0.0 | 0.9611 | 0.8149 | 0.9818 | 0.0101 | 0.3218 | 0.4499 | 0.0 | nan | 0.7522 | 0.8662 | 0.6466 | 0.7271 | 0.3761 | nan | 0.5491 | 0.5897 | 0.0104 | 0.8602 | 0.1250 | 0.0 | 0.3505 | 0.0267 | 0.6110 | 0.0 | 0.0 | 0.7596 | 0.0743 | 0.5171 | 0.3070 | 0.0039 | nan | 0.1908 | 0.4781 | 0.3434 | 0.0 | 0.8789 | 0.7513 | 0.9492 | 0.0062 | 0.1302 | 0.3627 | 0.0 | | 0.1353 | 15.9 | 3180 | 0.6430 | 0.3835 | 0.4563 | 0.8631 | nan | 0.7710 | 0.9483 | 0.7555 | 0.8692 | 0.4833 | nan | 0.7177 | 0.8501 | 0.0091 | 0.9469 | 0.1603 | 0.0 | 0.3588 | 0.0500 | 0.7433 | 0.0 | 0.0 | 0.9276 | 0.0763 | 0.5958 | 0.3994 | 0.0 | nan | 0.2782 | 0.5942 | 0.5438 | 0.0 | 0.9520 | 0.8643 | 0.9798 | 0.0108 | 0.2056 | 0.5113 | 0.0 | nan | 0.6972 | 0.8681 | 0.6930 | 0.6404 | 0.3562 | nan | 0.5509 | 0.6160 | 0.0048 | 0.8632 | 0.1420 | 0.0 | 0.2866 | 0.0500 | 0.6075 | 0.0 | 0.0 | 0.7604 | 0.0732 | 0.5174 | 0.3142 | 0.0 | nan | 0.2496 | 0.4831 | 0.3790 | 0.0 | 0.8853 | 0.7660 | 0.9496 | 0.0087 | 0.1189 | 0.3913 | 0.0 | | 0.1617 | 16.0 | 3200 | 0.6151 | 0.3882 | 0.4699 | 0.8694 | nan | 0.8348 | 0.9375 | 0.8239 | 0.8528 | 0.6061 | nan | 0.6822 | 0.8461 | 0.0035 | 0.9569 | 0.1134 | 0.0 | 0.2992 | 0.0593 | 0.7013 | 0.0 | 0.0 | 0.9299 | 0.2929 | 0.6016 | 0.3728 | 0.0 | nan | 0.3863 | 0.5991 | 0.5887 | 0.0 | 0.9438 | 0.8419 | 0.9836 | 0.0086 | 0.3252 | 0.4467 | 0.0 | nan | 0.7545 | 0.8715 | 0.7219 | 0.7414 | 0.3574 | nan | 0.5604 | 0.5901 | 0.0020 | 0.8559 | 0.1012 | 0.0 | 0.2251 | 0.0593 | 0.5656 | 0.0 | 0.0 | 0.7552 | 0.2467 | 0.5171 | 0.3168 | 0.0 | nan | 0.2600 | 0.4849 | 0.3788 | 0.0 | 0.8828 | 0.7523 | 0.9466 | 0.0066 | 0.1108 | 0.3571 | 0.0 | | 0.1087 | 16.1 | 3220 | 0.6156 | 0.3946 | 0.4679 | 0.8739 | nan | 0.8424 | 0.9612 | 0.7933 | 0.8578 | 0.4260 | nan | 0.7008 | 0.8356 | 0.0025 | 0.9456 | 0.1843 | 0.0 | 0.4258 | 0.1187 | 0.7147 | 0.0 | 0.0011 | 0.9300 | 0.2992 | 0.5966 | 0.3569 | 0.0 | nan | 0.4344 | 0.5571 | 0.5288 | 0.0 | 0.9551 | 0.8390 | 0.9786 | 0.0000 | 0.2392 | 0.4465 | 0.0 | nan | 0.7647 | 0.8734 | 0.6996 | 0.7312 | 0.3438 | nan | 0.5670 | 0.6188 | 0.0015 | 0.8613 | 0.1567 | 0.0 | 0.3108 | 0.1187 | 0.6023 | 0.0 | 0.0008 | 0.7553 | 0.2500 | 0.5166 | 0.2998 | 0.0 | nan | 0.2646 | 0.4777 | 0.3785 | 0.0 | 0.8838 | 0.7618 | 0.9511 | 0.0000 | 0.0869 | 0.3505 | 0.0 | | 0.1534 | 16.2 | 3240 | 0.6361 | 0.3907 | 0.4569 | 0.8695 | nan | 0.8248 | 0.9618 | 0.7594 | 0.8464 | 0.4330 | nan | 0.6412 | 0.7748 | 0.0027 | 0.9560 | 0.1846 | 0.0 | 0.2197 | 0.0492 | 0.6962 | 0.0 | 0.0 | 0.9159 | 0.2669 | 0.5864 | 0.3982 | 0.0130 | nan | 0.4334 | 0.5939 | 0.5933 | 0.0 | 0.9521 | 0.8627 | 0.9699 | 0.0202 | 0.1637 | 0.5010 | 0.0 | nan | 0.7452 | 0.8576 | 0.7096 | 0.7390 | 0.3564 | nan | 0.5582 | 0.6255 | 0.0012 | 0.8545 | 0.1652 | 0.0 | 0.2012 | 0.0492 | 0.5941 | 0.0 | 0.0 | 0.7608 | 0.2184 | 0.5128 | 0.3134 | 0.0129 | nan | 0.3033 | 0.4586 | 0.3854 | 0.0 | 0.8842 | 0.7705 | 0.9393 | 0.0105 | 0.0960 | 0.3782 | 0.0 | | 0.1424 | 16.3 | 3260 | 0.6442 | 0.3731 | 0.4348 | 0.8686 | nan | 0.8203 | 0.9599 | 0.7388 | 0.8626 | 0.4214 | nan | 0.6717 | 0.8149 | 0.0025 | 0.9475 | 0.1189 | 0.0 | 0.0003 | 0.0073 | 0.6840 | 0.0 | 0.0 | 0.9272 | 0.1288 | 0.6072 | 0.3303 | 0.0014 | nan | 0.3258 | 0.5757 | 0.5233 | 0.0 | 0.9592 | 0.8548 | 0.9727 | 0.0326 | 0.2164 | 0.4083 | 0.0 | nan | 0.7306 | 0.8685 | 0.7158 | 0.7007 | 0.3514 | nan | 0.5668 | 0.6272 | 0.0014 | 0.8631 | 0.1051 | 0.0 | 0.0003 | 0.0073 | 0.5916 | 0.0 | 0.0 | 0.7522 | 0.1159 | 0.5177 | 0.2870 | 0.0014 | nan | 0.2639 | 0.4648 | 0.3685 | 0.0 | 0.8824 | 0.7719 | 0.9418 | 0.0204 | 0.0798 | 0.3408 | 0.0 | | 0.0966 | 16.4 | 3280 | 0.6606 | 0.3902 | 0.4710 | 0.8664 | nan | 0.8024 | 0.9532 | 0.8037 | 0.8563 | 0.4764 | nan | 0.6834 | 0.8516 | 0.0098 | 0.9516 | 0.1549 | 0.0 | 0.7565 | 0.1112 | 0.6759 | 0.0 | 0.0 | 0.9240 | 0.1866 | 0.5807 | 0.3483 | 0.0 | nan | 0.2897 | 0.5930 | 0.5241 | 0.0 | 0.9495 | 0.8388 | 0.9864 | 0.0284 | 0.2430 | 0.4930 | 0.0 | nan | 0.7277 | 0.8624 | 0.6978 | 0.7045 | 0.3633 | nan | 0.5624 | 0.6023 | 0.0055 | 0.8623 | 0.1326 | 0.0 | 0.3993 | 0.1112 | 0.6022 | 0.0 | 0.0 | 0.7582 | 0.1648 | 0.5129 | 0.3044 | 0.0 | nan | 0.2335 | 0.4796 | 0.3474 | 0.0 | 0.8846 | 0.7624 | 0.9436 | 0.0170 | 0.0817 | 0.3635 | 0.0 | | 0.1438 | 16.5 | 3300 | 0.6547 | 0.3949 | 0.4663 | 0.8686 | nan | 0.8167 | 0.9584 | 0.7571 | 0.8461 | 0.4767 | nan | 0.6833 | 0.7965 | 0.0007 | 0.9491 | 0.1242 | 0.0 | 0.6012 | 0.1260 | 0.7100 | 0.0 | 0.0 | 0.9347 | 0.2388 | 0.5549 | 0.4005 | 0.0 | nan | 0.2431 | 0.5893 | 0.5809 | 0.0 | 0.9484 | 0.8448 | 0.9847 | 0.0081 | 0.2865 | 0.4624 | 0.0 | nan | 0.7351 | 0.8619 | 0.7252 | 0.7187 | 0.3620 | nan | 0.5654 | 0.6422 | 0.0003 | 0.8603 | 0.1085 | 0.0 | 0.4644 | 0.1260 | 0.6126 | 0.0 | 0.0 | 0.7527 | 0.1927 | 0.4990 | 0.3261 | 0.0 | nan | 0.1837 | 0.4828 | 0.3534 | 0.0 | 0.8866 | 0.7707 | 0.9463 | 0.0061 | 0.0907 | 0.3638 | 0.0 | | 0.1047 | 16.6 | 3320 | 0.6620 | 0.3902 | 0.4587 | 0.8690 | nan | 0.8165 | 0.9626 | 0.7851 | 0.8390 | 0.4763 | nan | 0.6498 | 0.7912 | 0.0249 | 0.9441 | 0.1145 | 0.0 | 0.4841 | 0.1315 | 0.7786 | 0.0 | 0.0 | 0.9267 | 0.1411 | 0.6071 | 0.3302 | 0.0021 | nan | 0.2145 | 0.5729 | 0.5669 | 0.0 | 0.9516 | 0.8311 | 0.9806 | 0.0048 | 0.2571 | 0.4927 | 0.0 | nan | 0.7392 | 0.8618 | 0.7085 | 0.7183 | 0.3847 | nan | 0.5524 | 0.6180 | 0.0084 | 0.8613 | 0.1008 | 0.0 | 0.4199 | 0.1315 | 0.6120 | 0.0 | 0.0 | 0.7561 | 0.1279 | 0.5092 | 0.2864 | 0.0021 | nan | 0.1764 | 0.4765 | 0.3684 | 0.0 | 0.8822 | 0.7540 | 0.9490 | 0.0033 | 0.0984 | 0.3790 | 0.0 | | 0.1603 | 16.7 | 3340 | 0.6201 | 0.3909 | 0.4604 | 0.8724 | nan | 0.8357 | 0.9505 | 0.7955 | 0.8510 | 0.5418 | nan | 0.6921 | 0.7958 | 0.0189 | 0.9505 | 0.1030 | 0.0 | 0.4512 | 0.1370 | 0.7554 | 0.0 | 0.0 | 0.9323 | 0.1324 | 0.5851 | 0.2937 | 0.0 | nan | 0.2254 | 0.5794 | 0.5298 | 0.0 | 0.9565 | 0.8437 | 0.9786 | 0.0100 | 0.2826 | 0.5059 | 0.0 | nan | 0.7510 | 0.8757 | 0.7049 | 0.7267 | 0.4078 | nan | 0.5574 | 0.6164 | 0.0068 | 0.8607 | 0.0917 | 0.0 | 0.4070 | 0.1369 | 0.5831 | 0.0 | 0.0 | 0.7535 | 0.1208 | 0.5183 | 0.2661 | 0.0 | nan | 0.1880 | 0.4874 | 0.3682 | 0.0 | 0.8805 | 0.7568 | 0.9503 | 0.0065 | 0.1034 | 0.3822 | 0.0 | | 0.1489 | 16.8 | 3360 | 0.6502 | 0.3922 | 0.4645 | 0.8689 | nan | 0.8335 | 0.9446 | 0.7831 | 0.8308 | 0.5645 | nan | 0.6905 | 0.7748 | 0.0126 | 0.9532 | 0.1575 | 0.0 | 0.5452 | 0.0976 | 0.7534 | 0.0 | 0.0 | 0.9381 | 0.1427 | 0.5909 | 0.4154 | 0.0 | nan | 0.2283 | 0.5970 | 0.5611 | 0.0 | 0.9464 | 0.8215 | 0.9784 | 0.0080 | 0.1876 | 0.5064 | 0.0 | nan | 0.7445 | 0.8666 | 0.7103 | 0.7164 | 0.3841 | nan | 0.5533 | 0.6011 | 0.0027 | 0.8621 | 0.1280 | 0.0 | 0.4344 | 0.0975 | 0.6057 | 0.0 | 0.0 | 0.7533 | 0.1269 | 0.5119 | 0.3197 | 0.0 | nan | 0.1745 | 0.4893 | 0.3693 | 0.0 | 0.8832 | 0.7496 | 0.9511 | 0.0055 | 0.1152 | 0.3944 | 0.0 | | 0.1685 | 16.9 | 3380 | 0.6708 | 0.3851 | 0.4575 | 0.8690 | nan | 0.8334 | 0.9582 | 0.7801 | 0.8063 | 0.5133 | nan | 0.6511 | 0.8619 | 0.0251 | 0.9376 | 0.1521 | 0.0 | 0.6353 | 0.0269 | 0.7188 | 0.0 | 0.0 | 0.9257 | 0.1267 | 0.5672 | 0.3617 | 0.0372 | nan | 0.2287 | 0.5864 | 0.5349 | 0.0 | 0.9582 | 0.8384 | 0.9849 | 0.0080 | 0.1262 | 0.4551 | 0.0 | nan | 0.7410 | 0.8653 | 0.6794 | 0.7120 | 0.3925 | nan | 0.5553 | 0.5044 | 0.0111 | 0.8650 | 0.1305 | 0.0 | 0.4608 | 0.0269 | 0.6250 | 0.0 | 0.0 | 0.7559 | 0.1139 | 0.5049 | 0.3013 | 0.0369 | nan | 0.1731 | 0.4805 | 0.3695 | 0.0 | 0.8804 | 0.7613 | 0.9454 | 0.0058 | 0.0652 | 0.3586 | 0.0 | | 0.1464 | 17.0 | 3400 | 0.6396 | 0.3889 | 0.4638 | 0.8711 | nan | 0.8434 | 0.9486 | 0.8471 | 0.8117 | 0.5337 | nan | 0.6774 | 0.8109 | 0.0009 | 0.9510 | 0.1685 | 0.0 | 0.6241 | 0.0945 | 0.7062 | 0.0 | 0.0 | 0.9337 | 0.0732 | 0.5948 | 0.3886 | 0.0499 | nan | 0.2157 | 0.5693 | 0.5502 | 0.0 | 0.9559 | 0.8320 | 0.9821 | 0.0024 | 0.2056 | 0.4701 | 0.0 | nan | 0.7574 | 0.8809 | 0.6622 | 0.7190 | 0.3777 | nan | 0.5570 | 0.6090 | 0.0005 | 0.8596 | 0.1422 | 0.0 | 0.4428 | 0.0944 | 0.5854 | 0.0 | 0.0 | 0.7552 | 0.0679 | 0.5222 | 0.3214 | 0.0494 | nan | 0.1783 | 0.4779 | 0.3635 | 0.0 | 0.8814 | 0.7449 | 0.9474 | 0.0013 | 0.0831 | 0.3613 | 0.0 | | 0.1365 | 17.1 | 3420 | 0.6572 | 0.3892 | 0.4633 | 0.8705 | nan | 0.8382 | 0.9572 | 0.8112 | 0.7848 | 0.5338 | nan | 0.6472 | 0.8088 | 0.0071 | 0.9497 | 0.1396 | 0.0 | 0.5627 | 0.0748 | 0.7273 | 0.0 | 0.0 | 0.9303 | 0.1357 | 0.6070 | 0.4103 | 0.0277 | nan | 0.2033 | 0.5932 | 0.5764 | 0.0 | 0.9469 | 0.8637 | 0.9832 | 0.0022 | 0.2535 | 0.4493 | 0.0 | nan | 0.7516 | 0.8673 | 0.6850 | 0.7140 | 0.3789 | nan | 0.5561 | 0.6163 | 0.0036 | 0.8644 | 0.1202 | 0.0 | 0.4187 | 0.0748 | 0.5798 | 0.0 | 0.0 | 0.7616 | 0.1225 | 0.5294 | 0.3379 | 0.0275 | nan | 0.1374 | 0.4880 | 0.3775 | 0.0 | 0.8883 | 0.7683 | 0.9473 | 0.0013 | 0.0840 | 0.3516 | 0.0 | | 0.1378 | 17.2 | 3440 | 0.6191 | 0.3847 | 0.4545 | 0.8740 | nan | 0.8495 | 0.9487 | 0.7936 | 0.8689 | 0.4940 | nan | 0.6837 | 0.8027 | 0.0011 | 0.9480 | 0.1273 | 0.0 | 0.3769 | 0.0300 | 0.7042 | 0.0 | 0.0 | 0.9262 | 0.1578 | 0.6273 | 0.3385 | 0.0095 | nan | 0.2106 | 0.5912 | 0.5896 | 0.0 | 0.9543 | 0.8506 | 0.9832 | 0.0015 | 0.1770 | 0.4974 | 0.0 | nan | 0.7596 | 0.8815 | 0.6459 | 0.7035 | 0.3778 | nan | 0.5592 | 0.6375 | 0.0007 | 0.8580 | 0.1145 | 0.0 | 0.3289 | 0.0300 | 0.5752 | 0.0 | 0.0 | 0.7606 | 0.1426 | 0.5268 | 0.2930 | 0.0094 | nan | 0.1559 | 0.4862 | 0.3845 | 0.0 | 0.8857 | 0.7672 | 0.9472 | 0.0014 | 0.0916 | 0.3863 | 0.0 | | 0.1417 | 17.3 | 3460 | 0.6328 | 0.3898 | 0.4660 | 0.8735 | nan | 0.8465 | 0.9501 | 0.7913 | 0.8431 | 0.5206 | nan | 0.6615 | 0.8699 | 0.0089 | 0.9539 | 0.2104 | 0.0 | 0.4512 | 0.0147 | 0.6838 | 0.0 | 0.0 | 0.9255 | 0.2034 | 0.6098 | 0.3673 | 0.0256 | nan | 0.2154 | 0.5991 | 0.6197 | 0.0 | 0.9511 | 0.8755 | 0.9824 | 0.0223 | 0.2381 | 0.4713 | 0.0 | nan | 0.7532 | 0.8780 | 0.6962 | 0.7382 | 0.3778 | nan | 0.5484 | 0.6016 | 0.0075 | 0.8590 | 0.1806 | 0.0 | 0.3458 | 0.0147 | 0.5818 | 0.0 | 0.0 | 0.7599 | 0.1754 | 0.5210 | 0.3029 | 0.0253 | nan | 0.1574 | 0.4859 | 0.3672 | 0.0 | 0.8871 | 0.7619 | 0.9488 | 0.0140 | 0.1032 | 0.3791 | 0.0 | | 0.1277 | 17.4 | 3480 | 0.6331 | 0.3872 | 0.4582 | 0.8722 | nan | 0.8338 | 0.9512 | 0.7777 | 0.8407 | 0.5404 | nan | 0.7127 | 0.8462 | 0.0131 | 0.9527 | 0.1605 | 0.0 | 0.3180 | 0.0268 | 0.6960 | 0.0 | 0.0 | 0.9278 | 0.2041 | 0.6040 | 0.3440 | 0.0067 | nan | 0.2529 | 0.5973 | 0.5703 | 0.0 | 0.9532 | 0.8474 | 0.9828 | 0.0080 | 0.2203 | 0.4740 | 0.0 | nan | 0.7460 | 0.8775 | 0.7051 | 0.7192 | 0.3927 | nan | 0.5591 | 0.6078 | 0.0072 | 0.8635 | 0.1412 | 0.0 | 0.2497 | 0.0268 | 0.5889 | 0.0 | 0.0 | 0.7585 | 0.1791 | 0.5135 | 0.2937 | 0.0066 | nan | 0.2063 | 0.4870 | 0.3853 | 0.0 | 0.8854 | 0.7689 | 0.9492 | 0.0046 | 0.0924 | 0.3759 | 0.0 | | 0.162 | 17.5 | 3500 | 0.6330 | 0.3854 | 0.4526 | 0.8730 | nan | 0.8373 | 0.9588 | 0.7677 | 0.8370 | 0.4894 | nan | 0.6701 | 0.8162 | 0.0102 | 0.9564 | 0.1115 | 0.0 | 0.3171 | 0.0528 | 0.6832 | 0.0 | 0.0 | 0.9220 | 0.2651 | 0.6014 | 0.3459 | 0.0011 | nan | 0.1705 | 0.5879 | 0.5539 | 0.0 | 0.9609 | 0.8585 | 0.9792 | 0.0009 | 0.2789 | 0.4478 | 0.0 | nan | 0.7443 | 0.8716 | 0.7125 | 0.7195 | 0.3858 | nan | 0.5647 | 0.6139 | 0.0049 | 0.8575 | 0.0998 | 0.0 | 0.2619 | 0.0528 | 0.5751 | 0.0 | 0.0 | 0.7591 | 0.2179 | 0.5136 | 0.2962 | 0.0010 | nan | 0.1336 | 0.4891 | 0.3825 | 0.0 | 0.8828 | 0.7760 | 0.9499 | 0.0009 | 0.1019 | 0.3647 | 0.0 | | 0.0967 | 17.6 | 3520 | 0.6176 | 0.3858 | 0.4542 | 0.8752 | nan | 0.8580 | 0.9520 | 0.7998 | 0.8507 | 0.5326 | nan | 0.6841 | 0.8336 | 0.0020 | 0.9521 | 0.1145 | 0.0 | 0.2032 | 0.0751 | 0.6629 | 0.0 | 0.0 | 0.9342 | 0.2073 | 0.5489 | 0.3947 | 0.0218 | nan | 0.2637 | 0.6003 | 0.5635 | 0.0 | 0.9521 | 0.8432 | 0.9831 | 0.0086 | 0.2128 | 0.4805 | 0.0 | nan | 0.7633 | 0.8815 | 0.6620 | 0.7389 | 0.3922 | nan | 0.5692 | 0.6264 | 0.0013 | 0.8571 | 0.1023 | 0.0 | 0.1693 | 0.0751 | 0.5715 | 0.0 | 0.0 | 0.7558 | 0.1879 | 0.4861 | 0.3087 | 0.0216 | nan | 0.2168 | 0.4879 | 0.3836 | 0.0 | 0.8844 | 0.7608 | 0.9469 | 0.0077 | 0.1060 | 0.3815 | 0.0 | | 0.1003 | 17.7 | 3540 | 0.6420 | 0.3874 | 0.4692 | 0.8713 | nan | 0.8484 | 0.9538 | 0.7812 | 0.8201 | 0.5088 | nan | 0.6856 | 0.8652 | 0.0007 | 0.9537 | 0.1706 | 0.0 | 0.6290 | 0.0624 | 0.7385 | 0.0 | 0.0 | 0.9386 | 0.2066 | 0.5970 | 0.3658 | 0.0267 | nan | 0.2712 | 0.5962 | 0.5393 | 0.0 | 0.9480 | 0.8226 | 0.9837 | 0.0082 | 0.2721 | 0.4197 | 0.0 | nan | 0.7533 | 0.8758 | 0.6771 | 0.6990 | 0.3944 | nan | 0.5631 | 0.5618 | 0.0005 | 0.8596 | 0.1445 | 0.0 | 0.3542 | 0.0624 | 0.5718 | 0.0 | 0.0 | 0.7510 | 0.1778 | 0.4933 | 0.3078 | 0.0262 | nan | 0.2112 | 0.4891 | 0.3718 | 0.0 | 0.8834 | 0.7588 | 0.9469 | 0.0075 | 0.1096 | 0.3450 | 0.0 | | 0.1321 | 17.8 | 3560 | 0.6472 | 0.3854 | 0.4690 | 0.8700 | nan | 0.8190 | 0.9575 | 0.7796 | 0.8176 | 0.5442 | nan | 0.7002 | 0.8532 | 0.0009 | 0.9549 | 0.2109 | 0.0357 | 0.8843 | 0.1077 | 0.7413 | 0.0 | 0.0 | 0.9250 | 0.1177 | 0.6343 | 0.3388 | 0.0348 | nan | 0.1354 | 0.5508 | 0.4907 | 0.0 | 0.9538 | 0.8184 | 0.9834 | 0.0006 | 0.1281 | 0.4906 | 0.0 | nan | 0.7362 | 0.8674 | 0.6764 | 0.7011 | 0.3939 | nan | 0.5691 | 0.5895 | 0.0007 | 0.8628 | 0.1768 | 0.0357 | 0.3133 | 0.1077 | 0.5636 | 0.0 | 0.0 | 0.7612 | 0.1111 | 0.5214 | 0.2874 | 0.0343 | nan | 0.1176 | 0.4788 | 0.3656 | 0.0 | 0.8833 | 0.7533 | 0.9473 | 0.0006 | 0.0951 | 0.3820 | 0.0 | | 0.1012 | 17.9 | 3580 | 0.6327 | 0.3945 | 0.4744 | 0.8750 | nan | 0.8444 | 0.9511 | 0.7980 | 0.8497 | 0.5546 | nan | 0.6874 | 0.8387 | 0.0047 | 0.9528 | 0.1516 | 0.0241 | 0.6486 | 0.1604 | 0.7541 | 0.0 | 0.0 | 0.9332 | 0.1317 | 0.6395 | 0.3957 | 0.0456 | nan | 0.1993 | 0.5825 | 0.5292 | 0.0 | 0.9510 | 0.8507 | 0.9821 | 0.0028 | 0.2793 | 0.4377 | 0.0 | nan | 0.7602 | 0.8801 | 0.6560 | 0.7283 | 0.3917 | nan | 0.5599 | 0.6319 | 0.0034 | 0.8636 | 0.1319 | 0.0241 | 0.3515 | 0.1603 | 0.5733 | 0.0 | 0.0 | 0.7636 | 0.1224 | 0.5281 | 0.3166 | 0.0446 | nan | 0.1600 | 0.4877 | 0.3838 | 0.0 | 0.8873 | 0.7671 | 0.9494 | 0.0024 | 0.1333 | 0.3614 | 0.0 | | 0.1115 | 18.0 | 3600 | 0.6471 | 0.3958 | 0.4775 | 0.8741 | nan | 0.8373 | 0.9557 | 0.8146 | 0.8568 | 0.5123 | nan | 0.6811 | 0.8384 | 0.0147 | 0.9499 | 0.1619 | 0.0286 | 0.6451 | 0.2463 | 0.7429 | 0.0 | 0.0 | 0.9359 | 0.1783 | 0.5939 | 0.3325 | 0.0481 | nan | 0.2130 | 0.5999 | 0.5620 | 0.0 | 0.9495 | 0.8572 | 0.9796 | 0.0047 | 0.2841 | 0.4547 | 0.0 | nan | 0.7578 | 0.8810 | 0.6269 | 0.7312 | 0.3883 | nan | 0.5636 | 0.6215 | 0.0093 | 0.8634 | 0.1373 | 0.0285 | 0.3512 | 0.2463 | 0.5963 | 0.0 | 0.0 | 0.7620 | 0.1596 | 0.5112 | 0.2902 | 0.0468 | nan | 0.1611 | 0.4861 | 0.3712 | 0.0 | 0.8848 | 0.7659 | 0.9499 | 0.0038 | 0.1134 | 0.3581 | 0.0 | | 0.1348 | 18.1 | 3620 | 0.6481 | 0.3917 | 0.4693 | 0.8706 | nan | 0.8306 | 0.9477 | 0.8035 | 0.8420 | 0.5514 | nan | 0.7045 | 0.8461 | 0.0400 | 0.9511 | 0.1532 | 0.0096 | 0.6127 | 0.1254 | 0.7203 | 0.0001 | 0.0 | 0.9398 | 0.1342 | 0.5573 | 0.3896 | 0.0337 | nan | 0.2287 | 0.5552 | 0.5724 | 0.0 | 0.9505 | 0.8600 | 0.9802 | 0.0028 | 0.2236 | 0.4512 | 0.0 | nan | 0.7478 | 0.8727 | 0.6513 | 0.7269 | 0.3812 | nan | 0.5678 | 0.6342 | 0.0202 | 0.8619 | 0.1277 | 0.0091 | 0.4013 | 0.1253 | 0.5868 | 0.0001 | 0.0 | 0.7536 | 0.1229 | 0.4895 | 0.2892 | 0.0330 | nan | 0.1947 | 0.4772 | 0.3720 | 0.0 | 0.8859 | 0.7637 | 0.9502 | 0.0024 | 0.1190 | 0.3666 | 0.0 | | 0.1008 | 18.2 | 3640 | 0.6400 | 0.3961 | 0.4778 | 0.8700 | nan | 0.8315 | 0.9466 | 0.7874 | 0.8600 | 0.5177 | nan | 0.6957 | 0.8418 | 0.0366 | 0.9475 | 0.2405 | 0.0 | 0.6111 | 0.1058 | 0.7397 | 0.0 | 0.0009 | 0.9279 | 0.2627 | 0.5572 | 0.3947 | 0.0730 | nan | 0.2279 | 0.6045 | 0.5431 | 0.0 | 0.9459 | 0.8792 | 0.9865 | 0.0192 | 0.2656 | 0.4386 | 0.0 | nan | 0.7440 | 0.8732 | 0.6501 | 0.7160 | 0.3869 | nan | 0.5630 | 0.6315 | 0.0223 | 0.8652 | 0.2004 | 0.0 | 0.4063 | 0.1058 | 0.5690 | 0.0 | 0.0008 | 0.7539 | 0.2115 | 0.4873 | 0.2980 | 0.0709 | nan | 0.1864 | 0.4805 | 0.3760 | 0.0 | 0.8870 | 0.7709 | 0.9467 | 0.0142 | 0.1040 | 0.3519 | 0.0 | | 0.1345 | 18.3 | 3660 | 0.6472 | 0.3959 | 0.4781 | 0.8706 | nan | 0.8146 | 0.9530 | 0.8235 | 0.8520 | 0.5497 | nan | 0.6968 | 0.8381 | 0.0207 | 0.9516 | 0.1768 | 0.0 | 0.6848 | 0.0982 | 0.7017 | 0.0 | 0.0016 | 0.9232 | 0.2083 | 0.5915 | 0.4181 | 0.0692 | nan | 0.2702 | 0.6141 | 0.5410 | 0.0 | 0.9518 | 0.8357 | 0.9695 | 0.0106 | 0.2471 | 0.4844 | 0.0 | nan | 0.7450 | 0.8758 | 0.6078 | 0.7318 | 0.3818 | nan | 0.5719 | 0.6405 | 0.0114 | 0.8615 | 0.1511 | 0.0 | 0.4094 | 0.0981 | 0.5907 | 0.0 | 0.0014 | 0.7618 | 0.1787 | 0.5066 | 0.3096 | 0.0666 | nan | 0.2119 | 0.4778 | 0.3787 | 0.0 | 0.8857 | 0.7576 | 0.9448 | 0.0082 | 0.1287 | 0.3737 | 0.0 | | 0.1169 | 18.4 | 3680 | 0.6531 | 0.3883 | 0.4689 | 0.8691 | nan | 0.8289 | 0.9521 | 0.7879 | 0.8127 | 0.5339 | nan | 0.6864 | 0.8635 | 0.0215 | 0.9581 | 0.1386 | 0.0544 | 0.8072 | 0.0718 | 0.6534 | 0.0 | 0.0 | 0.9327 | 0.1959 | 0.5625 | 0.3530 | 0.0056 | nan | 0.2673 | 0.5698 | 0.5440 | 0.0 | 0.9512 | 0.8463 | 0.9864 | 0.0051 | 0.1302 | 0.4835 | 0.0 | nan | 0.7451 | 0.8685 | 0.6699 | 0.7229 | 0.3542 | nan | 0.5663 | 0.5963 | 0.0124 | 0.8566 | 0.1181 | 0.0511 | 0.4035 | 0.0717 | 0.5656 | 0.0 | 0.0 | 0.7549 | 0.1712 | 0.4952 | 0.2960 | 0.0056 | nan | 0.2144 | 0.4820 | 0.3652 | 0.0 | 0.8849 | 0.7623 | 0.9443 | 0.0045 | 0.0663 | 0.3781 | 0.0 | | 0.1532 | 18.5 | 3700 | 0.6728 | 0.3907 | 0.4725 | 0.8686 | nan | 0.8117 | 0.9581 | 0.7649 | 0.8659 | 0.5009 | nan | 0.6302 | 0.7146 | 0.1802 | 0.9449 | 0.1444 | 0.0 | 0.7378 | 0.1248 | 0.7747 | 0.0 | 0.0 | 0.9387 | 0.1801 | 0.6004 | 0.3819 | 0.0629 | nan | 0.1859 | 0.5827 | 0.5751 | 0.0 | 0.9523 | 0.8323 | 0.9805 | 0.0229 | 0.2482 | 0.4215 | 0.0 | nan | 0.7378 | 0.8700 | 0.6787 | 0.6915 | 0.3825 | nan | 0.5584 | 0.6061 | 0.0625 | 0.8632 | 0.1243 | 0.0 | 0.3961 | 0.1248 | 0.5644 | 0.0 | 0.0 | 0.7546 | 0.1585 | 0.5011 | 0.3042 | 0.0613 | nan | 0.1455 | 0.4883 | 0.3701 | 0.0 | 0.8856 | 0.7606 | 0.9497 | 0.0151 | 0.1017 | 0.3440 | 0.0 | | 0.0933 | 18.6 | 3720 | 0.6657 | 0.3892 | 0.4705 | 0.8671 | nan | 0.7966 | 0.9503 | 0.7910 | 0.8426 | 0.5041 | nan | 0.7030 | 0.8504 | 0.0036 | 0.9521 | 0.1420 | 0.0 | 0.7654 | 0.0719 | 0.6865 | 0.0 | 0.0 | 0.9344 | 0.1655 | 0.6533 | 0.3030 | 0.0829 | nan | 0.2544 | 0.5910 | 0.5748 | 0.0 | 0.9539 | 0.8573 | 0.9809 | 0.0126 | 0.1971 | 0.4359 | 0.0 | nan | 0.7158 | 0.8649 | 0.6562 | 0.6928 | 0.3857 | nan | 0.5605 | 0.5842 | 0.0018 | 0.8581 | 0.1180 | 0.0 | 0.4258 | 0.0718 | 0.5858 | 0.0 | 0.0 | 0.7634 | 0.1494 | 0.5398 | 0.2672 | 0.0798 | nan | 0.1992 | 0.4898 | 0.3707 | 0.0 | 0.8857 | 0.7744 | 0.9496 | 0.0094 | 0.0952 | 0.3577 | 0.0 | | 0.1438 | 18.7 | 3740 | 0.6791 | 0.3893 | 0.4663 | 0.8655 | nan | 0.7821 | 0.9529 | 0.7963 | 0.8464 | 0.5156 | nan | 0.6654 | 0.8413 | 0.0087 | 0.9405 | 0.1420 | 0.0 | 0.5114 | 0.1117 | 0.7339 | 0.0 | 0.0 | 0.9284 | 0.2376 | 0.6176 | 0.3744 | 0.1001 | nan | 0.1974 | 0.6001 | 0.5361 | 0.0 | 0.9529 | 0.8762 | 0.9833 | 0.0107 | 0.2175 | 0.4416 | 0.0 | nan | 0.7087 | 0.8600 | 0.6665 | 0.7063 | 0.3783 | nan | 0.5571 | 0.5747 | 0.0045 | 0.8647 | 0.1161 | 0.0 | 0.3470 | 0.1117 | 0.5896 | 0.0 | 0.0 | 0.7619 | 0.2037 | 0.5308 | 0.3091 | 0.0951 | nan | 0.1643 | 0.4884 | 0.3719 | 0.0 | 0.8887 | 0.7752 | 0.9486 | 0.0075 | 0.0816 | 0.3465 | 0.0 | | 0.0849 | 18.8 | 3760 | 0.6566 | 0.3937 | 0.4671 | 0.8737 | nan | 0.8372 | 0.9555 | 0.7537 | 0.8370 | 0.5635 | nan | 0.6648 | 0.8707 | 0.0169 | 0.9553 | 0.1224 | 0.0 | 0.4110 | 0.0835 | 0.7136 | 0.0 | 0.0012 | 0.9242 | 0.2331 | 0.6116 | 0.4140 | 0.0660 | nan | 0.3150 | 0.5872 | 0.5567 | 0.0 | 0.9522 | 0.8598 | 0.9851 | 0.0051 | 0.1902 | 0.4594 | 0.0 | nan | 0.7514 | 0.8717 | 0.7081 | 0.7288 | 0.3997 | nan | 0.5627 | 0.5637 | 0.0104 | 0.8600 | 0.1056 | 0.0 | 0.3094 | 0.0835 | 0.5931 | 0.0 | 0.0010 | 0.7622 | 0.2070 | 0.5229 | 0.3329 | 0.0644 | nan | 0.2304 | 0.4850 | 0.3810 | 0.0 | 0.8919 | 0.7864 | 0.9474 | 0.0038 | 0.0767 | 0.3569 | 0.0 | | 0.1116 | 18.9 | 3780 | 0.6471 | 0.3966 | 0.4666 | 0.8774 | nan | 0.8720 | 0.9482 | 0.8046 | 0.8431 | 0.5345 | nan | 0.6738 | 0.8331 | 0.0113 | 0.9485 | 0.0919 | 0.0 | 0.4484 | 0.1166 | 0.7373 | 0.0 | 0.0026 | 0.9275 | 0.1592 | 0.6165 | 0.4077 | 0.0179 | nan | 0.3014 | 0.5796 | 0.5465 | 0.0 | 0.9580 | 0.8396 | 0.9802 | 0.0419 | 0.1764 | 0.5117 | 0.0 | nan | 0.7768 | 0.8863 | 0.6934 | 0.7475 | 0.3869 | nan | 0.5524 | 0.6067 | 0.0064 | 0.8653 | 0.0813 | 0.0 | 0.3712 | 0.1166 | 0.6076 | 0.0 | 0.0022 | 0.7640 | 0.1476 | 0.5426 | 0.3372 | 0.0176 | nan | 0.2389 | 0.4833 | 0.3878 | 0.0 | 0.8879 | 0.7702 | 0.9496 | 0.0196 | 0.0705 | 0.3732 | 0.0 | | 0.1391 | 19.0 | 3800 | 0.6216 | 0.3908 | 0.4653 | 0.8761 | nan | 0.8496 | 0.9555 | 0.7994 | 0.8339 | 0.5083 | nan | 0.7163 | 0.8324 | 0.0129 | 0.9491 | 0.1400 | 0.0 | 0.5319 | 0.0279 | 0.7549 | 0.0 | 0.0020 | 0.9199 | 0.1396 | 0.6369 | 0.3589 | 0.0829 | nan | 0.2064 | 0.6205 | 0.5965 | 0.0 | 0.9594 | 0.8644 | 0.9757 | 0.0076 | 0.1460 | 0.4598 | 0.0 | nan | 0.7640 | 0.8820 | 0.6720 | 0.7404 | 0.3806 | nan | 0.5694 | 0.5559 | 0.0067 | 0.8639 | 0.1234 | 0.0 | 0.4172 | 0.0279 | 0.5877 | 0.0 | 0.0018 | 0.7649 | 0.1263 | 0.5285 | 0.3136 | 0.0805 | nan | 0.1776 | 0.4837 | 0.3792 | 0.0 | 0.8865 | 0.7708 | 0.9521 | 0.0050 | 0.0771 | 0.3654 | 0.0 | | 0.1196 | 19.1 | 3820 | 0.6366 | 0.3926 | 0.4722 | 0.8727 | nan | 0.8250 | 0.9484 | 0.8132 | 0.8418 | 0.5690 | nan | 0.7304 | 0.8154 | 0.0102 | 0.9577 | 0.1662 | 0.0002 | 0.6124 | 0.0631 | 0.7290 | 0.0 | 0.0075 | 0.9306 | 0.1782 | 0.6425 | 0.3602 | 0.0330 | nan | 0.2764 | 0.5907 | 0.5957 | 0.0 | 0.9500 | 0.8442 | 0.9833 | 0.0043 | 0.1566 | 0.4757 | 0.0 | nan | 0.7522 | 0.8838 | 0.6799 | 0.7295 | 0.3469 | nan | 0.5803 | 0.5606 | 0.0042 | 0.8591 | 0.1443 | 0.0002 | 0.4289 | 0.0631 | 0.6002 | 0.0 | 0.0067 | 0.7621 | 0.1557 | 0.5209 | 0.3123 | 0.0322 | nan | 0.2066 | 0.4849 | 0.3736 | 0.0 | 0.8885 | 0.7692 | 0.9480 | 0.0035 | 0.1009 | 0.3666 | 0.0 | | 0.0959 | 19.2 | 3840 | 0.6288 | 0.3935 | 0.4702 | 0.8750 | nan | 0.8575 | 0.9453 | 0.7918 | 0.8497 | 0.5645 | nan | 0.6963 | 0.8299 | 0.0091 | 0.9490 | 0.2160 | 0.0 | 0.4691 | 0.0974 | 0.7566 | 0.0 | 0.0057 | 0.9396 | 0.1275 | 0.5798 | 0.4401 | 0.0737 | nan | 0.2358 | 0.5932 | 0.5844 | 0.0 | 0.9462 | 0.8775 | 0.9824 | 0.0096 | 0.1730 | 0.4441 | 0.0 | nan | 0.7664 | 0.8847 | 0.6672 | 0.7389 | 0.3696 | nan | 0.5766 | 0.5705 | 0.0046 | 0.8629 | 0.1796 | 0.0 | 0.3385 | 0.0973 | 0.5875 | 0.0 | 0.0046 | 0.7543 | 0.1155 | 0.4952 | 0.3465 | 0.0703 | nan | 0.1918 | 0.4893 | 0.3725 | 0.0 | 0.8922 | 0.7813 | 0.9511 | 0.0061 | 0.1146 | 0.3613 | 0.0 | | 0.0953 | 19.3 | 3860 | 0.6623 | 0.3801 | 0.4474 | 0.8713 | nan | 0.8262 | 0.9593 | 0.7969 | 0.8096 | 0.5081 | nan | 0.6969 | 0.8623 | 0.0131 | 0.9501 | 0.1314 | 0.0012 | 0.2272 | 0.0296 | 0.6862 | 0.0 | 0.0 | 0.9403 | 0.1217 | 0.5800 | 0.3706 | 0.0808 | nan | 0.2683 | 0.5718 | 0.5212 | 0.0 | 0.9522 | 0.8629 | 0.9806 | 0.0062 | 0.1071 | 0.4561 | 0.0 | nan | 0.7435 | 0.8717 | 0.7078 | 0.7134 | 0.3755 | nan | 0.5783 | 0.5435 | 0.0070 | 0.8618 | 0.1100 | 0.0012 | 0.1734 | 0.0295 | 0.5762 | 0.0 | 0.0 | 0.7504 | 0.1110 | 0.4996 | 0.3201 | 0.0776 | nan | 0.2004 | 0.4833 | 0.3712 | 0.0 | 0.8898 | 0.7813 | 0.9505 | 0.0037 | 0.0657 | 0.3646 | 0.0 | | 0.1373 | 19.4 | 3880 | 0.6685 | 0.3895 | 0.4625 | 0.8714 | nan | 0.8442 | 0.9510 | 0.7916 | 0.8176 | 0.5431 | nan | 0.6855 | 0.8213 | 0.0075 | 0.9520 | 0.1370 | 0.0018 | 0.5377 | 0.0521 | 0.6675 | 0.0 | 0.0017 | 0.9306 | 0.1636 | 0.5739 | 0.3619 | 0.0569 | nan | 0.2615 | 0.5481 | 0.5454 | 0.0 | 0.9551 | 0.8495 | 0.9859 | 0.0112 | 0.2837 | 0.4612 | 0.0 | nan | 0.7560 | 0.8749 | 0.7109 | 0.7168 | 0.3679 | nan | 0.5734 | 0.5998 | 0.0039 | 0.8604 | 0.1156 | 0.0018 | 0.3666 | 0.0521 | 0.5734 | 0.0 | 0.0015 | 0.7527 | 0.1418 | 0.4988 | 0.3208 | 0.0558 | nan | 0.1980 | 0.4756 | 0.3725 | 0.0 | 0.8879 | 0.7700 | 0.9445 | 0.0067 | 0.1031 | 0.3614 | 0.0 | | 0.0634 | 19.5 | 3900 | 0.6595 | 0.4015 | 0.4852 | 0.8721 | nan | 0.8256 | 0.9453 | 0.8299 | 0.8586 | 0.5898 | nan | 0.6855 | 0.8259 | 0.0306 | 0.9562 | 0.1417 | 0.0069 | 0.6597 | 0.0814 | 0.7130 | 0.0 | 0.0123 | 0.9206 | 0.2307 | 0.6144 | 0.3796 | 0.1257 | nan | 0.3290 | 0.6291 | 0.5818 | 0.0 | 0.9508 | 0.8416 | 0.9838 | 0.0177 | 0.2719 | 0.4868 | 0.0 | nan | 0.7497 | 0.8837 | 0.6989 | 0.7101 | 0.3746 | nan | 0.5735 | 0.6215 | 0.0142 | 0.8605 | 0.1227 | 0.0069 | 0.4337 | 0.0814 | 0.5765 | 0.0 | 0.0110 | 0.7668 | 0.1922 | 0.5229 | 0.3180 | 0.1211 | nan | 0.2457 | 0.4939 | 0.3811 | 0.0 | 0.8884 | 0.7650 | 0.9487 | 0.0094 | 0.1052 | 0.3722 | 0.0 | | 0.1144 | 19.6 | 3920 | 0.6743 | 0.3943 | 0.4708 | 0.8711 | nan | 0.8363 | 0.9524 | 0.8027 | 0.7968 | 0.5293 | nan | 0.6985 | 0.8225 | 0.1192 | 0.9560 | 0.1437 | 0.0100 | 0.7422 | 0.0896 | 0.6518 | 0.0 | 0.0 | 0.9286 | 0.1067 | 0.6105 | 0.3336 | 0.0772 | nan | 0.2570 | 0.5919 | 0.5545 | 0.0 | 0.9578 | 0.8493 | 0.9787 | 0.0030 | 0.1745 | 0.4916 | 0.0 | nan | 0.7460 | 0.8646 | 0.7031 | 0.7061 | 0.3837 | nan | 0.5698 | 0.6244 | 0.0406 | 0.8595 | 0.1206 | 0.0100 | 0.4231 | 0.0895 | 0.5612 | 0.0 | 0.0 | 0.7606 | 0.0981 | 0.5160 | 0.2920 | 0.0753 | nan | 0.2151 | 0.4904 | 0.3711 | 0.0 | 0.8875 | 0.7760 | 0.9499 | 0.0023 | 0.0951 | 0.3875 | 0.0 | | 0.1188 | 19.7 | 3940 | 0.6705 | 0.4008 | 0.4736 | 0.8727 | nan | 0.8352 | 0.9605 | 0.7775 | 0.7891 | 0.5159 | nan | 0.6744 | 0.8470 | 0.0115 | 0.9497 | 0.1475 | 0.0024 | 0.6870 | 0.1569 | 0.6805 | 0.0 | 0.0003 | 0.9275 | 0.2195 | 0.6293 | 0.3641 | 0.0765 | nan | 0.2214 | 0.6162 | 0.5561 | 0.0 | 0.9539 | 0.8622 | 0.9766 | 0.0012 | 0.2277 | 0.4884 | 0.0 | nan | 0.7518 | 0.8605 | 0.7179 | 0.7051 | 0.3932 | nan | 0.5742 | 0.6278 | 0.0058 | 0.8634 | 0.1261 | 0.0024 | 0.4491 | 0.1568 | 0.5786 | 0.0 | 0.0002 | 0.7646 | 0.1904 | 0.5319 | 0.3115 | 0.0745 | nan | 0.1672 | 0.5011 | 0.3704 | 0.0 | 0.8884 | 0.7778 | 0.9518 | 0.0011 | 0.1013 | 0.3798 | 0.0 | | 0.0921 | 19.8 | 3960 | 0.5977 | 0.3969 | 0.4735 | 0.8792 | nan | 0.8769 | 0.9573 | 0.7823 | 0.8502 | 0.4808 | nan | 0.6991 | 0.8520 | 0.0049 | 0.9545 | 0.1090 | 0.0058 | 0.6390 | 0.0450 | 0.6867 | 0.0 | 0.0 | 0.9265 | 0.1901 | 0.6006 | 0.3956 | 0.1275 | nan | 0.2660 | 0.6230 | 0.6010 | 0.0 | 0.9521 | 0.8688 | 0.9846 | 0.0038 | 0.2451 | 0.4230 | 0.0 | nan | 0.7787 | 0.8830 | 0.6752 | 0.7446 | 0.3796 | nan | 0.5851 | 0.6326 | 0.0033 | 0.8611 | 0.0962 | 0.0058 | 0.4192 | 0.0450 | 0.5756 | 0.0 | 0.0 | 0.7626 | 0.1657 | 0.5145 | 0.3129 | 0.1224 | nan | 0.1892 | 0.4940 | 0.3688 | 0.0 | 0.8890 | 0.7783 | 0.9494 | 0.0036 | 0.1144 | 0.3510 | 0.0 | | 0.0998 | 19.9 | 3980 | 0.6695 | 0.3898 | 0.4654 | 0.8695 | nan | 0.8016 | 0.9558 | 0.7933 | 0.8599 | 0.5248 | nan | 0.7058 | 0.8529 | 0.0 | 0.9482 | 0.1023 | 0.0142 | 0.5224 | 0.0380 | 0.7253 | 0.0 | 0.0163 | 0.9390 | 0.1337 | 0.5630 | 0.3497 | 0.1847 | nan | 0.2445 | 0.5742 | 0.5538 | 0.0 | 0.9543 | 0.8615 | 0.9816 | 0.0138 | 0.2513 | 0.4269 | 0.0 | nan | 0.7260 | 0.8725 | 0.7126 | 0.7037 | 0.3809 | nan | 0.5887 | 0.6069 | 0.0 | 0.8603 | 0.0903 | 0.0136 | 0.3584 | 0.0380 | 0.5701 | 0.0 | 0.0148 | 0.7466 | 0.1170 | 0.4905 | 0.2939 | 0.1709 | nan | 0.1710 | 0.4849 | 0.3728 | 0.0 | 0.8872 | 0.7744 | 0.9510 | 0.0121 | 0.1114 | 0.3524 | 0.0 | | 0.1001 | 20.0 | 4000 | 0.6360 | 0.3940 | 0.4690 | 0.8747 | nan | 0.8478 | 0.9596 | 0.8029 | 0.8542 | 0.4806 | nan | 0.7011 | 0.8270 | 0.0153 | 0.9524 | 0.1601 | 0.0 | 0.5663 | 0.0927 | 0.7436 | 0.0 | 0.0156 | 0.9355 | 0.1451 | 0.5621 | 0.3605 | 0.1619 | nan | 0.1624 | 0.6078 | 0.5374 | 0.0 | 0.9556 | 0.8211 | 0.9820 | 0.0169 | 0.3125 | 0.4290 | 0.0 | nan | 0.7591 | 0.8734 | 0.6983 | 0.7558 | 0.3819 | nan | 0.5784 | 0.6133 | 0.0099 | 0.8632 | 0.1415 | 0.0 | 0.3731 | 0.0927 | 0.5487 | 0.0 | 0.0145 | 0.7561 | 0.1293 | 0.4910 | 0.3003 | 0.1506 | nan | 0.1298 | 0.4896 | 0.3837 | 0.0 | 0.8829 | 0.7556 | 0.9518 | 0.0153 | 0.1128 | 0.3558 | 0.0 | | 0.1253 | 20.1 | 4020 | 0.6375 | 0.3945 | 0.4689 | 0.8761 | nan | 0.8562 | 0.9533 | 0.8124 | 0.8497 | 0.5275 | nan | 0.7055 | 0.8616 | 0.0107 | 0.9555 | 0.2256 | 0.0008 | 0.5598 | 0.1015 | 0.7113 | 0.0 | 0.0024 | 0.9286 | 0.1968 | 0.5871 | 0.3271 | 0.1187 | nan | 0.1832 | 0.5976 | 0.5238 | 0.0 | 0.9577 | 0.8331 | 0.9858 | 0.0137 | 0.1747 | 0.4445 | 0.0 | nan | 0.7673 | 0.8788 | 0.6697 | 0.7518 | 0.3957 | nan | 0.5791 | 0.5823 | 0.0078 | 0.8610 | 0.1961 | 0.0008 | 0.3487 | 0.1015 | 0.5737 | 0.0 | 0.0022 | 0.7595 | 0.1723 | 0.5084 | 0.2912 | 0.1149 | nan | 0.1545 | 0.4909 | 0.3803 | 0.0 | 0.8825 | 0.7555 | 0.9488 | 0.0121 | 0.0726 | 0.3626 | 0.0 | | 0.0963 | 20.2 | 4040 | 0.6343 | 0.3986 | 0.4709 | 0.8758 | nan | 0.8581 | 0.9509 | 0.7889 | 0.8497 | 0.5508 | nan | 0.6903 | 0.8440 | 0.0020 | 0.9561 | 0.1398 | 0.0 | 0.6282 | 0.0328 | 0.6605 | 0.0 | 0.0034 | 0.9246 | 0.2302 | 0.5814 | 0.3759 | 0.1717 | nan | 0.2484 | 0.5943 | 0.5692 | 0.0 | 0.9611 | 0.8160 | 0.9809 | 0.0170 | 0.1377 | 0.5036 | 0.0 | nan | 0.7668 | 0.8783 | 0.6873 | 0.7581 | 0.3905 | nan | 0.5727 | 0.6263 | 0.0016 | 0.8572 | 0.1219 | 0.0 | 0.4296 | 0.0328 | 0.5626 | 0.0 | 0.0032 | 0.7619 | 0.2007 | 0.5207 | 0.2996 | 0.1609 | nan | 0.2098 | 0.4903 | 0.3758 | 0.0 | 0.8799 | 0.7447 | 0.9505 | 0.0132 | 0.0692 | 0.3891 | 0.0 | | 0.1137 | 20.3 | 4060 | 0.6375 | 0.3946 | 0.4692 | 0.8769 | nan | 0.8613 | 0.9550 | 0.8020 | 0.8515 | 0.5362 | nan | 0.7077 | 0.8397 | 0.0242 | 0.9510 | 0.1386 | 0.0 | 0.6277 | 0.0927 | 0.7196 | 0.0 | 0.0015 | 0.9251 | 0.1790 | 0.6060 | 0.3227 | 0.1148 | nan | 0.1834 | 0.5894 | 0.5724 | 0.0 | 0.9528 | 0.8344 | 0.9837 | 0.0036 | 0.1638 | 0.4759 | 0.0 | nan | 0.7764 | 0.8810 | 0.6866 | 0.7539 | 0.3881 | nan | 0.5824 | 0.5827 | 0.0154 | 0.8616 | 0.1163 | 0.0 | 0.4547 | 0.0926 | 0.5692 | 0.0 | 0.0014 | 0.7587 | 0.1571 | 0.5074 | 0.2865 | 0.1074 | nan | 0.1429 | 0.4889 | 0.3755 | 0.0 | 0.8853 | 0.7612 | 0.9461 | 0.0029 | 0.0755 | 0.3681 | 0.0 | | 0.1062 | 20.4 | 4080 | 0.6567 | 0.3995 | 0.4775 | 0.8730 | nan | 0.8379 | 0.9542 | 0.8219 | 0.8067 | 0.5245 | nan | 0.7027 | 0.8269 | 0.0109 | 0.9520 | 0.1389 | 0.0 | 0.5794 | 0.1255 | 0.7120 | 0.0 | 0.0142 | 0.9348 | 0.1926 | 0.6014 | 0.4170 | 0.1640 | nan | 0.2494 | 0.6041 | 0.5862 | 0.0 | 0.9498 | 0.8683 | 0.9818 | 0.0056 | 0.2779 | 0.4382 | 0.0 | nan | 0.7553 | 0.8689 | 0.6614 | 0.7353 | 0.3795 | nan | 0.5757 | 0.6132 | 0.0066 | 0.8633 | 0.1199 | 0.0 | 0.4289 | 0.1255 | 0.5749 | 0.0 | 0.0132 | 0.7611 | 0.1679 | 0.5161 | 0.3362 | 0.1469 | nan | 0.1785 | 0.4951 | 0.3797 | 0.0 | 0.8900 | 0.7775 | 0.9491 | 0.0047 | 0.1043 | 0.3541 | 0.0 | | 0.1041 | 20.5 | 4100 | 0.6890 | 0.4013 | 0.4795 | 0.8670 | nan | 0.7937 | 0.9591 | 0.7886 | 0.8263 | 0.4823 | nan | 0.7058 | 0.8277 | 0.0570 | 0.9562 | 0.2314 | 0.0089 | 0.5893 | 0.0884 | 0.7255 | 0.0 | 0.0212 | 0.9247 | 0.2537 | 0.6046 | 0.3496 | 0.2725 | nan | 0.2060 | 0.5952 | 0.5589 | 0.0 | 0.9555 | 0.8368 | 0.9832 | 0.0022 | 0.2807 | 0.4606 | 0.0 | nan | 0.7252 | 0.8511 | 0.7002 | 0.7143 | 0.3809 | nan | 0.5730 | 0.5817 | 0.0289 | 0.8606 | 0.1952 | 0.0089 | 0.4421 | 0.0883 | 0.5674 | 0.0 | 0.0208 | 0.7613 | 0.2071 | 0.5142 | 0.3038 | 0.2314 | nan | 0.1489 | 0.4961 | 0.3774 | 0.0 | 0.8847 | 0.7648 | 0.9499 | 0.0020 | 0.1005 | 0.3606 | 0.0 | | 0.1587 | 20.6 | 4120 | 0.6767 | 0.4083 | 0.4812 | 0.8711 | nan | 0.8364 | 0.9537 | 0.7705 | 0.7938 | 0.5484 | nan | 0.6784 | 0.8085 | 0.0284 | 0.9558 | 0.1908 | 0.0 | 0.5209 | 0.2507 | 0.6850 | 0.0 | 0.0004 | 0.9251 | 0.2019 | 0.6493 | 0.3894 | 0.2335 | nan | 0.2594 | 0.5984 | 0.5619 | 0.0 | 0.9466 | 0.8498 | 0.9837 | 0.0110 | 0.2769 | 0.4914 | 0.0 | nan | 0.7442 | 0.8592 | 0.7000 | 0.7254 | 0.3821 | nan | 0.5722 | 0.6206 | 0.0132 | 0.8605 | 0.1656 | 0.0 | 0.4081 | 0.2504 | 0.5632 | 0.0 | 0.0004 | 0.7681 | 0.1775 | 0.5537 | 0.3305 | 0.2045 | nan | 0.1971 | 0.4928 | 0.3788 | 0.0 | 0.8873 | 0.7669 | 0.9484 | 0.0090 | 0.1096 | 0.3747 | 0.0 | | 0.1263 | 20.7 | 4140 | 0.6792 | 0.4052 | 0.4798 | 0.8693 | nan | 0.8127 | 0.9591 | 0.8023 | 0.8156 | 0.5044 | nan | 0.6911 | 0.8283 | 0.0084 | 0.9539 | 0.1576 | 0.0 | 0.4543 | 0.2301 | 0.6969 | 0.0 | 0.0087 | 0.9293 | 0.1909 | 0.6220 | 0.4202 | 0.2310 | nan | 0.3082 | 0.6124 | 0.5514 | 0.0 | 0.9555 | 0.8294 | 0.9521 | 0.0289 | 0.3463 | 0.4527 | 0.0 | nan | 0.7352 | 0.8585 | 0.6929 | 0.7207 | 0.3930 | nan | 0.5796 | 0.6131 | 0.0048 | 0.8608 | 0.1425 | 0.0 | 0.3678 | 0.2301 | 0.5863 | 0.0 | 0.0082 | 0.7663 | 0.1664 | 0.5473 | 0.3381 | 0.2010 | nan | 0.2321 | 0.4778 | 0.3794 | 0.0 | 0.8840 | 0.7550 | 0.9309 | 0.0198 | 0.1169 | 0.3594 | 0.0 | | 0.0687 | 20.8 | 4160 | 0.6763 | 0.4026 | 0.4814 | 0.8705 | nan | 0.8374 | 0.9388 | 0.8149 | 0.8092 | 0.6507 | nan | 0.7003 | 0.7479 | 0.0486 | 0.9459 | 0.1209 | 0.0 | 0.4576 | 0.1684 | 0.7734 | 0.0 | 0.0052 | 0.9254 | 0.1351 | 0.6627 | 0.3857 | 0.2935 | nan | 0.3222 | 0.5700 | 0.5956 | 0.0 | 0.9561 | 0.8280 | 0.9860 | 0.0265 | 0.2518 | 0.4466 | 0.0 | nan | 0.7526 | 0.8674 | 0.7003 | 0.7256 | 0.3858 | nan | 0.5779 | 0.5933 | 0.0159 | 0.8653 | 0.1066 | 0.0 | 0.3861 | 0.1684 | 0.5383 | 0.0 | 0.0047 | 0.7688 | 0.1184 | 0.5338 | 0.3312 | 0.2460 | nan | 0.2383 | 0.4819 | 0.3783 | 0.0 | 0.8830 | 0.7539 | 0.9475 | 0.0183 | 0.1309 | 0.3649 | 0.0 | | 0.1022 | 20.9 | 4180 | 0.6515 | 0.4094 | 0.4840 | 0.8714 | nan | 0.8243 | 0.9531 | 0.8246 | 0.8207 | 0.5230 | nan | 0.7051 | 0.8164 | 0.0200 | 0.9530 | 0.1783 | 0.0 | 0.6442 | 0.2868 | 0.7147 | 0.0 | 0.0004 | 0.9344 | 0.1501 | 0.5645 | 0.3791 | 0.2135 | nan | 0.2752 | 0.6152 | 0.5677 | 0.0 | 0.9529 | 0.8576 | 0.9788 | 0.0186 | 0.2173 | 0.5001 | 0.0 | nan | 0.7452 | 0.8705 | 0.6665 | 0.7204 | 0.3726 | nan | 0.5795 | 0.6388 | 0.0120 | 0.8618 | 0.1561 | 0.0 | 0.4798 | 0.2868 | 0.5968 | 0.0 | 0.0004 | 0.7536 | 0.1331 | 0.4925 | 0.3280 | 0.1859 | nan | 0.2251 | 0.4905 | 0.3779 | 0.0 | 0.8884 | 0.7699 | 0.9526 | 0.0146 | 0.1169 | 0.3848 | 0.0 | | 0.1278 | 21.0 | 4200 | 0.6634 | 0.4075 | 0.4829 | 0.8711 | nan | 0.8245 | 0.9530 | 0.7866 | 0.8225 | 0.5349 | nan | 0.6884 | 0.8640 | 0.0566 | 0.9489 | 0.2054 | 0.0 | 0.5859 | 0.2362 | 0.7052 | 0.0 | 0.0008 | 0.9347 | 0.1644 | 0.6074 | 0.3896 | 0.2307 | nan | 0.3313 | 0.5614 | 0.5768 | 0.0 | 0.9521 | 0.8605 | 0.9856 | 0.0116 | 0.1722 | 0.4631 | 0.0 | nan | 0.7408 | 0.8674 | 0.7110 | 0.7096 | 0.3785 | nan | 0.5794 | 0.5541 | 0.0353 | 0.8636 | 0.1842 | 0.0 | 0.4354 | 0.2361 | 0.5915 | 0.0 | 0.0008 | 0.7613 | 0.1453 | 0.5112 | 0.3317 | 0.2033 | nan | 0.2664 | 0.4866 | 0.3724 | 0.0 | 0.8884 | 0.7736 | 0.9489 | 0.0089 | 0.0912 | 0.3618 | 0.0 | | 0.0846 | 21.1 | 4220 | 0.6718 | 0.3986 | 0.4677 | 0.8711 | nan | 0.8204 | 0.9642 | 0.7579 | 0.8309 | 0.4685 | nan | 0.6837 | 0.8341 | 0.0109 | 0.9496 | 0.1461 | 0.0 | 0.4288 | 0.1567 | 0.6752 | 0.0 | 0.0 | 0.9378 | 0.1636 | 0.5715 | 0.3840 | 0.3030 | nan | 0.2801 | 0.6112 | 0.5545 | 0.0 | 0.9499 | 0.8698 | 0.9818 | 0.0113 | 0.1875 | 0.4330 | 0.0 | nan | 0.7372 | 0.8662 | 0.7208 | 0.7000 | 0.3749 | nan | 0.5771 | 0.5614 | 0.0071 | 0.8616 | 0.1265 | 0.0 | 0.3334 | 0.1567 | 0.5866 | 0.0 | 0.0 | 0.7540 | 0.1424 | 0.5010 | 0.3260 | 0.2542 | nan | 0.2147 | 0.4983 | 0.3801 | 0.0 | 0.8891 | 0.7807 | 0.9504 | 0.0093 | 0.0879 | 0.3571 | 0.0 | | 0.1115 | 21.2 | 4240 | 0.6552 | 0.4076 | 0.4832 | 0.8737 | nan | 0.8317 | 0.9473 | 0.8047 | 0.8574 | 0.5807 | nan | 0.6939 | 0.8171 | 0.0246 | 0.9492 | 0.1527 | 0.0037 | 0.5044 | 0.2308 | 0.6829 | 0.0 | 0.0009 | 0.9300 | 0.1914 | 0.6670 | 0.4029 | 0.3364 | nan | 0.3103 | 0.5589 | 0.5355 | 0.0 | 0.9510 | 0.8408 | 0.9833 | 0.0029 | 0.1955 | 0.4733 | 0.0 | nan | 0.7517 | 0.8763 | 0.6704 | 0.7468 | 0.3714 | nan | 0.5771 | 0.5795 | 0.0150 | 0.8643 | 0.1306 | 0.0037 | 0.3847 | 0.2307 | 0.5843 | 0.0 | 0.0008 | 0.7695 | 0.1615 | 0.5434 | 0.3354 | 0.2658 | nan | 0.2342 | 0.4869 | 0.3854 | 0.0 | 0.8869 | 0.7624 | 0.9506 | 0.0023 | 0.1051 | 0.3660 | 0.0 | | 0.1084 | 21.3 | 4260 | 0.6892 | 0.3892 | 0.4577 | 0.8640 | nan | 0.7884 | 0.9506 | 0.7385 | 0.8094 | 0.5658 | nan | 0.7285 | 0.8160 | 0.0189 | 0.9520 | 0.1778 | 0.0 | 0.2717 | 0.1533 | 0.6612 | 0.0 | 0.0003 | 0.9476 | 0.1546 | 0.5538 | 0.4004 | 0.1485 | nan | 0.2246 | 0.6010 | 0.5424 | 0.0 | 0.9481 | 0.8537 | 0.9823 | 0.0129 | 0.1972 | 0.4453 | 0.0 | nan | 0.7146 | 0.8556 | 0.7159 | 0.7010 | 0.3691 | nan | 0.5797 | 0.6040 | 0.0121 | 0.8610 | 0.1574 | 0.0 | 0.2095 | 0.1532 | 0.5598 | 0.0 | 0.0002 | 0.7475 | 0.1321 | 0.4993 | 0.3277 | 0.1327 | nan | 0.1811 | 0.4931 | 0.3776 | 0.0 | 0.8874 | 0.7649 | 0.9522 | 0.0083 | 0.0941 | 0.3645 | 0.0 | | 0.1338 | 21.4 | 4280 | 0.6709 | 0.3973 | 0.4641 | 0.8726 | nan | 0.8313 | 0.9602 | 0.7449 | 0.8214 | 0.5155 | nan | 0.7123 | 0.8303 | 0.0138 | 0.9432 | 0.1363 | 0.0 | 0.4912 | 0.1223 | 0.6385 | 0.0 | 0.0003 | 0.9390 | 0.1575 | 0.6410 | 0.3586 | 0.2166 | nan | 0.2556 | 0.5782 | 0.5075 | 0.0 | 0.9518 | 0.8385 | 0.9821 | 0.0096 | 0.2061 | 0.4482 | 0.0 | nan | 0.7484 | 0.8677 | 0.7121 | 0.7044 | 0.3990 | nan | 0.5868 | 0.5942 | 0.0097 | 0.8629 | 0.1184 | 0.0 | 0.3635 | 0.1222 | 0.5545 | 0.0 | 0.0002 | 0.7618 | 0.1388 | 0.5326 | 0.3045 | 0.1839 | nan | 0.2106 | 0.4879 | 0.3757 | 0.0 | 0.8841 | 0.7657 | 0.9515 | 0.0074 | 0.1076 | 0.3582 | 0.0 | | 0.0738 | 21.5 | 4300 | 0.6573 | 0.4033 | 0.4748 | 0.8713 | nan | 0.8115 | 0.9567 | 0.8002 | 0.8162 | 0.5514 | nan | 0.7062 | 0.8208 | 0.0280 | 0.9496 | 0.1044 | 0.0029 | 0.4920 | 0.1151 | 0.6800 | 0.0 | 0.0 | 0.9278 | 0.1859 | 0.6386 | 0.3948 | 0.3072 | nan | 0.3047 | 0.6096 | 0.5394 | 0.0 | 0.9550 | 0.8316 | 0.9853 | 0.0043 | 0.1767 | 0.4969 | 0.0 | nan | 0.7366 | 0.8644 | 0.6744 | 0.7142 | 0.3915 | nan | 0.5877 | 0.6245 | 0.0180 | 0.8641 | 0.0928 | 0.0028 | 0.3760 | 0.1150 | 0.5753 | 0.0 | 0.0 | 0.7681 | 0.1669 | 0.5448 | 0.3265 | 0.2540 | nan | 0.2388 | 0.4946 | 0.3862 | 0.0 | 0.8849 | 0.7629 | 0.9491 | 0.0037 | 0.1074 | 0.3806 | 0.0 | | 0.1029 | 21.6 | 4320 | 0.6469 | 0.4044 | 0.4782 | 0.8737 | nan | 0.8398 | 0.9561 | 0.7955 | 0.8276 | 0.5186 | nan | 0.7033 | 0.7922 | 0.0450 | 0.9536 | 0.1347 | 0.0001 | 0.6867 | 0.1459 | 0.7371 | 0.0 | 0.0003 | 0.9364 | 0.1667 | 0.6232 | 0.3445 | 0.2556 | nan | 0.2941 | 0.5700 | 0.5599 | 0.0 | 0.9584 | 0.8165 | 0.9835 | 0.0045 | 0.1937 | 0.4606 | 0.0 | nan | 0.7543 | 0.8712 | 0.6931 | 0.7240 | 0.3945 | nan | 0.5843 | 0.6372 | 0.0275 | 0.8614 | 0.1184 | 0.0001 | 0.4375 | 0.1459 | 0.5797 | 0.0 | 0.0003 | 0.7593 | 0.1475 | 0.5194 | 0.2930 | 0.2124 | nan | 0.2283 | 0.4873 | 0.3849 | 0.0 | 0.8807 | 0.7483 | 0.9503 | 0.0040 | 0.1232 | 0.3739 | 0.0 | | 0.1186 | 21.7 | 4340 | 0.6585 | 0.4079 | 0.4883 | 0.8723 | nan | 0.8288 | 0.9558 | 0.7707 | 0.8673 | 0.5224 | nan | 0.6848 | 0.8085 | 0.0581 | 0.9490 | 0.1224 | 0.0 | 0.6958 | 0.0732 | 0.7552 | 0.0 | 0.0 | 0.9204 | 0.2677 | 0.5686 | 0.3612 | 0.3455 | nan | 0.3524 | 0.6461 | 0.5403 | 0.0 | 0.9572 | 0.8196 | 0.9823 | 0.0281 | 0.2567 | 0.4876 | 0.0 | nan | 0.7477 | 0.8743 | 0.7007 | 0.7153 | 0.4094 | nan | 0.5850 | 0.6034 | 0.0355 | 0.8668 | 0.1055 | 0.0 | 0.4602 | 0.0732 | 0.6005 | 0.0 | 0.0 | 0.7550 | 0.2271 | 0.5002 | 0.3071 | 0.2702 | nan | 0.2727 | 0.4949 | 0.3814 | 0.0 | 0.8794 | 0.7426 | 0.9508 | 0.0204 | 0.0945 | 0.3773 | 0.0 | | 0.0677 | 21.8 | 4360 | 0.6361 | 0.3993 | 0.4769 | 0.8750 | nan | 0.8544 | 0.9537 | 0.7839 | 0.8400 | 0.5455 | nan | 0.6787 | 0.8215 | 0.0144 | 0.9538 | 0.1153 | 0.0 | 0.7256 | 0.0217 | 0.7075 | 0.0 | 0.0 | 0.9299 | 0.1329 | 0.5876 | 0.4313 | 0.2619 | nan | 0.2513 | 0.6280 | 0.5770 | 0.0 | 0.9517 | 0.8389 | 0.9813 | 0.0087 | 0.2031 | 0.4615 | 0.0 | nan | 0.7686 | 0.8791 | 0.6233 | 0.7480 | 0.3951 | nan | 0.5785 | 0.6149 | 0.0087 | 0.8604 | 0.0980 | 0.0 | 0.4816 | 0.0217 | 0.5856 | 0.0 | 0.0 | 0.7597 | 0.1237 | 0.5133 | 0.3455 | 0.2226 | nan | 0.2174 | 0.4940 | 0.3838 | 0.0 | 0.8837 | 0.7542 | 0.9505 | 0.0060 | 0.0886 | 0.3699 | 0.0 | | 0.0966 | 21.9 | 4380 | 0.6521 | 0.4025 | 0.4823 | 0.8755 | nan | 0.8575 | 0.9535 | 0.7738 | 0.8313 | 0.5383 | nan | 0.6988 | 0.8177 | 0.0173 | 0.9548 | 0.1243 | 0.0 | 0.6781 | 0.1098 | 0.7482 | 0.0 | 0.0 | 0.9375 | 0.1988 | 0.5860 | 0.3846 | 0.2233 | nan | 0.2717 | 0.6065 | 0.6044 | 0.0 | 0.9486 | 0.8565 | 0.9802 | 0.0117 | 0.2642 | 0.4555 | 0.0 | nan | 0.7634 | 0.8766 | 0.6806 | 0.7478 | 0.3922 | nan | 0.5740 | 0.6078 | 0.0107 | 0.8622 | 0.1051 | 0.0 | 0.4432 | 0.1098 | 0.5781 | 0.0 | 0.0 | 0.7588 | 0.1735 | 0.5266 | 0.3227 | 0.1965 | nan | 0.2137 | 0.4956 | 0.3696 | 0.0 | 0.8882 | 0.7733 | 0.9508 | 0.0081 | 0.0910 | 0.3591 | 0.0 | | 0.1168 | 22.0 | 4400 | 0.6743 | 0.4063 | 0.4888 | 0.8718 | nan | 0.8217 | 0.9470 | 0.8187 | 0.8473 | 0.5740 | nan | 0.6884 | 0.8092 | 0.0404 | 0.9540 | 0.1233 | 0.0 | 0.7544 | 0.1954 | 0.7602 | 0.0 | 0.0011 | 0.9278 | 0.1784 | 0.6134 | 0.3906 | 0.2784 | nan | 0.1975 | 0.6000 | 0.5872 | 0.0 | 0.9548 | 0.8470 | 0.9854 | 0.0077 | 0.2577 | 0.4814 | 0.0 | nan | 0.7449 | 0.8696 | 0.6629 | 0.7342 | 0.3708 | nan | 0.5749 | 0.6200 | 0.0206 | 0.8656 | 0.1045 | 0.0 | 0.4958 | 0.1954 | 0.5853 | 0.0 | 0.0010 | 0.7691 | 0.1619 | 0.5505 | 0.3202 | 0.2488 | nan | 0.1564 | 0.4904 | 0.3779 | 0.0 | 0.8859 | 0.7710 | 0.9489 | 0.0067 | 0.0918 | 0.3755 | 0.0 | | 0.1196 | 22.1 | 4420 | 0.6573 | 0.4029 | 0.4845 | 0.8736 | nan | 0.8246 | 0.9584 | 0.7864 | 0.8363 | 0.5195 | nan | 0.6933 | 0.7868 | 0.1454 | 0.9523 | 0.1428 | 0.0 | 0.7212 | 0.0582 | 0.7861 | 0.0 | 0.0001 | 0.9330 | 0.2043 | 0.6240 | 0.3923 | 0.2184 | nan | 0.2291 | 0.6001 | 0.5717 | 0.0 | 0.9499 | 0.8588 | 0.9833 | 0.0066 | 0.2396 | 0.4826 | 0.0 | nan | 0.7455 | 0.8696 | 0.6918 | 0.7157 | 0.3893 | nan | 0.5780 | 0.6101 | 0.0592 | 0.8655 | 0.1215 | 0.0 | 0.4956 | 0.0582 | 0.5283 | 0.0 | 0.0001 | 0.7669 | 0.1772 | 0.5426 | 0.3298 | 0.1941 | nan | 0.1912 | 0.4937 | 0.3772 | 0.0 | 0.8886 | 0.7717 | 0.9503 | 0.0059 | 0.0940 | 0.3812 | 0.0 | | 0.0916 | 22.2 | 4440 | 0.6671 | 0.4016 | 0.4835 | 0.8750 | nan | 0.8546 | 0.9526 | 0.7848 | 0.8181 | 0.5306 | nan | 0.7050 | 0.7740 | 0.0639 | 0.9540 | 0.1298 | 0.0 | 0.7229 | 0.0259 | 0.7771 | 0.0 | 0.0 | 0.9282 | 0.1856 | 0.6414 | 0.4127 | 0.1601 | nan | 0.3675 | 0.6012 | 0.6054 | 0.0 | 0.9586 | 0.8275 | 0.9830 | 0.0156 | 0.2546 | 0.4364 | 0.0 | nan | 0.7593 | 0.8716 | 0.7117 | 0.7337 | 0.3855 | nan | 0.5776 | 0.6147 | 0.0252 | 0.8617 | 0.1121 | 0.0 | 0.4832 | 0.0259 | 0.5267 | 0.0 | 0.0 | 0.7674 | 0.1577 | 0.5443 | 0.3348 | 0.1446 | nan | 0.2485 | 0.4981 | 0.3778 | 0.0 | 0.8851 | 0.7590 | 0.9504 | 0.0125 | 0.1175 | 0.3647 | 0.0 | | 0.0867 | 22.3 | 4460 | 0.6636 | 0.4042 | 0.4808 | 0.8755 | nan | 0.8381 | 0.9610 | 0.7599 | 0.8121 | 0.5300 | nan | 0.6977 | 0.8275 | 0.0271 | 0.9569 | 0.1220 | 0.0 | 0.7163 | 0.0654 | 0.7150 | 0.0 | 0.0 | 0.9224 | 0.2333 | 0.6599 | 0.3537 | 0.1412 | nan | 0.3349 | 0.5894 | 0.5763 | 0.0 | 0.9542 | 0.8557 | 0.9837 | 0.0079 | 0.2481 | 0.4951 | 0.0 | nan | 0.7549 | 0.8665 | 0.7145 | 0.7176 | 0.4058 | nan | 0.5809 | 0.6203 | 0.0147 | 0.8614 | 0.1082 | 0.0 | 0.4864 | 0.0654 | 0.5410 | 0.0 | 0.0 | 0.7728 | 0.2017 | 0.5630 | 0.3100 | 0.1315 | nan | 0.2410 | 0.4916 | 0.3759 | 0.0 | 0.8892 | 0.7703 | 0.9499 | 0.0067 | 0.1155 | 0.3761 | 0.0 | | 0.1021 | 22.4 | 4480 | 0.6624 | 0.4058 | 0.4860 | 0.8766 | nan | 0.8609 | 0.9531 | 0.7598 | 0.8181 | 0.5222 | nan | 0.6821 | 0.8110 | 0.1090 | 0.9507 | 0.1304 | 0.0 | 0.7379 | 0.0732 | 0.7369 | 0.0 | 0.0 | 0.9206 | 0.1910 | 0.6836 | 0.4226 | 0.2346 | nan | 0.2167 | 0.6100 | 0.5774 | 0.0 | 0.9527 | 0.8661 | 0.9817 | 0.0058 | 0.2534 | 0.4897 | 0.0 | nan | 0.7584 | 0.8716 | 0.7144 | 0.7334 | 0.3854 | nan | 0.5682 | 0.5957 | 0.0472 | 0.8653 | 0.1130 | 0.0 | 0.4996 | 0.0732 | 0.5125 | 0.0 | 0.0 | 0.7731 | 0.1765 | 0.5758 | 0.3386 | 0.1997 | nan | 0.1902 | 0.4950 | 0.3736 | 0.0 | 0.8902 | 0.7776 | 0.9509 | 0.0048 | 0.1255 | 0.3748 | 0.0 | | 0.0872 | 22.5 | 4500 | 0.6689 | 0.4063 | 0.4868 | 0.8745 | nan | 0.8574 | 0.9529 | 0.8159 | 0.8035 | 0.5175 | nan | 0.6869 | 0.7413 | 0.2261 | 0.9517 | 0.1434 | 0.0002 | 0.7643 | 0.0877 | 0.6973 | 0.0 | 0.0 | 0.9308 | 0.1619 | 0.5909 | 0.4018 | 0.3157 | nan | 0.2351 | 0.6278 | 0.5717 | 0.0 | 0.9580 | 0.8582 | 0.9820 | 0.0176 | 0.2352 | 0.4442 | 0.0 | nan | 0.7623 | 0.8723 | 0.6897 | 0.7278 | 0.3847 | nan | 0.5636 | 0.6053 | 0.0667 | 0.8630 | 0.1199 | 0.0002 | 0.5100 | 0.0877 | 0.5358 | 0.0 | 0.0 | 0.7563 | 0.1431 | 0.5301 | 0.3409 | 0.2571 | nan | 0.1989 | 0.4989 | 0.3886 | 0.0 | 0.8874 | 0.7713 | 0.9510 | 0.0122 | 0.1084 | 0.3682 | 0.0 | | 0.0983 | 22.6 | 4520 | 0.7023 | 0.3976 | 0.4705 | 0.8686 | nan | 0.8058 | 0.9578 | 0.8044 | 0.8129 | 0.5313 | nan | 0.6656 | 0.8009 | 0.0983 | 0.9533 | 0.1359 | 0.0 | 0.5326 | 0.0680 | 0.6913 | 0.0 | 0.0016 | 0.9358 | 0.1274 | 0.5976 | 0.3870 | 0.3395 | nan | 0.1922 | 0.6224 | 0.5431 | 0.0 | 0.9529 | 0.8436 | 0.9845 | 0.0118 | 0.2043 | 0.4549 | 0.0 | nan | 0.7301 | 0.8665 | 0.6552 | 0.7167 | 0.3797 | nan | 0.5711 | 0.6217 | 0.0392 | 0.8612 | 0.1163 | 0.0 | 0.4311 | 0.0680 | 0.5479 | 0.0 | 0.0015 | 0.7544 | 0.1128 | 0.5237 | 0.3256 | 0.2716 | nan | 0.1567 | 0.4959 | 0.3898 | 0.0 | 0.8861 | 0.7586 | 0.9494 | 0.0068 | 0.1104 | 0.3756 | 0.0 | | 0.1209 | 22.7 | 4540 | 0.6888 | 0.4058 | 0.4847 | 0.8721 | nan | 0.8228 | 0.9569 | 0.7871 | 0.8190 | 0.5117 | nan | 0.7215 | 0.8375 | 0.0268 | 0.9549 | 0.1433 | 0.0 | 0.5613 | 0.1067 | 0.6863 | 0.0 | 0.0124 | 0.9308 | 0.1523 | 0.6273 | 0.4237 | 0.4561 | nan | 0.2320 | 0.6087 | 0.5332 | 0.0 | 0.9446 | 0.8729 | 0.9841 | 0.0056 | 0.3310 | 0.4608 | 0.0 | nan | 0.7441 | 0.8699 | 0.6695 | 0.7164 | 0.3835 | nan | 0.5778 | 0.6299 | 0.0147 | 0.8619 | 0.1226 | 0.0 | 0.4366 | 0.1066 | 0.5622 | 0.0 | 0.0117 | 0.7632 | 0.1349 | 0.5324 | 0.3502 | 0.3325 | nan | 0.1801 | 0.4954 | 0.3844 | 0.0 | 0.8894 | 0.7696 | 0.9505 | 0.0040 | 0.1223 | 0.3692 | 0.0 | | 0.0824 | 22.8 | 4560 | 0.6736 | 0.3993 | 0.4740 | 0.8745 | nan | 0.8310 | 0.9552 | 0.7977 | 0.8584 | 0.5197 | nan | 0.6920 | 0.8539 | 0.0106 | 0.9566 | 0.1252 | 0.0011 | 0.6397 | 0.0409 | 0.6607 | 0.0 | 0.0032 | 0.9340 | 0.1506 | 0.5987 | 0.3985 | 0.3083 | nan | 0.2090 | 0.5958 | 0.5371 | 0.0 | 0.9536 | 0.8469 | 0.9855 | 0.0051 | 0.2170 | 0.4835 | 0.0 | nan | 0.7479 | 0.8744 | 0.6568 | 0.7333 | 0.3959 | nan | 0.5785 | 0.6096 | 0.0067 | 0.8609 | 0.1105 | 0.0011 | 0.4306 | 0.0409 | 0.5658 | 0.0 | 0.0031 | 0.7627 | 0.1352 | 0.5206 | 0.3350 | 0.2573 | nan | 0.1663 | 0.4924 | 0.3826 | 0.0 | 0.8878 | 0.7731 | 0.9471 | 0.0040 | 0.1089 | 0.3872 | 0.0 | | 0.0955 | 22.9 | 4580 | 0.6769 | 0.3973 | 0.4726 | 0.8750 | nan | 0.8395 | 0.9518 | 0.8094 | 0.8573 | 0.5389 | nan | 0.6844 | 0.8181 | 0.0240 | 0.9536 | 0.1233 | 0.0 | 0.6615 | 0.0566 | 0.6939 | 0.0 | 0.0015 | 0.9378 | 0.1305 | 0.6115 | 0.3674 | 0.2542 | nan | 0.1922 | 0.5850 | 0.5552 | 0.0 | 0.9538 | 0.8550 | 0.9818 | 0.0104 | 0.2114 | 0.4616 | 0.0 | nan | 0.7574 | 0.8781 | 0.6485 | 0.7476 | 0.4006 | nan | 0.5743 | 0.6027 | 0.0124 | 0.8644 | 0.1103 | 0.0 | 0.4160 | 0.0566 | 0.5664 | 0.0 | 0.0014 | 0.7584 | 0.1177 | 0.5146 | 0.3165 | 0.2226 | nan | 0.1647 | 0.4900 | 0.3889 | 0.0 | 0.8875 | 0.7725 | 0.9506 | 0.0059 | 0.1090 | 0.3789 | 0.0 | | 0.1196 | 23.0 | 4600 | 0.6728 | 0.4035 | 0.4759 | 0.8745 | nan | 0.8255 | 0.9565 | 0.7957 | 0.8567 | 0.5352 | nan | 0.7149 | 0.7660 | 0.0164 | 0.9594 | 0.1322 | 0.0020 | 0.7373 | 0.0622 | 0.6802 | 0.0 | 0.0034 | 0.9197 | 0.1516 | 0.6637 | 0.3872 | 0.2374 | nan | 0.2423 | 0.5818 | 0.5662 | 0.0 | 0.9616 | 0.8030 | 0.9793 | 0.0105 | 0.1726 | 0.5070 | 0.0 | nan | 0.7482 | 0.8715 | 0.6640 | 0.7457 | 0.4033 | nan | 0.5877 | 0.6187 | 0.0062 | 0.8592 | 0.1182 | 0.0020 | 0.4994 | 0.0622 | 0.5617 | 0.0 | 0.0033 | 0.7726 | 0.1381 | 0.5452 | 0.3192 | 0.2098 | nan | 0.1879 | 0.4902 | 0.3920 | 0.0 | 0.8796 | 0.7461 | 0.9529 | 0.0078 | 0.1243 | 0.3965 | 0.0 | | 0.1144 | 23.1 | 4620 | 0.6609 | 0.4076 | 0.4822 | 0.8764 | nan | 0.8449 | 0.9573 | 0.7680 | 0.8435 | 0.5289 | nan | 0.6911 | 0.8111 | 0.0271 | 0.9566 | 0.1334 | 0.0065 | 0.7120 | 0.1387 | 0.6763 | 0.0 | 0.0017 | 0.9177 | 0.1278 | 0.6233 | 0.4501 | 0.2858 | nan | 0.1839 | 0.6163 | 0.5656 | 0.0 | 0.9523 | 0.8649 | 0.9804 | 0.0073 | 0.2385 | 0.5180 | 0.0 | nan | 0.7512 | 0.8727 | 0.6915 | 0.7513 | 0.4048 | nan | 0.5798 | 0.6187 | 0.0118 | 0.8619 | 0.1208 | 0.0065 | 0.4702 | 0.1386 | 0.5600 | 0.0 | 0.0016 | 0.7692 | 0.1193 | 0.5338 | 0.3470 | 0.2441 | nan | 0.1581 | 0.4946 | 0.3897 | 0.0 | 0.8891 | 0.7725 | 0.9528 | 0.0054 | 0.1337 | 0.3932 | 0.0 | | 0.1147 | 23.2 | 4640 | 0.6673 | 0.4042 | 0.4842 | 0.8765 | nan | 0.8462 | 0.9556 | 0.7864 | 0.8425 | 0.5362 | nan | 0.7036 | 0.8250 | 0.0328 | 0.9598 | 0.1363 | 0.0 | 0.7146 | 0.0945 | 0.6819 | 0.0 | 0.0 | 0.9310 | 0.1955 | 0.6343 | 0.4073 | 0.2949 | nan | 0.2135 | 0.6039 | 0.5779 | 0.0 | 0.9519 | 0.8486 | 0.9804 | 0.0079 | 0.2781 | 0.4553 | 0.0 | nan | 0.7555 | 0.8744 | 0.6860 | 0.7501 | 0.3970 | nan | 0.5739 | 0.6089 | 0.0172 | 0.8584 | 0.1163 | 0.0 | 0.4415 | 0.0944 | 0.5553 | 0.0 | 0.0 | 0.7657 | 0.1683 | 0.5395 | 0.3418 | 0.2349 | nan | 0.1732 | 0.4906 | 0.3635 | 0.0 | 0.8903 | 0.7753 | 0.9524 | 0.0064 | 0.1280 | 0.3766 | 0.0 | | 0.0797 | 23.3 | 4660 | 0.6934 | 0.4041 | 0.4799 | 0.8736 | nan | 0.8226 | 0.9545 | 0.7549 | 0.8589 | 0.5554 | nan | 0.6764 | 0.8197 | 0.0164 | 0.9529 | 0.1220 | 0.0 | 0.6507 | 0.0652 | 0.6900 | 0.0 | 0.0 | 0.9273 | 0.1986 | 0.6386 | 0.3753 | 0.2942 | nan | 0.2419 | 0.6171 | 0.5872 | 0.0 | 0.9531 | 0.8602 | 0.9845 | 0.0076 | 0.2604 | 0.4724 | 0.0 | nan | 0.7388 | 0.8706 | 0.6904 | 0.7298 | 0.3955 | nan | 0.5787 | 0.6149 | 0.0105 | 0.8611 | 0.1066 | 0.0 | 0.4802 | 0.0651 | 0.5646 | 0.0 | 0.0 | 0.7635 | 0.1704 | 0.5359 | 0.3274 | 0.2392 | nan | 0.1948 | 0.4917 | 0.3767 | 0.0 | 0.8902 | 0.7752 | 0.9505 | 0.0069 | 0.1253 | 0.3780 | 0.0 | | 0.0965 | 23.4 | 4680 | 0.7067 | 0.4014 | 0.4765 | 0.8717 | nan | 0.8330 | 0.9523 | 0.7460 | 0.8488 | 0.5634 | nan | 0.6929 | 0.8100 | 0.0555 | 0.9566 | 0.1322 | 0.0 | 0.6135 | 0.1201 | 0.7288 | 0.0 | 0.0 | 0.9314 | 0.1291 | 0.5895 | 0.3716 | 0.3378 | nan | 0.2263 | 0.5871 | 0.5389 | 0.0 | 0.9494 | 0.8527 | 0.9822 | 0.0012 | 0.2427 | 0.4541 | 0.0 | nan | 0.7438 | 0.8718 | 0.6884 | 0.7370 | 0.3878 | nan | 0.5796 | 0.6060 | 0.0254 | 0.8590 | 0.1164 | 0.0 | 0.4471 | 0.1201 | 0.5629 | 0.0 | 0.0 | 0.7508 | 0.1147 | 0.4962 | 0.3251 | 0.2780 | nan | 0.1828 | 0.4891 | 0.3770 | 0.0 | 0.8855 | 0.7589 | 0.9526 | 0.0011 | 0.1225 | 0.3662 | 0.0 | | 0.1366 | 23.5 | 4700 | 0.6898 | 0.4134 | 0.4982 | 0.8730 | nan | 0.8253 | 0.9597 | 0.7906 | 0.8212 | 0.5107 | nan | 0.6995 | 0.8227 | 0.0240 | 0.9501 | 0.1438 | 0.0149 | 0.8054 | 0.2313 | 0.7367 | 0.0 | 0.0085 | 0.8985 | 0.3260 | 0.6925 | 0.3509 | 0.2820 | nan | 0.3289 | 0.6285 | 0.5403 | 0.0 | 0.9552 | 0.8376 | 0.9854 | 0.0036 | 0.2508 | 0.5173 | 0.0 | nan | 0.7431 | 0.8684 | 0.6978 | 0.7212 | 0.3905 | nan | 0.5702 | 0.6110 | 0.0107 | 0.8661 | 0.1260 | 0.0148 | 0.4738 | 0.2312 | 0.5671 | 0.0 | 0.0082 | 0.7703 | 0.2687 | 0.5575 | 0.3061 | 0.2209 | nan | 0.2486 | 0.4899 | 0.3764 | 0.0 | 0.8827 | 0.7573 | 0.9497 | 0.0033 | 0.1107 | 0.3857 | 0.0 | | 0.0908 | 23.6 | 4720 | 0.6770 | 0.4098 | 0.4881 | 0.8756 | nan | 0.8416 | 0.9511 | 0.7898 | 0.8537 | 0.5607 | nan | 0.6894 | 0.8402 | 0.0040 | 0.9572 | 0.1218 | 0.0037 | 0.6462 | 0.1959 | 0.6786 | 0.0 | 0.0095 | 0.9199 | 0.2052 | 0.6373 | 0.3794 | 0.3195 | nan | 0.3122 | 0.6081 | 0.5116 | 0.0 | 0.9548 | 0.8582 | 0.9822 | 0.0023 | 0.3116 | 0.4751 | 0.0 | nan | 0.7554 | 0.8787 | 0.6755 | 0.7475 | 0.3776 | nan | 0.5807 | 0.6166 | 0.0025 | 0.8624 | 0.1074 | 0.0037 | 0.4293 | 0.1958 | 0.5660 | 0.0 | 0.0091 | 0.7657 | 0.1809 | 0.5559 | 0.3133 | 0.2630 | nan | 0.2546 | 0.4961 | 0.3786 | 0.0 | 0.8875 | 0.7730 | 0.9517 | 0.0019 | 0.1079 | 0.3751 | 0.0 | | 0.1014 | 23.7 | 4740 | 0.6535 | 0.4063 | 0.4823 | 0.8735 | nan | 0.8430 | 0.9501 | 0.8058 | 0.8021 | 0.5354 | nan | 0.7165 | 0.8378 | 0.0109 | 0.9548 | 0.1042 | 0.0133 | 0.6325 | 0.0786 | 0.6767 | 0.0 | 0.0003 | 0.9215 | 0.1791 | 0.6628 | 0.3357 | 0.3620 | nan | 0.3204 | 0.6167 | 0.5248 | 0.0 | 0.9541 | 0.8546 | 0.9855 | 0.0132 | 0.2390 | 0.5036 | 0.0 | nan | 0.7544 | 0.8748 | 0.6597 | 0.7311 | 0.3679 | nan | 0.5745 | 0.6423 | 0.0075 | 0.8625 | 0.0925 | 0.0133 | 0.4476 | 0.0785 | 0.5741 | 0.0 | 0.0003 | 0.7645 | 0.1578 | 0.5596 | 0.2964 | 0.2935 | nan | 0.2556 | 0.4971 | 0.3898 | 0.0 | 0.8873 | 0.7724 | 0.9498 | 0.0085 | 0.1011 | 0.3875 | 0.0 | | 0.102 | 23.8 | 4760 | 0.6747 | 0.4017 | 0.4770 | 0.8742 | nan | 0.8188 | 0.9561 | 0.8030 | 0.8568 | 0.5305 | nan | 0.6926 | 0.8115 | 0.0258 | 0.9537 | 0.0933 | 0.0 | 0.6117 | 0.0918 | 0.7096 | 0.0 | 0.0003 | 0.9281 | 0.1678 | 0.6539 | 0.3376 | 0.3058 | nan | 0.3111 | 0.6180 | 0.5367 | 0.0 | 0.9519 | 0.8692 | 0.9838 | 0.0037 | 0.1622 | 0.4779 | 0.0 | nan | 0.7398 | 0.8785 | 0.6542 | 0.7370 | 0.3722 | nan | 0.5798 | 0.5885 | 0.0108 | 0.8663 | 0.0825 | 0.0 | 0.4224 | 0.0918 | 0.5818 | 0.0 | 0.0003 | 0.7673 | 0.1502 | 0.5593 | 0.2970 | 0.2544 | nan | 0.2420 | 0.4978 | 0.3933 | 0.0 | 0.8884 | 0.7793 | 0.9499 | 0.0026 | 0.0884 | 0.3787 | 0.0 | | 0.1469 | 23.9 | 4780 | 0.6532 | 0.4060 | 0.4836 | 0.8787 | nan | 0.8692 | 0.9580 | 0.7645 | 0.8406 | 0.5217 | nan | 0.6817 | 0.7427 | 0.0926 | 0.9555 | 0.1089 | 0.0 | 0.5908 | 0.1198 | 0.7532 | 0.0 | 0.0020 | 0.9314 | 0.2335 | 0.6322 | 0.3620 | 0.3100 | nan | 0.3448 | 0.5898 | 0.5816 | 0.0 | 0.9518 | 0.8482 | 0.9816 | 0.0056 | 0.2243 | 0.4756 | 0.0 | nan | 0.7724 | 0.8805 | 0.6834 | 0.7630 | 0.3912 | nan | 0.5737 | 0.5794 | 0.0273 | 0.8663 | 0.0947 | 0.0 | 0.4012 | 0.1198 | 0.5659 | 0.0 | 0.0020 | 0.7664 | 0.2007 | 0.5522 | 0.3059 | 0.2548 | nan | 0.2398 | 0.4910 | 0.3870 | 0.0 | 0.8851 | 0.7671 | 0.9514 | 0.0042 | 0.0953 | 0.3706 | 0.0 | | 0.0832 | 24.0 | 4800 | 0.6606 | 0.4047 | 0.4815 | 0.8758 | nan | 0.8290 | 0.9556 | 0.7845 | 0.8629 | 0.5526 | nan | 0.6877 | 0.8289 | 0.0126 | 0.9567 | 0.1224 | 0.0 | 0.6626 | 0.1311 | 0.6637 | 0.0 | 0.0090 | 0.9326 | 0.0793 | 0.6616 | 0.3595 | 0.3732 | nan | 0.2827 | 0.6086 | 0.5777 | 0.0 | 0.9524 | 0.8613 | 0.9847 | 0.0140 | 0.2216 | 0.4388 | 0.0 | nan | 0.7495 | 0.8800 | 0.6644 | 0.7370 | 0.3917 | nan | 0.5861 | 0.6419 | 0.0082 | 0.8640 | 0.1073 | 0.0 | 0.4011 | 0.1310 | 0.5753 | 0.0 | 0.0089 | 0.7656 | 0.0747 | 0.5575 | 0.3065 | 0.2953 | nan | 0.2261 | 0.4960 | 0.3872 | 0.0 | 0.8866 | 0.7757 | 0.9502 | 0.0096 | 0.1024 | 0.3698 | 0.0 | | 0.1054 | 24.1 | 4820 | 0.6621 | 0.4131 | 0.4917 | 0.8779 | nan | 0.8668 | 0.9545 | 0.7446 | 0.8404 | 0.5394 | nan | 0.6960 | 0.8436 | 0.0180 | 0.9535 | 0.1369 | 0.0 | 0.6041 | 0.2740 | 0.6958 | 0.0 | 0.0260 | 0.9239 | 0.2214 | 0.6427 | 0.3805 | 0.2423 | nan | 0.4048 | 0.5851 | 0.5905 | 0.0 | 0.9549 | 0.8283 | 0.9830 | 0.0152 | 0.2682 | 0.5001 | 0.0 | nan | 0.7693 | 0.8823 | 0.6863 | 0.7551 | 0.3808 | nan | 0.5851 | 0.6125 | 0.0100 | 0.8680 | 0.1182 | 0.0 | 0.3845 | 0.2736 | 0.5852 | 0.0 | 0.0255 | 0.7721 | 0.1944 | 0.5505 | 0.3180 | 0.2073 | nan | 0.2891 | 0.4955 | 0.3669 | 0.0 | 0.8834 | 0.7560 | 0.9512 | 0.0100 | 0.1112 | 0.3785 | 0.0 | | 0.0628 | 24.2 | 4840 | 0.6659 | 0.4126 | 0.4973 | 0.8764 | nan | 0.8482 | 0.9532 | 0.8024 | 0.8445 | 0.5336 | nan | 0.6979 | 0.8672 | 0.0248 | 0.9586 | 0.1514 | 0.0 | 0.6141 | 0.2115 | 0.6955 | 0.0 | 0.0052 | 0.9133 | 0.2168 | 0.6544 | 0.4175 | 0.3947 | nan | 0.3573 | 0.6188 | 0.5859 | 0.0 | 0.9544 | 0.8420 | 0.9826 | 0.0076 | 0.2786 | 0.4811 | 0.0 | nan | 0.7621 | 0.8762 | 0.6918 | 0.7523 | 0.3880 | nan | 0.5765 | 0.5741 | 0.0149 | 0.8643 | 0.1291 | 0.0 | 0.4119 | 0.2112 | 0.5678 | 0.0 | 0.0052 | 0.7720 | 0.1904 | 0.5424 | 0.3399 | 0.3102 | nan | 0.2791 | 0.4959 | 0.3576 | 0.0 | 0.8856 | 0.7586 | 0.9511 | 0.0060 | 0.1135 | 0.3753 | 0.0 | | 0.0991 | 24.3 | 4860 | 0.6955 | 0.4082 | 0.4875 | 0.8761 | nan | 0.8536 | 0.9518 | 0.8091 | 0.8414 | 0.5575 | nan | 0.6870 | 0.8289 | 0.0324 | 0.9480 | 0.1085 | 0.0 | 0.5813 | 0.1625 | 0.7587 | 0.0 | 0.0232 | 0.9303 | 0.2009 | 0.6298 | 0.3871 | 0.4062 | nan | 0.2810 | 0.5868 | 0.5593 | 0.0 | 0.9548 | 0.8246 | 0.9849 | 0.0032 | 0.2298 | 0.4767 | 0.0 | nan | 0.7655 | 0.8783 | 0.6721 | 0.7540 | 0.3832 | nan | 0.5760 | 0.5723 | 0.0179 | 0.8676 | 0.0964 | 0.0 | 0.4293 | 0.1624 | 0.5609 | 0.0 | 0.0230 | 0.7676 | 0.1764 | 0.5444 | 0.3243 | 0.3140 | nan | 0.2434 | 0.4931 | 0.3775 | 0.0 | 0.8835 | 0.7535 | 0.9503 | 0.0026 | 0.0972 | 0.3749 | 0.0 | | 0.1244 | 24.4 | 4880 | 0.6913 | 0.4070 | 0.4834 | 0.8750 | nan | 0.8484 | 0.9514 | 0.7952 | 0.8303 | 0.5339 | nan | 0.7220 | 0.8061 | 0.0202 | 0.9484 | 0.1076 | 0.0185 | 0.5105 | 0.1573 | 0.7423 | 0.0 | 0.0147 | 0.9346 | 0.2283 | 0.6113 | 0.3916 | 0.3409 | nan | 0.3059 | 0.6206 | 0.5449 | 0.0 | 0.9550 | 0.8421 | 0.9806 | 0.0050 | 0.2446 | 0.4554 | 0.0 | nan | 0.7589 | 0.8740 | 0.6844 | 0.7453 | 0.3829 | nan | 0.5773 | 0.6150 | 0.0107 | 0.8623 | 0.0949 | 0.0184 | 0.3722 | 0.1573 | 0.5755 | 0.0 | 0.0146 | 0.7645 | 0.1975 | 0.5299 | 0.3193 | 0.2635 | nan | 0.2408 | 0.4983 | 0.3800 | 0.0 | 0.8852 | 0.7619 | 0.9523 | 0.0040 | 0.1092 | 0.3730 | 0.0 | | 0.1156 | 24.5 | 4900 | 0.6872 | 0.4104 | 0.4857 | 0.8740 | nan | 0.8472 | 0.9539 | 0.8035 | 0.7991 | 0.5354 | nan | 0.6961 | 0.8381 | 0.0211 | 0.9500 | 0.1042 | 0.0164 | 0.5515 | 0.2262 | 0.7248 | 0.0 | 0.0216 | 0.9260 | 0.1495 | 0.6373 | 0.4013 | 0.3315 | nan | 0.3628 | 0.5909 | 0.5302 | 0.0 | 0.9539 | 0.8443 | 0.9846 | 0.0044 | 0.2430 | 0.4948 | 0.0 | nan | 0.7509 | 0.8737 | 0.6822 | 0.7162 | 0.3894 | nan | 0.5797 | 0.6099 | 0.0130 | 0.8638 | 0.0916 | 0.0164 | 0.4414 | 0.2259 | 0.5915 | 0.0 | 0.0215 | 0.7678 | 0.1346 | 0.5359 | 0.3177 | 0.2581 | nan | 0.2672 | 0.4943 | 0.3782 | 0.0 | 0.8868 | 0.7687 | 0.9510 | 0.0034 | 0.1181 | 0.3842 | 0.0 | | 0.1161 | 24.6 | 4920 | 0.6739 | 0.4109 | 0.4828 | 0.8753 | nan | 0.8585 | 0.9563 | 0.7776 | 0.8028 | 0.5214 | nan | 0.6980 | 0.8283 | 0.0100 | 0.9572 | 0.1064 | 0.0 | 0.5024 | 0.2197 | 0.7194 | 0.0 | 0.0005 | 0.9343 | 0.1388 | 0.6306 | 0.4056 | 0.2956 | nan | 0.4097 | 0.6059 | 0.5564 | 0.0 | 0.9504 | 0.8251 | 0.9853 | 0.0201 | 0.2503 | 0.4829 | 0.0 | nan | 0.7581 | 0.8764 | 0.6901 | 0.7238 | 0.4003 | nan | 0.5772 | 0.6326 | 0.0064 | 0.8622 | 0.0935 | 0.0 | 0.4033 | 0.2191 | 0.5943 | 0.0 | 0.0005 | 0.7664 | 0.1255 | 0.5445 | 0.3369 | 0.2515 | nan | 0.3306 | 0.5028 | 0.3723 | 0.0 | 0.8853 | 0.7556 | 0.9504 | 0.0122 | 0.1017 | 0.3758 | 0.0 | | 0.0637 | 24.7 | 4940 | 0.6632 | 0.4097 | 0.4835 | 0.8764 | nan | 0.8464 | 0.9557 | 0.8120 | 0.8365 | 0.5221 | nan | 0.6857 | 0.8429 | 0.0299 | 0.9485 | 0.1338 | 0.0 | 0.4987 | 0.1103 | 0.7168 | 0.0 | 0.0032 | 0.9244 | 0.1512 | 0.6448 | 0.3985 | 0.3803 | nan | 0.3515 | 0.6227 | 0.5915 | 0.0 | 0.9588 | 0.8445 | 0.9812 | 0.0259 | 0.1926 | 0.4600 | 0.0 | nan | 0.7592 | 0.8776 | 0.6662 | 0.7446 | 0.4036 | nan | 0.5652 | 0.6249 | 0.0187 | 0.8637 | 0.1150 | 0.0 | 0.4005 | 0.1101 | 0.5944 | 0.0 | 0.0032 | 0.7679 | 0.1344 | 0.5437 | 0.3184 | 0.3012 | nan | 0.2867 | 0.4977 | 0.3855 | 0.0 | 0.8851 | 0.7606 | 0.9509 | 0.0149 | 0.1346 | 0.3814 | 0.0 | | 0.0985 | 24.8 | 4960 | 0.6682 | 0.4116 | 0.4878 | 0.8794 | nan | 0.8668 | 0.9547 | 0.7974 | 0.8458 | 0.5332 | nan | 0.6825 | 0.8161 | 0.0215 | 0.9582 | 0.1394 | 0.0 | 0.6203 | 0.1289 | 0.6870 | 0.0 | 0.0065 | 0.9294 | 0.1776 | 0.6455 | 0.3973 | 0.4017 | nan | 0.2997 | 0.6134 | 0.5755 | 0.0 | 0.9516 | 0.8513 | 0.9842 | 0.0146 | 0.2445 | 0.4660 | 0.0 | nan | 0.7747 | 0.8812 | 0.6855 | 0.7584 | 0.3930 | nan | 0.5663 | 0.6446 | 0.0122 | 0.8604 | 0.1209 | 0.0 | 0.4040 | 0.1288 | 0.5813 | 0.0 | 0.0065 | 0.7709 | 0.1549 | 0.5461 | 0.3154 | 0.3114 | nan | 0.2470 | 0.4980 | 0.3755 | 0.0 | 0.8899 | 0.7729 | 0.9505 | 0.0092 | 0.1344 | 0.3784 | 0.0 | | 0.0761 | 24.9 | 4980 | 0.6739 | 0.4110 | 0.4890 | 0.8768 | nan | 0.8640 | 0.9506 | 0.7996 | 0.8113 | 0.5416 | nan | 0.6980 | 0.8474 | 0.0488 | 0.9506 | 0.1134 | 0.0036 | 0.6406 | 0.1765 | 0.7130 | 0.0 | 0.0 | 0.9279 | 0.2156 | 0.6395 | 0.3693 | 0.2851 | nan | 0.3728 | 0.5819 | 0.5480 | 0.0 | 0.9541 | 0.8624 | 0.9829 | 0.0202 | 0.2360 | 0.4945 | 0.0 | nan | 0.7687 | 0.8807 | 0.6984 | 0.7350 | 0.3854 | nan | 0.5674 | 0.6164 | 0.0271 | 0.8653 | 0.0989 | 0.0036 | 0.4341 | 0.1763 | 0.5854 | 0.0 | 0.0 | 0.7705 | 0.1856 | 0.5469 | 0.3020 | 0.2374 | nan | 0.2850 | 0.4921 | 0.3710 | 0.0 | 0.8867 | 0.7714 | 0.9507 | 0.0110 | 0.1154 | 0.3827 | 0.0 | | 0.0735 | 25.0 | 5000 | 0.6830 | 0.4082 | 0.4872 | 0.8746 | nan | 0.8526 | 0.9559 | 0.8056 | 0.7894 | 0.5199 | nan | 0.6850 | 0.8030 | 0.1254 | 0.9522 | 0.1047 | 0.0 | 0.6198 | 0.1374 | 0.7549 | 0.0 | 0.0015 | 0.9279 | 0.1950 | 0.6612 | 0.3706 | 0.2416 | nan | 0.3857 | 0.6293 | 0.5656 | 0.0 | 0.9538 | 0.8347 | 0.9810 | 0.0132 | 0.2424 | 0.4825 | 0.0 | nan | 0.7567 | 0.8744 | 0.6918 | 0.7206 | 0.3850 | nan | 0.5699 | 0.6128 | 0.0516 | 0.8634 | 0.0923 | 0.0 | 0.4538 | 0.1373 | 0.5654 | 0.0 | 0.0014 | 0.7704 | 0.1717 | 0.5527 | 0.3110 | 0.2065 | nan | 0.2869 | 0.4934 | 0.3816 | 0.0 | 0.8848 | 0.7593 | 0.9523 | 0.0082 | 0.1241 | 0.3836 | 0.0 | | 0.0938 | 25.1 | 5020 | 0.6645 | 0.4116 | 0.4882 | 0.8781 | nan | 0.8499 | 0.9551 | 0.8119 | 0.8610 | 0.5020 | nan | 0.6796 | 0.8278 | 0.0481 | 0.9580 | 0.1147 | 0.0 | 0.6345 | 0.1092 | 0.7140 | 0.0 | 0.0107 | 0.9178 | 0.1656 | 0.6692 | 0.4067 | 0.3954 | nan | 0.2975 | 0.6107 | 0.5512 | 0.0 | 0.9543 | 0.8624 | 0.9816 | 0.0058 | 0.2392 | 0.4898 | 0.0 | nan | 0.7644 | 0.8812 | 0.6835 | 0.7389 | 0.3873 | nan | 0.5716 | 0.6184 | 0.0260 | 0.8597 | 0.1006 | 0.0 | 0.4904 | 0.1090 | 0.5648 | 0.0 | 0.0104 | 0.7717 | 0.1496 | 0.5551 | 0.3267 | 0.3090 | nan | 0.2399 | 0.5007 | 0.3807 | 0.0 | 0.8883 | 0.7700 | 0.9509 | 0.0040 | 0.1360 | 0.3809 | 0.0 | | 0.093 | 25.2 | 5040 | 0.6889 | 0.4094 | 0.4870 | 0.8747 | nan | 0.8459 | 0.9616 | 0.7975 | 0.7967 | 0.5299 | nan | 0.6816 | 0.8436 | 0.0369 | 0.9504 | 0.1119 | 0.0 | 0.5495 | 0.1542 | 0.7090 | 0.0 | 0.0110 | 0.9289 | 0.2162 | 0.6300 | 0.3669 | 0.4266 | nan | 0.3106 | 0.6245 | 0.5737 | 0.0 | 0.9493 | 0.8377 | 0.9829 | 0.0034 | 0.2728 | 0.4795 | 0.0 | nan | 0.7568 | 0.8718 | 0.6990 | 0.7256 | 0.3905 | nan | 0.5750 | 0.5860 | 0.0194 | 0.8629 | 0.0962 | 0.0 | 0.4309 | 0.1541 | 0.5898 | 0.0 | 0.0107 | 0.7672 | 0.1901 | 0.5447 | 0.3203 | 0.3227 | nan | 0.2274 | 0.4974 | 0.3704 | 0.0 | 0.8879 | 0.7670 | 0.9509 | 0.0023 | 0.1111 | 0.3720 | 0.0 | | 0.075 | 25.3 | 5060 | 0.6667 | 0.4123 | 0.4879 | 0.8759 | nan | 0.8341 | 0.9488 | 0.8000 | 0.8593 | 0.5572 | nan | 0.7305 | 0.8057 | 0.0129 | 0.9501 | 0.1211 | 0.0 | 0.5537 | 0.1523 | 0.6850 | 0.0 | 0.0085 | 0.9328 | 0.1743 | 0.6494 | 0.3818 | 0.4368 | nan | 0.3142 | 0.5996 | 0.5596 | 0.0 | 0.9507 | 0.8710 | 0.9842 | 0.0056 | 0.2599 | 0.4734 | 0.0 | nan | 0.7514 | 0.8764 | 0.7078 | 0.7405 | 0.3926 | nan | 0.5763 | 0.6279 | 0.0074 | 0.8605 | 0.1038 | 0.0 | 0.4455 | 0.1521 | 0.5823 | 0.0 | 0.0081 | 0.7689 | 0.1574 | 0.5531 | 0.3275 | 0.3353 | nan | 0.2247 | 0.4981 | 0.3831 | 0.0 | 0.8894 | 0.7808 | 0.9505 | 0.0039 | 0.1115 | 0.3755 | 0.0 | | 0.0883 | 25.4 | 5080 | 0.6826 | 0.4071 | 0.4811 | 0.8782 | nan | 0.8498 | 0.9660 | 0.7923 | 0.8490 | 0.4725 | nan | 0.6617 | 0.8420 | 0.0195 | 0.9434 | 0.1262 | 0.0 | 0.6498 | 0.1398 | 0.6956 | 0.0 | 0.0 | 0.9290 | 0.1854 | 0.6294 | 0.3749 | 0.3810 | nan | 0.1998 | 0.6090 | 0.5451 | 0.0 | 0.9577 | 0.8428 | 0.9847 | 0.0020 | 0.2837 | 0.4641 | 0.0 | nan | 0.7571 | 0.8752 | 0.7074 | 0.7545 | 0.3956 | nan | 0.5682 | 0.5973 | 0.0116 | 0.8641 | 0.1088 | 0.0 | 0.4206 | 0.1397 | 0.5925 | 0.0 | 0.0 | 0.7698 | 0.1680 | 0.5493 | 0.3142 | 0.3030 | nan | 0.1681 | 0.4992 | 0.3759 | 0.0 | 0.8855 | 0.7756 | 0.9511 | 0.0017 | 0.1063 | 0.3670 | 0.0 | | 0.0816 | 25.5 | 5100 | 0.6807 | 0.4114 | 0.4955 | 0.8749 | nan | 0.8345 | 0.9478 | 0.8324 | 0.8536 | 0.5572 | nan | 0.7033 | 0.7725 | 0.1826 | 0.9557 | 0.1266 | 0.0 | 0.7428 | 0.0977 | 0.7280 | 0.0 | 0.0050 | 0.9258 | 0.2657 | 0.6480 | 0.3696 | 0.3062 | nan | 0.3317 | 0.6083 | 0.5698 | 0.0 | 0.9562 | 0.8370 | 0.9822 | 0.0077 | 0.2130 | 0.4945 | 0.0 | nan | 0.7530 | 0.8759 | 0.6533 | 0.7513 | 0.4000 | nan | 0.5683 | 0.6028 | 0.0512 | 0.8618 | 0.1103 | 0.0 | 0.5336 | 0.0977 | 0.5827 | 0.0 | 0.0049 | 0.7748 | 0.2246 | 0.5527 | 0.3123 | 0.2552 | nan | 0.2516 | 0.4962 | 0.3681 | 0.0 | 0.8845 | 0.7606 | 0.9528 | 0.0048 | 0.0989 | 0.3816 | 0.0 | | 0.0916 | 25.6 | 5120 | 0.7149 | 0.4123 | 0.4943 | 0.8733 | nan | 0.8150 | 0.9578 | 0.8043 | 0.8584 | 0.5208 | nan | 0.6757 | 0.7541 | 0.2774 | 0.9568 | 0.1717 | 0.0 | 0.6621 | 0.0629 | 0.7186 | 0.0 | 0.0065 | 0.9310 | 0.2619 | 0.6287 | 0.3896 | 0.3775 | nan | 0.2891 | 0.6290 | 0.5645 | 0.0 | 0.9525 | 0.8389 | 0.9825 | 0.0055 | 0.2428 | 0.4810 | 0.0 | nan | 0.7377 | 0.8732 | 0.6804 | 0.7161 | 0.3912 | nan | 0.5710 | 0.6170 | 0.0662 | 0.8598 | 0.1508 | 0.0 | 0.4993 | 0.0629 | 0.5755 | 0.0 | 0.0064 | 0.7720 | 0.2194 | 0.5453 | 0.3182 | 0.3081 | nan | 0.2384 | 0.4975 | 0.3752 | 0.0 | 0.8870 | 0.7674 | 0.9532 | 0.0039 | 0.1123 | 0.3868 | 0.0 | | 0.0756 | 25.7 | 5140 | 0.6823 | 0.4106 | 0.4847 | 0.8764 | nan | 0.8456 | 0.9563 | 0.8080 | 0.8325 | 0.5236 | nan | 0.7090 | 0.8032 | 0.0222 | 0.9552 | 0.1291 | 0.0 | 0.6489 | 0.1083 | 0.6883 | 0.0 | 0.0041 | 0.9370 | 0.2403 | 0.6077 | 0.3606 | 0.3683 | nan | 0.3181 | 0.6060 | 0.5516 | 0.0 | 0.9540 | 0.8402 | 0.9832 | 0.0093 | 0.2087 | 0.4923 | 0.0 | nan | 0.7600 | 0.8755 | 0.7023 | 0.7397 | 0.4002 | nan | 0.5797 | 0.6311 | 0.0085 | 0.8632 | 0.1114 | 0.0 | 0.4545 | 0.1082 | 0.5811 | 0.0 | 0.0039 | 0.7603 | 0.2042 | 0.5239 | 0.3056 | 0.2967 | nan | 0.2467 | 0.4979 | 0.3758 | 0.0 | 0.8863 | 0.7610 | 0.9513 | 0.0073 | 0.1088 | 0.3932 | 0.0 | | 0.0803 | 25.8 | 5160 | 0.6809 | 0.4101 | 0.4872 | 0.8765 | nan | 0.8516 | 0.9534 | 0.7921 | 0.8191 | 0.5660 | nan | 0.6904 | 0.8421 | 0.0473 | 0.9555 | 0.1229 | 0.0 | 0.6922 | 0.0765 | 0.7293 | 0.0 | 0.0033 | 0.9232 | 0.1971 | 0.6275 | 0.3768 | 0.3648 | nan | 0.2874 | 0.5885 | 0.5604 | 0.0 | 0.9569 | 0.8549 | 0.9839 | 0.0082 | 0.2141 | 0.5045 | 0.0 | nan | 0.7598 | 0.8736 | 0.7219 | 0.7322 | 0.4040 | nan | 0.5802 | 0.6153 | 0.0208 | 0.8647 | 0.1069 | 0.0 | 0.4975 | 0.0765 | 0.5739 | 0.0 | 0.0033 | 0.7707 | 0.1722 | 0.5427 | 0.3144 | 0.2992 | nan | 0.2225 | 0.4932 | 0.3685 | 0.0 | 0.8873 | 0.7701 | 0.9497 | 0.0060 | 0.1062 | 0.3894 | 0.0 | | 0.0942 | 25.9 | 5180 | 0.6411 | 0.4150 | 0.4918 | 0.8804 | nan | 0.8641 | 0.9544 | 0.8129 | 0.8409 | 0.5218 | nan | 0.7097 | 0.8362 | 0.0164 | 0.9534 | 0.1471 | 0.0 | 0.6927 | 0.0914 | 0.7230 | 0.0 | 0.0008 | 0.9245 | 0.2612 | 0.6589 | 0.3817 | 0.3129 | nan | 0.2753 | 0.6200 | 0.5529 | 0.0 | 0.9531 | 0.8723 | 0.9840 | 0.0115 | 0.2889 | 0.4774 | 0.0 | nan | 0.7693 | 0.8800 | 0.7346 | 0.7522 | 0.3931 | nan | 0.5823 | 0.6294 | 0.0100 | 0.8652 | 0.1280 | 0.0 | 0.4820 | 0.0913 | 0.5905 | 0.0 | 0.0008 | 0.7745 | 0.2181 | 0.5536 | 0.3162 | 0.2624 | nan | 0.2308 | 0.5001 | 0.3704 | 0.0 | 0.8909 | 0.7826 | 0.9502 | 0.0081 | 0.1282 | 0.3862 | 0.0 | | 0.0807 | 26.0 | 5200 | 0.6570 | 0.4128 | 0.4904 | 0.8781 | nan | 0.8506 | 0.9551 | 0.8030 | 0.8504 | 0.5486 | nan | 0.6728 | 0.8390 | 0.0246 | 0.9582 | 0.1389 | 0.0 | 0.7115 | 0.0633 | 0.7197 | 0.0 | 0.0044 | 0.9195 | 0.2165 | 0.6409 | 0.3865 | 0.4242 | nan | 0.2336 | 0.6239 | 0.5434 | 0.0 | 0.9546 | 0.8558 | 0.9826 | 0.0088 | 0.2644 | 0.4987 | 0.0 | nan | 0.7612 | 0.8776 | 0.7084 | 0.7527 | 0.3977 | nan | 0.5819 | 0.6278 | 0.0145 | 0.8635 | 0.1234 | 0.0 | 0.4967 | 0.0632 | 0.5861 | 0.0 | 0.0043 | 0.7734 | 0.1857 | 0.5453 | 0.3198 | 0.3231 | nan | 0.1996 | 0.5033 | 0.3803 | 0.0 | 0.8897 | 0.7765 | 0.9515 | 0.0058 | 0.1138 | 0.3839 | 0.0 | | 0.0989 | 26.1 | 5220 | 0.6766 | 0.4118 | 0.4929 | 0.8761 | nan | 0.8360 | 0.9501 | 0.7735 | 0.8566 | 0.5484 | nan | 0.7156 | 0.7947 | 0.1476 | 0.9545 | 0.1832 | 0.0 | 0.7047 | 0.0546 | 0.7455 | 0.0 | 0.0060 | 0.9283 | 0.1483 | 0.6515 | 0.4213 | 0.4199 | nan | 0.2458 | 0.6013 | 0.5454 | 0.0 | 0.9561 | 0.8676 | 0.9824 | 0.0061 | 0.2510 | 0.4768 | 0.0 | nan | 0.7478 | 0.8754 | 0.7067 | 0.7406 | 0.3945 | nan | 0.5775 | 0.6051 | 0.0539 | 0.8656 | 0.1609 | 0.0 | 0.4929 | 0.0546 | 0.5695 | 0.0 | 0.0059 | 0.7716 | 0.1330 | 0.5479 | 0.3357 | 0.3199 | nan | 0.2093 | 0.5010 | 0.3795 | 0.0 | 0.8896 | 0.7777 | 0.9525 | 0.0044 | 0.1170 | 0.3885 | 0.0 | | 0.0975 | 26.2 | 5240 | 0.6686 | 0.4147 | 0.4966 | 0.8766 | nan | 0.8481 | 0.9563 | 0.7830 | 0.8141 | 0.5606 | nan | 0.7064 | 0.8431 | 0.0839 | 0.9516 | 0.2203 | 0.0 | 0.6956 | 0.0612 | 0.7290 | 0.0 | 0.0090 | 0.9187 | 0.2126 | 0.6371 | 0.4440 | 0.4329 | nan | 0.2754 | 0.6159 | 0.5759 | 0.0 | 0.9525 | 0.8437 | 0.9847 | 0.0022 | 0.2114 | 0.5207 | 0.0 | nan | 0.7606 | 0.8729 | 0.7049 | 0.7360 | 0.3947 | nan | 0.5843 | 0.5929 | 0.0388 | 0.8677 | 0.1858 | 0.0 | 0.4800 | 0.0611 | 0.5925 | 0.0 | 0.0081 | 0.7734 | 0.1887 | 0.5505 | 0.3515 | 0.3299 | nan | 0.2270 | 0.5001 | 0.3717 | 0.0 | 0.8893 | 0.7697 | 0.9519 | 0.0017 | 0.0951 | 0.3903 | 0.0 | | 0.1222 | 26.3 | 5260 | 0.7007 | 0.4112 | 0.4917 | 0.8733 | nan | 0.8112 | 0.9544 | 0.7829 | 0.8609 | 0.5339 | nan | 0.6915 | 0.8542 | 0.0131 | 0.9557 | 0.1905 | 0.0 | 0.7522 | 0.0119 | 0.7064 | 0.0 | 0.0188 | 0.9257 | 0.1751 | 0.6510 | 0.4046 | 0.4786 | nan | 0.2755 | 0.6000 | 0.5597 | 0.0 | 0.9580 | 0.8450 | 0.9834 | 0.0081 | 0.2413 | 0.4900 | 0.0 | nan | 0.7343 | 0.8749 | 0.6988 | 0.6905 | 0.4023 | nan | 0.5853 | 0.6266 | 0.0117 | 0.8660 | 0.1613 | 0.0 | 0.4892 | 0.0119 | 0.6045 | 0.0 | 0.0182 | 0.7709 | 0.1540 | 0.5544 | 0.3348 | 0.3448 | nan | 0.2370 | 0.4994 | 0.3791 | 0.0 | 0.8864 | 0.7669 | 0.9530 | 0.0057 | 0.1116 | 0.3860 | 0.0 | | 0.1183 | 26.4 | 5280 | 0.6994 | 0.4117 | 0.4921 | 0.8735 | nan | 0.8099 | 0.9557 | 0.8170 | 0.8511 | 0.5359 | nan | 0.7151 | 0.8481 | 0.0280 | 0.9527 | 0.1402 | 0.0 | 0.7074 | 0.0186 | 0.7115 | 0.0 | 0.0120 | 0.9337 | 0.1862 | 0.6441 | 0.4187 | 0.4533 | nan | 0.3415 | 0.6097 | 0.5326 | 0.0 | 0.9519 | 0.8405 | 0.9825 | 0.0120 | 0.2630 | 0.4755 | 0.0 | nan | 0.7376 | 0.8751 | 0.6819 | 0.6959 | 0.4059 | nan | 0.5829 | 0.6393 | 0.0226 | 0.8642 | 0.1216 | 0.0 | 0.4882 | 0.0186 | 0.5983 | 0.0 | 0.0119 | 0.7705 | 0.1645 | 0.5578 | 0.3381 | 0.3383 | nan | 0.2654 | 0.5020 | 0.3845 | 0.0 | 0.8874 | 0.7637 | 0.9527 | 0.0081 | 0.1143 | 0.3839 | 0.0 | | 0.0751 | 26.5 | 5300 | 0.6749 | 0.4114 | 0.4891 | 0.8763 | nan | 0.8349 | 0.9526 | 0.7899 | 0.8599 | 0.5438 | nan | 0.7032 | 0.8391 | 0.0881 | 0.9516 | 0.1227 | 0.0 | 0.6688 | 0.0251 | 0.7336 | 0.0 | 0.0 | 0.9283 | 0.1750 | 0.6450 | 0.3704 | 0.4431 | nan | 0.2957 | 0.6250 | 0.5594 | 0.0 | 0.9550 | 0.8531 | 0.9841 | 0.0046 | 0.1953 | 0.5032 | 0.0 | nan | 0.7504 | 0.8752 | 0.7089 | 0.7251 | 0.3993 | nan | 0.5903 | 0.6139 | 0.0498 | 0.8656 | 0.1065 | 0.0 | 0.4799 | 0.0251 | 0.5932 | 0.0 | 0.0 | 0.7705 | 0.1554 | 0.5509 | 0.3146 | 0.3413 | nan | 0.2379 | 0.5074 | 0.3865 | 0.0 | 0.8889 | 0.7779 | 0.9511 | 0.0034 | 0.1035 | 0.3907 | 0.0 | | 0.0861 | 26.6 | 5320 | 0.6729 | 0.4105 | 0.4900 | 0.8744 | nan | 0.8149 | 0.9547 | 0.8105 | 0.8521 | 0.5684 | nan | 0.6942 | 0.8356 | 0.0508 | 0.9531 | 0.1030 | 0.0 | 0.7563 | 0.0308 | 0.7031 | 0.0 | 0.0034 | 0.9281 | 0.2793 | 0.6487 | 0.3257 | 0.4017 | nan | 0.3096 | 0.6137 | 0.5375 | 0.0 | 0.9537 | 0.8597 | 0.9814 | 0.0027 | 0.2138 | 0.4945 | 0.0 | nan | 0.7427 | 0.8728 | 0.6707 | 0.7218 | 0.4035 | nan | 0.5825 | 0.6148 | 0.0303 | 0.8661 | 0.0904 | 0.0 | 0.5141 | 0.0308 | 0.5987 | 0.0 | 0.0034 | 0.7702 | 0.2363 | 0.5536 | 0.2826 | 0.3157 | nan | 0.2324 | 0.5074 | 0.3798 | 0.0 | 0.8882 | 0.7793 | 0.9523 | 0.0021 | 0.1075 | 0.3858 | 0.0 | | 0.1072 | 26.7 | 5340 | 0.6385 | 0.4165 | 0.5051 | 0.8771 | nan | 0.8402 | 0.9509 | 0.8062 | 0.8605 | 0.5389 | nan | 0.7122 | 0.8531 | 0.0539 | 0.9585 | 0.1279 | 0.0 | 0.8263 | 0.0386 | 0.7015 | 0.0 | 0.0135 | 0.9078 | 0.4718 | 0.6603 | 0.3479 | 0.3799 | nan | 0.3560 | 0.6282 | 0.5538 | 0.0 | 0.9560 | 0.8533 | 0.9839 | 0.0029 | 0.2515 | 0.5266 | 0.0 | nan | 0.7579 | 0.8816 | 0.6728 | 0.7196 | 0.3906 | nan | 0.5818 | 0.6118 | 0.0338 | 0.8640 | 0.1143 | 0.0 | 0.5220 | 0.0386 | 0.5987 | 0.0 | 0.0133 | 0.7761 | 0.3066 | 0.5636 | 0.2985 | 0.3097 | nan | 0.2655 | 0.5059 | 0.3742 | 0.0 | 0.8888 | 0.7756 | 0.9515 | 0.0024 | 0.1193 | 0.3895 | 0.0 | | 0.075 | 26.8 | 5360 | 0.6882 | 0.4178 | 0.4994 | 0.8770 | nan | 0.8276 | 0.9542 | 0.8144 | 0.8555 | 0.5746 | nan | 0.6765 | 0.8361 | 0.0220 | 0.9521 | 0.1819 | 0.0 | 0.7861 | 0.0414 | 0.7041 | 0.0 | 0.0025 | 0.9256 | 0.3586 | 0.6814 | 0.3652 | 0.3536 | nan | 0.3433 | 0.6089 | 0.5657 | 0.0 | 0.9546 | 0.8467 | 0.9802 | 0.0030 | 0.2599 | 0.5062 | 0.0 | nan | 0.7502 | 0.8759 | 0.6701 | 0.7418 | 0.3962 | nan | 0.5801 | 0.6557 | 0.0151 | 0.8644 | 0.1547 | 0.0 | 0.5061 | 0.0414 | 0.6178 | 0.0 | 0.0024 | 0.7767 | 0.2725 | 0.5582 | 0.3139 | 0.2876 | nan | 0.2507 | 0.5072 | 0.3787 | 0.0 | 0.8898 | 0.7779 | 0.9536 | 0.0026 | 0.1324 | 0.3968 | 0.0 | | 0.1095 | 26.9 | 5380 | 0.6906 | 0.4114 | 0.4891 | 0.8781 | nan | 0.8369 | 0.9576 | 0.7971 | 0.8501 | 0.5551 | nan | 0.7060 | 0.8135 | 0.0291 | 0.9532 | 0.1185 | 0.0 | 0.6903 | 0.0319 | 0.7166 | 0.0 | 0.0 | 0.9269 | 0.2260 | 0.6456 | 0.3905 | 0.3490 | nan | 0.3174 | 0.6271 | 0.5923 | 0.0 | 0.9549 | 0.8555 | 0.9855 | 0.0044 | 0.2436 | 0.4768 | 0.0 | nan | 0.7561 | 0.8745 | 0.6692 | 0.7556 | 0.4017 | nan | 0.5824 | 0.6476 | 0.0187 | 0.8661 | 0.1019 | 0.0 | 0.4578 | 0.0319 | 0.6070 | 0.0 | 0.0 | 0.7750 | 0.1968 | 0.5555 | 0.3165 | 0.2879 | nan | 0.2456 | 0.5073 | 0.3771 | 0.0 | 0.8899 | 0.7780 | 0.9496 | 0.0039 | 0.1203 | 0.3899 | 0.0 | | 0.1506 | 27.0 | 5400 | 0.6823 | 0.4105 | 0.4968 | 0.8779 | nan | 0.8466 | 0.9546 | 0.7995 | 0.8469 | 0.5295 | nan | 0.7037 | 0.8394 | 0.0635 | 0.9543 | 0.1163 | 0.0 | 0.7633 | 0.0475 | 0.7615 | 0.0 | 0.0 | 0.9358 | 0.1898 | 0.6437 | 0.3914 | 0.4259 | nan | 0.3250 | 0.6204 | 0.5437 | 0.0 | 0.9495 | 0.8699 | 0.9813 | 0.0103 | 0.3204 | 0.4652 | 0.0 | nan | 0.7623 | 0.8762 | 0.6641 | 0.7556 | 0.3897 | nan | 0.5861 | 0.6245 | 0.0368 | 0.8685 | 0.0992 | 0.0 | 0.4243 | 0.0475 | 0.6058 | 0.0 | 0.0 | 0.7695 | 0.1659 | 0.5516 | 0.3248 | 0.3197 | nan | 0.2507 | 0.5065 | 0.3748 | 0.0 | 0.8914 | 0.7819 | 0.9530 | 0.0080 | 0.1189 | 0.3790 | 0.0 | | 0.0938 | 27.1 | 5420 | 0.6557 | 0.4105 | 0.4863 | 0.8819 | nan | 0.8763 | 0.9536 | 0.7881 | 0.8415 | 0.5490 | nan | 0.7220 | 0.8324 | 0.0224 | 0.9549 | 0.1229 | 0.0 | 0.7538 | 0.0387 | 0.7186 | 0.0 | 0.0004 | 0.9286 | 0.1472 | 0.6398 | 0.3440 | 0.3325 | nan | 0.2764 | 0.6080 | 0.5649 | 0.0 | 0.9557 | 0.8681 | 0.9825 | 0.0012 | 0.2378 | 0.5020 | 0.0 | nan | 0.7828 | 0.8807 | 0.6887 | 0.7740 | 0.3939 | nan | 0.5933 | 0.6221 | 0.0134 | 0.8674 | 0.1067 | 0.0 | 0.4857 | 0.0387 | 0.6042 | 0.0 | 0.0004 | 0.7721 | 0.1336 | 0.5439 | 0.2990 | 0.2742 | nan | 0.2357 | 0.5013 | 0.3789 | 0.0 | 0.8894 | 0.7802 | 0.9524 | 0.0011 | 0.1293 | 0.3938 | 0.0 | | 0.0922 | 27.2 | 5440 | 0.6789 | 0.4124 | 0.4937 | 0.8778 | nan | 0.8441 | 0.9528 | 0.7871 | 0.8561 | 0.5597 | nan | 0.7062 | 0.8316 | 0.0127 | 0.9555 | 0.1165 | 0.0 | 0.7839 | 0.0331 | 0.7121 | 0.0 | 0.0036 | 0.9277 | 0.2226 | 0.6475 | 0.3821 | 0.3915 | nan | 0.3323 | 0.6105 | 0.6060 | 0.0 | 0.9571 | 0.8456 | 0.9830 | 0.0124 | 0.2411 | 0.4829 | 0.0 | nan | 0.7593 | 0.8755 | 0.7016 | 0.7540 | 0.3872 | nan | 0.5912 | 0.6258 | 0.0077 | 0.8663 | 0.1008 | 0.0 | 0.4567 | 0.0331 | 0.6051 | 0.0 | 0.0034 | 0.7729 | 0.1901 | 0.5497 | 0.3172 | 0.3099 | nan | 0.2680 | 0.5033 | 0.3734 | 0.0 | 0.8883 | 0.7725 | 0.9521 | 0.0100 | 0.1303 | 0.3909 | 0.0 | | 0.096 | 27.3 | 5460 | 0.6704 | 0.4060 | 0.4903 | 0.8754 | nan | 0.8254 | 0.9533 | 0.7920 | 0.8735 | 0.5246 | nan | 0.6949 | 0.8410 | 0.0581 | 0.9554 | 0.1050 | 0.0 | 0.7937 | 0.0133 | 0.7730 | 0.0 | 0.0007 | 0.9311 | 0.2743 | 0.6469 | 0.3820 | 0.3244 | nan | 0.2645 | 0.6060 | 0.5833 | 0.0 | 0.9549 | 0.8581 | 0.9833 | 0.0101 | 0.2081 | 0.4576 | 0.0 | nan | 0.7435 | 0.8795 | 0.6874 | 0.7072 | 0.3941 | nan | 0.5839 | 0.5840 | 0.0354 | 0.8667 | 0.0918 | 0.0 | 0.4936 | 0.0133 | 0.5650 | 0.0 | 0.0007 | 0.7718 | 0.2335 | 0.5515 | 0.3144 | 0.2713 | nan | 0.2367 | 0.5008 | 0.3740 | 0.0 | 0.8891 | 0.7737 | 0.9509 | 0.0084 | 0.0967 | 0.3735 | 0.0 | | 0.0978 | 27.4 | 5480 | 0.6725 | 0.4047 | 0.4838 | 0.8759 | nan | 0.8290 | 0.9558 | 0.8021 | 0.8683 | 0.4995 | nan | 0.7111 | 0.8203 | 0.0517 | 0.9528 | 0.1147 | 0.0 | 0.6795 | 0.0295 | 0.7583 | 0.0 | 0.0 | 0.9286 | 0.2051 | 0.6574 | 0.3569 | 0.3430 | nan | 0.2604 | 0.6237 | 0.5778 | 0.0 | 0.9572 | 0.8412 | 0.9829 | 0.0104 | 0.1870 | 0.4760 | 0.0 | nan | 0.7490 | 0.8809 | 0.6835 | 0.7109 | 0.3761 | nan | 0.5841 | 0.5973 | 0.0304 | 0.8676 | 0.0987 | 0.0 | 0.4418 | 0.0295 | 0.5886 | 0.0 | 0.0 | 0.7723 | 0.1829 | 0.5593 | 0.3080 | 0.2819 | nan | 0.2282 | 0.5039 | 0.3866 | 0.0 | 0.8879 | 0.7694 | 0.9508 | 0.0081 | 0.0898 | 0.3821 | 0.0 | | 0.1081 | 27.5 | 5500 | 0.6632 | 0.4076 | 0.4835 | 0.8783 | nan | 0.8643 | 0.9516 | 0.7971 | 0.8583 | 0.5222 | nan | 0.7090 | 0.8066 | 0.0313 | 0.9567 | 0.1047 | 0.0 | 0.6207 | 0.0330 | 0.7483 | 0.0 | 0.0 | 0.9300 | 0.2751 | 0.6632 | 0.3831 | 0.2637 | nan | 0.3216 | 0.6242 | 0.5814 | 0.0 | 0.9588 | 0.7810 | 0.9817 | 0.0221 | 0.1954 | 0.4874 | 0.0 | nan | 0.7753 | 0.8867 | 0.6966 | 0.7449 | 0.3831 | nan | 0.5873 | 0.6360 | 0.0203 | 0.8656 | 0.0909 | 0.0 | 0.4102 | 0.0330 | 0.5889 | 0.0 | 0.0 | 0.7705 | 0.2247 | 0.5607 | 0.3196 | 0.2249 | nan | 0.2714 | 0.5076 | 0.3911 | 0.0 | 0.8806 | 0.7286 | 0.9519 | 0.0116 | 0.0938 | 0.3870 | 0.0 | | 0.0712 | 27.6 | 5520 | 0.6932 | 0.4087 | 0.4852 | 0.8768 | nan | 0.8332 | 0.9596 | 0.8028 | 0.8385 | 0.5178 | nan | 0.7153 | 0.8222 | 0.0493 | 0.9471 | 0.1109 | 0.0 | 0.6470 | 0.0600 | 0.7521 | 0.0 | 0.0019 | 0.9282 | 0.2295 | 0.6720 | 0.3390 | 0.2774 | nan | 0.3487 | 0.6283 | 0.5638 | 0.0 | 0.9566 | 0.8371 | 0.9832 | 0.0143 | 0.2063 | 0.4831 | 0.0 | nan | 0.7533 | 0.8757 | 0.6709 | 0.7481 | 0.3994 | nan | 0.5842 | 0.6202 | 0.0310 | 0.8688 | 0.0967 | 0.0 | 0.4354 | 0.0600 | 0.5986 | 0.0 | 0.0018 | 0.7740 | 0.1988 | 0.5605 | 0.2965 | 0.2358 | nan | 0.2775 | 0.5082 | 0.3916 | 0.0 | 0.8852 | 0.7556 | 0.9511 | 0.0095 | 0.1038 | 0.3855 | 0.0 | | 0.1027 | 27.7 | 5540 | 0.6726 | 0.4118 | 0.4918 | 0.8777 | nan | 0.8437 | 0.9552 | 0.7899 | 0.8477 | 0.5416 | nan | 0.7039 | 0.8241 | 0.0552 | 0.9555 | 0.1165 | 0.0 | 0.7872 | 0.0404 | 0.7305 | 0.0 | 0.0 | 0.9259 | 0.2684 | 0.6667 | 0.3302 | 0.3230 | nan | 0.3216 | 0.6274 | 0.5729 | 0.0 | 0.9556 | 0.8468 | 0.9843 | 0.0049 | 0.2345 | 0.4851 | 0.0 | nan | 0.7561 | 0.8770 | 0.6958 | 0.7531 | 0.3973 | nan | 0.5829 | 0.6304 | 0.0335 | 0.8641 | 0.1020 | 0.0 | 0.4693 | 0.0404 | 0.5951 | 0.0 | 0.0 | 0.7749 | 0.2252 | 0.5508 | 0.2920 | 0.2629 | nan | 0.2534 | 0.5062 | 0.3897 | 0.0 | 0.8872 | 0.7672 | 0.9510 | 0.0033 | 0.1268 | 0.3899 | 0.0 | | 0.0892 | 27.8 | 5560 | 0.6352 | 0.4118 | 0.4884 | 0.8813 | nan | 0.8804 | 0.9525 | 0.7938 | 0.8414 | 0.5485 | nan | 0.7087 | 0.8153 | 0.0402 | 0.9562 | 0.1179 | 0.0 | 0.8002 | 0.0301 | 0.6878 | 0.0 | 0.0 | 0.9299 | 0.1701 | 0.6682 | 0.3440 | 0.4775 | nan | 0.2656 | 0.5936 | 0.5302 | 0.0 | 0.9541 | 0.8372 | 0.9861 | 0.0032 | 0.1952 | 0.5014 | 0.0 | nan | 0.7858 | 0.8827 | 0.7003 | 0.7763 | 0.3848 | nan | 0.5858 | 0.6419 | 0.0244 | 0.8634 | 0.1014 | 0.0 | 0.4713 | 0.0301 | 0.5872 | 0.0 | 0.0 | 0.7710 | 0.1535 | 0.5495 | 0.3039 | 0.3359 | nan | 0.2132 | 0.4990 | 0.3878 | 0.0 | 0.8881 | 0.7675 | 0.9497 | 0.0020 | 0.1255 | 0.3961 | 0.0 | | 0.0857 | 27.9 | 5580 | 0.6976 | 0.4086 | 0.4967 | 0.8745 | nan | 0.8227 | 0.9517 | 0.8055 | 0.8587 | 0.5235 | nan | 0.7266 | 0.8383 | 0.0748 | 0.9593 | 0.1181 | 0.0 | 0.7492 | 0.0193 | 0.7157 | 0.0 | 0.0019 | 0.9173 | 0.2253 | 0.6414 | 0.4203 | 0.4937 | nan | 0.2772 | 0.6241 | 0.5605 | 0.0 | 0.9543 | 0.8592 | 0.9834 | 0.0037 | 0.2642 | 0.5051 | 0.0 | nan | 0.7448 | 0.8754 | 0.6542 | 0.7406 | 0.3817 | nan | 0.5784 | 0.6356 | 0.0402 | 0.8633 | 0.1012 | 0.0 | 0.4460 | 0.0193 | 0.5878 | 0.0 | 0.0018 | 0.7743 | 0.1996 | 0.5449 | 0.3332 | 0.3409 | nan | 0.2216 | 0.5022 | 0.3775 | 0.0 | 0.8887 | 0.7681 | 0.9517 | 0.0029 | 0.1119 | 0.3869 | 0.0 | | 0.0972 | 28.0 | 5600 | 0.6871 | 0.4086 | 0.4893 | 0.8766 | nan | 0.8299 | 0.9635 | 0.8034 | 0.8568 | 0.4856 | nan | 0.6668 | 0.8288 | 0.0648 | 0.9496 | 0.1159 | 0.0 | 0.6802 | 0.0382 | 0.7206 | 0.0 | 0.0052 | 0.9283 | 0.2539 | 0.6530 | 0.3671 | 0.4558 | nan | 0.2910 | 0.6120 | 0.5659 | 0.0 | 0.9574 | 0.8490 | 0.9824 | 0.0049 | 0.2732 | 0.4544 | 0.0 | nan | 0.7460 | 0.8773 | 0.6448 | 0.7394 | 0.3866 | nan | 0.5750 | 0.6389 | 0.0358 | 0.8679 | 0.1019 | 0.0 | 0.4226 | 0.0382 | 0.5962 | 0.0 | 0.0051 | 0.7747 | 0.2163 | 0.5552 | 0.3197 | 0.3337 | nan | 0.2179 | 0.5055 | 0.3791 | 0.0 | 0.8878 | 0.7706 | 0.9520 | 0.0039 | 0.1103 | 0.3722 | 0.0 | | 0.0743 | 28.1 | 5620 | 0.6541 | 0.4056 | 0.4848 | 0.8790 | nan | 0.8631 | 0.9537 | 0.8229 | 0.8409 | 0.5408 | nan | 0.6942 | 0.7871 | 0.0371 | 0.9509 | 0.1217 | 0.0 | 0.6734 | 0.0553 | 0.7003 | 0.0 | 0.0110 | 0.9429 | 0.1248 | 0.6635 | 0.3811 | 0.5042 | nan | 0.2089 | 0.6030 | 0.5817 | 0.0 | 0.9534 | 0.8205 | 0.9791 | 0.0077 | 0.2077 | 0.4827 | 0.0 | nan | 0.7753 | 0.8886 | 0.5865 | 0.7629 | 0.3910 | nan | 0.5861 | 0.6408 | 0.0199 | 0.8648 | 0.1062 | 0.0 | 0.4367 | 0.0553 | 0.5790 | 0.0 | 0.0104 | 0.7665 | 0.1147 | 0.5501 | 0.3330 | 0.3413 | nan | 0.1779 | 0.5043 | 0.3899 | 0.0 | 0.8862 | 0.7527 | 0.9532 | 0.0058 | 0.1139 | 0.3876 | 0.0 | | 0.0942 | 28.2 | 5640 | 0.6635 | 0.4081 | 0.4960 | 0.8790 | nan | 0.8525 | 0.9555 | 0.8179 | 0.8496 | 0.5351 | nan | 0.7077 | 0.8176 | 0.0422 | 0.9503 | 0.1236 | 0.0101 | 0.7732 | 0.0344 | 0.7263 | 0.0024 | 0.0073 | 0.9277 | 0.1708 | 0.6411 | 0.4058 | 0.5404 | nan | 0.2374 | 0.6133 | 0.5887 | 0.0 | 0.9534 | 0.8582 | 0.9830 | 0.0130 | 0.2590 | 0.4756 | 0.0 | nan | 0.7672 | 0.8834 | 0.6328 | 0.7648 | 0.4010 | nan | 0.5878 | 0.6325 | 0.0224 | 0.8678 | 0.1067 | 0.0091 | 0.4179 | 0.0344 | 0.5821 | 0.0024 | 0.0070 | 0.7730 | 0.1557 | 0.5549 | 0.3368 | 0.3462 | nan | 0.2009 | 0.5048 | 0.3854 | 0.0 | 0.8885 | 0.7682 | 0.9517 | 0.0087 | 0.0918 | 0.3746 | 0.0 | | 0.0886 | 28.3 | 5660 | 0.6853 | 0.4070 | 0.4902 | 0.8768 | nan | 0.8376 | 0.9575 | 0.8021 | 0.8487 | 0.5298 | nan | 0.7247 | 0.8240 | 0.0681 | 0.9529 | 0.1197 | 0.0025 | 0.7321 | 0.0702 | 0.7406 | 0.0 | 0.0094 | 0.9398 | 0.1876 | 0.6085 | 0.3805 | 0.3841 | nan | 0.2739 | 0.6085 | 0.5558 | 0.0 | 0.9502 | 0.8390 | 0.9827 | 0.0155 | 0.2440 | 0.4976 | 0.0 | nan | 0.7588 | 0.8773 | 0.6697 | 0.7653 | 0.3903 | nan | 0.5894 | 0.6166 | 0.0336 | 0.8655 | 0.1039 | 0.0024 | 0.4033 | 0.0702 | 0.5867 | 0.0 | 0.0092 | 0.7629 | 0.1709 | 0.5351 | 0.3236 | 0.2880 | nan | 0.2085 | 0.5044 | 0.3839 | 0.0 | 0.8877 | 0.7654 | 0.9520 | 0.0106 | 0.0980 | 0.3892 | 0.0 | | 0.0686 | 28.4 | 5680 | 0.6927 | 0.4090 | 0.4976 | 0.8763 | nan | 0.8348 | 0.9564 | 0.8030 | 0.8511 | 0.5345 | nan | 0.7257 | 0.8044 | 0.0892 | 0.9515 | 0.1484 | 0.0079 | 0.7587 | 0.0563 | 0.7866 | 0.0 | 0.0058 | 0.9275 | 0.2435 | 0.6232 | 0.3731 | 0.4607 | nan | 0.2507 | 0.6339 | 0.5748 | 0.0 | 0.9553 | 0.8259 | 0.9828 | 0.0151 | 0.2420 | 0.4988 | 0.0 | nan | 0.7546 | 0.8767 | 0.6618 | 0.7606 | 0.3896 | nan | 0.5871 | 0.6214 | 0.0452 | 0.8665 | 0.1262 | 0.0077 | 0.4076 | 0.0563 | 0.5419 | 0.0 | 0.0057 | 0.7684 | 0.2128 | 0.5385 | 0.3169 | 0.3450 | nan | 0.2047 | 0.5044 | 0.3781 | 0.0 | 0.8854 | 0.7555 | 0.9524 | 0.0115 | 0.1106 | 0.3952 | 0.0 | | 0.0875 | 28.5 | 5700 | 0.6821 | 0.4057 | 0.4935 | 0.8757 | nan | 0.8305 | 0.9575 | 0.8115 | 0.8588 | 0.5157 | nan | 0.7006 | 0.7422 | 0.2954 | 0.9579 | 0.1075 | 0.0196 | 0.8595 | 0.0428 | 0.7431 | 0.0 | 0.0050 | 0.9338 | 0.1709 | 0.6354 | 0.3660 | 0.3553 | nan | 0.2370 | 0.6056 | 0.5617 | 0.0 | 0.9548 | 0.8313 | 0.9851 | 0.0051 | 0.2237 | 0.4795 | 0.0 | nan | 0.7482 | 0.8784 | 0.6547 | 0.7435 | 0.3981 | nan | 0.5876 | 0.5876 | 0.0877 | 0.8614 | 0.0961 | 0.0196 | 0.4713 | 0.0428 | 0.5644 | 0.0 | 0.0050 | 0.7676 | 0.1565 | 0.5372 | 0.3097 | 0.2924 | nan | 0.2009 | 0.5005 | 0.3819 | 0.0 | 0.8853 | 0.7565 | 0.9506 | 0.0041 | 0.1105 | 0.3831 | 0.0 | | 0.1119 | 28.6 | 5720 | 0.6901 | 0.4077 | 0.4900 | 0.8763 | nan | 0.8302 | 0.9586 | 0.7965 | 0.8520 | 0.5337 | nan | 0.7242 | 0.8196 | 0.0748 | 0.9513 | 0.1156 | 0.0012 | 0.7790 | 0.0846 | 0.7200 | 0.0 | 0.0159 | 0.9362 | 0.2396 | 0.6415 | 0.3514 | 0.3662 | nan | 0.2889 | 0.6017 | 0.5220 | 0.0 | 0.9530 | 0.8405 | 0.9830 | 0.0093 | 0.2357 | 0.4550 | 0.0 | nan | 0.7512 | 0.8756 | 0.6802 | 0.7477 | 0.3918 | nan | 0.5922 | 0.5992 | 0.0375 | 0.8659 | 0.1026 | 0.0012 | 0.4171 | 0.0845 | 0.5942 | 0.0 | 0.0158 | 0.7687 | 0.2068 | 0.5397 | 0.3038 | 0.2886 | nan | 0.2047 | 0.5016 | 0.3771 | 0.0 | 0.8878 | 0.7668 | 0.9526 | 0.0076 | 0.1130 | 0.3719 | 0.0 | | 0.065 | 28.7 | 5740 | 0.6797 | 0.4143 | 0.4985 | 0.8774 | nan | 0.8452 | 0.9559 | 0.8006 | 0.8493 | 0.5016 | nan | 0.7138 | 0.8127 | 0.0544 | 0.9576 | 0.1546 | 0.0006 | 0.7711 | 0.1104 | 0.7272 | 0.0 | 0.0075 | 0.9237 | 0.2652 | 0.6554 | 0.4013 | 0.4589 | nan | 0.3079 | 0.6201 | 0.5540 | 0.0 | 0.9557 | 0.8299 | 0.9835 | 0.0105 | 0.2206 | 0.5035 | 0.0 | nan | 0.7559 | 0.8769 | 0.6712 | 0.7529 | 0.4004 | nan | 0.5876 | 0.6260 | 0.0293 | 0.8636 | 0.1338 | 0.0005 | 0.4466 | 0.1104 | 0.5750 | 0.0 | 0.0075 | 0.7745 | 0.2198 | 0.5464 | 0.3310 | 0.3431 | nan | 0.2321 | 0.5006 | 0.3856 | 0.0 | 0.8848 | 0.7497 | 0.9528 | 0.0091 | 0.1076 | 0.3819 | 0.0 | | 0.0934 | 28.8 | 5760 | 0.6465 | 0.4116 | 0.4904 | 0.8790 | nan | 0.8431 | 0.9547 | 0.7972 | 0.8962 | 0.5132 | nan | 0.7213 | 0.7866 | 0.0628 | 0.9542 | 0.2099 | 0.0010 | 0.6565 | 0.0928 | 0.6967 | 0.0 | 0.0017 | 0.9361 | 0.1630 | 0.6224 | 0.3858 | 0.4652 | nan | 0.2589 | 0.6364 | 0.5783 | 0.0 | 0.9554 | 0.8465 | 0.9811 | 0.0046 | 0.2253 | 0.4466 | 0.0 | nan | 0.7549 | 0.8802 | 0.6827 | 0.7726 | 0.4038 | nan | 0.5928 | 0.6084 | 0.0322 | 0.8658 | 0.1788 | 0.0010 | 0.4057 | 0.0927 | 0.5834 | 0.0 | 0.0017 | 0.7648 | 0.1460 | 0.5400 | 0.3277 | 0.3493 | nan | 0.2143 | 0.5050 | 0.3887 | 0.0 | 0.8878 | 0.7628 | 0.9532 | 0.0042 | 0.1013 | 0.3690 | 0.0 | | 0.0913 | 28.9 | 5780 | 0.7101 | 0.4135 | 0.4998 | 0.8756 | nan | 0.8296 | 0.9634 | 0.8012 | 0.8352 | 0.4835 | nan | 0.7196 | 0.8435 | 0.0852 | 0.9564 | 0.2459 | 0.0 | 0.6448 | 0.0940 | 0.7343 | 0.0 | 0.0033 | 0.9281 | 0.2470 | 0.6256 | 0.4120 | 0.5362 | nan | 0.2655 | 0.5973 | 0.6031 | 0.0 | 0.9490 | 0.8486 | 0.9847 | 0.0051 | 0.2759 | 0.4760 | 0.0 | nan | 0.7505 | 0.8698 | 0.6877 | 0.7422 | 0.3776 | nan | 0.5863 | 0.5962 | 0.0473 | 0.8671 | 0.2066 | 0.0 | 0.4120 | 0.0940 | 0.5942 | 0.0 | 0.0032 | 0.7700 | 0.2057 | 0.5449 | 0.3383 | 0.3747 | nan | 0.2026 | 0.5004 | 0.3717 | 0.0 | 0.8905 | 0.7700 | 0.9527 | 0.0047 | 0.1002 | 0.3704 | 0.0 | | 0.072 | 29.0 | 5800 | 0.6968 | 0.4099 | 0.4834 | 0.8762 | nan | 0.8334 | 0.9610 | 0.8071 | 0.8424 | 0.4897 | nan | 0.7077 | 0.8352 | 0.0391 | 0.9520 | 0.1520 | 0.0 | 0.5859 | 0.0863 | 0.7320 | 0.0 | 0.0046 | 0.9333 | 0.1649 | 0.6309 | 0.3745 | 0.4073 | nan | 0.3005 | 0.6135 | 0.5685 | 0.0 | 0.9567 | 0.8291 | 0.9849 | 0.0106 | 0.1745 | 0.4902 | 0.0 | nan | 0.7528 | 0.8734 | 0.6771 | 0.7447 | 0.3839 | nan | 0.5856 | 0.6162 | 0.0255 | 0.8688 | 0.1347 | 0.0 | 0.4398 | 0.0863 | 0.5976 | 0.0 | 0.0046 | 0.7651 | 0.1485 | 0.5473 | 0.3246 | 0.3305 | nan | 0.2484 | 0.4965 | 0.3926 | 0.0 | 0.8861 | 0.7581 | 0.9514 | 0.0094 | 0.0807 | 0.3850 | 0.0 | | 0.0781 | 29.1 | 5820 | 0.6973 | 0.4131 | 0.4886 | 0.8773 | nan | 0.8343 | 0.9599 | 0.7973 | 0.8468 | 0.5317 | nan | 0.7142 | 0.8298 | 0.0164 | 0.9593 | 0.1801 | 0.0 | 0.6026 | 0.0882 | 0.7129 | 0.0 | 0.0032 | 0.9345 | 0.2456 | 0.6484 | 0.3544 | 0.4435 | nan | 0.3119 | 0.6138 | 0.5755 | 0.0 | 0.9525 | 0.8419 | 0.9814 | 0.0059 | 0.1920 | 0.4572 | 0.0 | nan | 0.7539 | 0.8760 | 0.6968 | 0.7523 | 0.3918 | nan | 0.5919 | 0.6012 | 0.0116 | 0.8643 | 0.1596 | 0.0 | 0.4408 | 0.0882 | 0.5925 | 0.0 | 0.0032 | 0.7668 | 0.2068 | 0.5522 | 0.3123 | 0.3525 | nan | 0.2465 | 0.4988 | 0.3882 | 0.0 | 0.8884 | 0.7642 | 0.9537 | 0.0050 | 0.0883 | 0.3717 | 0.0 | | 0.0882 | 29.2 | 5840 | 0.6949 | 0.4107 | 0.4873 | 0.8777 | nan | 0.8417 | 0.9601 | 0.7950 | 0.8391 | 0.5112 | nan | 0.7123 | 0.8305 | 0.0195 | 0.9536 | 0.1604 | 0.0 | 0.6562 | 0.0472 | 0.7099 | 0.0 | 0.0 | 0.9325 | 0.1996 | 0.6680 | 0.3698 | 0.5032 | nan | 0.2998 | 0.5804 | 0.5492 | 0.0 | 0.9570 | 0.8352 | 0.9825 | 0.0130 | 0.1928 | 0.4749 | 0.0 | nan | 0.7550 | 0.8757 | 0.6943 | 0.7509 | 0.4055 | nan | 0.5837 | 0.6050 | 0.0133 | 0.8664 | 0.1407 | 0.0 | 0.4336 | 0.0472 | 0.5829 | 0.0 | 0.0 | 0.7708 | 0.1744 | 0.5551 | 0.3162 | 0.3544 | nan | 0.2495 | 0.4980 | 0.3864 | 0.0 | 0.8861 | 0.7591 | 0.9531 | 0.0100 | 0.1005 | 0.3753 | 0.0 | | 0.0707 | 29.3 | 5860 | 0.6947 | 0.4112 | 0.4895 | 0.8774 | nan | 0.8247 | 0.9609 | 0.7985 | 0.8580 | 0.5161 | nan | 0.7280 | 0.7997 | 0.0293 | 0.9579 | 0.1443 | 0.0 | 0.6550 | 0.0522 | 0.6908 | 0.0 | 0.0013 | 0.9334 | 0.2002 | 0.6496 | 0.3956 | 0.4688 | nan | 0.2862 | 0.6281 | 0.5941 | 0.0 | 0.9545 | 0.8436 | 0.9820 | 0.0123 | 0.2338 | 0.4644 | 0.0 | nan | 0.7498 | 0.8752 | 0.6773 | 0.7486 | 0.4050 | nan | 0.5895 | 0.6463 | 0.0182 | 0.8622 | 0.1285 | 0.0 | 0.4120 | 0.0521 | 0.5773 | 0.0 | 0.0013 | 0.7698 | 0.1752 | 0.5495 | 0.3277 | 0.3557 | nan | 0.2325 | 0.5066 | 0.3845 | 0.0 | 0.8877 | 0.7646 | 0.9535 | 0.0102 | 0.1159 | 0.3829 | 0.0 | | 0.0808 | 29.4 | 5880 | 0.7035 | 0.4153 | 0.5005 | 0.8777 | nan | 0.8409 | 0.9609 | 0.8013 | 0.8432 | 0.5292 | nan | 0.6840 | 0.8192 | 0.0704 | 0.9569 | 0.1940 | 0.0002 | 0.7892 | 0.0759 | 0.6909 | 0.0 | 0.0115 | 0.9330 | 0.2708 | 0.6312 | 0.3761 | 0.4916 | nan | 0.3595 | 0.6176 | 0.5648 | 0.0 | 0.9512 | 0.8414 | 0.9804 | 0.0128 | 0.2294 | 0.4890 | 0.0 | nan | 0.7548 | 0.8765 | 0.6734 | 0.7502 | 0.4074 | nan | 0.5854 | 0.6300 | 0.0378 | 0.8660 | 0.1712 | 0.0002 | 0.3926 | 0.0759 | 0.5810 | 0.0 | 0.0114 | 0.7675 | 0.2253 | 0.5374 | 0.3222 | 0.3649 | nan | 0.2490 | 0.5043 | 0.3850 | 0.0 | 0.8883 | 0.7650 | 0.9545 | 0.0100 | 0.1134 | 0.3900 | 0.0 | | 0.0959 | 29.5 | 5900 | 0.6633 | 0.4123 | 0.4992 | 0.8769 | nan | 0.8516 | 0.9532 | 0.8257 | 0.8463 | 0.5411 | nan | 0.7325 | 0.7945 | 0.0996 | 0.9575 | 0.1970 | 0.0 | 0.8361 | 0.0722 | 0.7040 | 0.0 | 0.0015 | 0.9325 | 0.1979 | 0.6232 | 0.3559 | 0.4814 | nan | 0.3650 | 0.6066 | 0.5344 | 0.0 | 0.9574 | 0.7861 | 0.9828 | 0.0202 | 0.2246 | 0.4946 | 0.0 | nan | 0.7671 | 0.8815 | 0.6558 | 0.7607 | 0.4057 | nan | 0.5869 | 0.6192 | 0.0441 | 0.8637 | 0.1692 | 0.0 | 0.4204 | 0.0721 | 0.5866 | 0.0 | 0.0014 | 0.7655 | 0.1765 | 0.5306 | 0.3051 | 0.3484 | nan | 0.2809 | 0.5017 | 0.3784 | 0.0 | 0.8789 | 0.7286 | 0.9534 | 0.0144 | 0.1115 | 0.3860 | 0.0 | | 0.0771 | 29.6 | 5920 | 0.6884 | 0.4142 | 0.5007 | 0.8764 | nan | 0.8283 | 0.9584 | 0.8123 | 0.8427 | 0.5157 | nan | 0.7270 | 0.8310 | 0.0579 | 0.9523 | 0.2168 | 0.0 | 0.8208 | 0.0862 | 0.7416 | 0.0 | 0.0 | 0.9275 | 0.1550 | 0.6739 | 0.3409 | 0.5263 | nan | 0.3536 | 0.6307 | 0.5499 | 0.0 | 0.9567 | 0.8381 | 0.9828 | 0.0116 | 0.2159 | 0.4682 | 0.0 | nan | 0.7503 | 0.8739 | 0.6630 | 0.7414 | 0.4013 | nan | 0.5855 | 0.6113 | 0.0326 | 0.8692 | 0.1840 | 0.0 | 0.4372 | 0.0862 | 0.5978 | 0.0 | 0.0 | 0.7733 | 0.1416 | 0.5535 | 0.2984 | 0.3676 | nan | 0.2926 | 0.5067 | 0.3830 | 0.0 | 0.8852 | 0.7577 | 0.9536 | 0.0100 | 0.1185 | 0.3791 | 0.0 | | 0.0705 | 29.7 | 5940 | 0.6747 | 0.4181 | 0.5036 | 0.8780 | nan | 0.8426 | 0.9584 | 0.7923 | 0.8350 | 0.5360 | nan | 0.7188 | 0.8237 | 0.0655 | 0.9586 | 0.2272 | 0.0001 | 0.8067 | 0.0805 | 0.7526 | 0.0 | 0.0 | 0.9323 | 0.2434 | 0.6474 | 0.3673 | 0.5485 | nan | 0.3330 | 0.6053 | 0.5587 | 0.0 | 0.9538 | 0.8479 | 0.9846 | 0.0074 | 0.2233 | 0.4627 | 0.0 | nan | 0.7568 | 0.8740 | 0.6958 | 0.7447 | 0.4053 | nan | 0.5883 | 0.6244 | 0.0361 | 0.8644 | 0.1895 | 0.0001 | 0.4437 | 0.0805 | 0.5893 | 0.0 | 0.0 | 0.7708 | 0.2032 | 0.5473 | 0.3117 | 0.3816 | nan | 0.2713 | 0.5021 | 0.3805 | 0.0 | 0.8884 | 0.7714 | 0.9529 | 0.0068 | 0.1211 | 0.3776 | 0.0 | | 0.0472 | 29.8 | 5960 | 0.7109 | 0.4086 | 0.4956 | 0.8758 | nan | 0.8353 | 0.9621 | 0.8017 | 0.8276 | 0.5187 | nan | 0.7011 | 0.8241 | 0.1174 | 0.9530 | 0.1418 | 0.0 | 0.7473 | 0.0818 | 0.7646 | 0.0 | 0.0 | 0.9265 | 0.1480 | 0.6075 | 0.4195 | 0.5112 | nan | 0.2423 | 0.6126 | 0.5446 | 0.0 | 0.9582 | 0.8396 | 0.9838 | 0.0135 | 0.3227 | 0.4524 | 0.0 | nan | 0.7546 | 0.8730 | 0.6628 | 0.7390 | 0.4098 | nan | 0.5845 | 0.6204 | 0.0569 | 0.8708 | 0.1223 | 0.0 | 0.4144 | 0.0818 | 0.5821 | 0.0 | 0.0 | 0.7687 | 0.1324 | 0.5337 | 0.3295 | 0.3688 | nan | 0.2083 | 0.5037 | 0.3762 | 0.0 | 0.8858 | 0.7640 | 0.9523 | 0.0120 | 0.1100 | 0.3565 | 0.0 | | 0.0856 | 29.9 | 5980 | 0.7068 | 0.4075 | 0.4915 | 0.8769 | nan | 0.8328 | 0.9616 | 0.7846 | 0.8604 | 0.5267 | nan | 0.7001 | 0.8317 | 0.0490 | 0.9578 | 0.1086 | 0.0 | 0.7661 | 0.0397 | 0.6910 | 0.0 | 0.0025 | 0.9349 | 0.2551 | 0.6181 | 0.3754 | 0.4470 | nan | 0.2880 | 0.6232 | 0.5678 | 0.0 | 0.9521 | 0.8409 | 0.9837 | 0.0102 | 0.2774 | 0.4412 | 0.0 | nan | 0.7531 | 0.8765 | 0.6674 | 0.7471 | 0.4108 | nan | 0.5886 | 0.6380 | 0.0292 | 0.8636 | 0.0955 | 0.0 | 0.4072 | 0.0397 | 0.5824 | 0.0 | 0.0025 | 0.7642 | 0.2054 | 0.5298 | 0.3210 | 0.3295 | nan | 0.2042 | 0.5070 | 0.3838 | 0.0 | 0.8882 | 0.7660 | 0.9520 | 0.0091 | 0.1154 | 0.3630 | 0.0 | | 0.0632 | 30.0 | 6000 | 0.6955 | 0.4046 | 0.4834 | 0.8758 | nan | 0.8442 | 0.9537 | 0.7839 | 0.8517 | 0.5396 | nan | 0.7162 | 0.8110 | 0.0400 | 0.9540 | 0.1004 | 0.0 | 0.7228 | 0.0127 | 0.6716 | 0.0 | 0.0072 | 0.9378 | 0.1141 | 0.6040 | 0.3381 | 0.4786 | nan | 0.3070 | 0.6319 | 0.5860 | 0.0 | 0.9547 | 0.8347 | 0.9842 | 0.0080 | 0.2084 | 0.4721 | 0.0 | nan | 0.7577 | 0.8775 | 0.6634 | 0.7512 | 0.3930 | nan | 0.5910 | 0.6374 | 0.0219 | 0.8630 | 0.0879 | 0.0 | 0.4588 | 0.0127 | 0.5791 | 0.0 | 0.0070 | 0.7596 | 0.1049 | 0.5244 | 0.2932 | 0.3452 | nan | 0.2441 | 0.5018 | 0.3876 | 0.0 | 0.8857 | 0.7614 | 0.9505 | 0.0066 | 0.1052 | 0.3755 | 0.0 | | 0.0753 | 30.1 | 6020 | 0.6949 | 0.4082 | 0.4973 | 0.8758 | nan | 0.8345 | 0.9571 | 0.8362 | 0.8264 | 0.5438 | nan | 0.7323 | 0.8108 | 0.1971 | 0.9566 | 0.1114 | 0.0 | 0.7618 | 0.0325 | 0.7024 | 0.0 | 0.0191 | 0.9319 | 0.1148 | 0.6197 | 0.3594 | 0.5067 | nan | 0.3401 | 0.6437 | 0.5648 | 0.0 | 0.9520 | 0.8443 | 0.9797 | 0.0129 | 0.2513 | 0.4698 | 0.0 | nan | 0.7561 | 0.8772 | 0.6487 | 0.7454 | 0.4034 | nan | 0.5893 | 0.6161 | 0.0728 | 0.8618 | 0.0976 | 0.0 | 0.4558 | 0.0325 | 0.5780 | 0.0 | 0.0188 | 0.7642 | 0.1055 | 0.5314 | 0.3079 | 0.3534 | nan | 0.2505 | 0.5066 | 0.3820 | 0.0 | 0.8864 | 0.7637 | 0.9524 | 0.0108 | 0.1209 | 0.3744 | 0.0 | | 0.1257 | 30.2 | 6040 | 0.6917 | 0.4070 | 0.4841 | 0.8779 | nan | 0.8493 | 0.9572 | 0.8337 | 0.8431 | 0.5213 | nan | 0.6946 | 0.7666 | 0.1476 | 0.9528 | 0.1400 | 0.0 | 0.6802 | 0.0394 | 0.7341 | 0.0 | 0.0169 | 0.9393 | 0.0977 | 0.6252 | 0.3417 | 0.4498 | nan | 0.2825 | 0.6021 | 0.5489 | 0.0 | 0.9565 | 0.8478 | 0.9835 | 0.0138 | 0.1719 | 0.4552 | 0.0 | nan | 0.7609 | 0.8821 | 0.6538 | 0.7509 | 0.3998 | nan | 0.5869 | 0.6337 | 0.0614 | 0.8635 | 0.1233 | 0.0 | 0.4218 | 0.0394 | 0.5784 | 0.0 | 0.0168 | 0.7609 | 0.0906 | 0.5322 | 0.3005 | 0.3418 | nan | 0.2392 | 0.4983 | 0.3973 | 0.0 | 0.8847 | 0.7660 | 0.9524 | 0.0114 | 0.1011 | 0.3764 | 0.0 | | 0.1099 | 30.3 | 6060 | 0.6502 | 0.4117 | 0.4958 | 0.8784 | nan | 0.8603 | 0.9469 | 0.8447 | 0.8613 | 0.5677 | nan | 0.6992 | 0.8168 | 0.0617 | 0.9599 | 0.1942 | 0.0 | 0.7344 | 0.0195 | 0.6950 | 0.0 | 0.0363 | 0.9261 | 0.1359 | 0.6417 | 0.3702 | 0.4821 | nan | 0.2626 | 0.6125 | 0.5872 | 0.0 | 0.9514 | 0.8377 | 0.9843 | 0.0076 | 0.2688 | 0.4986 | 0.0 | nan | 0.7764 | 0.8886 | 0.6787 | 0.7535 | 0.3592 | nan | 0.5988 | 0.6382 | 0.0345 | 0.8602 | 0.1687 | 0.0 | 0.4401 | 0.0195 | 0.5770 | 0.0 | 0.0358 | 0.7705 | 0.1235 | 0.5355 | 0.3169 | 0.3607 | nan | 0.2241 | 0.5019 | 0.3836 | 0.0 | 0.8868 | 0.7598 | 0.9519 | 0.0056 | 0.1383 | 0.3852 | 0.0 | | 0.0772 | 30.4 | 6080 | 0.6619 | 0.4126 | 0.4875 | 0.8790 | nan | 0.8764 | 0.9468 | 0.8257 | 0.8391 | 0.5504 | nan | 0.7166 | 0.8301 | 0.0395 | 0.9459 | 0.1677 | 0.0 | 0.5851 | 0.0885 | 0.7132 | 0.0 | 0.0414 | 0.9361 | 0.1862 | 0.6286 | 0.3409 | 0.3462 | nan | 0.2937 | 0.6241 | 0.5592 | 0.0 | 0.9572 | 0.8257 | 0.9805 | 0.0118 | 0.2612 | 0.4819 | 0.0 | nan | 0.7804 | 0.8881 | 0.7290 | 0.7596 | 0.3569 | nan | 0.5928 | 0.6354 | 0.0243 | 0.8688 | 0.1465 | 0.0 | 0.4213 | 0.0884 | 0.5870 | 0.0 | 0.0406 | 0.7637 | 0.1650 | 0.5315 | 0.3011 | 0.2810 | nan | 0.2343 | 0.5056 | 0.3912 | 0.0 | 0.8844 | 0.7534 | 0.9542 | 0.0080 | 0.1283 | 0.3817 | 0.0 | | 0.0667 | 30.5 | 6100 | 0.6734 | 0.4156 | 0.4975 | 0.8768 | nan | 0.8659 | 0.9488 | 0.8169 | 0.8425 | 0.5445 | nan | 0.7401 | 0.8129 | 0.1878 | 0.9561 | 0.1335 | 0.0 | 0.6130 | 0.0796 | 0.7363 | 0.0 | 0.0373 | 0.9295 | 0.2599 | 0.5996 | 0.3926 | 0.4041 | nan | 0.3321 | 0.6299 | 0.5870 | 0.0 | 0.9605 | 0.7830 | 0.9819 | 0.0121 | 0.2828 | 0.4500 | 0.0 | nan | 0.7741 | 0.8840 | 0.7218 | 0.7529 | 0.3690 | nan | 0.5993 | 0.6478 | 0.0927 | 0.8646 | 0.1188 | 0.0 | 0.4250 | 0.0795 | 0.5757 | 0.0 | 0.0369 | 0.7654 | 0.2164 | 0.5279 | 0.3266 | 0.3253 | nan | 0.2681 | 0.5043 | 0.3908 | 0.0 | 0.8761 | 0.7178 | 0.9529 | 0.0091 | 0.1090 | 0.3682 | 0.0 | | 0.0914 | 30.6 | 6120 | 0.6724 | 0.4115 | 0.4907 | 0.8772 | nan | 0.8487 | 0.9539 | 0.8272 | 0.8489 | 0.5201 | nan | 0.7346 | 0.8388 | 0.0661 | 0.9553 | 0.1135 | 0.0 | 0.6458 | 0.0374 | 0.7067 | 0.0 | 0.0146 | 0.9313 | 0.2638 | 0.6162 | 0.3748 | 0.4614 | nan | 0.2940 | 0.6111 | 0.5433 | 0.0 | 0.9517 | 0.8375 | 0.9826 | 0.0122 | 0.2292 | 0.4805 | 0.0 | nan | 0.7623 | 0.8822 | 0.6902 | 0.7479 | 0.3890 | nan | 0.5967 | 0.6272 | 0.0419 | 0.8660 | 0.1011 | 0.0 | 0.4417 | 0.0374 | 0.5728 | 0.0 | 0.0145 | 0.7672 | 0.2226 | 0.5318 | 0.3195 | 0.3483 | nan | 0.2485 | 0.5019 | 0.3951 | 0.0 | 0.8840 | 0.7441 | 0.9528 | 0.0088 | 0.0989 | 0.3731 | 0.0 | | 0.0661 | 30.7 | 6140 | 0.7170 | 0.4061 | 0.4897 | 0.8731 | nan | 0.8210 | 0.9529 | 0.8176 | 0.8697 | 0.5278 | nan | 0.7301 | 0.8420 | 0.0351 | 0.9562 | 0.1284 | 0.0 | 0.6482 | 0.0361 | 0.7150 | 0.0 | 0.0179 | 0.9353 | 0.2255 | 0.6233 | 0.3457 | 0.5291 | nan | 0.2833 | 0.6036 | 0.5790 | 0.0 | 0.9537 | 0.8027 | 0.9850 | 0.0134 | 0.2334 | 0.4607 | 0.0 | nan | 0.7420 | 0.8802 | 0.6737 | 0.7257 | 0.3774 | nan | 0.5939 | 0.6169 | 0.0226 | 0.8641 | 0.1118 | 0.0 | 0.4206 | 0.0361 | 0.5798 | 0.0 | 0.0178 | 0.7671 | 0.1958 | 0.5363 | 0.2960 | 0.3592 | nan | 0.2366 | 0.4952 | 0.3927 | 0.0 | 0.8805 | 0.7368 | 0.9513 | 0.0107 | 0.1042 | 0.3705 | 0.0 | | 0.0741 | 30.8 | 6160 | 0.7035 | 0.4059 | 0.4835 | 0.8739 | nan | 0.8332 | 0.9534 | 0.8044 | 0.8594 | 0.5470 | nan | 0.6966 | 0.8466 | 0.0264 | 0.9513 | 0.1178 | 0.0 | 0.6525 | 0.0279 | 0.6862 | 0.0 | 0.0111 | 0.9339 | 0.2435 | 0.6108 | 0.3731 | 0.4133 | nan | 0.2855 | 0.6059 | 0.5583 | 0.0 | 0.9556 | 0.8048 | 0.9822 | 0.0117 | 0.2116 | 0.4694 | 0.0 | nan | 0.7500 | 0.8788 | 0.6764 | 0.7474 | 0.3757 | nan | 0.5922 | 0.6360 | 0.0169 | 0.8647 | 0.1031 | 0.0 | 0.4324 | 0.0278 | 0.5708 | 0.0 | 0.0111 | 0.7611 | 0.2063 | 0.5166 | 0.3174 | 0.3225 | nan | 0.2321 | 0.4967 | 0.3917 | 0.0 | 0.8810 | 0.7370 | 0.9529 | 0.0099 | 0.1061 | 0.3742 | 0.0 | | 0.058 | 30.9 | 6180 | 0.6949 | 0.4057 | 0.4874 | 0.8758 | nan | 0.8500 | 0.9529 | 0.8062 | 0.8529 | 0.5494 | nan | 0.7168 | 0.8530 | 0.0539 | 0.9533 | 0.1434 | 0.0 | 0.6919 | 0.0333 | 0.7170 | 0.0 | 0.0029 | 0.9345 | 0.1742 | 0.5842 | 0.3866 | 0.4719 | nan | 0.2287 | 0.6061 | 0.5550 | 0.0 | 0.9526 | 0.8203 | 0.9840 | 0.0032 | 0.2581 | 0.4611 | 0.0 | nan | 0.7621 | 0.8822 | 0.6865 | 0.7517 | 0.3810 | nan | 0.5951 | 0.6290 | 0.0340 | 0.8645 | 0.1196 | 0.0 | 0.4348 | 0.0333 | 0.5701 | 0.0 | 0.0029 | 0.7594 | 0.1569 | 0.5025 | 0.3272 | 0.3516 | nan | 0.1977 | 0.4968 | 0.3827 | 0.0 | 0.8834 | 0.7426 | 0.9522 | 0.0028 | 0.1132 | 0.3673 | 0.0 | | 0.0769 | 31.0 | 6200 | 0.7021 | 0.4053 | 0.4841 | 0.8771 | nan | 0.8492 | 0.9555 | 0.8176 | 0.8517 | 0.5460 | nan | 0.6977 | 0.8309 | 0.0346 | 0.9528 | 0.1484 | 0.0 | 0.6787 | 0.0428 | 0.7329 | 0.0 | 0.0019 | 0.9380 | 0.1775 | 0.5846 | 0.3760 | 0.3971 | nan | 0.2414 | 0.5944 | 0.5549 | 0.0 | 0.9527 | 0.8520 | 0.9845 | 0.0045 | 0.2571 | 0.4371 | 0.0 | nan | 0.7645 | 0.8833 | 0.6655 | 0.7506 | 0.3847 | nan | 0.5952 | 0.6331 | 0.0236 | 0.8665 | 0.1226 | 0.0 | 0.4309 | 0.0428 | 0.5769 | 0.0 | 0.0018 | 0.7567 | 0.1589 | 0.5088 | 0.3230 | 0.3165 | nan | 0.2053 | 0.4955 | 0.3887 | 0.0 | 0.8882 | 0.7661 | 0.9518 | 0.0039 | 0.1047 | 0.3583 | 0.0 | | 0.1001 | 31.1 | 6220 | 0.6853 | 0.4093 | 0.4854 | 0.8775 | nan | 0.8416 | 0.9577 | 0.8128 | 0.8535 | 0.5338 | nan | 0.7194 | 0.8196 | 0.0204 | 0.9562 | 0.1343 | 0.0 | 0.6562 | 0.0496 | 0.7100 | 0.0 | 0.0097 | 0.9281 | 0.1991 | 0.6091 | 0.3812 | 0.4154 | nan | 0.2431 | 0.6048 | 0.5839 | 0.0 | 0.9558 | 0.8400 | 0.9842 | 0.0073 | 0.2275 | 0.4778 | 0.0 | nan | 0.7583 | 0.8790 | 0.6821 | 0.7485 | 0.3949 | nan | 0.5976 | 0.6516 | 0.0143 | 0.8642 | 0.1142 | 0.0 | 0.4418 | 0.0496 | 0.5859 | 0.0 | 0.0096 | 0.7646 | 0.1751 | 0.5216 | 0.3246 | 0.3335 | nan | 0.1983 | 0.4977 | 0.3874 | 0.0 | 0.8879 | 0.7664 | 0.9515 | 0.0064 | 0.1118 | 0.3783 | 0.0 | | 0.0825 | 31.2 | 6240 | 0.7171 | 0.4077 | 0.4846 | 0.8742 | nan | 0.8166 | 0.9577 | 0.8072 | 0.8654 | 0.5347 | nan | 0.7027 | 0.8295 | 0.0439 | 0.9538 | 0.1191 | 0.0 | 0.6460 | 0.0475 | 0.7103 | 0.0 | 0.0139 | 0.9326 | 0.1952 | 0.6114 | 0.3621 | 0.3947 | nan | 0.2462 | 0.6264 | 0.5722 | 0.0 | 0.9531 | 0.8397 | 0.9826 | 0.0105 | 0.2558 | 0.4754 | 0.0 | nan | 0.7417 | 0.8761 | 0.7062 | 0.7212 | 0.3914 | nan | 0.5990 | 0.6442 | 0.0283 | 0.8653 | 0.1020 | 0.0 | 0.4481 | 0.0475 | 0.5948 | 0.0 | 0.0138 | 0.7624 | 0.1714 | 0.5208 | 0.3110 | 0.3095 | nan | 0.2061 | 0.5013 | 0.3833 | 0.0 | 0.8875 | 0.7596 | 0.9522 | 0.0089 | 0.1172 | 0.3768 | 0.0 | | 0.0719 | 31.3 | 6260 | 0.7227 | 0.4029 | 0.4776 | 0.8724 | nan | 0.8252 | 0.9572 | 0.8001 | 0.8361 | 0.5346 | nan | 0.7171 | 0.8383 | 0.0841 | 0.9517 | 0.1249 | 0.0004 | 0.6682 | 0.0438 | 0.7044 | 0.0 | 0.0090 | 0.9348 | 0.1377 | 0.5864 | 0.3301 | 0.3852 | nan | 0.2310 | 0.5876 | 0.5513 | 0.0 | 0.9574 | 0.8105 | 0.9836 | 0.0052 | 0.1920 | 0.4961 | 0.0 | nan | 0.7465 | 0.8720 | 0.6935 | 0.7138 | 0.3899 | nan | 0.5954 | 0.6378 | 0.0479 | 0.8665 | 0.1055 | 0.0004 | 0.4430 | 0.0438 | 0.5924 | 0.0 | 0.0089 | 0.7560 | 0.1232 | 0.5036 | 0.2936 | 0.3103 | nan | 0.1915 | 0.4936 | 0.3802 | 0.0 | 0.8833 | 0.7456 | 0.9516 | 0.0045 | 0.1095 | 0.3888 | 0.0 | | 0.0706 | 31.4 | 6280 | 0.7254 | 0.4055 | 0.4860 | 0.8726 | nan | 0.8178 | 0.9526 | 0.8173 | 0.8460 | 0.5517 | nan | 0.7147 | 0.8187 | 0.0855 | 0.9556 | 0.1163 | 0.0048 | 0.6842 | 0.0582 | 0.7201 | 0.0 | 0.0240 | 0.9390 | 0.1200 | 0.5932 | 0.3625 | 0.4347 | nan | 0.2568 | 0.5897 | 0.5849 | 0.0 | 0.9546 | 0.8502 | 0.9816 | 0.0098 | 0.2429 | 0.4652 | 0.0 | nan | 0.7442 | 0.8756 | 0.6588 | 0.7151 | 0.3918 | nan | 0.5943 | 0.6401 | 0.0474 | 0.8644 | 0.0997 | 0.0046 | 0.4438 | 0.0581 | 0.5896 | 0.0 | 0.0238 | 0.7558 | 0.1079 | 0.5100 | 0.3098 | 0.3399 | nan | 0.2168 | 0.4966 | 0.3814 | 0.0 | 0.8874 | 0.7607 | 0.9527 | 0.0083 | 0.1194 | 0.3773 | 0.0 | | 0.0874 | 31.5 | 6300 | 0.7268 | 0.4020 | 0.4789 | 0.8717 | nan | 0.8095 | 0.9606 | 0.8034 | 0.8561 | 0.5209 | nan | 0.7001 | 0.8063 | 0.0546 | 0.9517 | 0.1006 | 0.0039 | 0.6399 | 0.0730 | 0.7356 | 0.0 | 0.0078 | 0.9383 | 0.1055 | 0.5675 | 0.3776 | 0.4526 | nan | 0.2018 | 0.5976 | 0.5706 | 0.0 | 0.9536 | 0.8297 | 0.9801 | 0.0118 | 0.2338 | 0.4798 | 0.0 | nan | 0.7348 | 0.8737 | 0.6748 | 0.7096 | 0.3915 | nan | 0.5888 | 0.6465 | 0.0313 | 0.8675 | 0.0893 | 0.0037 | 0.4096 | 0.0730 | 0.5995 | 0.0 | 0.0078 | 0.7556 | 0.0964 | 0.5046 | 0.3131 | 0.3490 | nan | 0.1735 | 0.4953 | 0.3811 | 0.0 | 0.8872 | 0.7585 | 0.9528 | 0.0095 | 0.1093 | 0.3771 | 0.0 | | 0.1078 | 31.6 | 6320 | 0.7038 | 0.4033 | 0.4803 | 0.8737 | nan | 0.8193 | 0.9549 | 0.8142 | 0.8598 | 0.5280 | nan | 0.7177 | 0.8195 | 0.0581 | 0.9601 | 0.1067 | 0.0096 | 0.6746 | 0.0319 | 0.7294 | 0.0 | 0.0 | 0.9362 | 0.1320 | 0.6057 | 0.3791 | 0.3374 | nan | 0.2643 | 0.5980 | 0.5489 | 0.0 | 0.9565 | 0.8236 | 0.9834 | 0.0105 | 0.2318 | 0.4768 | 0.0 | nan | 0.7444 | 0.8757 | 0.6466 | 0.7298 | 0.4080 | nan | 0.5887 | 0.6389 | 0.0335 | 0.8631 | 0.0937 | 0.0093 | 0.4737 | 0.0319 | 0.5699 | 0.0 | 0.0 | 0.7627 | 0.1200 | 0.5175 | 0.3115 | 0.2830 | nan | 0.2302 | 0.4973 | 0.3836 | 0.0 | 0.8837 | 0.7454 | 0.9522 | 0.0086 | 0.1224 | 0.3814 | 0.0 | | 0.0711 | 31.7 | 6340 | 0.7174 | 0.4066 | 0.4866 | 0.8736 | nan | 0.8243 | 0.9516 | 0.8095 | 0.8656 | 0.5432 | nan | 0.7135 | 0.8475 | 0.0550 | 0.9579 | 0.1080 | 0.0546 | 0.7416 | 0.0321 | 0.7189 | 0.0 | 0.0 | 0.9370 | 0.1802 | 0.6009 | 0.3674 | 0.3297 | nan | 0.2785 | 0.6281 | 0.5601 | 0.0 | 0.9530 | 0.8254 | 0.9845 | 0.0107 | 0.2219 | 0.4711 | 0.0 | nan | 0.7459 | 0.8751 | 0.6467 | 0.7309 | 0.3992 | nan | 0.5923 | 0.6307 | 0.0327 | 0.8658 | 0.0929 | 0.0520 | 0.4931 | 0.0321 | 0.5788 | 0.0 | 0.0 | 0.7610 | 0.1603 | 0.5105 | 0.3171 | 0.2766 | nan | 0.2344 | 0.5053 | 0.3758 | 0.0 | 0.8858 | 0.7519 | 0.9518 | 0.0092 | 0.1202 | 0.3818 | 0.0 | | 0.0753 | 31.8 | 6360 | 0.7175 | 0.4085 | 0.4870 | 0.8750 | nan | 0.8346 | 0.9524 | 0.7899 | 0.8642 | 0.5514 | nan | 0.7120 | 0.8405 | 0.0501 | 0.9590 | 0.1016 | 0.0091 | 0.6982 | 0.0294 | 0.6857 | 0.0 | 0.0012 | 0.9302 | 0.1419 | 0.6116 | 0.3700 | 0.5232 | nan | 0.2718 | 0.6032 | 0.5396 | 0.0 | 0.9547 | 0.8306 | 0.9835 | 0.0090 | 0.2432 | 0.4908 | 0.0 | nan | 0.7515 | 0.8752 | 0.6752 | 0.7407 | 0.3913 | nan | 0.5938 | 0.6172 | 0.0311 | 0.8641 | 0.0885 | 0.0088 | 0.5006 | 0.0294 | 0.5643 | 0.0 | 0.0012 | 0.7671 | 0.1301 | 0.5224 | 0.3169 | 0.3806 | nan | 0.2300 | 0.5022 | 0.3785 | 0.0 | 0.8872 | 0.7572 | 0.9525 | 0.0075 | 0.1284 | 0.3792 | 0.0 | | 0.0705 | 31.9 | 6380 | 0.7062 | 0.4112 | 0.4900 | 0.8763 | nan | 0.8427 | 0.9512 | 0.7884 | 0.8599 | 0.5317 | nan | 0.7283 | 0.8216 | 0.0484 | 0.9527 | 0.1100 | 0.0072 | 0.6203 | 0.0666 | 0.7012 | 0.0 | 0.0029 | 0.9403 | 0.1995 | 0.6033 | 0.3720 | 0.5112 | nan | 0.3261 | 0.6278 | 0.5613 | 0.0 | 0.9518 | 0.8619 | 0.9833 | 0.0094 | 0.2489 | 0.4499 | 0.0 | nan | 0.7558 | 0.8745 | 0.6857 | 0.7456 | 0.3975 | nan | 0.5863 | 0.6238 | 0.0294 | 0.8666 | 0.0955 | 0.0069 | 0.4483 | 0.0666 | 0.5581 | 0.0 | 0.0029 | 0.7610 | 0.1724 | 0.5200 | 0.3221 | 0.3774 | nan | 0.2558 | 0.5087 | 0.3873 | 0.0 | 0.8909 | 0.7734 | 0.9525 | 0.0079 | 0.1195 | 0.3665 | 0.0 | | 0.0822 | 32.0 | 6400 | 0.6943 | 0.4123 | 0.4897 | 0.8763 | nan | 0.8249 | 0.9565 | 0.8046 | 0.8503 | 0.5427 | nan | 0.7243 | 0.8296 | 0.0390 | 0.9608 | 0.0953 | 0.0 | 0.6325 | 0.0404 | 0.6859 | 0.0 | 0.0005 | 0.9273 | 0.2207 | 0.6202 | 0.4016 | 0.4459 | nan | 0.3713 | 0.6071 | 0.5696 | 0.0 | 0.9564 | 0.8529 | 0.9832 | 0.0115 | 0.2157 | 0.4988 | 0.0 | nan | 0.7500 | 0.8751 | 0.6614 | 0.7480 | 0.4049 | nan | 0.5941 | 0.6398 | 0.0240 | 0.8622 | 0.0844 | 0.0 | 0.4865 | 0.0404 | 0.5522 | 0.0 | 0.0005 | 0.7696 | 0.1895 | 0.5334 | 0.3299 | 0.3518 | nan | 0.2886 | 0.5030 | 0.3887 | 0.0 | 0.8885 | 0.7667 | 0.9526 | 0.0091 | 0.1140 | 0.3856 | 0.0 | | 0.0668 | 32.1 | 6420 | 0.7170 | 0.4117 | 0.4908 | 0.8756 | nan | 0.8303 | 0.9569 | 0.8044 | 0.8574 | 0.5256 | nan | 0.7134 | 0.8253 | 0.0717 | 0.9551 | 0.1055 | 0.0004 | 0.6658 | 0.0383 | 0.6925 | 0.0 | 0.0016 | 0.9343 | 0.2526 | 0.6060 | 0.3833 | 0.4895 | nan | 0.3324 | 0.6097 | 0.5615 | 0.0 | 0.9505 | 0.8486 | 0.9849 | 0.0083 | 0.2182 | 0.4814 | 0.0 | nan | 0.7487 | 0.8756 | 0.6546 | 0.7416 | 0.4008 | nan | 0.5996 | 0.6438 | 0.0427 | 0.8637 | 0.0916 | 0.0004 | 0.4797 | 0.0383 | 0.5574 | 0.0 | 0.0016 | 0.7635 | 0.2075 | 0.5271 | 0.3201 | 0.3666 | nan | 0.2627 | 0.4996 | 0.3834 | 0.0 | 0.8883 | 0.7686 | 0.9515 | 0.0066 | 0.1079 | 0.3810 | 0.0 | | 0.109 | 32.2 | 6440 | 0.6935 | 0.4073 | 0.4896 | 0.8756 | nan | 0.8346 | 0.9530 | 0.7909 | 0.8605 | 0.5453 | nan | 0.7193 | 0.8166 | 0.1420 | 0.9557 | 0.1090 | 0.0 | 0.6780 | 0.0411 | 0.7133 | 0.0 | 0.0003 | 0.9332 | 0.1200 | 0.6218 | 0.3600 | 0.4940 | nan | 0.2634 | 0.6039 | 0.5729 | 0.0 | 0.9543 | 0.8559 | 0.9829 | 0.0058 | 0.2842 | 0.4566 | 0.0 | nan | 0.7505 | 0.8765 | 0.6700 | 0.7381 | 0.3989 | nan | 0.6023 | 0.6275 | 0.0790 | 0.8640 | 0.0946 | 0.0 | 0.4644 | 0.0411 | 0.5404 | 0.0 | 0.0003 | 0.7660 | 0.1098 | 0.5411 | 0.3056 | 0.3728 | nan | 0.2263 | 0.4973 | 0.3782 | 0.0 | 0.8881 | 0.7704 | 0.9521 | 0.0049 | 0.1110 | 0.3630 | 0.0 | | 0.1092 | 32.3 | 6460 | 0.6913 | 0.4103 | 0.4887 | 0.8780 | nan | 0.8523 | 0.9547 | 0.7885 | 0.8480 | 0.5476 | nan | 0.7248 | 0.8243 | 0.0524 | 0.9527 | 0.1199 | 0.0 | 0.6414 | 0.0525 | 0.6923 | 0.0 | 0.0012 | 0.9293 | 0.1572 | 0.6457 | 0.3509 | 0.5211 | nan | 0.2821 | 0.6106 | 0.5562 | 0.0 | 0.9529 | 0.8452 | 0.9834 | 0.0110 | 0.2626 | 0.4776 | 0.0 | nan | 0.7633 | 0.8792 | 0.6827 | 0.7520 | 0.3946 | nan | 0.5972 | 0.6374 | 0.0305 | 0.8661 | 0.1029 | 0.0 | 0.4491 | 0.0525 | 0.5630 | 0.0 | 0.0012 | 0.7682 | 0.1397 | 0.5439 | 0.3047 | 0.3765 | nan | 0.2310 | 0.5023 | 0.3822 | 0.0 | 0.8886 | 0.7707 | 0.9526 | 0.0090 | 0.1141 | 0.3733 | 0.0 | | 0.0882 | 32.4 | 6480 | 0.6950 | 0.4128 | 0.4911 | 0.8775 | nan | 0.8384 | 0.9547 | 0.8099 | 0.8561 | 0.5512 | nan | 0.7151 | 0.8269 | 0.0289 | 0.9541 | 0.0924 | 0.0 | 0.6422 | 0.0898 | 0.6895 | 0.0 | 0.0008 | 0.9313 | 0.2086 | 0.6159 | 0.4136 | 0.4909 | nan | 0.2832 | 0.6301 | 0.5775 | 0.0 | 0.9506 | 0.8589 | 0.9833 | 0.0161 | 0.2169 | 0.4886 | 0.0 | nan | 0.7574 | 0.8779 | 0.6794 | 0.7450 | 0.4003 | nan | 0.5922 | 0.6370 | 0.0167 | 0.8665 | 0.0826 | 0.0 | 0.4593 | 0.0898 | 0.5692 | 0.0 | 0.0008 | 0.7670 | 0.1782 | 0.5370 | 0.3388 | 0.3688 | nan | 0.2250 | 0.5074 | 0.3948 | 0.0 | 0.8908 | 0.7754 | 0.9528 | 0.0123 | 0.1047 | 0.3829 | 0.0 | | 0.1078 | 32.5 | 6500 | 0.7008 | 0.4077 | 0.4851 | 0.8768 | nan | 0.8260 | 0.9568 | 0.8089 | 0.8641 | 0.5325 | nan | 0.7276 | 0.8352 | 0.0344 | 0.9569 | 0.0890 | 0.0 | 0.6498 | 0.0799 | 0.7141 | 0.0 | 0.0003 | 0.9332 | 0.1337 | 0.6248 | 0.4004 | 0.4473 | nan | 0.2084 | 0.6083 | 0.5986 | 0.0 | 0.9546 | 0.8545 | 0.9826 | 0.0117 | 0.2128 | 0.4765 | 0.0 | nan | 0.7537 | 0.8767 | 0.6901 | 0.7304 | 0.4085 | nan | 0.5935 | 0.6295 | 0.0208 | 0.8652 | 0.0800 | 0.0 | 0.4612 | 0.0799 | 0.5707 | 0.0 | 0.0003 | 0.7653 | 0.1212 | 0.5288 | 0.3344 | 0.3514 | nan | 0.1775 | 0.5027 | 0.3925 | 0.0 | 0.8896 | 0.7701 | 0.9528 | 0.0095 | 0.1122 | 0.3797 | 0.0 | | 0.0839 | 32.6 | 6520 | 0.6931 | 0.4070 | 0.4811 | 0.8762 | nan | 0.8256 | 0.9578 | 0.7687 | 0.8671 | 0.5476 | nan | 0.7239 | 0.8142 | 0.0359 | 0.9583 | 0.0924 | 0.0 | 0.6538 | 0.0468 | 0.6923 | 0.0 | 0.0052 | 0.9296 | 0.1578 | 0.6324 | 0.3887 | 0.4013 | nan | 0.2183 | 0.6161 | 0.5845 | 0.0 | 0.9555 | 0.8465 | 0.9810 | 0.0069 | 0.2266 | 0.4620 | 0.0 | nan | 0.7481 | 0.8773 | 0.7198 | 0.7139 | 0.4052 | nan | 0.5951 | 0.6356 | 0.0227 | 0.8628 | 0.0832 | 0.0 | 0.4647 | 0.0468 | 0.5635 | 0.0 | 0.0052 | 0.7671 | 0.1405 | 0.5332 | 0.3266 | 0.3282 | nan | 0.1798 | 0.5013 | 0.3971 | 0.0 | 0.8898 | 0.7736 | 0.9531 | 0.0060 | 0.1144 | 0.3709 | 0.0 | | 0.0993 | 32.7 | 6540 | 0.7215 | 0.4052 | 0.4845 | 0.8741 | nan | 0.8252 | 0.9563 | 0.7900 | 0.8293 | 0.5540 | nan | 0.7042 | 0.8283 | 0.1096 | 0.9620 | 0.0997 | 0.0 | 0.6786 | 0.0450 | 0.6849 | 0.0 | 0.0052 | 0.9311 | 0.1770 | 0.6198 | 0.3774 | 0.3792 | nan | 0.1926 | 0.6116 | 0.5805 | 0.0 | 0.9537 | 0.8576 | 0.9825 | 0.0017 | 0.2979 | 0.4689 | 0.0 | nan | 0.7450 | 0.8710 | 0.6980 | 0.7201 | 0.4009 | nan | 0.5858 | 0.6205 | 0.0619 | 0.8594 | 0.0887 | 0.0 | 0.4599 | 0.0450 | 0.5572 | 0.0 | 0.0052 | 0.7674 | 0.1559 | 0.5378 | 0.3225 | 0.3122 | nan | 0.1615 | 0.5010 | 0.3915 | 0.0 | 0.8896 | 0.7746 | 0.9524 | 0.0015 | 0.1138 | 0.3657 | 0.0 | | 0.0767 | 32.8 | 6560 | 0.7154 | 0.4119 | 0.4878 | 0.8752 | nan | 0.8259 | 0.9534 | 0.8095 | 0.8543 | 0.5497 | nan | 0.6975 | 0.7829 | 0.1343 | 0.9474 | 0.1097 | 0.0 | 0.6268 | 0.0537 | 0.7137 | 0.0 | 0.0033 | 0.9335 | 0.1615 | 0.6321 | 0.4182 | 0.4522 | nan | 0.2498 | 0.6249 | 0.5742 | 0.0 | 0.9536 | 0.8550 | 0.9824 | 0.0084 | 0.2087 | 0.4922 | 0.0 | nan | 0.7454 | 0.8740 | 0.6896 | 0.7329 | 0.3946 | nan | 0.5824 | 0.6412 | 0.0629 | 0.8653 | 0.0960 | 0.0 | 0.4605 | 0.0537 | 0.5902 | 0.0 | 0.0033 | 0.7651 | 0.1420 | 0.5385 | 0.3385 | 0.3546 | nan | 0.2151 | 0.5038 | 0.3986 | 0.0 | 0.8898 | 0.7714 | 0.9525 | 0.0068 | 0.1229 | 0.3897 | 0.0 | | 0.07 | 32.9 | 6580 | 0.7151 | 0.4086 | 0.4868 | 0.8742 | nan | 0.8154 | 0.9566 | 0.8130 | 0.8613 | 0.5386 | nan | 0.7003 | 0.8201 | 0.0959 | 0.9565 | 0.1221 | 0.0079 | 0.6187 | 0.0545 | 0.7487 | 0.0 | 0.0009 | 0.9347 | 0.2350 | 0.6063 | 0.3738 | 0.4407 | nan | 0.2614 | 0.6001 | 0.5478 | 0.0 | 0.9557 | 0.8371 | 0.9858 | 0.0061 | 0.2000 | 0.4813 | 0.0 | nan | 0.7412 | 0.8754 | 0.6716 | 0.7269 | 0.3948 | nan | 0.5877 | 0.6248 | 0.0466 | 0.8617 | 0.1046 | 0.0078 | 0.4182 | 0.0545 | 0.6030 | 0.0 | 0.0009 | 0.7639 | 0.1969 | 0.5336 | 0.3179 | 0.3477 | nan | 0.2122 | 0.4997 | 0.3894 | 0.0 | 0.8867 | 0.7627 | 0.9507 | 0.0050 | 0.1037 | 0.3853 | 0.0 | | 0.0869 | 33.0 | 6600 | 0.7291 | 0.4098 | 0.4886 | 0.8733 | nan | 0.8000 | 0.9569 | 0.8014 | 0.8654 | 0.5538 | nan | 0.7208 | 0.8246 | 0.0684 | 0.9506 | 0.1201 | 0.0254 | 0.6151 | 0.0426 | 0.7512 | 0.0 | 0.0015 | 0.9316 | 0.2126 | 0.6254 | 0.3746 | 0.4522 | nan | 0.2562 | 0.6426 | 0.5521 | 0.0 | 0.9547 | 0.8444 | 0.9837 | 0.0108 | 0.2265 | 0.4712 | 0.0 | nan | 0.7315 | 0.8745 | 0.6940 | 0.7067 | 0.4021 | nan | 0.5896 | 0.6237 | 0.0401 | 0.8644 | 0.1045 | 0.0251 | 0.4398 | 0.0425 | 0.6079 | 0.0 | 0.0015 | 0.7688 | 0.1816 | 0.5406 | 0.3177 | 0.3504 | nan | 0.1928 | 0.5069 | 0.3942 | 0.0 | 0.8878 | 0.7669 | 0.9524 | 0.0084 | 0.1167 | 0.3818 | 0.0 | | 0.0907 | 33.1 | 6620 | 0.7083 | 0.4140 | 0.4938 | 0.8760 | nan | 0.8313 | 0.9492 | 0.7967 | 0.8513 | 0.5814 | nan | 0.7147 | 0.8239 | 0.0723 | 0.9481 | 0.1296 | 0.0329 | 0.6339 | 0.0548 | 0.7398 | 0.0 | 0.0030 | 0.9302 | 0.2007 | 0.6677 | 0.3820 | 0.4066 | nan | 0.3267 | 0.6150 | 0.5558 | 0.0 | 0.9557 | 0.8548 | 0.9816 | 0.0128 | 0.2743 | 0.4734 | 0.0 | nan | 0.7510 | 0.8790 | 0.6874 | 0.7512 | 0.3778 | nan | 0.5872 | 0.6345 | 0.0395 | 0.8648 | 0.1112 | 0.0328 | 0.4481 | 0.0548 | 0.5977 | 0.0 | 0.0030 | 0.7735 | 0.1747 | 0.5569 | 0.3216 | 0.3227 | nan | 0.2508 | 0.5051 | 0.3969 | 0.0 | 0.8878 | 0.7636 | 0.9535 | 0.0094 | 0.1310 | 0.3795 | 0.0 | | 0.0662 | 33.2 | 6640 | 0.7039 | 0.4131 | 0.4916 | 0.8772 | nan | 0.8298 | 0.9586 | 0.8088 | 0.8580 | 0.5375 | nan | 0.7101 | 0.7987 | 0.1955 | 0.9538 | 0.1375 | 0.0 | 0.6408 | 0.0635 | 0.7458 | 0.0 | 0.0037 | 0.9299 | 0.1396 | 0.6343 | 0.3837 | 0.4410 | nan | 0.2617 | 0.6124 | 0.5711 | 0.0 | 0.9535 | 0.8593 | 0.9840 | 0.0078 | 0.2362 | 0.4741 | 0.0 | nan | 0.7508 | 0.8788 | 0.6696 | 0.7440 | 0.3979 | nan | 0.5928 | 0.6328 | 0.0921 | 0.8642 | 0.1173 | 0.0 | 0.4648 | 0.0635 | 0.5916 | 0.0 | 0.0037 | 0.7705 | 0.1264 | 0.5459 | 0.3223 | 0.3503 | nan | 0.2286 | 0.5051 | 0.3944 | 0.0 | 0.8893 | 0.7750 | 0.9522 | 0.0059 | 0.1141 | 0.3754 | 0.0 | | 0.0726 | 33.3 | 6660 | 0.7065 | 0.4084 | 0.4842 | 0.8759 | nan | 0.8363 | 0.9559 | 0.8071 | 0.8274 | 0.5425 | nan | 0.7215 | 0.8516 | 0.0306 | 0.9505 | 0.1148 | 0.0 | 0.6430 | 0.0477 | 0.6927 | 0.0 | 0.0025 | 0.9344 | 0.1697 | 0.6262 | 0.3146 | 0.4589 | nan | 0.2954 | 0.6111 | 0.5492 | 0.0 | 0.9557 | 0.8535 | 0.9834 | 0.0044 | 0.2195 | 0.4939 | 0.0 | nan | 0.7537 | 0.8756 | 0.6706 | 0.7371 | 0.4011 | nan | 0.5870 | 0.6198 | 0.0206 | 0.8659 | 0.0991 | 0.0 | 0.4766 | 0.0477 | 0.5819 | 0.0 | 0.0025 | 0.7675 | 0.1532 | 0.5420 | 0.2811 | 0.3503 | nan | 0.2542 | 0.5009 | 0.3853 | 0.0 | 0.8869 | 0.7699 | 0.9523 | 0.0035 | 0.1047 | 0.3782 | 0.0 | | 0.0765 | 33.4 | 6680 | 0.7081 | 0.4149 | 0.4950 | 0.8765 | nan | 0.8328 | 0.9578 | 0.8029 | 0.8399 | 0.5425 | nan | 0.7116 | 0.8201 | 0.0353 | 0.9547 | 0.1215 | 0.0 | 0.6405 | 0.0628 | 0.7235 | 0.0 | 0.0085 | 0.9322 | 0.1502 | 0.6530 | 0.4228 | 0.6183 | nan | 0.3093 | 0.6097 | 0.5750 | 0.0 | 0.9516 | 0.8377 | 0.9840 | 0.0098 | 0.2520 | 0.4802 | 0.0 | nan | 0.7528 | 0.8742 | 0.6729 | 0.7404 | 0.4038 | nan | 0.5890 | 0.6490 | 0.0240 | 0.8631 | 0.1061 | 0.0 | 0.4733 | 0.0628 | 0.5778 | 0.0 | 0.0085 | 0.7719 | 0.1361 | 0.5468 | 0.3464 | 0.4091 | nan | 0.2539 | 0.5036 | 0.3912 | 0.0 | 0.8882 | 0.7625 | 0.9521 | 0.0076 | 0.1290 | 0.3809 | 0.0 | | 0.0897 | 33.5 | 6700 | 0.7159 | 0.4094 | 0.4878 | 0.8751 | nan | 0.8256 | 0.9557 | 0.8043 | 0.8559 | 0.5403 | nan | 0.7293 | 0.8425 | 0.0251 | 0.9597 | 0.0810 | 0.0 | 0.6896 | 0.0414 | 0.7049 | 0.0 | 0.0086 | 0.9321 | 0.1699 | 0.6306 | 0.3758 | 0.5154 | nan | 0.2817 | 0.6137 | 0.5449 | 0.0 | 0.9536 | 0.8209 | 0.9839 | 0.0041 | 0.2244 | 0.4935 | 0.0 | nan | 0.7496 | 0.8754 | 0.6659 | 0.7466 | 0.3971 | nan | 0.5921 | 0.6412 | 0.0174 | 0.8602 | 0.0730 | 0.0 | 0.4721 | 0.0414 | 0.5749 | 0.0 | 0.0086 | 0.7695 | 0.1523 | 0.5378 | 0.3204 | 0.3785 | nan | 0.2446 | 0.5059 | 0.3870 | 0.0 | 0.8839 | 0.7483 | 0.9526 | 0.0031 | 0.1198 | 0.3810 | 0.0 | | 0.0776 | 33.6 | 6720 | 0.7226 | 0.4115 | 0.4950 | 0.8749 | nan | 0.8214 | 0.9543 | 0.8080 | 0.8569 | 0.5551 | nan | 0.7005 | 0.8607 | 0.0413 | 0.9565 | 0.0964 | 0.0 | 0.6873 | 0.0378 | 0.7132 | 0.0 | 0.0026 | 0.9314 | 0.2169 | 0.6416 | 0.3864 | 0.5312 | nan | 0.3206 | 0.6181 | 0.5986 | 0.0 | 0.9525 | 0.8474 | 0.9843 | 0.0084 | 0.2475 | 0.4630 | 0.0 | nan | 0.7465 | 0.8753 | 0.6633 | 0.7468 | 0.3910 | nan | 0.5916 | 0.6261 | 0.0278 | 0.8626 | 0.0849 | 0.0 | 0.4609 | 0.0378 | 0.5827 | 0.0 | 0.0026 | 0.7684 | 0.1841 | 0.5377 | 0.3260 | 0.3794 | nan | 0.2578 | 0.5090 | 0.3896 | 0.0 | 0.8880 | 0.7629 | 0.9518 | 0.0058 | 0.1302 | 0.3758 | 0.0 | | 0.1109 | 33.7 | 6740 | 0.7219 | 0.4140 | 0.4975 | 0.8759 | nan | 0.8275 | 0.9576 | 0.8050 | 0.8541 | 0.5187 | nan | 0.7415 | 0.8474 | 0.0535 | 0.9539 | 0.1190 | 0.0 | 0.7124 | 0.0666 | 0.7268 | 0.0 | 0.0017 | 0.9352 | 0.1979 | 0.6278 | 0.3975 | 0.5485 | nan | 0.3189 | 0.6067 | 0.5829 | 0.0 | 0.9535 | 0.8316 | 0.9827 | 0.0100 | 0.2786 | 0.4640 | 0.0 | nan | 0.7494 | 0.8761 | 0.6653 | 0.7520 | 0.3944 | nan | 0.5889 | 0.6479 | 0.0342 | 0.8640 | 0.1020 | 0.0 | 0.4684 | 0.0666 | 0.5976 | 0.0 | 0.0017 | 0.7665 | 0.1717 | 0.5348 | 0.3320 | 0.3754 | nan | 0.2489 | 0.5057 | 0.3869 | 0.0 | 0.8872 | 0.7600 | 0.9523 | 0.0073 | 0.1342 | 0.3753 | 0.0 | | 0.0714 | 33.8 | 6760 | 0.7040 | 0.4191 | 0.5104 | 0.8769 | nan | 0.8311 | 0.9574 | 0.8046 | 0.8543 | 0.5370 | nan | 0.7318 | 0.8065 | 0.3629 | 0.9544 | 0.1247 | 0.0 | 0.7249 | 0.1480 | 0.7440 | 0.0 | 0.0020 | 0.9248 | 0.2265 | 0.6234 | 0.4050 | 0.5126 | nan | 0.2951 | 0.6125 | 0.5795 | 0.0 | 0.9579 | 0.8385 | 0.9825 | 0.0048 | 0.3099 | 0.4775 | 0.0 | nan | 0.7546 | 0.8781 | 0.6648 | 0.7504 | 0.4058 | nan | 0.5879 | 0.6292 | 0.1124 | 0.8647 | 0.1069 | 0.0 | 0.4684 | 0.1479 | 0.5998 | 0.0 | 0.0020 | 0.7721 | 0.1987 | 0.5353 | 0.3365 | 0.3703 | nan | 0.2205 | 0.5051 | 0.3877 | 0.0 | 0.8865 | 0.7618 | 0.9521 | 0.0035 | 0.1365 | 0.3713 | 0.0 | | 0.0615 | 33.9 | 6780 | 0.7146 | 0.4125 | 0.4896 | 0.8775 | nan | 0.8301 | 0.9632 | 0.7929 | 0.8508 | 0.5108 | nan | 0.7121 | 0.8431 | 0.0684 | 0.9536 | 0.1265 | 0.0 | 0.7271 | 0.0748 | 0.6713 | 0.0 | 0.0 | 0.9263 | 0.2143 | 0.6247 | 0.3687 | 0.4975 | nan | 0.2420 | 0.6167 | 0.5793 | 0.0 | 0.9579 | 0.8454 | 0.9836 | 0.0041 | 0.1795 | 0.5033 | 0.0 | nan | 0.7504 | 0.8762 | 0.6669 | 0.7404 | 0.4170 | nan | 0.5865 | 0.6454 | 0.0385 | 0.8639 | 0.1077 | 0.0 | 0.4739 | 0.0748 | 0.5780 | 0.0 | 0.0 | 0.7720 | 0.1918 | 0.5342 | 0.3126 | 0.3662 | nan | 0.1871 | 0.5051 | 0.3988 | 0.0 | 0.8858 | 0.7658 | 0.9516 | 0.0035 | 0.1193 | 0.3869 | 0.0 | | 0.0893 | 34.0 | 6800 | 0.7000 | 0.4168 | 0.4974 | 0.8783 | nan | 0.8374 | 0.9617 | 0.7923 | 0.8609 | 0.5329 | nan | 0.7031 | 0.7904 | 0.1178 | 0.9539 | 0.1435 | 0.0022 | 0.8039 | 0.0996 | 0.7168 | 0.0 | 0.0020 | 0.9357 | 0.2202 | 0.6263 | 0.3937 | 0.5509 | nan | 0.2026 | 0.6028 | 0.5671 | 0.0 | 0.9493 | 0.8424 | 0.9830 | 0.0082 | 0.2195 | 0.4952 | 0.0 | nan | 0.7542 | 0.8793 | 0.6748 | 0.7430 | 0.4192 | nan | 0.5891 | 0.6673 | 0.0697 | 0.8621 | 0.1203 | 0.0022 | 0.4841 | 0.0995 | 0.5736 | 0.0 | 0.0020 | 0.7679 | 0.1916 | 0.5293 | 0.3269 | 0.3909 | nan | 0.1635 | 0.5054 | 0.3861 | 0.0 | 0.8889 | 0.7676 | 0.9519 | 0.0068 | 0.1331 | 0.3868 | 0.0 | | 0.0643 | 34.1 | 6820 | 0.7305 | 0.4052 | 0.4840 | 0.8741 | nan | 0.8217 | 0.9635 | 0.8015 | 0.8551 | 0.4787 | nan | 0.7021 | 0.8348 | 0.0715 | 0.9554 | 0.1245 | 0.0 | 0.6547 | 0.0495 | 0.7125 | 0.0 | 0.0079 | 0.9311 | 0.2483 | 0.5820 | 0.3625 | 0.4930 | nan | 0.1410 | 0.6551 | 0.5725 | 0.0 | 0.9555 | 0.8247 | 0.9831 | 0.0033 | 0.2233 | 0.4796 | 0.0 | nan | 0.7460 | 0.8727 | 0.6596 | 0.7339 | 0.3828 | nan | 0.5858 | 0.6495 | 0.0449 | 0.8623 | 0.1080 | 0.0 | 0.4414 | 0.0495 | 0.5549 | 0.0 | 0.0079 | 0.7635 | 0.2121 | 0.5171 | 0.3097 | 0.3602 | nan | 0.1169 | 0.5045 | 0.3949 | 0.0 | 0.8853 | 0.7568 | 0.9522 | 0.0027 | 0.1130 | 0.3797 | 0.0 | | 0.0835 | 34.2 | 6840 | 0.7173 | 0.4094 | 0.4836 | 0.8773 | nan | 0.8376 | 0.9589 | 0.7892 | 0.8537 | 0.5312 | nan | 0.7159 | 0.8236 | 0.0417 | 0.9520 | 0.1248 | 0.0 | 0.6412 | 0.0239 | 0.6903 | 0.0 | 0.0034 | 0.9344 | 0.1832 | 0.6327 | 0.3625 | 0.5151 | nan | 0.2228 | 0.6314 | 0.5617 | 0.0 | 0.9543 | 0.8310 | 0.9842 | 0.0093 | 0.1743 | 0.4925 | 0.0 | nan | 0.7552 | 0.8780 | 0.6802 | 0.7442 | 0.4123 | nan | 0.5862 | 0.6551 | 0.0259 | 0.8620 | 0.1086 | 0.0 | 0.4597 | 0.0239 | 0.5638 | 0.0 | 0.0034 | 0.7681 | 0.1653 | 0.5335 | 0.3098 | 0.3808 | nan | 0.1794 | 0.5121 | 0.3907 | 0.0 | 0.8856 | 0.7535 | 0.9520 | 0.0069 | 0.1149 | 0.3910 | 0.0 | | 0.0742 | 34.3 | 6860 | 0.7048 | 0.4105 | 0.4888 | 0.8782 | nan | 0.8362 | 0.9591 | 0.8005 | 0.8552 | 0.5264 | nan | 0.7207 | 0.8449 | 0.0987 | 0.9547 | 0.1075 | 0.0 | 0.6571 | 0.0278 | 0.6951 | 0.0 | 0.0081 | 0.9278 | 0.1670 | 0.6597 | 0.3732 | 0.5320 | nan | 0.2193 | 0.6198 | 0.5725 | 0.0 | 0.9523 | 0.8574 | 0.9834 | 0.0013 | 0.1958 | 0.4896 | 0.0 | nan | 0.7562 | 0.8790 | 0.6689 | 0.7413 | 0.4090 | nan | 0.5854 | 0.6317 | 0.0561 | 0.8609 | 0.0946 | 0.0 | 0.4557 | 0.0278 | 0.5638 | 0.0 | 0.0081 | 0.7737 | 0.1521 | 0.5446 | 0.3168 | 0.3893 | nan | 0.1820 | 0.5130 | 0.3914 | 0.0 | 0.8888 | 0.7689 | 0.9520 | 0.0010 | 0.1353 | 0.3876 | 0.0 | | 0.1168 | 34.4 | 6880 | 0.6841 | 0.4150 | 0.4977 | 0.8794 | nan | 0.8529 | 0.9527 | 0.8020 | 0.8523 | 0.5574 | nan | 0.7265 | 0.8365 | 0.0637 | 0.9550 | 0.1016 | 0.0 | 0.6778 | 0.0272 | 0.7031 | 0.0 | 0.0212 | 0.9231 | 0.2925 | 0.6511 | 0.3985 | 0.5207 | nan | 0.2553 | 0.6245 | 0.5977 | 0.0 | 0.9545 | 0.8385 | 0.9839 | 0.0018 | 0.2349 | 0.5202 | 0.0 | nan | 0.7699 | 0.8812 | 0.6835 | 0.7501 | 0.3954 | nan | 0.5866 | 0.6470 | 0.0383 | 0.8615 | 0.0893 | 0.0 | 0.4509 | 0.0272 | 0.5621 | 0.0 | 0.0212 | 0.7771 | 0.2466 | 0.5537 | 0.3301 | 0.3799 | nan | 0.2045 | 0.5060 | 0.3862 | 0.0 | 0.8881 | 0.7639 | 0.9515 | 0.0015 | 0.1364 | 0.3917 | 0.0 | | 0.0595 | 34.5 | 6900 | 0.7130 | 0.4101 | 0.4904 | 0.8775 | nan | 0.8442 | 0.9532 | 0.8013 | 0.8490 | 0.5569 | nan | 0.7149 | 0.8149 | 0.1256 | 0.9551 | 0.1126 | 0.0 | 0.6683 | 0.0307 | 0.7014 | 0.0 | 0.0049 | 0.9346 | 0.2040 | 0.6351 | 0.3813 | 0.4835 | nan | 0.2342 | 0.6111 | 0.5948 | 0.0 | 0.9543 | 0.8439 | 0.9837 | 0.0032 | 0.2197 | 0.4776 | 0.0 | nan | 0.7597 | 0.8772 | 0.6766 | 0.7504 | 0.3995 | nan | 0.5850 | 0.6392 | 0.0631 | 0.8600 | 0.0989 | 0.0 | 0.4336 | 0.0307 | 0.5614 | 0.0 | 0.0049 | 0.7709 | 0.1801 | 0.5459 | 0.3220 | 0.3608 | nan | 0.1852 | 0.5028 | 0.3971 | 0.0 | 0.8874 | 0.7644 | 0.9520 | 0.0026 | 0.1309 | 0.3810 | 0.0 | | 0.0885 | 34.6 | 6920 | 0.7065 | 0.4110 | 0.4956 | 0.8773 | nan | 0.8436 | 0.9571 | 0.7963 | 0.8438 | 0.5365 | nan | 0.7003 | 0.8339 | 0.1525 | 0.9518 | 0.1117 | 0.0 | 0.6927 | 0.0622 | 0.7234 | 0.0 | 0.0065 | 0.9294 | 0.2014 | 0.6350 | 0.3897 | 0.5032 | nan | 0.2568 | 0.6164 | 0.5865 | 0.0 | 0.9536 | 0.8522 | 0.9817 | 0.0117 | 0.2515 | 0.4782 | 0.0 | nan | 0.7598 | 0.8766 | 0.6939 | 0.7435 | 0.4051 | nan | 0.5845 | 0.6021 | 0.0802 | 0.8648 | 0.0967 | 0.0 | 0.4335 | 0.0622 | 0.5517 | 0.0 | 0.0065 | 0.7722 | 0.1793 | 0.5491 | 0.3233 | 0.3673 | nan | 0.1997 | 0.5005 | 0.3978 | 0.0 | 0.8879 | 0.7661 | 0.9532 | 0.0080 | 0.1146 | 0.3710 | 0.0 | | 0.0715 | 34.7 | 6940 | 0.7091 | 0.4125 | 0.4943 | 0.8776 | nan | 0.8471 | 0.9560 | 0.7982 | 0.8358 | 0.5498 | nan | 0.7191 | 0.8264 | 0.0810 | 0.9588 | 0.1080 | 0.0 | 0.6942 | 0.0434 | 0.6989 | 0.0 | 0.0072 | 0.9267 | 0.2175 | 0.6463 | 0.3871 | 0.4835 | nan | 0.3013 | 0.6419 | 0.5906 | 0.0 | 0.9537 | 0.8299 | 0.9838 | 0.0065 | 0.2428 | 0.4815 | 0.0 | nan | 0.7602 | 0.8772 | 0.7021 | 0.7450 | 0.4044 | nan | 0.5904 | 0.6210 | 0.0448 | 0.8598 | 0.0944 | 0.0 | 0.4384 | 0.0434 | 0.5587 | 0.0 | 0.0072 | 0.7725 | 0.1907 | 0.5470 | 0.3270 | 0.3721 | nan | 0.2330 | 0.5032 | 0.4042 | 0.0 | 0.8870 | 0.7600 | 0.9526 | 0.0048 | 0.1173 | 0.3813 | 0.0 | | 0.0908 | 34.8 | 6960 | 0.7034 | 0.4133 | 0.4963 | 0.8781 | nan | 0.8463 | 0.9567 | 0.7999 | 0.8399 | 0.5551 | nan | 0.7234 | 0.8263 | 0.1250 | 0.9508 | 0.1135 | 0.0 | 0.6795 | 0.0507 | 0.7067 | 0.0 | 0.0101 | 0.9316 | 0.2536 | 0.6314 | 0.3883 | 0.4968 | nan | 0.2776 | 0.6215 | 0.5755 | 0.0 | 0.9553 | 0.8408 | 0.9832 | 0.0066 | 0.2792 | 0.4559 | 0.0 | nan | 0.7646 | 0.8796 | 0.6948 | 0.7483 | 0.4008 | nan | 0.5927 | 0.6159 | 0.0623 | 0.8653 | 0.0994 | 0.0 | 0.4262 | 0.0507 | 0.5722 | 0.0 | 0.0101 | 0.7701 | 0.2134 | 0.5379 | 0.3254 | 0.3697 | nan | 0.2032 | 0.5077 | 0.4076 | 0.0 | 0.8870 | 0.7637 | 0.9529 | 0.0047 | 0.1300 | 0.3709 | 0.0 | | 0.0926 | 34.9 | 6980 | 0.7226 | 0.4129 | 0.5021 | 0.8743 | nan | 0.8084 | 0.9567 | 0.7951 | 0.8701 | 0.5444 | nan | 0.7083 | 0.8245 | 0.1514 | 0.9543 | 0.1348 | 0.0 | 0.6805 | 0.0463 | 0.7060 | 0.0 | 0.0069 | 0.9234 | 0.2647 | 0.6287 | 0.3968 | 0.5927 | nan | 0.2616 | 0.6449 | 0.5906 | 0.0 | 0.9519 | 0.8647 | 0.9833 | 0.0077 | 0.2975 | 0.4714 | 0.0 | nan | 0.7343 | 0.8782 | 0.6921 | 0.6995 | 0.4072 | nan | 0.5899 | 0.6290 | 0.0722 | 0.8616 | 0.1157 | 0.0 | 0.4273 | 0.0463 | 0.5762 | 0.0 | 0.0069 | 0.7722 | 0.2214 | 0.5357 | 0.3314 | 0.3707 | nan | 0.1966 | 0.5107 | 0.3979 | 0.0 | 0.8896 | 0.7743 | 0.9526 | 0.0057 | 0.1434 | 0.3734 | 0.0 | | 0.072 | 35.0 | 7000 | 0.7112 | 0.4118 | 0.4961 | 0.8770 | nan | 0.8373 | 0.9527 | 0.7939 | 0.8673 | 0.5509 | nan | 0.7249 | 0.8349 | 0.1411 | 0.9566 | 0.1426 | 0.0 | 0.6715 | 0.0284 | 0.7095 | 0.0 | 0.0032 | 0.9373 | 0.2113 | 0.6021 | 0.3573 | 0.5783 | nan | 0.2289 | 0.6315 | 0.5659 | 0.0 | 0.9492 | 0.8623 | 0.9810 | 0.0086 | 0.2546 | 0.4921 | 0.0 | nan | 0.7609 | 0.8859 | 0.6978 | 0.7222 | 0.3807 | nan | 0.5923 | 0.6311 | 0.0732 | 0.8622 | 0.1218 | 0.0 | 0.4430 | 0.0284 | 0.5737 | 0.0 | 0.0032 | 0.7643 | 0.1859 | 0.5229 | 0.3097 | 0.3805 | nan | 0.1885 | 0.5066 | 0.4033 | 0.0 | 0.8896 | 0.7731 | 0.9540 | 0.0062 | 0.1348 | 0.3822 | 0.0 | | 0.0878 | 35.1 | 7020 | 0.7032 | 0.4121 | 0.4950 | 0.8796 | nan | 0.8632 | 0.9551 | 0.8148 | 0.8522 | 0.5301 | nan | 0.7068 | 0.8316 | 0.0810 | 0.9546 | 0.1448 | 0.0 | 0.6698 | 0.0374 | 0.7077 | 0.0 | 0.0 | 0.9315 | 0.1499 | 0.6055 | 0.3846 | 0.6106 | nan | 0.2612 | 0.6163 | 0.5929 | 0.0 | 0.9509 | 0.8546 | 0.9852 | 0.0106 | 0.2482 | 0.4902 | 0.0 | nan | 0.7732 | 0.8875 | 0.6897 | 0.7523 | 0.3764 | nan | 0.5895 | 0.6392 | 0.0493 | 0.8625 | 0.1203 | 0.0 | 0.4445 | 0.0374 | 0.5728 | 0.0 | 0.0 | 0.7643 | 0.1360 | 0.5188 | 0.3272 | 0.3897 | nan | 0.2108 | 0.5050 | 0.3999 | 0.0 | 0.8897 | 0.7721 | 0.9513 | 0.0080 | 0.1363 | 0.3830 | 0.0 | | 0.0617 | 35.2 | 7040 | 0.7085 | 0.4076 | 0.4862 | 0.8779 | nan | 0.8469 | 0.9541 | 0.8175 | 0.8613 | 0.5417 | nan | 0.7101 | 0.8428 | 0.0342 | 0.9524 | 0.1063 | 0.0 | 0.6526 | 0.0122 | 0.7009 | 0.0 | 0.0007 | 0.9350 | 0.1259 | 0.5991 | 0.3807 | 0.5674 | nan | 0.2156 | 0.6201 | 0.5787 | 0.0 | 0.9548 | 0.8446 | 0.9818 | 0.0100 | 0.2173 | 0.4933 | 0.0 | nan | 0.7655 | 0.8837 | 0.6960 | 0.7419 | 0.3874 | nan | 0.5899 | 0.6424 | 0.0234 | 0.8619 | 0.0929 | 0.0 | 0.4448 | 0.0122 | 0.5773 | 0.0 | 0.0007 | 0.7608 | 0.1156 | 0.5165 | 0.3232 | 0.3777 | nan | 0.1875 | 0.5068 | 0.4089 | 0.0 | 0.8878 | 0.7641 | 0.9526 | 0.0079 | 0.1262 | 0.3872 | 0.0 | | 0.0636 | 35.3 | 7060 | 0.7015 | 0.4130 | 0.4922 | 0.8782 | nan | 0.8488 | 0.9566 | 0.8009 | 0.8527 | 0.5343 | nan | 0.7245 | 0.8251 | 0.0319 | 0.9549 | 0.1212 | 0.0 | 0.6995 | 0.0283 | 0.6915 | 0.0 | 0.0028 | 0.9273 | 0.2412 | 0.6259 | 0.3709 | 0.5330 | nan | 0.2623 | 0.6272 | 0.5727 | 0.0 | 0.9555 | 0.8255 | 0.9827 | 0.0109 | 0.2392 | 0.5022 | 0.0 | nan | 0.7659 | 0.8814 | 0.6998 | 0.7451 | 0.3928 | nan | 0.5907 | 0.6495 | 0.0203 | 0.8597 | 0.1045 | 0.0 | 0.4582 | 0.0283 | 0.5761 | 0.0 | 0.0028 | 0.7697 | 0.2071 | 0.5300 | 0.3199 | 0.3712 | nan | 0.2169 | 0.5065 | 0.4021 | 0.0 | 0.8852 | 0.7539 | 0.9524 | 0.0086 | 0.1298 | 0.3871 | 0.0 | | 0.0551 | 35.4 | 7080 | 0.6996 | 0.4136 | 0.4917 | 0.8791 | nan | 0.8576 | 0.9591 | 0.8023 | 0.8473 | 0.5101 | nan | 0.7156 | 0.8227 | 0.0597 | 0.9540 | 0.1384 | 0.0 | 0.6836 | 0.0255 | 0.7013 | 0.0 | 0.0024 | 0.9314 | 0.2454 | 0.6274 | 0.3617 | 0.5197 | nan | 0.2522 | 0.6283 | 0.5832 | 0.0 | 0.9528 | 0.8283 | 0.9866 | 0.0045 | 0.2473 | 0.4846 | 0.0 | nan | 0.7706 | 0.8814 | 0.7086 | 0.7500 | 0.3846 | nan | 0.5886 | 0.6455 | 0.0359 | 0.8613 | 0.1172 | 0.0 | 0.4454 | 0.0255 | 0.5828 | 0.0 | 0.0024 | 0.7678 | 0.2096 | 0.5277 | 0.3131 | 0.3811 | nan | 0.2117 | 0.5064 | 0.4019 | 0.0 | 0.8859 | 0.7581 | 0.9508 | 0.0038 | 0.1357 | 0.3816 | 0.0 | | 0.0555 | 35.5 | 7100 | 0.7008 | 0.4108 | 0.4897 | 0.8788 | nan | 0.8481 | 0.9579 | 0.8094 | 0.8626 | 0.5052 | nan | 0.7268 | 0.8382 | 0.0548 | 0.9542 | 0.1216 | 0.0 | 0.6768 | 0.0288 | 0.7086 | 0.0 | 0.0040 | 0.9293 | 0.1859 | 0.6175 | 0.3534 | 0.5334 | nan | 0.2739 | 0.6032 | 0.5648 | 0.0 | 0.9600 | 0.8306 | 0.9847 | 0.0082 | 0.2428 | 0.4855 | 0.0 | nan | 0.7684 | 0.8847 | 0.6922 | 0.7430 | 0.3884 | nan | 0.5942 | 0.6428 | 0.0347 | 0.8619 | 0.1038 | 0.0 | 0.4355 | 0.0288 | 0.5760 | 0.0 | 0.0040 | 0.7669 | 0.1662 | 0.5237 | 0.3013 | 0.3811 | nan | 0.2261 | 0.5022 | 0.4037 | 0.0 | 0.8831 | 0.7549 | 0.9516 | 0.0065 | 0.1401 | 0.3809 | 0.0 | | 0.0685 | 35.6 | 7120 | 0.6991 | 0.4137 | 0.4954 | 0.8780 | nan | 0.8426 | 0.9550 | 0.8109 | 0.8636 | 0.5350 | nan | 0.7355 | 0.8334 | 0.0566 | 0.9551 | 0.1427 | 0.0012 | 0.7120 | 0.0213 | 0.6881 | 0.0 | 0.0188 | 0.9347 | 0.2164 | 0.6212 | 0.3516 | 0.5650 | nan | 0.2964 | 0.6247 | 0.5762 | 0.0 | 0.9528 | 0.8363 | 0.9829 | 0.0074 | 0.2304 | 0.4844 | 0.0 | nan | 0.7640 | 0.8856 | 0.6964 | 0.7336 | 0.3824 | nan | 0.5942 | 0.6517 | 0.0340 | 0.8610 | 0.1194 | 0.0012 | 0.4459 | 0.0213 | 0.5836 | 0.0 | 0.0187 | 0.7668 | 0.1879 | 0.5232 | 0.3031 | 0.3881 | nan | 0.2392 | 0.5056 | 0.4001 | 0.0 | 0.8858 | 0.7562 | 0.9529 | 0.0062 | 0.1420 | 0.3893 | 0.0 | | 0.0688 | 35.7 | 7140 | 0.7115 | 0.4137 | 0.4982 | 0.8749 | nan | 0.8133 | 0.9552 | 0.7908 | 0.8666 | 0.5457 | nan | 0.7425 | 0.8438 | 0.0803 | 0.9552 | 0.1694 | 0.0015 | 0.7415 | 0.0262 | 0.6994 | 0.0 | 0.0347 | 0.9313 | 0.1861 | 0.6127 | 0.3975 | 0.5372 | nan | 0.2925 | 0.6242 | 0.5672 | 0.0 | 0.9508 | 0.8592 | 0.9825 | 0.0076 | 0.2294 | 0.4985 | 0.0 | nan | 0.7406 | 0.8803 | 0.7048 | 0.7144 | 0.3771 | nan | 0.5912 | 0.6374 | 0.0471 | 0.8628 | 0.1437 | 0.0015 | 0.4355 | 0.0262 | 0.5822 | 0.0 | 0.0345 | 0.7694 | 0.1668 | 0.5288 | 0.3207 | 0.3802 | nan | 0.2327 | 0.5057 | 0.3945 | 0.0 | 0.8890 | 0.7673 | 0.9535 | 0.0067 | 0.1502 | 0.3926 | 0.0 | | 0.0911 | 35.8 | 7160 | 0.7127 | 0.4171 | 0.5055 | 0.8749 | nan | 0.8198 | 0.9536 | 0.7962 | 0.8677 | 0.5529 | nan | 0.7179 | 0.8123 | 0.2586 | 0.9532 | 0.1314 | 0.0 | 0.7348 | 0.0632 | 0.7108 | 0.0 | 0.0530 | 0.9332 | 0.2061 | 0.6238 | 0.4007 | 0.5218 | nan | 0.3449 | 0.6127 | 0.5598 | 0.0 | 0.9534 | 0.8369 | 0.9825 | 0.0082 | 0.2718 | 0.4932 | 0.0 | nan | 0.7420 | 0.8851 | 0.6962 | 0.7035 | 0.3731 | nan | 0.5890 | 0.6483 | 0.1169 | 0.8648 | 0.1145 | 0.0 | 0.4173 | 0.0632 | 0.5896 | 0.0 | 0.0524 | 0.7712 | 0.1831 | 0.5377 | 0.3297 | 0.3778 | nan | 0.2610 | 0.5062 | 0.3907 | 0.0 | 0.8873 | 0.7594 | 0.9537 | 0.0071 | 0.1381 | 0.3888 | 0.0 | | 0.0745 | 35.9 | 7180 | 0.6970 | 0.4162 | 0.5017 | 0.8765 | nan | 0.8326 | 0.9566 | 0.7924 | 0.8683 | 0.5136 | nan | 0.7218 | 0.7968 | 0.3063 | 0.9562 | 0.1121 | 0.0 | 0.6826 | 0.0618 | 0.7158 | 0.0 | 0.0413 | 0.9271 | 0.2218 | 0.6374 | 0.3751 | 0.4786 | nan | 0.3344 | 0.6221 | 0.5697 | 0.0 | 0.9569 | 0.8356 | 0.9849 | 0.0138 | 0.2779 | 0.4618 | 0.0 | nan | 0.7538 | 0.8827 | 0.7012 | 0.7188 | 0.3717 | nan | 0.5949 | 0.6410 | 0.1112 | 0.8610 | 0.0984 | 0.0 | 0.4344 | 0.0618 | 0.5806 | 0.0 | 0.0409 | 0.7695 | 0.1922 | 0.5333 | 0.3247 | 0.3578 | nan | 0.2487 | 0.5049 | 0.3946 | 0.0 | 0.8862 | 0.7612 | 0.9519 | 0.0110 | 0.1510 | 0.3789 | 0.0 | | 0.098 | 36.0 | 7200 | 0.7067 | 0.4167 | 0.5020 | 0.8761 | nan | 0.8272 | 0.9580 | 0.7997 | 0.8615 | 0.5146 | nan | 0.7204 | 0.7786 | 0.3260 | 0.9567 | 0.1210 | 0.0 | 0.7143 | 0.0673 | 0.7219 | 0.0 | 0.0364 | 0.9272 | 0.1989 | 0.6440 | 0.3668 | 0.4954 | nan | 0.3236 | 0.6110 | 0.5818 | 0.0 | 0.9562 | 0.8395 | 0.9854 | 0.0191 | 0.2414 | 0.4706 | 0.0 | nan | 0.7499 | 0.8770 | 0.6894 | 0.7338 | 0.3880 | nan | 0.5896 | 0.6261 | 0.1040 | 0.8598 | 0.1053 | 0.0 | 0.4644 | 0.0673 | 0.5807 | 0.0 | 0.0362 | 0.7711 | 0.1764 | 0.5369 | 0.3151 | 0.3816 | nan | 0.2475 | 0.5046 | 0.3927 | 0.0 | 0.8861 | 0.7613 | 0.9518 | 0.0145 | 0.1437 | 0.3786 | 0.0 | | 0.0913 | 36.1 | 7220 | 0.6997 | 0.4150 | 0.4988 | 0.8783 | nan | 0.8518 | 0.9561 | 0.8063 | 0.8402 | 0.5311 | nan | 0.7216 | 0.7947 | 0.2903 | 0.9534 | 0.1148 | 0.0 | 0.6703 | 0.0730 | 0.7298 | 0.0 | 0.0158 | 0.9356 | 0.1644 | 0.6500 | 0.3514 | 0.4716 | nan | 0.3353 | 0.6106 | 0.5744 | 0.0 | 0.9499 | 0.8522 | 0.9845 | 0.0146 | 0.2490 | 0.4688 | 0.0 | nan | 0.7663 | 0.8818 | 0.6878 | 0.7574 | 0.3797 | nan | 0.5869 | 0.6227 | 0.1008 | 0.8619 | 0.1000 | 0.0 | 0.4523 | 0.0730 | 0.5852 | 0.0 | 0.0157 | 0.7675 | 0.1473 | 0.5364 | 0.3086 | 0.3620 | nan | 0.2524 | 0.5042 | 0.3920 | 0.0 | 0.8884 | 0.7666 | 0.9527 | 0.0105 | 0.1416 | 0.3776 | 0.0 | | 0.0637 | 36.2 | 7240 | 0.6977 | 0.4142 | 0.4944 | 0.8788 | nan | 0.8588 | 0.9543 | 0.8064 | 0.8406 | 0.5492 | nan | 0.7120 | 0.8186 | 0.1689 | 0.9555 | 0.1137 | 0.0 | 0.6676 | 0.0581 | 0.7305 | 0.0 | 0.0312 | 0.9309 | 0.1872 | 0.6187 | 0.3752 | 0.4368 | nan | 0.2872 | 0.6168 | 0.5789 | 0.0 | 0.9575 | 0.8339 | 0.9822 | 0.0152 | 0.2486 | 0.4868 | 0.0 | nan | 0.7745 | 0.8842 | 0.6891 | 0.7590 | 0.3856 | nan | 0.5883 | 0.6318 | 0.0757 | 0.8617 | 0.1009 | 0.0 | 0.4530 | 0.0581 | 0.5816 | 0.0 | 0.0312 | 0.7674 | 0.1672 | 0.5317 | 0.3163 | 0.3459 | nan | 0.2318 | 0.5034 | 0.3969 | 0.0 | 0.8852 | 0.7579 | 0.9540 | 0.0101 | 0.1300 | 0.3808 | 0.0 | | 0.0577 | 36.3 | 7260 | 0.6996 | 0.4130 | 0.4876 | 0.8784 | nan | 0.8582 | 0.9553 | 0.7909 | 0.8403 | 0.5440 | nan | 0.7311 | 0.8018 | 0.1001 | 0.9540 | 0.1164 | 0.0 | 0.6541 | 0.0438 | 0.6995 | 0.0 | 0.0241 | 0.9389 | 0.1828 | 0.6238 | 0.3535 | 0.4863 | nan | 0.2532 | 0.6019 | 0.5866 | 0.0 | 0.9520 | 0.8268 | 0.9827 | 0.0112 | 0.1905 | 0.5003 | 0.0 | nan | 0.7702 | 0.8815 | 0.7048 | 0.7527 | 0.3851 | nan | 0.5930 | 0.6534 | 0.0464 | 0.8612 | 0.1019 | 0.0 | 0.4455 | 0.0438 | 0.5897 | 0.0 | 0.0240 | 0.7643 | 0.1630 | 0.5292 | 0.3078 | 0.3715 | nan | 0.2076 | 0.5011 | 0.3996 | 0.0 | 0.8853 | 0.7536 | 0.9532 | 0.0087 | 0.1253 | 0.3925 | 0.0 | | 0.0702 | 36.4 | 7280 | 0.6983 | 0.4149 | 0.4925 | 0.8794 | nan | 0.8620 | 0.9546 | 0.8033 | 0.8408 | 0.5509 | nan | 0.7143 | 0.8175 | 0.0806 | 0.9553 | 0.1160 | 0.0 | 0.6629 | 0.0396 | 0.6983 | 0.0 | 0.0254 | 0.9273 | 0.2064 | 0.6429 | 0.3873 | 0.4986 | nan | 0.2752 | 0.6216 | 0.5645 | 0.0 | 0.9558 | 0.8278 | 0.9822 | 0.0105 | 0.2456 | 0.4927 | 0.0 | nan | 0.7758 | 0.8830 | 0.6970 | 0.7579 | 0.3846 | nan | 0.5898 | 0.6602 | 0.0442 | 0.8605 | 0.1009 | 0.0 | 0.4327 | 0.0396 | 0.5747 | 0.0 | 0.0254 | 0.7718 | 0.1811 | 0.5403 | 0.3225 | 0.3797 | nan | 0.2263 | 0.5026 | 0.3968 | 0.0 | 0.8848 | 0.7530 | 0.9532 | 0.0078 | 0.1448 | 0.3873 | 0.0 | | 0.0964 | 36.5 | 7300 | 0.7178 | 0.4104 | 0.4857 | 0.8780 | nan | 0.8624 | 0.9553 | 0.8127 | 0.8304 | 0.5364 | nan | 0.7039 | 0.8188 | 0.0519 | 0.9557 | 0.1239 | 0.0 | 0.6651 | 0.0256 | 0.6564 | 0.0 | 0.0216 | 0.9335 | 0.2001 | 0.6143 | 0.3561 | 0.4989 | nan | 0.2867 | 0.6079 | 0.5412 | 0.0 | 0.9570 | 0.8218 | 0.9826 | 0.0098 | 0.2261 | 0.4856 | 0.0 | nan | 0.7748 | 0.8830 | 0.6690 | 0.7528 | 0.3793 | nan | 0.5862 | 0.6547 | 0.0308 | 0.8591 | 0.1060 | 0.0 | 0.4377 | 0.0256 | 0.5692 | 0.0 | 0.0215 | 0.7674 | 0.1777 | 0.5325 | 0.3070 | 0.3708 | nan | 0.2353 | 0.5018 | 0.3911 | 0.0 | 0.8832 | 0.7496 | 0.9528 | 0.0068 | 0.1247 | 0.3816 | 0.0 | | 0.0779 | 36.6 | 7320 | 0.7006 | 0.4113 | 0.4883 | 0.8802 | nan | 0.8665 | 0.9604 | 0.7948 | 0.8530 | 0.4945 | nan | 0.6943 | 0.8122 | 0.0683 | 0.9578 | 0.1127 | 0.0 | 0.6658 | 0.0479 | 0.6622 | 0.0 | 0.0275 | 0.9319 | 0.1801 | 0.6153 | 0.3630 | 0.5298 | nan | 0.2535 | 0.6365 | 0.5624 | 0.0 | 0.9528 | 0.8414 | 0.9845 | 0.0084 | 0.2596 | 0.4869 | 0.0 | nan | 0.7764 | 0.8868 | 0.6850 | 0.7540 | 0.3797 | nan | 0.5877 | 0.6487 | 0.0401 | 0.8588 | 0.0975 | 0.0 | 0.4239 | 0.0478 | 0.5621 | 0.0 | 0.0273 | 0.7669 | 0.1615 | 0.5356 | 0.3141 | 0.3862 | nan | 0.2107 | 0.5080 | 0.3909 | 0.0 | 0.8876 | 0.7632 | 0.9520 | 0.0057 | 0.1236 | 0.3791 | 0.0 | | 0.0819 | 36.7 | 7340 | 0.6940 | 0.4133 | 0.4937 | 0.8800 | nan | 0.8635 | 0.9556 | 0.8034 | 0.8453 | 0.5412 | nan | 0.7088 | 0.8302 | 0.0977 | 0.9553 | 0.1181 | 0.0 | 0.6735 | 0.0520 | 0.6906 | 0.0024 | 0.0285 | 0.9300 | 0.1750 | 0.6211 | 0.3797 | 0.4705 | nan | 0.2997 | 0.6162 | 0.5961 | 0.0 | 0.9550 | 0.8468 | 0.9837 | 0.0117 | 0.2575 | 0.4882 | 0.0 | nan | 0.7777 | 0.8874 | 0.6822 | 0.7598 | 0.3861 | nan | 0.5925 | 0.6507 | 0.0582 | 0.8616 | 0.1026 | 0.0 | 0.4242 | 0.0520 | 0.5613 | 0.0024 | 0.0280 | 0.7668 | 0.1571 | 0.5304 | 0.3237 | 0.3636 | nan | 0.2547 | 0.5030 | 0.3807 | 0.0 | 0.8867 | 0.7599 | 0.9527 | 0.0080 | 0.1316 | 0.3807 | 0.0 | | 0.0808 | 36.8 | 7360 | 0.7044 | 0.4140 | 0.4945 | 0.8787 | nan | 0.8521 | 0.9593 | 0.8008 | 0.8295 | 0.5274 | nan | 0.7146 | 0.8379 | 0.0985 | 0.9559 | 0.1124 | 0.0 | 0.7056 | 0.0324 | 0.7043 | 0.0 | 0.0238 | 0.9325 | 0.2648 | 0.6339 | 0.3744 | 0.4287 | nan | 0.3486 | 0.6128 | 0.5930 | 0.0 | 0.9544 | 0.8436 | 0.9831 | 0.0106 | 0.2225 | 0.4661 | 0.0 | nan | 0.7659 | 0.8791 | 0.6990 | 0.7507 | 0.3942 | nan | 0.5905 | 0.6398 | 0.0591 | 0.8624 | 0.0988 | 0.0 | 0.4251 | 0.0324 | 0.5589 | 0.0 | 0.0237 | 0.7672 | 0.2195 | 0.5346 | 0.3235 | 0.3453 | nan | 0.2728 | 0.5047 | 0.3876 | 0.0 | 0.8877 | 0.7607 | 0.9530 | 0.0081 | 0.1244 | 0.3778 | 0.0 | | 0.0728 | 36.9 | 7380 | 0.7084 | 0.4142 | 0.4939 | 0.8775 | nan | 0.8526 | 0.9574 | 0.7959 | 0.8252 | 0.5435 | nan | 0.7087 | 0.8486 | 0.0754 | 0.9540 | 0.1073 | 0.0 | 0.7192 | 0.0265 | 0.6834 | 0.0 | 0.0597 | 0.9352 | 0.2742 | 0.6030 | 0.3712 | 0.4912 | nan | 0.3306 | 0.6073 | 0.5723 | 0.0 | 0.9549 | 0.8389 | 0.9843 | 0.0061 | 0.1954 | 0.4826 | 0.0 | nan | 0.7638 | 0.8777 | 0.7027 | 0.7416 | 0.3910 | nan | 0.5880 | 0.6377 | 0.0475 | 0.8630 | 0.0950 | 0.0 | 0.4225 | 0.0265 | 0.5725 | 0.0 | 0.0590 | 0.7648 | 0.2240 | 0.5250 | 0.3208 | 0.3676 | nan | 0.2636 | 0.5029 | 0.3931 | 0.0 | 0.8882 | 0.7638 | 0.9525 | 0.0050 | 0.1131 | 0.3814 | 0.0 | | 0.1764 | 37.0 | 7400 | 0.7037 | 0.4157 | 0.4940 | 0.8778 | nan | 0.8468 | 0.9595 | 0.7919 | 0.8447 | 0.5265 | nan | 0.6947 | 0.8410 | 0.0717 | 0.9525 | 0.1123 | 0.0 | 0.6746 | 0.0423 | 0.6907 | 0.0 | 0.0719 | 0.9325 | 0.2086 | 0.6238 | 0.3980 | 0.5253 | nan | 0.3076 | 0.6047 | 0.5729 | 0.0 | 0.9522 | 0.8394 | 0.9850 | 0.0059 | 0.2324 | 0.4974 | 0.0 | nan | 0.7613 | 0.8786 | 0.7001 | 0.7416 | 0.3913 | nan | 0.5885 | 0.6472 | 0.0451 | 0.8637 | 0.0987 | 0.0 | 0.4451 | 0.0423 | 0.5728 | 0.0 | 0.0705 | 0.7681 | 0.1829 | 0.5288 | 0.3336 | 0.3785 | nan | 0.2514 | 0.5018 | 0.3933 | 0.0 | 0.8883 | 0.7619 | 0.9517 | 0.0048 | 0.1275 | 0.3840 | 0.0 | | 0.0895 | 37.1 | 7420 | 0.6999 | 0.4126 | 0.4867 | 0.8771 | nan | 0.8503 | 0.9592 | 0.7848 | 0.8346 | 0.5228 | nan | 0.7162 | 0.8293 | 0.0593 | 0.9567 | 0.1175 | 0.0 | 0.6605 | 0.0387 | 0.6881 | 0.0 | 0.0442 | 0.9354 | 0.1439 | 0.6318 | 0.3631 | 0.4673 | nan | 0.3297 | 0.6239 | 0.5750 | 0.0 | 0.9561 | 0.8125 | 0.9823 | 0.0063 | 0.2213 | 0.4635 | 0.0 | nan | 0.7630 | 0.8783 | 0.7080 | 0.7374 | 0.3937 | nan | 0.5928 | 0.6596 | 0.0380 | 0.8603 | 0.1037 | 0.0 | 0.4432 | 0.0387 | 0.5748 | 0.0 | 0.0438 | 0.7654 | 0.1303 | 0.5292 | 0.3127 | 0.3603 | nan | 0.2676 | 0.5057 | 0.3996 | 0.0 | 0.8828 | 0.7418 | 0.9535 | 0.0050 | 0.1345 | 0.3791 | 0.0 | | 0.0634 | 37.2 | 7440 | 0.7077 | 0.4125 | 0.4917 | 0.8772 | nan | 0.8502 | 0.9576 | 0.7968 | 0.8249 | 0.5279 | nan | 0.7244 | 0.8363 | 0.0593 | 0.9579 | 0.1177 | 0.0 | 0.7277 | 0.0377 | 0.6814 | 0.0 | 0.0241 | 0.9348 | 0.1902 | 0.6227 | 0.3439 | 0.4986 | nan | 0.3346 | 0.6298 | 0.5920 | 0.0 | 0.9556 | 0.8270 | 0.9823 | 0.0088 | 0.2083 | 0.4821 | 0.0 | nan | 0.7617 | 0.8781 | 0.7008 | 0.7425 | 0.3955 | nan | 0.5903 | 0.6593 | 0.0376 | 0.8607 | 0.1038 | 0.0 | 0.4193 | 0.0377 | 0.5790 | 0.0 | 0.0239 | 0.7656 | 0.1680 | 0.5331 | 0.3008 | 0.3671 | nan | 0.2706 | 0.5073 | 0.3960 | 0.0 | 0.8836 | 0.7501 | 0.9531 | 0.0067 | 0.1227 | 0.3856 | 0.0 | | 0.0869 | 37.3 | 7460 | 0.7203 | 0.4136 | 0.4989 | 0.8760 | nan | 0.8378 | 0.9566 | 0.8226 | 0.8214 | 0.5284 | nan | 0.7195 | 0.8496 | 0.0744 | 0.9566 | 0.1154 | 0.0 | 0.7636 | 0.0293 | 0.6973 | 0.0 | 0.0175 | 0.9256 | 0.2485 | 0.6449 | 0.3697 | 0.5119 | nan | 0.3441 | 0.6346 | 0.5931 | 0.0 | 0.9548 | 0.8344 | 0.9848 | 0.0120 | 0.2231 | 0.4933 | 0.0 | nan | 0.7570 | 0.8759 | 0.6561 | 0.7358 | 0.3974 | nan | 0.5874 | 0.6473 | 0.0484 | 0.8628 | 0.1012 | 0.0 | 0.4442 | 0.0293 | 0.5714 | 0.0 | 0.0173 | 0.7723 | 0.2101 | 0.5453 | 0.3139 | 0.3701 | nan | 0.2793 | 0.5088 | 0.3939 | 0.0 | 0.8855 | 0.7551 | 0.9520 | 0.0085 | 0.1230 | 0.3860 | 0.0 | | 0.0597 | 37.4 | 7480 | 0.7213 | 0.4157 | 0.4955 | 0.8759 | nan | 0.8351 | 0.9555 | 0.8265 | 0.8139 | 0.5288 | nan | 0.7252 | 0.8324 | 0.0834 | 0.9548 | 0.1119 | 0.0 | 0.6856 | 0.0357 | 0.6952 | 0.0 | 0.0188 | 0.9232 | 0.2232 | 0.6509 | 0.4116 | 0.4663 | nan | 0.3377 | 0.6356 | 0.5721 | 0.0 | 0.9563 | 0.8457 | 0.9834 | 0.0146 | 0.2253 | 0.5080 | 0.0 | nan | 0.7572 | 0.8750 | 0.6576 | 0.7290 | 0.3910 | nan | 0.5886 | 0.6493 | 0.0514 | 0.8631 | 0.0988 | 0.0 | 0.4646 | 0.0357 | 0.5736 | 0.0 | 0.0186 | 0.7752 | 0.1952 | 0.5484 | 0.3319 | 0.3610 | nan | 0.2802 | 0.5078 | 0.4058 | 0.0 | 0.8867 | 0.7619 | 0.9530 | 0.0098 | 0.1402 | 0.3920 | 0.0 | | 0.1193 | 37.5 | 7500 | 0.7207 | 0.4120 | 0.4897 | 0.8770 | nan | 0.8432 | 0.9606 | 0.8109 | 0.8180 | 0.5229 | nan | 0.7120 | 0.8275 | 0.1034 | 0.9543 | 0.1219 | 0.0 | 0.6833 | 0.0345 | 0.6969 | 0.0 | 0.0102 | 0.9327 | 0.1481 | 0.6324 | 0.3758 | 0.4912 | nan | 0.2921 | 0.6100 | 0.5838 | 0.0 | 0.9552 | 0.8455 | 0.9850 | 0.0088 | 0.2433 | 0.4661 | 0.0 | nan | 0.7599 | 0.8764 | 0.6587 | 0.7355 | 0.4006 | nan | 0.5885 | 0.6496 | 0.0616 | 0.8628 | 0.1051 | 0.0 | 0.4668 | 0.0345 | 0.5765 | 0.0 | 0.0101 | 0.7681 | 0.1350 | 0.5379 | 0.3189 | 0.3620 | nan | 0.2479 | 0.5048 | 0.3974 | 0.0 | 0.8879 | 0.7671 | 0.9516 | 0.0066 | 0.1352 | 0.3757 | 0.0 | | 0.0614 | 37.6 | 7520 | 0.6856 | 0.4164 | 0.4986 | 0.8806 | nan | 0.8557 | 0.9552 | 0.8226 | 0.8593 | 0.5294 | nan | 0.7159 | 0.8328 | 0.1112 | 0.9576 | 0.1192 | 0.0 | 0.6645 | 0.0129 | 0.6900 | 0.0 | 0.0179 | 0.9247 | 0.1651 | 0.6610 | 0.4152 | 0.5309 | nan | 0.3175 | 0.6542 | 0.5826 | 0.0 | 0.9500 | 0.8539 | 0.9860 | 0.0117 | 0.2712 | 0.4871 | 0.0 | nan | 0.7709 | 0.8876 | 0.6646 | 0.7462 | 0.3925 | nan | 0.5944 | 0.6430 | 0.0640 | 0.8613 | 0.1034 | 0.0 | 0.4655 | 0.0129 | 0.5776 | 0.0 | 0.0176 | 0.7756 | 0.1487 | 0.5568 | 0.3333 | 0.3772 | nan | 0.2724 | 0.5143 | 0.3912 | 0.0 | 0.8902 | 0.7694 | 0.9512 | 0.0086 | 0.1547 | 0.3807 | 0.0 | | 0.0606 | 37.7 | 7540 | 0.6941 | 0.4182 | 0.5017 | 0.8805 | nan | 0.8650 | 0.9560 | 0.7933 | 0.8414 | 0.5326 | nan | 0.7177 | 0.8293 | 0.2166 | 0.9546 | 0.1315 | 0.0 | 0.6449 | 0.0166 | 0.7135 | 0.0 | 0.0131 | 0.9264 | 0.1896 | 0.6603 | 0.3691 | 0.5088 | nan | 0.3542 | 0.6202 | 0.5767 | 0.0 | 0.9502 | 0.8751 | 0.9825 | 0.0109 | 0.3413 | 0.4621 | 0.0 | nan | 0.7733 | 0.8840 | 0.6919 | 0.7561 | 0.3889 | nan | 0.5943 | 0.6378 | 0.1103 | 0.8633 | 0.1141 | 0.0 | 0.4581 | 0.0166 | 0.5770 | 0.0 | 0.0129 | 0.7770 | 0.1652 | 0.5553 | 0.3172 | 0.3632 | nan | 0.2839 | 0.5131 | 0.3938 | 0.0 | 0.8873 | 0.7691 | 0.9542 | 0.0079 | 0.1494 | 0.3677 | 0.0 | | 0.0705 | 37.8 | 7560 | 0.7229 | 0.4164 | 0.4947 | 0.8785 | nan | 0.8464 | 0.9580 | 0.8100 | 0.8481 | 0.5249 | nan | 0.7054 | 0.8062 | 0.1369 | 0.9560 | 0.1133 | 0.0 | 0.6437 | 0.0216 | 0.7081 | 0.0 | 0.0221 | 0.9333 | 0.1667 | 0.6620 | 0.3972 | 0.4877 | nan | 0.3476 | 0.6257 | 0.5574 | 0.0 | 0.9519 | 0.8326 | 0.9843 | 0.0128 | 0.2983 | 0.4736 | 0.0 | nan | 0.7628 | 0.8790 | 0.6833 | 0.7575 | 0.3933 | nan | 0.5903 | 0.6564 | 0.0770 | 0.8623 | 0.0997 | 0.0 | 0.4658 | 0.0216 | 0.5761 | 0.0 | 0.0218 | 0.7726 | 0.1472 | 0.5519 | 0.3323 | 0.3607 | nan | 0.2752 | 0.5137 | 0.3994 | 0.0 | 0.8860 | 0.7577 | 0.9535 | 0.0095 | 0.1437 | 0.3755 | 0.0 | | 0.0537 | 37.9 | 7580 | 0.7356 | 0.4172 | 0.5007 | 0.8764 | nan | 0.8459 | 0.9600 | 0.8097 | 0.8236 | 0.5127 | nan | 0.7044 | 0.8155 | 0.2175 | 0.9523 | 0.1212 | 0.0 | 0.6535 | 0.0820 | 0.7624 | 0.0 | 0.0245 | 0.9178 | 0.2273 | 0.6465 | 0.3677 | 0.4937 | nan | 0.3210 | 0.6487 | 0.5791 | 0.0 | 0.9585 | 0.8132 | 0.9828 | 0.0122 | 0.2730 | 0.4941 | 0.0 | nan | 0.7605 | 0.8744 | 0.6860 | 0.7425 | 0.3914 | nan | 0.5853 | 0.6235 | 0.1081 | 0.8668 | 0.1056 | 0.0 | 0.4666 | 0.0820 | 0.5846 | 0.0 | 0.0237 | 0.7741 | 0.1965 | 0.5475 | 0.3153 | 0.3550 | nan | 0.2637 | 0.5131 | 0.3937 | 0.0 | 0.8822 | 0.7447 | 0.9532 | 0.0093 | 0.1260 | 0.3765 | 0.0 | | 0.0794 | 38.0 | 7600 | 0.7181 | 0.4144 | 0.4909 | 0.8764 | nan | 0.8466 | 0.9593 | 0.7987 | 0.8244 | 0.5173 | nan | 0.6945 | 0.8151 | 0.1500 | 0.9550 | 0.1158 | 0.0 | 0.6362 | 0.0640 | 0.7349 | 0.0 | 0.0074 | 0.9308 | 0.2132 | 0.6209 | 0.3703 | 0.4600 | nan | 0.3048 | 0.6224 | 0.5582 | 0.0 | 0.9569 | 0.8241 | 0.9826 | 0.0143 | 0.2317 | 0.5001 | 0.0 | nan | 0.7596 | 0.8736 | 0.6989 | 0.7417 | 0.3967 | nan | 0.5822 | 0.6178 | 0.0823 | 0.8657 | 0.1009 | 0.0 | 0.4681 | 0.0640 | 0.5913 | 0.0 | 0.0073 | 0.7699 | 0.1893 | 0.5379 | 0.3136 | 0.3458 | nan | 0.2566 | 0.5099 | 0.3935 | 0.0 | 0.8832 | 0.7484 | 0.9532 | 0.0109 | 0.1155 | 0.3826 | 0.0 | | 0.0976 | 38.1 | 7620 | 0.7275 | 0.4167 | 0.4948 | 0.8778 | nan | 0.8452 | 0.9596 | 0.7945 | 0.8375 | 0.5127 | nan | 0.7134 | 0.8046 | 0.2441 | 0.9569 | 0.1222 | 0.0 | 0.6225 | 0.0448 | 0.7151 | 0.0 | 0.0046 | 0.9291 | 0.1728 | 0.6484 | 0.3735 | 0.5232 | nan | 0.3094 | 0.6309 | 0.5690 | 0.0 | 0.9544 | 0.8394 | 0.9851 | 0.0144 | 0.2225 | 0.4840 | 0.0 | nan | 0.7599 | 0.8755 | 0.7011 | 0.7472 | 0.3964 | nan | 0.5875 | 0.6215 | 0.1186 | 0.8636 | 0.1064 | 0.0 | 0.4711 | 0.0448 | 0.5874 | 0.0 | 0.0046 | 0.7716 | 0.1564 | 0.5438 | 0.3163 | 0.3747 | nan | 0.2580 | 0.5110 | 0.3998 | 0.0 | 0.8870 | 0.7616 | 0.9522 | 0.0109 | 0.1265 | 0.3795 | 0.0 | | 0.1365 | 38.2 | 7640 | 0.7119 | 0.4121 | 0.4928 | 0.8766 | nan | 0.8474 | 0.9524 | 0.8074 | 0.8526 | 0.5495 | nan | 0.6898 | 0.8332 | 0.0948 | 0.9565 | 0.1132 | 0.0 | 0.6853 | 0.0279 | 0.7159 | 0.0 | 0.0181 | 0.9358 | 0.1758 | 0.6504 | 0.3769 | 0.5242 | nan | 0.2627 | 0.6129 | 0.5666 | 0.0 | 0.9530 | 0.8232 | 0.9847 | 0.0102 | 0.2972 | 0.4530 | 0.0 | nan | 0.7633 | 0.8785 | 0.6964 | 0.7575 | 0.3707 | nan | 0.5854 | 0.6259 | 0.0489 | 0.8638 | 0.0986 | 0.0 | 0.4741 | 0.0279 | 0.5917 | 0.0 | 0.0180 | 0.7694 | 0.1586 | 0.5437 | 0.3192 | 0.3748 | nan | 0.2147 | 0.5088 | 0.3942 | 0.0 | 0.8850 | 0.7495 | 0.9525 | 0.0080 | 0.1414 | 0.3649 | 0.0 | | 0.083 | 38.3 | 7660 | 0.6949 | 0.4127 | 0.4875 | 0.8774 | nan | 0.8514 | 0.9562 | 0.7999 | 0.8520 | 0.5273 | nan | 0.6936 | 0.8133 | 0.0601 | 0.9558 | 0.1203 | 0.0 | 0.6775 | 0.0249 | 0.6821 | 0.0 | 0.0110 | 0.9317 | 0.2019 | 0.6379 | 0.3723 | 0.4744 | nan | 0.2766 | 0.6250 | 0.5635 | 0.0 | 0.9551 | 0.8210 | 0.9834 | 0.0084 | 0.2563 | 0.4670 | 0.0 | nan | 0.7648 | 0.8777 | 0.7051 | 0.7562 | 0.3775 | nan | 0.5838 | 0.6511 | 0.0339 | 0.8629 | 0.1044 | 0.0 | 0.4645 | 0.0249 | 0.5882 | 0.0 | 0.0109 | 0.7678 | 0.1786 | 0.5393 | 0.3193 | 0.3608 | nan | 0.2288 | 0.5063 | 0.3969 | 0.0 | 0.8832 | 0.7467 | 0.9530 | 0.0070 | 0.1366 | 0.3761 | 0.0 | | 0.0565 | 38.4 | 7680 | 0.7109 | 0.4145 | 0.4937 | 0.8779 | nan | 0.8473 | 0.9554 | 0.7978 | 0.8475 | 0.5370 | nan | 0.7277 | 0.8299 | 0.0863 | 0.9572 | 0.1351 | 0.0 | 0.7158 | 0.0126 | 0.6839 | 0.0 | 0.0220 | 0.9303 | 0.1862 | 0.6312 | 0.3700 | 0.4975 | nan | 0.3249 | 0.6206 | 0.5891 | 0.0 | 0.9540 | 0.8441 | 0.9846 | 0.0100 | 0.2242 | 0.4755 | 0.0 | nan | 0.7633 | 0.8766 | 0.7151 | 0.7567 | 0.3933 | nan | 0.5880 | 0.6438 | 0.0540 | 0.8617 | 0.1155 | 0.0 | 0.4699 | 0.0126 | 0.5720 | 0.0 | 0.0219 | 0.7696 | 0.1657 | 0.5374 | 0.3103 | 0.3687 | nan | 0.2571 | 0.5025 | 0.3951 | 0.0 | 0.8852 | 0.7603 | 0.9523 | 0.0079 | 0.1289 | 0.3780 | 0.0 | | 0.0839 | 38.5 | 7700 | 0.7159 | 0.4131 | 0.4910 | 0.8775 | nan | 0.8444 | 0.9584 | 0.8049 | 0.8463 | 0.5358 | nan | 0.7017 | 0.8419 | 0.0834 | 0.9532 | 0.1406 | 0.0 | 0.7470 | 0.0308 | 0.6798 | 0.0014 | 0.0332 | 0.9384 | 0.1592 | 0.6145 | 0.3729 | 0.5277 | nan | 0.2538 | 0.6062 | 0.5783 | 0.0 | 0.9537 | 0.8386 | 0.9830 | 0.0098 | 0.2066 | 0.4680 | 0.0 | nan | 0.7614 | 0.8756 | 0.7064 | 0.7559 | 0.3953 | nan | 0.5875 | 0.6396 | 0.0518 | 0.8645 | 0.1198 | 0.0 | 0.4784 | 0.0308 | 0.5764 | 0.0014 | 0.0330 | 0.7649 | 0.1447 | 0.5283 | 0.3164 | 0.3755 | nan | 0.2125 | 0.5022 | 0.3936 | 0.0 | 0.8854 | 0.7581 | 0.9531 | 0.0079 | 0.1206 | 0.3774 | 0.0 | | 0.0716 | 38.6 | 7720 | 0.7162 | 0.4187 | 0.5017 | 0.8788 | nan | 0.8526 | 0.9551 | 0.8050 | 0.8509 | 0.5184 | nan | 0.7209 | 0.8335 | 0.1556 | 0.9543 | 0.1366 | 0.0 | 0.7470 | 0.0510 | 0.7137 | 0.0 | 0.0399 | 0.9287 | 0.1901 | 0.6347 | 0.3853 | 0.5274 | nan | 0.3012 | 0.6383 | 0.5748 | 0.0 | 0.9536 | 0.8538 | 0.9817 | 0.0128 | 0.2525 | 0.4866 | 0.0 | nan | 0.7630 | 0.8781 | 0.7102 | 0.7565 | 0.3939 | nan | 0.5860 | 0.6298 | 0.0866 | 0.8660 | 0.1174 | 0.0 | 0.4762 | 0.0510 | 0.5783 | 0.0 | 0.0391 | 0.7713 | 0.1708 | 0.5383 | 0.3233 | 0.3755 | nan | 0.2561 | 0.5111 | 0.3920 | 0.0 | 0.8877 | 0.7657 | 0.9540 | 0.0100 | 0.1305 | 0.3816 | 0.0 | | 0.0676 | 38.7 | 7740 | 0.7302 | 0.4197 | 0.5022 | 0.8779 | nan | 0.8427 | 0.9615 | 0.8072 | 0.8324 | 0.5191 | nan | 0.7091 | 0.8190 | 0.1884 | 0.9549 | 0.1308 | 0.0 | 0.7349 | 0.0611 | 0.7321 | 0.0 | 0.0403 | 0.9298 | 0.2212 | 0.6389 | 0.3802 | 0.5425 | nan | 0.3131 | 0.6210 | 0.5711 | 0.0 | 0.9553 | 0.8361 | 0.9837 | 0.0132 | 0.2581 | 0.4737 | 0.0 | nan | 0.7592 | 0.8737 | 0.7083 | 0.7446 | 0.4115 | nan | 0.5851 | 0.6418 | 0.1032 | 0.8648 | 0.1133 | 0.0 | 0.4610 | 0.0611 | 0.5807 | 0.0 | 0.0396 | 0.7703 | 0.1939 | 0.5360 | 0.3207 | 0.3842 | nan | 0.2520 | 0.5080 | 0.3975 | 0.0 | 0.8857 | 0.7559 | 0.9533 | 0.0114 | 0.1337 | 0.3807 | 0.0 | | 0.0372 | 38.8 | 7760 | 0.7275 | 0.4183 | 0.4979 | 0.8781 | nan | 0.8438 | 0.9619 | 0.8070 | 0.8256 | 0.5068 | nan | 0.7279 | 0.8360 | 0.1696 | 0.9537 | 0.1189 | 0.0 | 0.7087 | 0.0612 | 0.7082 | 0.0 | 0.0407 | 0.9307 | 0.2306 | 0.6416 | 0.3363 | 0.4796 | nan | 0.3186 | 0.6241 | 0.5594 | 0.0 | 0.9526 | 0.8501 | 0.9834 | 0.0108 | 0.2527 | 0.4908 | 0.0 | nan | 0.7588 | 0.8734 | 0.7065 | 0.7448 | 0.4069 | nan | 0.5842 | 0.6397 | 0.0891 | 0.8659 | 0.1040 | 0.0 | 0.4616 | 0.0612 | 0.5898 | 0.0 | 0.0400 | 0.7725 | 0.2028 | 0.5434 | 0.2970 | 0.3622 | nan | 0.2571 | 0.5070 | 0.3964 | 0.0 | 0.8866 | 0.7624 | 0.9531 | 0.0095 | 0.1274 | 0.3815 | 0.0 | | 0.1237 | 38.9 | 7780 | 0.7390 | 0.4173 | 0.4961 | 0.8774 | nan | 0.8399 | 0.9606 | 0.8099 | 0.8417 | 0.5393 | nan | 0.7047 | 0.8089 | 0.1372 | 0.9554 | 0.1194 | 0.0 | 0.7419 | 0.0579 | 0.7088 | 0.0 | 0.0401 | 0.9338 | 0.2093 | 0.6252 | 0.3535 | 0.5579 | nan | 0.2733 | 0.6027 | 0.5453 | 0.0 | 0.9556 | 0.8218 | 0.9823 | 0.0119 | 0.2484 | 0.4868 | 0.0 | nan | 0.7578 | 0.8762 | 0.6966 | 0.7555 | 0.4077 | nan | 0.5903 | 0.6490 | 0.0734 | 0.8654 | 0.1039 | 0.0 | 0.4925 | 0.0579 | 0.5851 | 0.0 | 0.0395 | 0.7697 | 0.1866 | 0.5410 | 0.3043 | 0.3889 | nan | 0.2289 | 0.5039 | 0.3952 | 0.0 | 0.8828 | 0.7474 | 0.9537 | 0.0093 | 0.1161 | 0.3756 | 0.0 | | 0.0878 | 39.0 | 7800 | 0.7379 | 0.4135 | 0.4932 | 0.8779 | nan | 0.8465 | 0.9608 | 0.8103 | 0.8408 | 0.5221 | nan | 0.7165 | 0.8213 | 0.1609 | 0.9555 | 0.1149 | 0.0 | 0.6741 | 0.0498 | 0.7193 | 0.0 | 0.0385 | 0.9362 | 0.1633 | 0.6127 | 0.3813 | 0.5541 | nan | 0.2256 | 0.6193 | 0.5516 | 0.0 | 0.9528 | 0.8320 | 0.9839 | 0.0110 | 0.2697 | 0.4585 | 0.0 | nan | 0.7617 | 0.8781 | 0.6954 | 0.7586 | 0.4065 | nan | 0.5940 | 0.6398 | 0.0858 | 0.8650 | 0.0996 | 0.0 | 0.4466 | 0.0498 | 0.5844 | 0.0 | 0.0380 | 0.7653 | 0.1480 | 0.5314 | 0.3164 | 0.3876 | nan | 0.1939 | 0.5069 | 0.3963 | 0.0 | 0.8841 | 0.7475 | 0.9528 | 0.0084 | 0.1212 | 0.3684 | 0.0 | | 0.062 | 39.1 | 7820 | 0.7108 | 0.4157 | 0.4923 | 0.8784 | nan | 0.8438 | 0.9597 | 0.7949 | 0.8504 | 0.5365 | nan | 0.7155 | 0.8291 | 0.0994 | 0.9575 | 0.1482 | 0.0 | 0.6562 | 0.0225 | 0.6960 | 0.0 | 0.0527 | 0.9367 | 0.1711 | 0.6192 | 0.3725 | 0.5414 | nan | 0.2670 | 0.6328 | 0.5576 | 0.0 | 0.9509 | 0.8439 | 0.9822 | 0.0065 | 0.2352 | 0.4735 | 0.0 | nan | 0.7621 | 0.8780 | 0.7092 | 0.7552 | 0.4032 | nan | 0.5936 | 0.6374 | 0.0592 | 0.8633 | 0.1272 | 0.0 | 0.4597 | 0.0225 | 0.5862 | 0.0 | 0.0517 | 0.7645 | 0.1543 | 0.5282 | 0.3163 | 0.3868 | nan | 0.2224 | 0.5101 | 0.4008 | 0.0 | 0.8880 | 0.7637 | 0.9543 | 0.0051 | 0.1229 | 0.3766 | 0.0 | | 0.1004 | 39.2 | 7840 | 0.7033 | 0.4185 | 0.4942 | 0.8791 | nan | 0.8559 | 0.9565 | 0.7930 | 0.8491 | 0.5355 | nan | 0.7101 | 0.8301 | 0.1027 | 0.9562 | 0.1725 | 0.0 | 0.6637 | 0.0242 | 0.7070 | 0.0 | 0.0722 | 0.9286 | 0.1791 | 0.6424 | 0.3857 | 0.5214 | nan | 0.2739 | 0.6100 | 0.5467 | 0.0 | 0.9586 | 0.8115 | 0.9831 | 0.0036 | 0.2308 | 0.5100 | 0.0 | nan | 0.7675 | 0.8797 | 0.7109 | 0.7577 | 0.3964 | nan | 0.5891 | 0.6448 | 0.0609 | 0.8643 | 0.1490 | 0.0 | 0.4624 | 0.0242 | 0.5880 | 0.0 | 0.0703 | 0.7721 | 0.1616 | 0.5424 | 0.3222 | 0.3832 | nan | 0.2333 | 0.5032 | 0.3985 | 0.0 | 0.8826 | 0.7440 | 0.9536 | 0.0029 | 0.1356 | 0.3931 | 0.0 | | 0.0602 | 39.3 | 7860 | 0.7216 | 0.4169 | 0.4948 | 0.8780 | nan | 0.8340 | 0.9595 | 0.8077 | 0.8507 | 0.5321 | nan | 0.7220 | 0.8371 | 0.1258 | 0.9539 | 0.1473 | 0.0 | 0.6460 | 0.0272 | 0.7104 | 0.0 | 0.0401 | 0.9283 | 0.2397 | 0.6438 | 0.3579 | 0.4705 | nan | 0.2827 | 0.6228 | 0.5608 | 0.0 | 0.9569 | 0.8349 | 0.9816 | 0.0061 | 0.2532 | 0.4996 | 0.0 | nan | 0.7584 | 0.8766 | 0.6849 | 0.7489 | 0.3992 | nan | 0.5886 | 0.6289 | 0.0706 | 0.8663 | 0.1302 | 0.0 | 0.4607 | 0.0272 | 0.5851 | 0.0 | 0.0398 | 0.7749 | 0.2118 | 0.5500 | 0.3089 | 0.3623 | nan | 0.2385 | 0.5067 | 0.3993 | 0.0 | 0.8862 | 0.7622 | 0.9543 | 0.0048 | 0.1298 | 0.3845 | 0.0 | | 0.0722 | 39.4 | 7880 | 0.7279 | 0.4158 | 0.4944 | 0.8793 | nan | 0.8447 | 0.9610 | 0.8107 | 0.8430 | 0.5297 | nan | 0.7024 | 0.8319 | 0.1052 | 0.9593 | 0.1157 | 0.0 | 0.6465 | 0.0196 | 0.7043 | 0.0 | 0.0278 | 0.9274 | 0.2287 | 0.6582 | 0.3815 | 0.4803 | nan | 0.2805 | 0.6389 | 0.5961 | 0.0 | 0.9555 | 0.8343 | 0.9837 | 0.0073 | 0.2698 | 0.4762 | 0.0 | nan | 0.7635 | 0.8777 | 0.6774 | 0.7572 | 0.4057 | nan | 0.5842 | 0.6384 | 0.0617 | 0.8630 | 0.1021 | 0.0 | 0.4500 | 0.0196 | 0.5784 | 0.0 | 0.0277 | 0.7755 | 0.2023 | 0.5536 | 0.3216 | 0.3683 | nan | 0.2345 | 0.5113 | 0.4037 | 0.0 | 0.8876 | 0.7632 | 0.9530 | 0.0058 | 0.1393 | 0.3805 | 0.0 | | 0.0719 | 39.5 | 7900 | 0.7208 | 0.4149 | 0.4961 | 0.8786 | nan | 0.8398 | 0.9592 | 0.8124 | 0.8508 | 0.5282 | nan | 0.7058 | 0.8316 | 0.1139 | 0.9600 | 0.1281 | 0.0 | 0.6649 | 0.0164 | 0.7065 | 0.0 | 0.0230 | 0.9300 | 0.1827 | 0.6428 | 0.4054 | 0.5390 | nan | 0.2653 | 0.6270 | 0.5696 | 0.0 | 0.9535 | 0.8449 | 0.9823 | 0.0082 | 0.2985 | 0.4852 | 0.0 | nan | 0.7593 | 0.8784 | 0.6730 | 0.7511 | 0.4080 | nan | 0.5888 | 0.6418 | 0.0683 | 0.8619 | 0.1105 | 0.0 | 0.4541 | 0.0164 | 0.5749 | 0.0 | 0.0230 | 0.7736 | 0.1657 | 0.5479 | 0.3315 | 0.3771 | nan | 0.2269 | 0.5095 | 0.4005 | 0.0 | 0.8888 | 0.7629 | 0.9537 | 0.0065 | 0.1402 | 0.3813 | 0.0 | | 0.0846 | 39.6 | 7920 | 0.6984 | 0.4183 | 0.4961 | 0.8808 | nan | 0.8589 | 0.9569 | 0.8112 | 0.8473 | 0.5346 | nan | 0.7155 | 0.8169 | 0.0974 | 0.9561 | 0.1416 | 0.0 | 0.6767 | 0.0101 | 0.7005 | 0.0 | 0.0254 | 0.9290 | 0.1983 | 0.6502 | 0.3968 | 0.5309 | nan | 0.2897 | 0.6273 | 0.5516 | 0.0 | 0.9538 | 0.8456 | 0.9819 | 0.0081 | 0.2572 | 0.5060 | 0.0 | nan | 0.7752 | 0.8828 | 0.6751 | 0.7567 | 0.4072 | nan | 0.5912 | 0.6550 | 0.0613 | 0.8624 | 0.1209 | 0.0 | 0.4808 | 0.0101 | 0.5745 | 0.0 | 0.0253 | 0.7752 | 0.1782 | 0.5485 | 0.3296 | 0.3791 | nan | 0.2458 | 0.5081 | 0.4058 | 0.0 | 0.8873 | 0.7594 | 0.9539 | 0.0062 | 0.1392 | 0.3899 | 0.0 | | 0.0809 | 39.7 | 7940 | 0.7169 | 0.4148 | 0.4923 | 0.8793 | nan | 0.8536 | 0.9586 | 0.8105 | 0.8475 | 0.5322 | nan | 0.6982 | 0.8256 | 0.0994 | 0.9558 | 0.1317 | 0.0 | 0.6672 | 0.0141 | 0.7055 | 0.0 | 0.0191 | 0.9298 | 0.1881 | 0.6427 | 0.3488 | 0.5112 | nan | 0.2933 | 0.6179 | 0.5871 | 0.0 | 0.9564 | 0.8229 | 0.9850 | 0.0119 | 0.2337 | 0.5052 | 0.0 | nan | 0.7701 | 0.8802 | 0.6757 | 0.7586 | 0.4029 | nan | 0.5886 | 0.6480 | 0.0595 | 0.8637 | 0.1131 | 0.0 | 0.4732 | 0.0141 | 0.5779 | 0.0 | 0.0190 | 0.7734 | 0.1697 | 0.5488 | 0.3030 | 0.3724 | nan | 0.2432 | 0.5065 | 0.4009 | 0.0 | 0.8848 | 0.7553 | 0.9522 | 0.0083 | 0.1235 | 0.3883 | 0.0 | | 0.0945 | 39.8 | 7960 | 0.7017 | 0.4152 | 0.4958 | 0.8797 | nan | 0.8576 | 0.9532 | 0.8158 | 0.8498 | 0.5588 | nan | 0.7138 | 0.8261 | 0.1482 | 0.9577 | 0.1459 | 0.0 | 0.6522 | 0.0102 | 0.7086 | 0.0 | 0.0185 | 0.9304 | 0.1772 | 0.6265 | 0.3742 | 0.5060 | nan | 0.2929 | 0.6277 | 0.5883 | 0.0 | 0.9563 | 0.8431 | 0.9832 | 0.0089 | 0.2584 | 0.4771 | 0.0 | nan | 0.7751 | 0.8855 | 0.6741 | 0.7566 | 0.3922 | nan | 0.5920 | 0.6459 | 0.0836 | 0.8614 | 0.1255 | 0.0 | 0.4616 | 0.0102 | 0.5758 | 0.0 | 0.0184 | 0.7704 | 0.1604 | 0.5422 | 0.3159 | 0.3706 | nan | 0.2470 | 0.5085 | 0.3991 | 0.0 | 0.8875 | 0.7653 | 0.9526 | 0.0060 | 0.1240 | 0.3785 | 0.0 | | 0.0709 | 39.9 | 7980 | 0.6824 | 0.4177 | 0.4990 | 0.8802 | nan | 0.8637 | 0.9545 | 0.8064 | 0.8450 | 0.5502 | nan | 0.7172 | 0.8241 | 0.1955 | 0.9567 | 0.1354 | 0.0 | 0.6952 | 0.0083 | 0.7013 | 0.0 | 0.0200 | 0.9256 | 0.1924 | 0.6498 | 0.3677 | 0.5193 | nan | 0.3254 | 0.6198 | 0.5777 | 0.0 | 0.9556 | 0.8344 | 0.9831 | 0.0078 | 0.2394 | 0.4968 | 0.0 | nan | 0.7781 | 0.8862 | 0.7001 | 0.7581 | 0.3762 | nan | 0.5914 | 0.6247 | 0.1057 | 0.8610 | 0.1174 | 0.0 | 0.4881 | 0.0083 | 0.5716 | 0.0 | 0.0199 | 0.7752 | 0.1735 | 0.5509 | 0.3146 | 0.3802 | nan | 0.2705 | 0.5070 | 0.3950 | 0.0 | 0.8869 | 0.7625 | 0.9528 | 0.0055 | 0.1265 | 0.3797 | 0.0 | | 0.0547 | 40.0 | 8000 | 0.7111 | 0.4160 | 0.4967 | 0.8793 | nan | 0.8592 | 0.9570 | 0.7988 | 0.8468 | 0.5238 | nan | 0.7063 | 0.8328 | 0.1482 | 0.9541 | 0.1329 | 0.0 | 0.6587 | 0.0168 | 0.7095 | 0.0 | 0.0289 | 0.9318 | 0.2070 | 0.6181 | 0.3898 | 0.5362 | nan | 0.3203 | 0.6264 | 0.5822 | 0.0 | 0.9551 | 0.8464 | 0.9843 | 0.0083 | 0.2529 | 0.4631 | 0.0 | nan | 0.7693 | 0.8781 | 0.7153 | 0.7588 | 0.3972 | nan | 0.5880 | 0.6141 | 0.0841 | 0.8626 | 0.1162 | 0.0 | 0.4551 | 0.0168 | 0.5758 | 0.0 | 0.0288 | 0.7695 | 0.1828 | 0.5380 | 0.3252 | 0.3743 | nan | 0.2568 | 0.5091 | 0.3966 | 0.0 | 0.8883 | 0.7678 | 0.9528 | 0.0061 | 0.1174 | 0.3673 | 0.0 | | 0.0858 | 40.1 | 8020 | 0.7221 | 0.4135 | 0.4925 | 0.8782 | nan | 0.8552 | 0.9559 | 0.7976 | 0.8474 | 0.5309 | nan | 0.7098 | 0.8381 | 0.1045 | 0.9552 | 0.1035 | 0.0 | 0.6336 | 0.0115 | 0.7002 | 0.0 | 0.0388 | 0.9335 | 0.2127 | 0.5985 | 0.3998 | 0.5091 | nan | 0.3033 | 0.6293 | 0.5819 | 0.0 | 0.9555 | 0.8360 | 0.9838 | 0.0103 | 0.2591 | 0.4654 | 0.0 | nan | 0.7694 | 0.8781 | 0.7211 | 0.7589 | 0.3944 | nan | 0.5895 | 0.6134 | 0.0608 | 0.8625 | 0.0920 | 0.0 | 0.4409 | 0.0115 | 0.5837 | 0.0 | 0.0385 | 0.7643 | 0.1876 | 0.5244 | 0.3306 | 0.3740 | nan | 0.2456 | 0.5079 | 0.3983 | 0.0 | 0.8871 | 0.7570 | 0.9532 | 0.0073 | 0.1129 | 0.3659 | 0.0 | | 0.0695 | 40.2 | 8040 | 0.7099 | 0.4159 | 0.4970 | 0.8800 | nan | 0.8637 | 0.9564 | 0.7999 | 0.8444 | 0.5332 | nan | 0.7073 | 0.8490 | 0.1056 | 0.9560 | 0.1068 | 0.0 | 0.6443 | 0.0205 | 0.7023 | 0.0 | 0.0346 | 0.9285 | 0.2501 | 0.6405 | 0.3662 | 0.5070 | nan | 0.3411 | 0.6337 | 0.5808 | 0.0 | 0.9531 | 0.8505 | 0.9833 | 0.0080 | 0.2765 | 0.4612 | 0.0 | nan | 0.7759 | 0.8817 | 0.7130 | 0.7633 | 0.3899 | nan | 0.5911 | 0.6022 | 0.0613 | 0.8624 | 0.0944 | 0.0 | 0.4481 | 0.0205 | 0.5844 | 0.0 | 0.0342 | 0.7705 | 0.2157 | 0.5417 | 0.3178 | 0.3717 | nan | 0.2610 | 0.5101 | 0.3980 | 0.0 | 0.8885 | 0.7637 | 0.9531 | 0.0057 | 0.1230 | 0.3653 | 0.0 | | 0.0796 | 40.3 | 8060 | 0.7107 | 0.4157 | 0.4954 | 0.8796 | nan | 0.8609 | 0.9568 | 0.7905 | 0.8493 | 0.5125 | nan | 0.7184 | 0.8283 | 0.1212 | 0.9543 | 0.1012 | 0.0 | 0.6666 | 0.0362 | 0.7047 | 0.0 | 0.0299 | 0.9276 | 0.1844 | 0.6379 | 0.3452 | 0.5341 | nan | 0.3662 | 0.6248 | 0.5844 | 0.0 | 0.9572 | 0.8399 | 0.9828 | 0.0140 | 0.2255 | 0.4967 | 0.0 | nan | 0.7721 | 0.8809 | 0.7172 | 0.7599 | 0.3866 | nan | 0.5904 | 0.6015 | 0.0668 | 0.8629 | 0.0898 | 0.0 | 0.4694 | 0.0362 | 0.5856 | 0.0 | 0.0297 | 0.7715 | 0.1673 | 0.5417 | 0.3024 | 0.3677 | nan | 0.2845 | 0.5088 | 0.3984 | 0.0 | 0.8865 | 0.7622 | 0.9529 | 0.0097 | 0.1214 | 0.3776 | 0.0 | | 0.0724 | 40.4 | 8080 | 0.7106 | 0.4169 | 0.5013 | 0.8799 | nan | 0.8624 | 0.9575 | 0.7998 | 0.8468 | 0.5267 | nan | 0.7205 | 0.8300 | 0.1562 | 0.9553 | 0.1078 | 0.0 | 0.6866 | 0.0474 | 0.7193 | 0.0 | 0.0365 | 0.9303 | 0.1972 | 0.6227 | 0.3779 | 0.5839 | nan | 0.3352 | 0.6225 | 0.5880 | 0.0 | 0.9522 | 0.8457 | 0.9839 | 0.0096 | 0.2625 | 0.4761 | 0.0 | nan | 0.7742 | 0.8807 | 0.7120 | 0.7606 | 0.3870 | nan | 0.5935 | 0.6092 | 0.0852 | 0.8635 | 0.0951 | 0.0 | 0.4542 | 0.0474 | 0.5808 | 0.0 | 0.0362 | 0.7702 | 0.1771 | 0.5395 | 0.3223 | 0.3795 | nan | 0.2676 | 0.5092 | 0.3938 | 0.0 | 0.8893 | 0.7672 | 0.9525 | 0.0073 | 0.1162 | 0.3706 | 0.0 | | 0.0861 | 40.5 | 8100 | 0.7126 | 0.4154 | 0.4968 | 0.8788 | nan | 0.8509 | 0.9591 | 0.8039 | 0.8473 | 0.5250 | nan | 0.7123 | 0.8259 | 0.1234 | 0.9525 | 0.1220 | 0.0 | 0.6810 | 0.0406 | 0.7185 | 0.0 | 0.0481 | 0.9290 | 0.2089 | 0.6287 | 0.3642 | 0.5502 | nan | 0.2832 | 0.6238 | 0.5661 | 0.0 | 0.9556 | 0.8417 | 0.9821 | 0.0075 | 0.2778 | 0.4699 | 0.0 | nan | 0.7683 | 0.8775 | 0.7027 | 0.7596 | 0.3931 | nan | 0.5943 | 0.6091 | 0.0731 | 0.8649 | 0.1067 | 0.0 | 0.4585 | 0.0406 | 0.5824 | 0.0 | 0.0475 | 0.7699 | 0.1858 | 0.5358 | 0.3135 | 0.3763 | nan | 0.2335 | 0.5090 | 0.3943 | 0.0 | 0.8875 | 0.7648 | 0.9533 | 0.0059 | 0.1189 | 0.3664 | 0.0 | | 0.0864 | 40.6 | 8120 | 0.7126 | 0.4158 | 0.4996 | 0.8781 | nan | 0.8529 | 0.9577 | 0.7999 | 0.8500 | 0.5235 | nan | 0.7092 | 0.8165 | 0.1371 | 0.9546 | 0.1320 | 0.0 | 0.7268 | 0.0451 | 0.7072 | 0.0 | 0.0315 | 0.9346 | 0.1732 | 0.6225 | 0.3816 | 0.6257 | nan | 0.2730 | 0.6131 | 0.5802 | 0.0 | 0.9530 | 0.8303 | 0.9835 | 0.0093 | 0.2946 | 0.4686 | 0.0 | nan | 0.7672 | 0.8786 | 0.7101 | 0.7576 | 0.3883 | nan | 0.5942 | 0.6285 | 0.0779 | 0.8631 | 0.1131 | 0.0 | 0.4647 | 0.0451 | 0.5853 | 0.0 | 0.0312 | 0.7671 | 0.1563 | 0.5319 | 0.3204 | 0.4060 | nan | 0.2306 | 0.5065 | 0.3911 | 0.0 | 0.8866 | 0.7560 | 0.9531 | 0.0071 | 0.1212 | 0.3685 | 0.0 | | 0.0489 | 40.7 | 8140 | 0.7001 | 0.4168 | 0.4990 | 0.8786 | nan | 0.8581 | 0.9576 | 0.7935 | 0.8392 | 0.5069 | nan | 0.7259 | 0.7895 | 0.2352 | 0.9557 | 0.1464 | 0.0 | 0.7005 | 0.0433 | 0.6987 | 0.0 | 0.0159 | 0.9282 | 0.1917 | 0.6173 | 0.3791 | 0.5618 | nan | 0.2642 | 0.6416 | 0.5948 | 0.0 | 0.9565 | 0.8391 | 0.9837 | 0.0108 | 0.2517 | 0.4799 | 0.0 | nan | 0.7651 | 0.8791 | 0.7217 | 0.7586 | 0.3793 | nan | 0.5944 | 0.6412 | 0.0998 | 0.8614 | 0.1255 | 0.0 | 0.4618 | 0.0433 | 0.5817 | 0.0 | 0.0158 | 0.7689 | 0.1718 | 0.5362 | 0.3195 | 0.3800 | nan | 0.2271 | 0.5068 | 0.3928 | 0.0 | 0.8861 | 0.7587 | 0.9531 | 0.0083 | 0.1205 | 0.3787 | 0.0 | | 0.0885 | 40.8 | 8160 | 0.7055 | 0.4151 | 0.4967 | 0.8789 | nan | 0.8546 | 0.9578 | 0.8063 | 0.8368 | 0.5227 | nan | 0.7276 | 0.8306 | 0.1052 | 0.9568 | 0.1536 | 0.0 | 0.7151 | 0.0256 | 0.6851 | 0.0 | 0.0230 | 0.9323 | 0.1821 | 0.6141 | 0.3939 | 0.5730 | nan | 0.2549 | 0.6408 | 0.5897 | 0.0 | 0.9506 | 0.8533 | 0.9838 | 0.0067 | 0.2379 | 0.4792 | 0.0 | nan | 0.7664 | 0.8800 | 0.7130 | 0.7574 | 0.3786 | nan | 0.5950 | 0.6417 | 0.0597 | 0.8608 | 0.1306 | 0.0 | 0.4675 | 0.0256 | 0.5791 | 0.0 | 0.0230 | 0.7671 | 0.1637 | 0.5323 | 0.3301 | 0.3915 | nan | 0.2119 | 0.5075 | 0.3876 | 0.0 | 0.8901 | 0.7687 | 0.9532 | 0.0054 | 0.1181 | 0.3782 | 0.0 | | 0.0666 | 40.9 | 8180 | 0.6983 | 0.4148 | 0.4943 | 0.8804 | nan | 0.8615 | 0.9583 | 0.8135 | 0.8462 | 0.5189 | nan | 0.7181 | 0.8330 | 0.0987 | 0.9506 | 0.1547 | 0.0 | 0.6725 | 0.0495 | 0.7100 | 0.0 | 0.0287 | 0.9267 | 0.1776 | 0.6343 | 0.3780 | 0.5432 | nan | 0.2443 | 0.6137 | 0.5769 | 0.0 | 0.9584 | 0.8384 | 0.9828 | 0.0066 | 0.2333 | 0.4880 | 0.0 | nan | 0.7760 | 0.8835 | 0.6975 | 0.7577 | 0.3847 | nan | 0.5952 | 0.6365 | 0.0593 | 0.8659 | 0.1333 | 0.0 | 0.4577 | 0.0495 | 0.5792 | 0.0 | 0.0286 | 0.7722 | 0.1602 | 0.5420 | 0.3217 | 0.3705 | nan | 0.2078 | 0.5065 | 0.3912 | 0.0 | 0.8859 | 0.7595 | 0.9537 | 0.0053 | 0.1169 | 0.3758 | 0.0 | | 0.0927 | 41.0 | 8200 | 0.7110 | 0.4152 | 0.4938 | 0.8796 | nan | 0.8595 | 0.9572 | 0.8074 | 0.8472 | 0.5385 | nan | 0.7135 | 0.8222 | 0.0817 | 0.9568 | 0.1376 | 0.0 | 0.6614 | 0.0422 | 0.7135 | 0.0 | 0.0282 | 0.9262 | 0.1968 | 0.6167 | 0.3808 | 0.5397 | nan | 0.2615 | 0.6287 | 0.5838 | 0.0 | 0.9574 | 0.8236 | 0.9837 | 0.0121 | 0.2220 | 0.5007 | 0.0 | nan | 0.7749 | 0.8832 | 0.7024 | 0.7589 | 0.3841 | nan | 0.5957 | 0.6435 | 0.0512 | 0.8635 | 0.1196 | 0.0 | 0.4658 | 0.0422 | 0.5789 | 0.0 | 0.0281 | 0.7691 | 0.1761 | 0.5324 | 0.3221 | 0.3748 | nan | 0.2213 | 0.5070 | 0.3918 | 0.0 | 0.8855 | 0.7553 | 0.9531 | 0.0094 | 0.1149 | 0.3822 | 0.0 | | 0.0784 | 41.1 | 8220 | 0.7319 | 0.4160 | 0.4940 | 0.8790 | nan | 0.8561 | 0.9583 | 0.8089 | 0.8384 | 0.5278 | nan | 0.7092 | 0.8246 | 0.0814 | 0.9545 | 0.1391 | 0.0 | 0.6316 | 0.0590 | 0.7229 | 0.0 | 0.0238 | 0.9329 | 0.2112 | 0.6189 | 0.3859 | 0.5190 | nan | 0.2873 | 0.6168 | 0.5891 | 0.0 | 0.9543 | 0.8355 | 0.9833 | 0.0165 | 0.2376 | 0.4825 | 0.0 | nan | 0.7691 | 0.8808 | 0.6975 | 0.7617 | 0.3839 | nan | 0.5908 | 0.6494 | 0.0511 | 0.8644 | 0.1202 | 0.0 | 0.4527 | 0.0590 | 0.5800 | 0.0 | 0.0237 | 0.7681 | 0.1869 | 0.5344 | 0.3263 | 0.3720 | nan | 0.2333 | 0.5073 | 0.3929 | 0.0 | 0.8865 | 0.7570 | 0.9537 | 0.0123 | 0.1167 | 0.3791 | 0.0 | | 0.0667 | 41.2 | 8240 | 0.7222 | 0.4126 | 0.4903 | 0.8783 | nan | 0.8543 | 0.9550 | 0.8106 | 0.8437 | 0.5405 | nan | 0.7182 | 0.8463 | 0.0717 | 0.9550 | 0.1303 | 0.0 | 0.6477 | 0.0313 | 0.7111 | 0.0 | 0.0306 | 0.9294 | 0.1755 | 0.6283 | 0.3626 | 0.4902 | nan | 0.2610 | 0.6342 | 0.5772 | 0.0 | 0.9563 | 0.8257 | 0.9828 | 0.0121 | 0.2199 | 0.4890 | 0.0 | nan | 0.7670 | 0.8804 | 0.6984 | 0.7617 | 0.3820 | nan | 0.5916 | 0.6406 | 0.0455 | 0.8624 | 0.1137 | 0.0 | 0.4502 | 0.0313 | 0.5807 | 0.0 | 0.0305 | 0.7701 | 0.1591 | 0.5380 | 0.3146 | 0.3684 | nan | 0.2147 | 0.5078 | 0.3928 | 0.0 | 0.8845 | 0.7513 | 0.9536 | 0.0095 | 0.1195 | 0.3833 | 0.0 | | 0.0719 | 41.3 | 8260 | 0.7150 | 0.4129 | 0.4889 | 0.8786 | nan | 0.8482 | 0.9565 | 0.8046 | 0.8534 | 0.5374 | nan | 0.7099 | 0.8436 | 0.0664 | 0.9558 | 0.1160 | 0.0 | 0.6357 | 0.0288 | 0.7091 | 0.0 | 0.0459 | 0.9284 | 0.1771 | 0.6290 | 0.3729 | 0.4730 | nan | 0.2386 | 0.6250 | 0.5816 | 0.0 | 0.9566 | 0.8394 | 0.9835 | 0.0035 | 0.2339 | 0.4895 | 0.0 | nan | 0.7638 | 0.8790 | 0.7074 | 0.7540 | 0.3897 | nan | 0.5946 | 0.6470 | 0.0429 | 0.8617 | 0.1015 | 0.0 | 0.4443 | 0.0288 | 0.5846 | 0.0 | 0.0458 | 0.7694 | 0.1605 | 0.5365 | 0.3197 | 0.3608 | nan | 0.2000 | 0.5086 | 0.3954 | 0.0 | 0.8862 | 0.7583 | 0.9526 | 0.0030 | 0.1288 | 0.3859 | 0.0 | | 0.058 | 41.4 | 8280 | 0.7249 | 0.4149 | 0.4906 | 0.8788 | nan | 0.8595 | 0.9577 | 0.8047 | 0.8316 | 0.5300 | nan | 0.7112 | 0.8323 | 0.0734 | 0.9546 | 0.1188 | 0.0 | 0.6529 | 0.0564 | 0.7018 | 0.0 | 0.0388 | 0.9337 | 0.2297 | 0.6200 | 0.3705 | 0.4951 | nan | 0.2212 | 0.6194 | 0.5791 | 0.0 | 0.9535 | 0.8400 | 0.9823 | 0.0089 | 0.2406 | 0.4811 | 0.0 | nan | 0.7669 | 0.8796 | 0.7100 | 0.7539 | 0.3916 | nan | 0.5929 | 0.6481 | 0.0462 | 0.8639 | 0.1034 | 0.0 | 0.4556 | 0.0564 | 0.5855 | 0.0 | 0.0385 | 0.7658 | 0.1999 | 0.5280 | 0.3195 | 0.3666 | nan | 0.1890 | 0.5092 | 0.3935 | 0.0 | 0.8874 | 0.7598 | 0.9533 | 0.0071 | 0.1239 | 0.3806 | 0.0 | | 0.0713 | 41.5 | 8300 | 0.7246 | 0.4159 | 0.4938 | 0.8796 | nan | 0.8617 | 0.9582 | 0.8017 | 0.8346 | 0.5269 | nan | 0.7055 | 0.8484 | 0.0846 | 0.9547 | 0.1224 | 0.0 | 0.6546 | 0.0629 | 0.7093 | 0.0 | 0.0433 | 0.9283 | 0.2270 | 0.6212 | 0.3884 | 0.5291 | nan | 0.2168 | 0.6255 | 0.5767 | 0.0 | 0.9544 | 0.8536 | 0.9821 | 0.0084 | 0.2440 | 0.4769 | 0.0 | nan | 0.7708 | 0.8794 | 0.7099 | 0.7561 | 0.3945 | nan | 0.5920 | 0.6368 | 0.0546 | 0.8646 | 0.1063 | 0.0 | 0.4508 | 0.0629 | 0.5847 | 0.0 | 0.0426 | 0.7672 | 0.1970 | 0.5285 | 0.3282 | 0.3829 | nan | 0.1803 | 0.5096 | 0.3944 | 0.0 | 0.8888 | 0.7674 | 0.9534 | 0.0067 | 0.1231 | 0.3763 | 0.0 | | 0.0749 | 41.6 | 8320 | 0.7235 | 0.4159 | 0.4939 | 0.8791 | nan | 0.8596 | 0.9574 | 0.7982 | 0.8473 | 0.5339 | nan | 0.7065 | 0.8321 | 0.1159 | 0.9565 | 0.1260 | 0.0 | 0.6461 | 0.0717 | 0.7121 | 0.0 | 0.0399 | 0.9331 | 0.2266 | 0.6165 | 0.3872 | 0.5393 | nan | 0.2080 | 0.6241 | 0.5663 | 0.0 | 0.9519 | 0.8333 | 0.9848 | 0.0066 | 0.2445 | 0.4781 | 0.0 | nan | 0.7714 | 0.8796 | 0.7094 | 0.7562 | 0.3935 | nan | 0.5940 | 0.6356 | 0.0654 | 0.8632 | 0.1084 | 0.0 | 0.4528 | 0.0717 | 0.5806 | 0.0 | 0.0393 | 0.7653 | 0.1969 | 0.5249 | 0.3302 | 0.3871 | nan | 0.1723 | 0.5094 | 0.3931 | 0.0 | 0.8876 | 0.7604 | 0.9525 | 0.0054 | 0.1249 | 0.3762 | 0.0 | | 0.0797 | 41.7 | 8340 | 0.7274 | 0.4130 | 0.4909 | 0.8792 | nan | 0.8519 | 0.9562 | 0.8035 | 0.8531 | 0.5439 | nan | 0.7088 | 0.8430 | 0.0806 | 0.9565 | 0.1222 | 0.0 | 0.6614 | 0.0473 | 0.7077 | 0.0 | 0.0384 | 0.9308 | 0.1828 | 0.6351 | 0.3837 | 0.5151 | nan | 0.1893 | 0.6149 | 0.5878 | 0.0 | 0.9549 | 0.8449 | 0.9851 | 0.0052 | 0.2324 | 0.4736 | 0.0 | nan | 0.7700 | 0.8795 | 0.7001 | 0.7552 | 0.3956 | nan | 0.5961 | 0.6341 | 0.0483 | 0.8633 | 0.1057 | 0.0 | 0.4576 | 0.0472 | 0.5786 | 0.0 | 0.0380 | 0.7683 | 0.1633 | 0.5348 | 0.3249 | 0.3814 | nan | 0.1590 | 0.5099 | 0.3968 | 0.0 | 0.8878 | 0.7646 | 0.9524 | 0.0043 | 0.1245 | 0.3759 | 0.0 | | 0.0992 | 41.8 | 8360 | 0.7200 | 0.4129 | 0.4919 | 0.8783 | nan | 0.8496 | 0.9564 | 0.8070 | 0.8435 | 0.5330 | nan | 0.7220 | 0.8368 | 0.0732 | 0.9572 | 0.1298 | 0.0 | 0.6709 | 0.0622 | 0.7007 | 0.0 | 0.0269 | 0.9303 | 0.1829 | 0.6236 | 0.3705 | 0.5537 | nan | 0.1950 | 0.6268 | 0.5977 | 0.0 | 0.9553 | 0.8440 | 0.9844 | 0.0053 | 0.2268 | 0.4750 | 0.0 | nan | 0.7641 | 0.8773 | 0.6907 | 0.7587 | 0.3943 | nan | 0.5929 | 0.6437 | 0.0429 | 0.8629 | 0.1115 | 0.0 | 0.4596 | 0.0622 | 0.5825 | 0.0 | 0.0267 | 0.7682 | 0.1647 | 0.5321 | 0.3166 | 0.3840 | nan | 0.1642 | 0.5102 | 0.3949 | 0.0 | 0.8868 | 0.7622 | 0.9528 | 0.0044 | 0.1231 | 0.3788 | 0.0 | | 0.0582 | 41.9 | 8380 | 0.7296 | 0.4136 | 0.4924 | 0.8785 | nan | 0.8458 | 0.9592 | 0.8061 | 0.8447 | 0.5136 | nan | 0.7207 | 0.8342 | 0.0641 | 0.9576 | 0.1245 | 0.0 | 0.6799 | 0.0590 | 0.7184 | 0.0 | 0.0293 | 0.9313 | 0.1980 | 0.6216 | 0.3837 | 0.5488 | nan | 0.2023 | 0.6359 | 0.5885 | 0.0 | 0.9538 | 0.8449 | 0.9838 | 0.0088 | 0.2121 | 0.4875 | 0.0 | nan | 0.7637 | 0.8770 | 0.6863 | 0.7567 | 0.3963 | nan | 0.5920 | 0.6472 | 0.0403 | 0.8635 | 0.1081 | 0.0 | 0.4642 | 0.0590 | 0.5765 | 0.0 | 0.0291 | 0.7673 | 0.1775 | 0.5307 | 0.3274 | 0.3845 | nan | 0.1720 | 0.5113 | 0.3951 | 0.0 | 0.8879 | 0.7644 | 0.9531 | 0.0069 | 0.1164 | 0.3815 | 0.0 | | 0.0701 | 42.0 | 8400 | 0.7337 | 0.4166 | 0.4941 | 0.8794 | nan | 0.8473 | 0.9594 | 0.8017 | 0.8461 | 0.5233 | nan | 0.7117 | 0.8309 | 0.0828 | 0.9545 | 0.1145 | 0.0 | 0.6676 | 0.0605 | 0.7218 | 0.0 | 0.0342 | 0.9306 | 0.2053 | 0.6414 | 0.3719 | 0.5456 | nan | 0.2417 | 0.6270 | 0.5795 | 0.0 | 0.9551 | 0.8547 | 0.9828 | 0.0081 | 0.2259 | 0.4839 | 0.0 | nan | 0.7642 | 0.8767 | 0.7027 | 0.7549 | 0.3968 | nan | 0.5939 | 0.6486 | 0.0503 | 0.8654 | 0.1005 | 0.0 | 0.4616 | 0.0605 | 0.5804 | 0.0 | 0.0339 | 0.7705 | 0.1820 | 0.5421 | 0.3226 | 0.3825 | nan | 0.2008 | 0.5129 | 0.4000 | 0.0 | 0.8890 | 0.7714 | 0.9532 | 0.0065 | 0.1244 | 0.3819 | 0.0 | | 0.0728 | 42.1 | 8420 | 0.7088 | 0.4176 | 0.4970 | 0.8802 | nan | 0.8586 | 0.9580 | 0.7946 | 0.8437 | 0.5241 | nan | 0.7327 | 0.8391 | 0.0650 | 0.9594 | 0.1225 | 0.0 | 0.6719 | 0.0386 | 0.7056 | 0.0 | 0.0373 | 0.9302 | 0.2239 | 0.6361 | 0.3748 | 0.5825 | nan | 0.2644 | 0.6285 | 0.5914 | 0.0 | 0.9539 | 0.8399 | 0.9820 | 0.0074 | 0.2436 | 0.4938 | 0.0 | nan | 0.7709 | 0.8803 | 0.7146 | 0.7559 | 0.3943 | nan | 0.5952 | 0.6516 | 0.0407 | 0.8623 | 0.1078 | 0.0 | 0.4596 | 0.0386 | 0.5840 | 0.0 | 0.0372 | 0.7724 | 0.1968 | 0.5428 | 0.3211 | 0.3926 | nan | 0.2153 | 0.5124 | 0.3964 | 0.0 | 0.8873 | 0.7598 | 0.9536 | 0.0059 | 0.1290 | 0.3856 | 0.0 | | 0.0779 | 42.2 | 8440 | 0.7205 | 0.4161 | 0.4943 | 0.8797 | nan | 0.8508 | 0.9577 | 0.7977 | 0.8524 | 0.5320 | nan | 0.7101 | 0.8392 | 0.0628 | 0.9558 | 0.1263 | 0.0 | 0.6522 | 0.0314 | 0.7007 | 0.0 | 0.0404 | 0.9291 | 0.2000 | 0.6533 | 0.3771 | 0.5758 | nan | 0.2348 | 0.6217 | 0.5855 | 0.0 | 0.9555 | 0.8560 | 0.9833 | 0.0088 | 0.2664 | 0.4606 | 0.0 | nan | 0.7671 | 0.8793 | 0.7060 | 0.7545 | 0.3968 | nan | 0.5946 | 0.6539 | 0.0420 | 0.8647 | 0.1110 | 0.0 | 0.4508 | 0.0314 | 0.5844 | 0.0 | 0.0401 | 0.7719 | 0.1772 | 0.5459 | 0.3214 | 0.3899 | nan | 0.1958 | 0.5120 | 0.4020 | 0.0 | 0.8878 | 0.7681 | 0.9529 | 0.0068 | 0.1305 | 0.3751 | 0.0 | | 0.0871 | 42.3 | 8460 | 0.7242 | 0.4185 | 0.4984 | 0.8793 | nan | 0.8523 | 0.9558 | 0.8010 | 0.8471 | 0.5390 | nan | 0.7165 | 0.8376 | 0.0732 | 0.9575 | 0.1268 | 0.0 | 0.6554 | 0.0401 | 0.7135 | 0.0 | 0.0391 | 0.9302 | 0.2009 | 0.6376 | 0.3938 | 0.5685 | nan | 0.2904 | 0.6257 | 0.6011 | 0.0 | 0.9547 | 0.8443 | 0.9831 | 0.0072 | 0.2709 | 0.4865 | 0.0 | nan | 0.7665 | 0.8795 | 0.7045 | 0.7569 | 0.3943 | nan | 0.5916 | 0.6578 | 0.0482 | 0.8641 | 0.1112 | 0.0 | 0.4562 | 0.0401 | 0.5822 | 0.0 | 0.0388 | 0.7715 | 0.1783 | 0.5404 | 0.3289 | 0.3985 | nan | 0.2423 | 0.5129 | 0.3986 | 0.0 | 0.8878 | 0.7631 | 0.9532 | 0.0056 | 0.1355 | 0.3838 | 0.0 | | 0.0671 | 42.4 | 8480 | 0.7098 | 0.4188 | 0.4982 | 0.8806 | nan | 0.8619 | 0.9552 | 0.8057 | 0.8452 | 0.5423 | nan | 0.7258 | 0.8334 | 0.0872 | 0.9566 | 0.1294 | 0.0 | 0.6649 | 0.0448 | 0.7137 | 0.0 | 0.0332 | 0.9312 | 0.1886 | 0.6284 | 0.3881 | 0.5432 | nan | 0.3220 | 0.6299 | 0.5896 | 0.0 | 0.9543 | 0.8488 | 0.9821 | 0.0093 | 0.2349 | 0.4914 | 0.0 | nan | 0.7764 | 0.8833 | 0.6977 | 0.7602 | 0.3922 | nan | 0.5932 | 0.6581 | 0.0533 | 0.8642 | 0.1131 | 0.0 | 0.4540 | 0.0448 | 0.5776 | 0.0 | 0.0331 | 0.7703 | 0.1689 | 0.5359 | 0.3266 | 0.3886 | nan | 0.2650 | 0.5127 | 0.3999 | 0.0 | 0.8876 | 0.7635 | 0.9537 | 0.0070 | 0.1350 | 0.3869 | 0.0 | | 0.0736 | 42.5 | 8500 | 0.7222 | 0.4162 | 0.4933 | 0.8795 | nan | 0.8571 | 0.9566 | 0.8087 | 0.8463 | 0.5427 | nan | 0.7186 | 0.8216 | 0.0681 | 0.9545 | 0.1284 | 0.0 | 0.6721 | 0.0528 | 0.7124 | 0.0 | 0.0335 | 0.9342 | 0.1955 | 0.6150 | 0.3653 | 0.5562 | nan | 0.2725 | 0.6109 | 0.5731 | 0.0 | 0.9558 | 0.8340 | 0.9829 | 0.0079 | 0.2154 | 0.4945 | 0.0 | nan | 0.7733 | 0.8818 | 0.6834 | 0.7600 | 0.3922 | nan | 0.5932 | 0.6659 | 0.0427 | 0.8649 | 0.1118 | 0.0 | 0.4561 | 0.0528 | 0.5790 | 0.0 | 0.0332 | 0.7670 | 0.1743 | 0.5256 | 0.3162 | 0.3880 | nan | 0.2272 | 0.5090 | 0.3999 | 0.0 | 0.8859 | 0.7582 | 0.9532 | 0.0062 | 0.1276 | 0.3903 | 0.0 | | 0.0666 | 42.6 | 8520 | 0.7110 | 0.4198 | 0.4967 | 0.8812 | nan | 0.8616 | 0.9569 | 0.7996 | 0.8491 | 0.5364 | nan | 0.7060 | 0.8177 | 0.0932 | 0.9571 | 0.1263 | 0.0 | 0.6658 | 0.0576 | 0.7184 | 0.0 | 0.0473 | 0.9285 | 0.2135 | 0.6561 | 0.3734 | 0.5421 | nan | 0.2644 | 0.6309 | 0.5902 | 0.0 | 0.9542 | 0.8551 | 0.9831 | 0.0090 | 0.2125 | 0.4895 | 0.0 | nan | 0.7743 | 0.8819 | 0.6957 | 0.7576 | 0.3946 | nan | 0.5919 | 0.6597 | 0.0575 | 0.8641 | 0.1115 | 0.0 | 0.4605 | 0.0576 | 0.5757 | 0.0 | 0.0468 | 0.7745 | 0.1876 | 0.5488 | 0.3221 | 0.3951 | nan | 0.2200 | 0.5136 | 0.4034 | 0.0 | 0.8891 | 0.7712 | 0.9533 | 0.0069 | 0.1312 | 0.3877 | 0.0 | | 0.0695 | 42.7 | 8540 | 0.7218 | 0.4189 | 0.4977 | 0.8780 | nan | 0.8364 | 0.9574 | 0.7969 | 0.8431 | 0.5503 | nan | 0.7173 | 0.8123 | 0.1159 | 0.9577 | 0.1220 | 0.0 | 0.6804 | 0.0625 | 0.7212 | 0.0 | 0.0555 | 0.9310 | 0.1974 | 0.6558 | 0.3836 | 0.5621 | nan | 0.2385 | 0.6290 | 0.6051 | 0.0 | 0.9517 | 0.8515 | 0.9845 | 0.0038 | 0.2094 | 0.4955 | 0.0 | nan | 0.7579 | 0.8747 | 0.7015 | 0.7469 | 0.3890 | nan | 0.5928 | 0.6626 | 0.0668 | 0.8629 | 0.1075 | 0.0 | 0.4649 | 0.0625 | 0.5803 | 0.0 | 0.0551 | 0.7735 | 0.1761 | 0.5460 | 0.3289 | 0.4016 | nan | 0.1946 | 0.5145 | 0.4018 | 0.0 | 0.8895 | 0.7713 | 0.9526 | 0.0032 | 0.1358 | 0.3916 | 0.0 | | 0.0572 | 42.8 | 8560 | 0.7331 | 0.4175 | 0.4938 | 0.8775 | nan | 0.8462 | 0.9556 | 0.7954 | 0.8294 | 0.5532 | nan | 0.7014 | 0.8166 | 0.1012 | 0.9509 | 0.1225 | 0.0 | 0.6679 | 0.0907 | 0.7107 | 0.0 | 0.0320 | 0.9310 | 0.1957 | 0.6430 | 0.3670 | 0.5316 | nan | 0.2325 | 0.6179 | 0.5918 | 0.0 | 0.9551 | 0.8517 | 0.9839 | 0.0061 | 0.2207 | 0.5003 | 0.0 | nan | 0.7595 | 0.8746 | 0.7054 | 0.7463 | 0.3845 | nan | 0.5869 | 0.6623 | 0.0572 | 0.8663 | 0.1078 | 0.0 | 0.4633 | 0.0906 | 0.5868 | 0.0 | 0.0318 | 0.7716 | 0.1738 | 0.5426 | 0.3163 | 0.3871 | nan | 0.1916 | 0.5119 | 0.3987 | 0.0 | 0.8883 | 0.7697 | 0.9529 | 0.0050 | 0.1348 | 0.3928 | 0.0 | | 0.0852 | 42.9 | 8580 | 0.7294 | 0.4218 | 0.5035 | 0.8781 | nan | 0.8474 | 0.9539 | 0.7988 | 0.8363 | 0.5591 | nan | 0.7173 | 0.8223 | 0.1199 | 0.9574 | 0.1374 | 0.0 | 0.6863 | 0.0966 | 0.7122 | 0.0002 | 0.0356 | 0.9282 | 0.2153 | 0.6422 | 0.3889 | 0.5794 | nan | 0.3187 | 0.6249 | 0.5866 | 0.0 | 0.9541 | 0.8453 | 0.9840 | 0.0073 | 0.2534 | 0.5031 | 0.0 | nan | 0.7629 | 0.8772 | 0.7008 | 0.7526 | 0.3834 | nan | 0.5882 | 0.6610 | 0.0666 | 0.8639 | 0.1194 | 0.0 | 0.4654 | 0.0966 | 0.5804 | 0.0002 | 0.0351 | 0.7734 | 0.1897 | 0.5443 | 0.3258 | 0.4037 | nan | 0.2523 | 0.5123 | 0.3962 | 0.0 | 0.8888 | 0.7665 | 0.9529 | 0.0059 | 0.1388 | 0.3924 | 0.0 | | 0.088 | 43.0 | 8600 | 0.7267 | 0.4198 | 0.4991 | 0.8778 | nan | 0.8440 | 0.9545 | 0.8047 | 0.8429 | 0.5587 | nan | 0.7048 | 0.8152 | 0.1090 | 0.9560 | 0.1211 | 0.0 | 0.7106 | 0.0746 | 0.6981 | 0.0001 | 0.0267 | 0.9307 | 0.2232 | 0.6422 | 0.3654 | 0.5411 | nan | 0.3333 | 0.6245 | 0.5647 | 0.0 | 0.9534 | 0.8454 | 0.9850 | 0.0061 | 0.2331 | 0.5013 | 0.0 | nan | 0.7621 | 0.8776 | 0.6877 | 0.7496 | 0.3866 | nan | 0.5884 | 0.6568 | 0.0592 | 0.8628 | 0.1055 | 0.0 | 0.4838 | 0.0746 | 0.5828 | 0.0001 | 0.0265 | 0.7720 | 0.1964 | 0.5455 | 0.3158 | 0.3866 | nan | 0.2690 | 0.5124 | 0.3965 | 0.0 | 0.8881 | 0.7673 | 0.9523 | 0.0047 | 0.1319 | 0.3907 | 0.0 | | 0.065 | 43.1 | 8620 | 0.7317 | 0.4196 | 0.5005 | 0.8781 | nan | 0.8451 | 0.9537 | 0.8098 | 0.8427 | 0.5527 | nan | 0.7069 | 0.8223 | 0.1230 | 0.9560 | 0.1239 | 0.0 | 0.6888 | 0.0726 | 0.7129 | 0.0 | 0.0271 | 0.9294 | 0.2125 | 0.6537 | 0.3741 | 0.5544 | nan | 0.3373 | 0.6257 | 0.5650 | 0.0 | 0.9528 | 0.8529 | 0.9834 | 0.0072 | 0.2298 | 0.5000 | 0.0 | nan | 0.7622 | 0.8785 | 0.6837 | 0.7518 | 0.3855 | nan | 0.5875 | 0.6530 | 0.0684 | 0.8638 | 0.1080 | 0.0 | 0.4804 | 0.0725 | 0.5771 | 0.0 | 0.0269 | 0.7726 | 0.1868 | 0.5450 | 0.3229 | 0.3885 | nan | 0.2678 | 0.5116 | 0.3946 | 0.0 | 0.8892 | 0.7700 | 0.9534 | 0.0056 | 0.1319 | 0.3878 | 0.0 | | 0.0697 | 43.2 | 8640 | 0.7204 | 0.4192 | 0.5005 | 0.8785 | nan | 0.8482 | 0.9530 | 0.8100 | 0.8455 | 0.5573 | nan | 0.7103 | 0.8250 | 0.1150 | 0.9551 | 0.1219 | 0.0 | 0.6710 | 0.0804 | 0.7151 | 0.0 | 0.0266 | 0.9303 | 0.2002 | 0.6498 | 0.3774 | 0.5495 | nan | 0.3482 | 0.6393 | 0.5789 | 0.0 | 0.9557 | 0.8447 | 0.9829 | 0.0081 | 0.2347 | 0.4812 | 0.0 | nan | 0.7656 | 0.8803 | 0.6869 | 0.7549 | 0.3853 | nan | 0.5889 | 0.6460 | 0.0617 | 0.8646 | 0.1058 | 0.0 | 0.4654 | 0.0803 | 0.5804 | 0.0 | 0.0264 | 0.7723 | 0.1779 | 0.5491 | 0.3224 | 0.3888 | nan | 0.2797 | 0.5128 | 0.3962 | 0.0 | 0.8874 | 0.7630 | 0.9534 | 0.0062 | 0.1306 | 0.3827 | 0.0 | | 0.0618 | 43.3 | 8660 | 0.7165 | 0.4183 | 0.4983 | 0.8792 | nan | 0.8572 | 0.9548 | 0.8042 | 0.8479 | 0.5460 | nan | 0.7072 | 0.8212 | 0.1021 | 0.9566 | 0.1165 | 0.0 | 0.6709 | 0.0647 | 0.7131 | 0.0 | 0.0285 | 0.9315 | 0.2115 | 0.6370 | 0.3804 | 0.5576 | nan | 0.3268 | 0.6234 | 0.5779 | 0.0 | 0.9542 | 0.8376 | 0.9834 | 0.0093 | 0.2454 | 0.4792 | 0.0 | nan | 0.7710 | 0.8820 | 0.7006 | 0.7550 | 0.3833 | nan | 0.5902 | 0.6422 | 0.0543 | 0.8639 | 0.1019 | 0.0 | 0.4641 | 0.0647 | 0.5811 | 0.0 | 0.0281 | 0.7700 | 0.1853 | 0.5414 | 0.3240 | 0.3977 | nan | 0.2670 | 0.5098 | 0.3917 | 0.0 | 0.8875 | 0.7613 | 0.9529 | 0.0073 | 0.1270 | 0.3791 | 0.0 | | 0.06 | 43.4 | 8680 | 0.7109 | 0.4201 | 0.5011 | 0.8800 | nan | 0.8595 | 0.9558 | 0.7967 | 0.8494 | 0.5272 | nan | 0.7195 | 0.8216 | 0.1296 | 0.9587 | 0.1180 | 0.0 | 0.6750 | 0.0609 | 0.7230 | 0.0 | 0.0352 | 0.9277 | 0.2070 | 0.6379 | 0.3972 | 0.5523 | nan | 0.3292 | 0.6208 | 0.5823 | 0.0 | 0.9550 | 0.8361 | 0.9826 | 0.0137 | 0.2559 | 0.5083 | 0.0 | nan | 0.7730 | 0.8823 | 0.7029 | 0.7538 | 0.3909 | nan | 0.5893 | 0.6437 | 0.0670 | 0.8638 | 0.1036 | 0.0 | 0.4610 | 0.0609 | 0.5810 | 0.0 | 0.0346 | 0.7733 | 0.1834 | 0.5459 | 0.3286 | 0.4014 | nan | 0.2719 | 0.5098 | 0.3906 | 0.0 | 0.8872 | 0.7607 | 0.9534 | 0.0097 | 0.1288 | 0.3893 | 0.0 | | 0.0637 | 43.5 | 8700 | 0.7112 | 0.4197 | 0.5019 | 0.8798 | nan | 0.8598 | 0.9564 | 0.7924 | 0.8452 | 0.5278 | nan | 0.7139 | 0.8459 | 0.0673 | 0.9563 | 0.1162 | 0.0 | 0.7044 | 0.0499 | 0.7242 | 0.0046 | 0.0234 | 0.9320 | 0.2177 | 0.6367 | 0.4050 | 0.5829 | nan | 0.3565 | 0.6146 | 0.5795 | 0.0 | 0.9508 | 0.8419 | 0.9832 | 0.0151 | 0.2453 | 0.5112 | 0.0 | nan | 0.7701 | 0.8810 | 0.7087 | 0.7526 | 0.3938 | nan | 0.5881 | 0.6374 | 0.0410 | 0.8654 | 0.1019 | 0.0 | 0.4544 | 0.0498 | 0.5856 | 0.0044 | 0.0231 | 0.7726 | 0.1907 | 0.5461 | 0.3340 | 0.4087 | nan | 0.2842 | 0.5077 | 0.3904 | 0.0 | 0.8887 | 0.7634 | 0.9534 | 0.0104 | 0.1279 | 0.3934 | 0.0 | | 0.0686 | 43.6 | 8720 | 0.7090 | 0.4197 | 0.4989 | 0.8799 | nan | 0.8557 | 0.9581 | 0.7838 | 0.8390 | 0.5234 | nan | 0.7065 | 0.8386 | 0.0559 | 0.9563 | 0.1346 | 0.0 | 0.7263 | 0.0440 | 0.7063 | 0.0053 | 0.0168 | 0.9253 | 0.2019 | 0.6655 | 0.4033 | 0.5193 | nan | 0.3487 | 0.6270 | 0.5874 | 0.0 | 0.9547 | 0.8525 | 0.9830 | 0.0137 | 0.2328 | 0.4980 | 0.0 | nan | 0.7640 | 0.8780 | 0.7104 | 0.7531 | 0.4014 | nan | 0.5854 | 0.6436 | 0.0343 | 0.8650 | 0.1170 | 0.0 | 0.4675 | 0.0440 | 0.5847 | 0.0052 | 0.0166 | 0.7773 | 0.1781 | 0.5546 | 0.3319 | 0.3880 | nan | 0.2757 | 0.5094 | 0.3956 | 0.0 | 0.8885 | 0.7679 | 0.9534 | 0.0100 | 0.1391 | 0.3922 | 0.0 | | 0.0807 | 43.7 | 8740 | 0.7229 | 0.4180 | 0.4983 | 0.8792 | nan | 0.8466 | 0.9586 | 0.7929 | 0.8472 | 0.5371 | nan | 0.7049 | 0.8364 | 0.0621 | 0.9543 | 0.1221 | 0.0 | 0.7106 | 0.0559 | 0.7133 | 0.0062 | 0.0340 | 0.9336 | 0.1828 | 0.6558 | 0.3936 | 0.5337 | nan | 0.3272 | 0.6369 | 0.6002 | 0.0 | 0.9521 | 0.8448 | 0.9834 | 0.0054 | 0.2294 | 0.4832 | 0.0 | nan | 0.7613 | 0.8771 | 0.7048 | 0.7507 | 0.4042 | nan | 0.5878 | 0.6479 | 0.0383 | 0.8658 | 0.1062 | 0.0 | 0.4355 | 0.0559 | 0.5919 | 0.0061 | 0.0334 | 0.7728 | 0.1631 | 0.5500 | 0.3294 | 0.3916 | nan | 0.2608 | 0.5128 | 0.3933 | 0.0 | 0.8887 | 0.7658 | 0.9533 | 0.0044 | 0.1359 | 0.3871 | 0.0 | | 0.0557 | 43.8 | 8760 | 0.7293 | 0.4202 | 0.5003 | 0.8787 | nan | 0.8414 | 0.9593 | 0.7976 | 0.8454 | 0.5346 | nan | 0.7111 | 0.8324 | 0.0655 | 0.9556 | 0.1376 | 0.0 | 0.7028 | 0.0795 | 0.7066 | 0.0112 | 0.0297 | 0.9321 | 0.2095 | 0.6408 | 0.3838 | 0.5562 | nan | 0.3337 | 0.6187 | 0.5912 | 0.0 | 0.9529 | 0.8491 | 0.9826 | 0.0117 | 0.2366 | 0.4989 | 0.0 | nan | 0.7594 | 0.8760 | 0.7044 | 0.7460 | 0.4061 | nan | 0.5894 | 0.6491 | 0.0391 | 0.8650 | 0.1192 | 0.0 | 0.4528 | 0.0795 | 0.5932 | 0.0110 | 0.0292 | 0.7729 | 0.1842 | 0.5456 | 0.3261 | 0.3934 | nan | 0.2579 | 0.5118 | 0.3937 | 0.0 | 0.8887 | 0.7667 | 0.9542 | 0.0090 | 0.1334 | 0.3893 | 0.0 | | 0.0655 | 43.9 | 8780 | 0.7311 | 0.4198 | 0.5014 | 0.8783 | nan | 0.8393 | 0.9554 | 0.8100 | 0.8526 | 0.5444 | nan | 0.7158 | 0.8314 | 0.0777 | 0.9564 | 0.1426 | 0.0 | 0.7048 | 0.0661 | 0.7165 | 0.0070 | 0.0355 | 0.9295 | 0.2021 | 0.6563 | 0.3785 | 0.5421 | nan | 0.3485 | 0.6098 | 0.5809 | 0.0 | 0.9562 | 0.8432 | 0.9819 | 0.0113 | 0.2569 | 0.4911 | 0.0 | nan | 0.7591 | 0.8774 | 0.6880 | 0.7491 | 0.4010 | nan | 0.5890 | 0.6512 | 0.0459 | 0.8645 | 0.1234 | 0.0 | 0.4569 | 0.0661 | 0.5894 | 0.0069 | 0.0348 | 0.7745 | 0.1777 | 0.5502 | 0.3229 | 0.3875 | nan | 0.2705 | 0.5105 | 0.3950 | 0.0 | 0.8871 | 0.7637 | 0.9543 | 0.0087 | 0.1385 | 0.3887 | 0.0 | | 0.1089 | 44.0 | 8800 | 0.7312 | 0.4192 | 0.4999 | 0.8783 | nan | 0.8397 | 0.9578 | 0.8063 | 0.8582 | 0.5364 | nan | 0.6964 | 0.8238 | 0.0701 | 0.9568 | 0.1412 | 0.0 | 0.6988 | 0.0710 | 0.7186 | 0.0024 | 0.0395 | 0.9293 | 0.1825 | 0.6419 | 0.3726 | 0.5495 | nan | 0.3361 | 0.6296 | 0.5986 | 0.0 | 0.9542 | 0.8373 | 0.9846 | 0.0106 | 0.2464 | 0.5059 | 0.0 | nan | 0.7582 | 0.8774 | 0.6870 | 0.7482 | 0.4026 | nan | 0.5888 | 0.6556 | 0.0403 | 0.8647 | 0.1220 | 0.0 | 0.4713 | 0.0709 | 0.5898 | 0.0024 | 0.0388 | 0.7742 | 0.1633 | 0.5487 | 0.3173 | 0.3890 | nan | 0.2694 | 0.5097 | 0.3926 | 0.0 | 0.8871 | 0.7616 | 0.9533 | 0.0082 | 0.1322 | 0.3913 | 0.0 | | 0.0651 | 44.1 | 8820 | 0.7258 | 0.4167 | 0.4960 | 0.8781 | nan | 0.8389 | 0.9580 | 0.8069 | 0.8540 | 0.5310 | nan | 0.7010 | 0.8305 | 0.0672 | 0.9582 | 0.1303 | 0.0 | 0.6921 | 0.0439 | 0.7103 | 0.0 | 0.0331 | 0.9301 | 0.1987 | 0.6444 | 0.3661 | 0.5154 | nan | 0.3231 | 0.6235 | 0.5817 | 0.0 | 0.9546 | 0.8372 | 0.9849 | 0.0090 | 0.2459 | 0.5016 | 0.0 | nan | 0.7581 | 0.8772 | 0.6853 | 0.7509 | 0.4018 | nan | 0.5875 | 0.6505 | 0.0400 | 0.8634 | 0.1134 | 0.0 | 0.4602 | 0.0439 | 0.5862 | 0.0 | 0.0328 | 0.7733 | 0.1758 | 0.5474 | 0.3146 | 0.3825 | nan | 0.2584 | 0.5083 | 0.3952 | 0.0 | 0.8866 | 0.7597 | 0.9530 | 0.0068 | 0.1321 | 0.3900 | 0.0 | | 0.0692 | 44.2 | 8840 | 0.7330 | 0.4161 | 0.4947 | 0.8773 | nan | 0.8412 | 0.9553 | 0.8049 | 0.8450 | 0.5478 | nan | 0.7073 | 0.8187 | 0.0795 | 0.9572 | 0.1261 | 0.0 | 0.6916 | 0.0548 | 0.7093 | 0.0029 | 0.0275 | 0.9332 | 0.1765 | 0.6301 | 0.3681 | 0.5453 | nan | 0.2985 | 0.6152 | 0.5738 | 0.0 | 0.9541 | 0.8444 | 0.9853 | 0.0108 | 0.2370 | 0.4879 | 0.0 | nan | 0.7584 | 0.8765 | 0.6907 | 0.7500 | 0.3961 | nan | 0.5871 | 0.6576 | 0.0452 | 0.8634 | 0.1096 | 0.0 | 0.4611 | 0.0548 | 0.5893 | 0.0028 | 0.0274 | 0.7684 | 0.1583 | 0.5353 | 0.3184 | 0.3885 | nan | 0.2442 | 0.5093 | 0.3966 | 0.0 | 0.8871 | 0.7617 | 0.9526 | 0.0082 | 0.1301 | 0.3866 | 0.0 | | 0.0654 | 44.3 | 8860 | 0.7268 | 0.4164 | 0.4958 | 0.8779 | nan | 0.8438 | 0.9578 | 0.7944 | 0.8479 | 0.5362 | nan | 0.6987 | 0.8252 | 0.0859 | 0.9576 | 0.1255 | 0.0 | 0.6884 | 0.0533 | 0.7208 | 0.0038 | 0.0228 | 0.9300 | 0.1571 | 0.6303 | 0.4273 | 0.5702 | nan | 0.2575 | 0.6223 | 0.5944 | 0.0 | 0.9520 | 0.8468 | 0.9850 | 0.0115 | 0.2333 | 0.4859 | 0.0 | nan | 0.7596 | 0.8765 | 0.6991 | 0.7497 | 0.3995 | nan | 0.5865 | 0.6540 | 0.0503 | 0.8638 | 0.1088 | 0.0 | 0.4636 | 0.0533 | 0.5866 | 0.0037 | 0.0227 | 0.7688 | 0.1423 | 0.5312 | 0.3468 | 0.4065 | nan | 0.2179 | 0.5087 | 0.3934 | 0.0 | 0.8891 | 0.7646 | 0.9523 | 0.0087 | 0.1321 | 0.3839 | 0.0 | | 0.0886 | 44.4 | 8880 | 0.7317 | 0.4162 | 0.4946 | 0.8776 | nan | 0.8453 | 0.9574 | 0.7868 | 0.8436 | 0.5339 | nan | 0.7076 | 0.8323 | 0.0835 | 0.9565 | 0.1189 | 0.0 | 0.6836 | 0.0644 | 0.7184 | 0.0029 | 0.0224 | 0.9334 | 0.1645 | 0.6164 | 0.4076 | 0.5555 | nan | 0.2833 | 0.6216 | 0.5878 | 0.0 | 0.9531 | 0.8455 | 0.9836 | 0.0136 | 0.2147 | 0.4873 | 0.0 | nan | 0.7597 | 0.8757 | 0.7058 | 0.7446 | 0.4026 | nan | 0.5884 | 0.6438 | 0.0485 | 0.8646 | 0.1042 | 0.0 | 0.4628 | 0.0644 | 0.5891 | 0.0028 | 0.0223 | 0.7662 | 0.1479 | 0.5247 | 0.3425 | 0.3939 | nan | 0.2349 | 0.5086 | 0.3960 | 0.0 | 0.8889 | 0.7651 | 0.9531 | 0.0099 | 0.1246 | 0.3835 | 0.0 | | 0.0592 | 44.5 | 8900 | 0.7325 | 0.4189 | 0.4975 | 0.8784 | nan | 0.8458 | 0.9586 | 0.7892 | 0.8429 | 0.5300 | nan | 0.7085 | 0.8330 | 0.1017 | 0.9566 | 0.1281 | 0.0 | 0.6934 | 0.0763 | 0.7256 | 0.0006 | 0.0309 | 0.9290 | 0.2149 | 0.6268 | 0.3929 | 0.5390 | nan | 0.2751 | 0.6182 | 0.5806 | 0.0 | 0.9533 | 0.8541 | 0.9831 | 0.0123 | 0.2190 | 0.5009 | 0.0 | nan | 0.7606 | 0.8758 | 0.7049 | 0.7442 | 0.4042 | nan | 0.5897 | 0.6434 | 0.0576 | 0.8656 | 0.1122 | 0.0 | 0.4635 | 0.0763 | 0.5883 | 0.0005 | 0.0305 | 0.7700 | 0.1886 | 0.5313 | 0.3367 | 0.3915 | nan | 0.2296 | 0.5086 | 0.3965 | 0.0 | 0.8897 | 0.7701 | 0.9537 | 0.0092 | 0.1255 | 0.3859 | 0.0 | | 0.0756 | 44.6 | 8920 | 0.7285 | 0.4177 | 0.4947 | 0.8778 | nan | 0.8449 | 0.9569 | 0.7980 | 0.8439 | 0.5414 | nan | 0.7156 | 0.8249 | 0.0876 | 0.9550 | 0.1230 | 0.0 | 0.6976 | 0.0827 | 0.7212 | 0.0011 | 0.0269 | 0.9353 | 0.2019 | 0.6128 | 0.3814 | 0.5312 | nan | 0.2838 | 0.6041 | 0.5666 | 0.0 | 0.9534 | 0.8478 | 0.9831 | 0.0133 | 0.2039 | 0.4928 | 0.0 | nan | 0.7604 | 0.8769 | 0.6981 | 0.7464 | 0.4036 | nan | 0.5917 | 0.6492 | 0.0511 | 0.8661 | 0.1076 | 0.0 | 0.4700 | 0.0827 | 0.5898 | 0.0011 | 0.0267 | 0.7655 | 0.1780 | 0.5223 | 0.3287 | 0.3834 | nan | 0.2358 | 0.5075 | 0.3975 | 0.0 | 0.8880 | 0.7659 | 0.9536 | 0.0100 | 0.1227 | 0.3856 | 0.0 | | 0.0552 | 44.7 | 8940 | 0.7286 | 0.4178 | 0.4960 | 0.8785 | nan | 0.8500 | 0.9568 | 0.7997 | 0.8454 | 0.5380 | nan | 0.7104 | 0.8261 | 0.0826 | 0.9550 | 0.1311 | 0.0 | 0.6737 | 0.0857 | 0.7207 | 0.0013 | 0.0228 | 0.9298 | 0.1966 | 0.6192 | 0.3907 | 0.5334 | nan | 0.2912 | 0.6324 | 0.5819 | 0.0 | 0.9558 | 0.8433 | 0.9824 | 0.0117 | 0.2209 | 0.4822 | 0.0 | nan | 0.7627 | 0.8784 | 0.6998 | 0.7509 | 0.4015 | nan | 0.5918 | 0.6423 | 0.0479 | 0.8664 | 0.1136 | 0.0 | 0.4565 | 0.0857 | 0.5902 | 0.0013 | 0.0225 | 0.7685 | 0.1739 | 0.5288 | 0.3294 | 0.3864 | nan | 0.2423 | 0.5103 | 0.3982 | 0.0 | 0.8870 | 0.7651 | 0.9538 | 0.0089 | 0.1235 | 0.3812 | 0.0 | | 0.0666 | 44.8 | 8960 | 0.7386 | 0.4175 | 0.4952 | 0.8780 | nan | 0.8404 | 0.9579 | 0.8046 | 0.8503 | 0.5445 | nan | 0.7113 | 0.8195 | 0.1001 | 0.9529 | 0.1387 | 0.0 | 0.6556 | 0.0980 | 0.7253 | 0.0010 | 0.0275 | 0.9326 | 0.1886 | 0.6285 | 0.3736 | 0.5390 | nan | 0.2726 | 0.6155 | 0.5677 | 0.0 | 0.9554 | 0.8414 | 0.9833 | 0.0083 | 0.2342 | 0.4780 | 0.0 | nan | 0.7598 | 0.8779 | 0.6886 | 0.7473 | 0.4052 | nan | 0.5924 | 0.6417 | 0.0559 | 0.8671 | 0.1196 | 0.0 | 0.4493 | 0.0980 | 0.5946 | 0.0010 | 0.0273 | 0.7685 | 0.1679 | 0.5325 | 0.3207 | 0.3926 | nan | 0.2302 | 0.5093 | 0.3996 | 0.0 | 0.8867 | 0.7644 | 0.9536 | 0.0063 | 0.1233 | 0.3797 | 0.0 | | 0.062 | 44.9 | 8980 | 0.7354 | 0.4167 | 0.4943 | 0.8776 | nan | 0.8392 | 0.9559 | 0.8056 | 0.8510 | 0.5501 | nan | 0.7168 | 0.8244 | 0.0703 | 0.9562 | 0.1388 | 0.0 | 0.6559 | 0.0825 | 0.7042 | 0.0 | 0.0273 | 0.9347 | 0.2042 | 0.6299 | 0.3684 | 0.5330 | nan | 0.2867 | 0.6248 | 0.5721 | 0.0 | 0.9543 | 0.8418 | 0.9828 | 0.0084 | 0.2265 | 0.4720 | 0.0 | nan | 0.7588 | 0.8780 | 0.6875 | 0.7494 | 0.4034 | nan | 0.5932 | 0.6421 | 0.0418 | 0.8650 | 0.1199 | 0.0 | 0.4503 | 0.0825 | 0.5912 | 0.0 | 0.0270 | 0.7679 | 0.1794 | 0.5320 | 0.3182 | 0.3908 | nan | 0.2400 | 0.5106 | 0.3981 | 0.0 | 0.8870 | 0.7627 | 0.9540 | 0.0062 | 0.1210 | 0.3772 | 0.0 | | 0.0723 | 45.0 | 9000 | 0.7471 | 0.4173 | 0.4954 | 0.8778 | nan | 0.8333 | 0.9563 | 0.8064 | 0.8629 | 0.5388 | nan | 0.7114 | 0.8171 | 0.0703 | 0.9572 | 0.1278 | 0.0 | 0.6566 | 0.0568 | 0.7102 | 0.0 | 0.0346 | 0.9322 | 0.1989 | 0.6334 | 0.3889 | 0.5232 | nan | 0.3184 | 0.6227 | 0.5818 | 0.0 | 0.9517 | 0.8613 | 0.9836 | 0.0112 | 0.2188 | 0.4869 | 0.0 | nan | 0.7549 | 0.8780 | 0.6770 | 0.7456 | 0.4055 | nan | 0.5931 | 0.6419 | 0.0425 | 0.8642 | 0.1121 | 0.0 | 0.4577 | 0.0568 | 0.5839 | 0.0 | 0.0343 | 0.7700 | 0.1761 | 0.5356 | 0.3285 | 0.3927 | nan | 0.2656 | 0.5089 | 0.3986 | 0.0 | 0.8898 | 0.7708 | 0.9535 | 0.0083 | 0.1270 | 0.3819 | 0.0 | | 0.0648 | 45.1 | 9020 | 0.7314 | 0.4180 | 0.4975 | 0.8780 | nan | 0.8368 | 0.9564 | 0.8026 | 0.8635 | 0.5396 | nan | 0.7064 | 0.8212 | 0.0805 | 0.9556 | 0.1300 | 0.0 | 0.6996 | 0.0558 | 0.7156 | 0.0 | 0.0289 | 0.9301 | 0.2240 | 0.6456 | 0.3784 | 0.5095 | nan | 0.3363 | 0.6191 | 0.5911 | 0.0 | 0.9550 | 0.8447 | 0.9817 | 0.0094 | 0.2181 | 0.4845 | 0.0 | nan | 0.7572 | 0.8784 | 0.6826 | 0.7447 | 0.4042 | nan | 0.5951 | 0.6395 | 0.0479 | 0.8653 | 0.1139 | 0.0 | 0.4665 | 0.0558 | 0.5832 | 0.0 | 0.0287 | 0.7728 | 0.1957 | 0.5406 | 0.3217 | 0.3863 | nan | 0.2713 | 0.5095 | 0.3938 | 0.0 | 0.8876 | 0.7644 | 0.9542 | 0.0069 | 0.1283 | 0.3792 | 0.0 | | 0.0623 | 45.2 | 9040 | 0.7278 | 0.4188 | 0.4995 | 0.8778 | nan | 0.8387 | 0.9556 | 0.8078 | 0.8564 | 0.5497 | nan | 0.7147 | 0.8184 | 0.0845 | 0.9571 | 0.1374 | 0.0 | 0.7057 | 0.0565 | 0.7144 | 0.0 | 0.0271 | 0.9300 | 0.2166 | 0.6403 | 0.3961 | 0.5277 | nan | 0.3465 | 0.6230 | 0.5861 | 0.0 | 0.9538 | 0.8378 | 0.9831 | 0.0104 | 0.2284 | 0.4816 | 0.0 | nan | 0.7591 | 0.8794 | 0.6766 | 0.7506 | 0.3999 | nan | 0.5965 | 0.6427 | 0.0491 | 0.8646 | 0.1196 | 0.0 | 0.4744 | 0.0565 | 0.5865 | 0.0 | 0.0269 | 0.7725 | 0.1910 | 0.5386 | 0.3288 | 0.3946 | nan | 0.2760 | 0.5099 | 0.3924 | 0.0 | 0.8870 | 0.7582 | 0.9538 | 0.0074 | 0.1293 | 0.3795 | 0.0 | | 0.0933 | 45.3 | 9060 | 0.7315 | 0.4165 | 0.4958 | 0.8770 | nan | 0.8354 | 0.9562 | 0.8112 | 0.8543 | 0.5486 | nan | 0.7124 | 0.8203 | 0.0950 | 0.9562 | 0.1240 | 0.0 | 0.6678 | 0.0667 | 0.7217 | 0.0 | 0.0277 | 0.9344 | 0.1949 | 0.6166 | 0.3834 | 0.5225 | nan | 0.3350 | 0.6123 | 0.5716 | 0.0 | 0.9552 | 0.8340 | 0.9832 | 0.0097 | 0.2344 | 0.4819 | 0.0 | nan | 0.7565 | 0.8785 | 0.6719 | 0.7522 | 0.4025 | nan | 0.5940 | 0.6437 | 0.0542 | 0.8649 | 0.1081 | 0.0 | 0.4599 | 0.0667 | 0.5888 | 0.0 | 0.0274 | 0.7680 | 0.1737 | 0.5282 | 0.3233 | 0.3892 | nan | 0.2649 | 0.5078 | 0.3946 | 0.0 | 0.8857 | 0.7566 | 0.9535 | 0.0070 | 0.1277 | 0.3796 | 0.0 | | 0.0677 | 45.4 | 9080 | 0.7344 | 0.4174 | 0.4972 | 0.8772 | nan | 0.8379 | 0.9557 | 0.8075 | 0.8528 | 0.5457 | nan | 0.7049 | 0.8222 | 0.1007 | 0.9576 | 0.1268 | 0.0 | 0.6637 | 0.0536 | 0.7270 | 0.0 | 0.0301 | 0.9329 | 0.1934 | 0.6329 | 0.3872 | 0.5320 | nan | 0.3338 | 0.6185 | 0.5816 | 0.0 | 0.9532 | 0.8411 | 0.9840 | 0.0124 | 0.2396 | 0.4811 | 0.0 | nan | 0.7567 | 0.8781 | 0.6787 | 0.7525 | 0.3997 | nan | 0.5912 | 0.6507 | 0.0581 | 0.8639 | 0.1107 | 0.0 | 0.4616 | 0.0536 | 0.5893 | 0.0 | 0.0297 | 0.7699 | 0.1720 | 0.5350 | 0.3262 | 0.3926 | nan | 0.2656 | 0.5087 | 0.3940 | 0.0 | 0.8867 | 0.7585 | 0.9531 | 0.0086 | 0.1308 | 0.3809 | 0.0 | | 0.0562 | 45.5 | 9100 | 0.7293 | 0.4182 | 0.4980 | 0.8775 | nan | 0.8373 | 0.9568 | 0.8015 | 0.8531 | 0.5431 | nan | 0.7127 | 0.8264 | 0.0872 | 0.9570 | 0.1405 | 0.0 | 0.6721 | 0.0682 | 0.7243 | 0.0 | 0.0273 | 0.9280 | 0.2170 | 0.6294 | 0.3873 | 0.5305 | nan | 0.3120 | 0.6229 | 0.5872 | 0.0 | 0.9560 | 0.8360 | 0.9832 | 0.0102 | 0.2377 | 0.4911 | 0.0 | nan | 0.7569 | 0.8778 | 0.6855 | 0.7522 | 0.4006 | nan | 0.5921 | 0.6538 | 0.0508 | 0.8642 | 0.1226 | 0.0 | 0.4610 | 0.0682 | 0.5914 | 0.0 | 0.0270 | 0.7720 | 0.1898 | 0.5368 | 0.3270 | 0.3875 | nan | 0.2549 | 0.5103 | 0.3902 | 0.0 | 0.8855 | 0.7555 | 0.9537 | 0.0074 | 0.1266 | 0.3816 | 0.0 | | 0.0722 | 45.6 | 9120 | 0.7235 | 0.4166 | 0.4948 | 0.8775 | nan | 0.8413 | 0.9552 | 0.7981 | 0.8550 | 0.5459 | nan | 0.7196 | 0.8248 | 0.0916 | 0.9560 | 0.1451 | 0.0 | 0.6633 | 0.0771 | 0.7236 | 0.0 | 0.0392 | 0.9341 | 0.1956 | 0.6222 | 0.3749 | 0.5074 | nan | 0.2789 | 0.6205 | 0.5841 | 0.0 | 0.9543 | 0.8399 | 0.9834 | 0.0040 | 0.2227 | 0.4759 | 0.0 | nan | 0.7596 | 0.8782 | 0.6895 | 0.7517 | 0.3923 | nan | 0.5944 | 0.6488 | 0.0515 | 0.8644 | 0.1259 | 0.0 | 0.4444 | 0.0771 | 0.5943 | 0.0 | 0.0388 | 0.7686 | 0.1723 | 0.5313 | 0.3210 | 0.3871 | nan | 0.2354 | 0.5105 | 0.3925 | 0.0 | 0.8865 | 0.7586 | 0.9536 | 0.0031 | 0.1195 | 0.3787 | 0.0 | | 0.0855 | 45.7 | 9140 | 0.7305 | 0.4143 | 0.4917 | 0.8777 | nan | 0.8419 | 0.9569 | 0.8012 | 0.8498 | 0.5367 | nan | 0.7175 | 0.8233 | 0.0903 | 0.9561 | 0.1382 | 0.0 | 0.6537 | 0.0677 | 0.7246 | 0.0 | 0.0283 | 0.9335 | 0.1604 | 0.6322 | 0.3685 | 0.5063 | nan | 0.2680 | 0.6185 | 0.5825 | 0.0 | 0.9568 | 0.8389 | 0.9840 | 0.0062 | 0.2316 | 0.4611 | 0.0 | nan | 0.7593 | 0.8777 | 0.6863 | 0.7509 | 0.3992 | nan | 0.5939 | 0.6446 | 0.0520 | 0.8650 | 0.1202 | 0.0 | 0.4408 | 0.0677 | 0.5886 | 0.0 | 0.0282 | 0.7690 | 0.1451 | 0.5347 | 0.3159 | 0.3871 | nan | 0.2261 | 0.5104 | 0.3969 | 0.0 | 0.8861 | 0.7610 | 0.9529 | 0.0048 | 0.1205 | 0.3735 | 0.0 | | 0.1003 | 45.8 | 9160 | 0.7352 | 0.4134 | 0.4917 | 0.8775 | nan | 0.8407 | 0.9570 | 0.8053 | 0.8416 | 0.5420 | nan | 0.7239 | 0.8304 | 0.0839 | 0.9577 | 0.1438 | 0.0 | 0.6563 | 0.0487 | 0.7149 | 0.0 | 0.0286 | 0.9311 | 0.1542 | 0.6253 | 0.3811 | 0.5337 | nan | 0.2558 | 0.6163 | 0.5718 | 0.0 | 0.9561 | 0.8423 | 0.9833 | 0.0079 | 0.2251 | 0.4753 | 0.0 | nan | 0.7589 | 0.8770 | 0.6796 | 0.7526 | 0.4003 | nan | 0.5916 | 0.6395 | 0.0499 | 0.8648 | 0.1252 | 0.0 | 0.4420 | 0.0487 | 0.5860 | 0.0 | 0.0284 | 0.7701 | 0.1400 | 0.5330 | 0.3208 | 0.3949 | nan | 0.2163 | 0.5095 | 0.3961 | 0.0 | 0.8868 | 0.7634 | 0.9535 | 0.0060 | 0.1182 | 0.3753 | 0.0 | | 0.0856 | 45.9 | 9180 | 0.7275 | 0.4160 | 0.4940 | 0.8780 | nan | 0.8410 | 0.9569 | 0.8037 | 0.8525 | 0.5318 | nan | 0.7125 | 0.8325 | 0.0641 | 0.9577 | 0.1498 | 0.0 | 0.6451 | 0.0527 | 0.7146 | 0.0 | 0.0274 | 0.9315 | 0.1914 | 0.6320 | 0.3802 | 0.5320 | nan | 0.2852 | 0.6298 | 0.5780 | 0.0 | 0.9526 | 0.8491 | 0.9825 | 0.0073 | 0.2231 | 0.4918 | 0.0 | nan | 0.7590 | 0.8777 | 0.6857 | 0.7505 | 0.3972 | nan | 0.5930 | 0.6430 | 0.0397 | 0.8646 | 0.1305 | 0.0 | 0.4510 | 0.0527 | 0.5873 | 0.0 | 0.0272 | 0.7714 | 0.1695 | 0.5386 | 0.3233 | 0.3932 | nan | 0.2411 | 0.5101 | 0.3956 | 0.0 | 0.8883 | 0.7657 | 0.9541 | 0.0055 | 0.1176 | 0.3792 | 0.0 | | 0.0818 | 46.0 | 9200 | 0.7458 | 0.4147 | 0.4941 | 0.8772 | nan | 0.8414 | 0.9570 | 0.7964 | 0.8409 | 0.5371 | nan | 0.7188 | 0.8320 | 0.0784 | 0.9571 | 0.1535 | 0.0 | 0.6523 | 0.0480 | 0.7226 | 0.0 | 0.0183 | 0.9319 | 0.1858 | 0.6348 | 0.3898 | 0.5330 | nan | 0.2789 | 0.6231 | 0.5943 | 0.0 | 0.9558 | 0.8330 | 0.9843 | 0.0110 | 0.2373 | 0.4636 | 0.0 | nan | 0.7578 | 0.8759 | 0.6976 | 0.7506 | 0.3975 | nan | 0.5916 | 0.6398 | 0.0482 | 0.8652 | 0.1325 | 0.0 | 0.4373 | 0.0480 | 0.5842 | 0.0 | 0.0181 | 0.7708 | 0.1639 | 0.5377 | 0.3260 | 0.3928 | nan | 0.2324 | 0.5116 | 0.3963 | 0.0 | 0.8858 | 0.7563 | 0.9532 | 0.0081 | 0.1199 | 0.3716 | 0.0 | | 0.0727 | 46.1 | 9220 | 0.7385 | 0.4162 | 0.4947 | 0.8773 | nan | 0.8434 | 0.9572 | 0.7878 | 0.8435 | 0.5369 | nan | 0.7161 | 0.8285 | 0.0850 | 0.9556 | 0.1520 | 0.0 | 0.6477 | 0.0513 | 0.7257 | 0.0 | 0.0171 | 0.9305 | 0.2151 | 0.6339 | 0.3868 | 0.5109 | nan | 0.2958 | 0.6302 | 0.5912 | 0.0 | 0.9548 | 0.8314 | 0.9839 | 0.0083 | 0.2354 | 0.4753 | 0.0 | nan | 0.7582 | 0.8760 | 0.7016 | 0.7485 | 0.3972 | nan | 0.5919 | 0.6417 | 0.0512 | 0.8663 | 0.1306 | 0.0 | 0.4412 | 0.0513 | 0.5872 | 0.0 | 0.0169 | 0.7718 | 0.1877 | 0.5386 | 0.3260 | 0.3906 | nan | 0.2423 | 0.5121 | 0.3961 | 0.0 | 0.8860 | 0.7552 | 0.9534 | 0.0062 | 0.1189 | 0.3750 | 0.0 | | 0.0575 | 46.2 | 9240 | 0.7399 | 0.4176 | 0.4962 | 0.8774 | nan | 0.8411 | 0.9561 | 0.7987 | 0.8454 | 0.5384 | nan | 0.7186 | 0.8241 | 0.0766 | 0.9531 | 0.1534 | 0.0 | 0.6517 | 0.0703 | 0.7375 | 0.0 | 0.0285 | 0.9308 | 0.2062 | 0.6480 | 0.3815 | 0.5007 | nan | 0.3178 | 0.6272 | 0.5761 | 0.0 | 0.9552 | 0.8348 | 0.9836 | 0.0087 | 0.2379 | 0.4757 | 0.0 | nan | 0.7579 | 0.8761 | 0.6950 | 0.7505 | 0.3951 | nan | 0.5925 | 0.6410 | 0.0465 | 0.8681 | 0.1319 | 0.0 | 0.4437 | 0.0703 | 0.5958 | 0.0 | 0.0279 | 0.7733 | 0.1823 | 0.5453 | 0.3248 | 0.3814 | nan | 0.2603 | 0.5126 | 0.3966 | 0.0 | 0.8860 | 0.7557 | 0.9535 | 0.0066 | 0.1182 | 0.3746 | 0.0 | | 0.0972 | 46.3 | 9260 | 0.7251 | 0.4175 | 0.4958 | 0.8779 | nan | 0.8462 | 0.9550 | 0.7966 | 0.8451 | 0.5448 | nan | 0.7146 | 0.8252 | 0.0723 | 0.9575 | 0.1456 | 0.0 | 0.6676 | 0.0647 | 0.7245 | 0.0 | 0.0273 | 0.9313 | 0.1948 | 0.6500 | 0.3861 | 0.5225 | nan | 0.3129 | 0.6088 | 0.5722 | 0.0 | 0.9545 | 0.8397 | 0.9830 | 0.0075 | 0.2369 | 0.4783 | 0.0 | nan | 0.7610 | 0.8766 | 0.7007 | 0.7536 | 0.3934 | nan | 0.5942 | 0.6424 | 0.0442 | 0.8653 | 0.1253 | 0.0 | 0.4524 | 0.0647 | 0.5900 | 0.0 | 0.0269 | 0.7730 | 0.1726 | 0.5446 | 0.3287 | 0.3898 | nan | 0.2568 | 0.5100 | 0.3965 | 0.0 | 0.8871 | 0.7584 | 0.9537 | 0.0058 | 0.1184 | 0.3745 | 0.0 | | 0.0528 | 46.4 | 9280 | 0.7302 | 0.4166 | 0.4933 | 0.8784 | nan | 0.8492 | 0.9565 | 0.7981 | 0.8473 | 0.5424 | nan | 0.6950 | 0.8226 | 0.0683 | 0.9571 | 0.1401 | 0.0 | 0.6675 | 0.0655 | 0.7211 | 0.0 | 0.0256 | 0.9300 | 0.1869 | 0.6525 | 0.3817 | 0.5119 | nan | 0.2976 | 0.6154 | 0.5809 | 0.0 | 0.9544 | 0.8396 | 0.9842 | 0.0088 | 0.2052 | 0.4807 | 0.0 | nan | 0.7626 | 0.8770 | 0.6993 | 0.7526 | 0.3962 | nan | 0.5926 | 0.6401 | 0.0423 | 0.8654 | 0.1211 | 0.0 | 0.4546 | 0.0655 | 0.5869 | 0.0 | 0.0253 | 0.7730 | 0.1662 | 0.5443 | 0.3279 | 0.3840 | nan | 0.2467 | 0.5094 | 0.3965 | 0.0 | 0.8873 | 0.7608 | 0.9531 | 0.0066 | 0.1164 | 0.3777 | 0.0 | | 0.0629 | 46.5 | 9300 | 0.7256 | 0.4175 | 0.4942 | 0.8785 | nan | 0.8492 | 0.9565 | 0.7925 | 0.8456 | 0.5273 | nan | 0.7210 | 0.8175 | 0.0515 | 0.9588 | 0.1461 | 0.0 | 0.6703 | 0.0769 | 0.7161 | 0.0 | 0.0221 | 0.9329 | 0.1864 | 0.6420 | 0.3898 | 0.5235 | nan | 0.2992 | 0.6153 | 0.5840 | 0.0 | 0.9537 | 0.8408 | 0.9846 | 0.0105 | 0.2214 | 0.4793 | 0.0 | nan | 0.7628 | 0.8771 | 0.7040 | 0.7518 | 0.3966 | nan | 0.5926 | 0.6505 | 0.0332 | 0.8643 | 0.1260 | 0.0 | 0.4539 | 0.0769 | 0.5916 | 0.0 | 0.0219 | 0.7715 | 0.1656 | 0.5414 | 0.3303 | 0.3870 | nan | 0.2515 | 0.5090 | 0.3982 | 0.0 | 0.8873 | 0.7597 | 0.9529 | 0.0079 | 0.1177 | 0.3779 | 0.0 | | 0.0586 | 46.6 | 9320 | 0.7344 | 0.4168 | 0.4927 | 0.8779 | nan | 0.8464 | 0.9545 | 0.8018 | 0.8483 | 0.5394 | nan | 0.7172 | 0.8116 | 0.0486 | 0.9574 | 0.1387 | 0.0 | 0.6651 | 0.0812 | 0.7010 | 0.0 | 0.0252 | 0.9338 | 0.1746 | 0.6301 | 0.3853 | 0.5186 | nan | 0.3050 | 0.6162 | 0.5729 | 0.0 | 0.9552 | 0.8382 | 0.9841 | 0.0090 | 0.2169 | 0.4890 | 0.0 | nan | 0.7616 | 0.8774 | 0.6929 | 0.7521 | 0.3965 | nan | 0.5919 | 0.6575 | 0.0306 | 0.8647 | 0.1198 | 0.0 | 0.4507 | 0.0812 | 0.5933 | 0.0 | 0.0249 | 0.7696 | 0.1567 | 0.5370 | 0.3266 | 0.3832 | nan | 0.2577 | 0.5076 | 0.3988 | 0.0 | 0.8866 | 0.7574 | 0.9530 | 0.0069 | 0.1191 | 0.3816 | 0.0 | | 0.0568 | 46.7 | 9340 | 0.7314 | 0.4166 | 0.4932 | 0.8782 | nan | 0.8463 | 0.9563 | 0.8034 | 0.8480 | 0.5337 | nan | 0.7172 | 0.8189 | 0.0663 | 0.9556 | 0.1397 | 0.0 | 0.6488 | 0.0903 | 0.7265 | 0.0 | 0.0278 | 0.9337 | 0.1693 | 0.6203 | 0.3956 | 0.5112 | nan | 0.2894 | 0.6188 | 0.5791 | 0.0 | 0.9555 | 0.8414 | 0.9835 | 0.0048 | 0.2222 | 0.4803 | 0.0 | nan | 0.7618 | 0.8772 | 0.6908 | 0.7524 | 0.3994 | nan | 0.5930 | 0.6552 | 0.0414 | 0.8665 | 0.1207 | 0.0 | 0.4390 | 0.0903 | 0.5985 | 0.0 | 0.0274 | 0.7683 | 0.1518 | 0.5324 | 0.3318 | 0.3791 | nan | 0.2433 | 0.5085 | 0.3998 | 0.0 | 0.8870 | 0.7603 | 0.9532 | 0.0039 | 0.1184 | 0.3791 | 0.0 | | 0.0611 | 46.8 | 9360 | 0.7297 | 0.4186 | 0.4960 | 0.8783 | nan | 0.8479 | 0.9551 | 0.8031 | 0.8471 | 0.5461 | nan | 0.7064 | 0.8101 | 0.0863 | 0.9576 | 0.1442 | 0.0 | 0.6391 | 0.0892 | 0.7285 | 0.0 | 0.0249 | 0.9313 | 0.1826 | 0.6372 | 0.3967 | 0.5232 | nan | 0.3067 | 0.6147 | 0.5874 | 0.0 | 0.9550 | 0.8416 | 0.9838 | 0.0089 | 0.2377 | 0.4788 | 0.0 | nan | 0.7624 | 0.8777 | 0.6896 | 0.7528 | 0.3999 | nan | 0.5928 | 0.6580 | 0.0516 | 0.8651 | 0.1244 | 0.0 | 0.4500 | 0.0892 | 0.5927 | 0.0 | 0.0246 | 0.7715 | 0.1615 | 0.5397 | 0.3325 | 0.3890 | nan | 0.2582 | 0.5085 | 0.3975 | 0.0 | 0.8872 | 0.7598 | 0.9531 | 0.0068 | 0.1216 | 0.3773 | 0.0 | | 0.1038 | 46.9 | 9380 | 0.7274 | 0.4177 | 0.4942 | 0.8783 | nan | 0.8465 | 0.9568 | 0.7992 | 0.8453 | 0.5452 | nan | 0.7094 | 0.8165 | 0.0852 | 0.9587 | 0.1454 | 0.0 | 0.6590 | 0.0907 | 0.7254 | 0.0 | 0.0270 | 0.9278 | 0.1996 | 0.6471 | 0.3748 | 0.4610 | nan | 0.2958 | 0.6211 | 0.5866 | 0.0 | 0.9559 | 0.8365 | 0.9833 | 0.0078 | 0.2260 | 0.4811 | 0.0 | nan | 0.7621 | 0.8770 | 0.6919 | 0.7533 | 0.4018 | nan | 0.5925 | 0.6552 | 0.0508 | 0.8641 | 0.1258 | 0.0 | 0.4486 | 0.0907 | 0.5881 | 0.0 | 0.0267 | 0.7738 | 0.1758 | 0.5457 | 0.3221 | 0.3668 | nan | 0.2496 | 0.5085 | 0.3966 | 0.0 | 0.8863 | 0.7590 | 0.9533 | 0.0059 | 0.1182 | 0.3766 | 0.0 | | 0.0721 | 47.0 | 9400 | 0.7309 | 0.4177 | 0.4954 | 0.8785 | nan | 0.8496 | 0.9579 | 0.7981 | 0.8473 | 0.5252 | nan | 0.7008 | 0.8264 | 0.0810 | 0.9573 | 0.1389 | 0.0 | 0.6520 | 0.0924 | 0.7349 | 0.0 | 0.0209 | 0.9318 | 0.1881 | 0.6388 | 0.3893 | 0.5309 | nan | 0.2970 | 0.6226 | 0.5797 | 0.0 | 0.9539 | 0.8383 | 0.9824 | 0.0103 | 0.2231 | 0.4843 | 0.0 | nan | 0.7617 | 0.8773 | 0.6923 | 0.7515 | 0.4019 | nan | 0.5902 | 0.6515 | 0.0488 | 0.8656 | 0.1199 | 0.0 | 0.4441 | 0.0924 | 0.5927 | 0.0 | 0.0206 | 0.7722 | 0.1676 | 0.5419 | 0.3287 | 0.3841 | nan | 0.2496 | 0.5102 | 0.3969 | 0.0 | 0.8872 | 0.7594 | 0.9538 | 0.0076 | 0.1178 | 0.3788 | 0.0 | | 0.0765 | 47.1 | 9420 | 0.7436 | 0.4173 | 0.4947 | 0.8780 | nan | 0.8395 | 0.9568 | 0.7953 | 0.8553 | 0.5420 | nan | 0.7069 | 0.8268 | 0.0823 | 0.9524 | 0.1397 | 0.0 | 0.6590 | 0.0776 | 0.7365 | 0.0 | 0.0291 | 0.9325 | 0.1797 | 0.6467 | 0.3971 | 0.5116 | nan | 0.2776 | 0.6242 | 0.5679 | 0.0 | 0.9525 | 0.8448 | 0.9835 | 0.0085 | 0.2178 | 0.4879 | 0.0 | nan | 0.7576 | 0.8766 | 0.6852 | 0.7475 | 0.4050 | nan | 0.5935 | 0.6497 | 0.0490 | 0.8680 | 0.1209 | 0.0 | 0.4457 | 0.0776 | 0.5973 | 0.0 | 0.0285 | 0.7724 | 0.1608 | 0.5445 | 0.3343 | 0.3842 | nan | 0.2379 | 0.5114 | 0.3974 | 0.0 | 0.8881 | 0.7620 | 0.9534 | 0.0065 | 0.1186 | 0.3802 | 0.0 | | 0.0778 | 47.2 | 9440 | 0.7376 | 0.4153 | 0.4919 | 0.8779 | nan | 0.8420 | 0.9571 | 0.7947 | 0.8531 | 0.5307 | nan | 0.7137 | 0.8216 | 0.0704 | 0.9525 | 0.1361 | 0.0 | 0.6425 | 0.0739 | 0.7302 | 0.0 | 0.0221 | 0.9352 | 0.1703 | 0.6248 | 0.3884 | 0.5102 | nan | 0.2715 | 0.6262 | 0.5748 | 0.0 | 0.9541 | 0.8482 | 0.9821 | 0.0113 | 0.2260 | 0.4762 | 0.0 | nan | 0.7585 | 0.8773 | 0.6851 | 0.7481 | 0.4032 | nan | 0.5925 | 0.6512 | 0.0433 | 0.8679 | 0.1177 | 0.0 | 0.4427 | 0.0739 | 0.5972 | 0.0 | 0.0217 | 0.7690 | 0.1527 | 0.5360 | 0.3284 | 0.3733 | nan | 0.2334 | 0.5107 | 0.3990 | 0.0 | 0.8876 | 0.7631 | 0.9538 | 0.0084 | 0.1165 | 0.3760 | 0.0 | | 0.0621 | 47.3 | 9460 | 0.7279 | 0.4137 | 0.4915 | 0.8774 | nan | 0.8375 | 0.9566 | 0.8020 | 0.8541 | 0.5428 | nan | 0.7096 | 0.8224 | 0.0794 | 0.9576 | 0.1345 | 0.0 | 0.6602 | 0.0616 | 0.7244 | 0.0 | 0.0222 | 0.9324 | 0.1667 | 0.6315 | 0.3798 | 0.5007 | nan | 0.2676 | 0.6308 | 0.5884 | 0.0 | 0.9567 | 0.8354 | 0.9810 | 0.0079 | 0.2086 | 0.4761 | 0.0 | nan | 0.7575 | 0.8770 | 0.6783 | 0.7480 | 0.4013 | nan | 0.5942 | 0.6488 | 0.0470 | 0.8652 | 0.1168 | 0.0 | 0.4413 | 0.0616 | 0.5908 | 0.0 | 0.0220 | 0.7701 | 0.1501 | 0.5382 | 0.3234 | 0.3744 | nan | 0.2276 | 0.5094 | 0.3971 | 0.0 | 0.8862 | 0.7601 | 0.9539 | 0.0060 | 0.1150 | 0.3776 | 0.0 | | 0.0558 | 47.4 | 9480 | 0.7360 | 0.4143 | 0.4952 | 0.8770 | nan | 0.8385 | 0.9552 | 0.8044 | 0.8521 | 0.5415 | nan | 0.7143 | 0.8293 | 0.0805 | 0.9582 | 0.1368 | 0.0 | 0.6804 | 0.0656 | 0.7297 | 0.0 | 0.0222 | 0.9316 | 0.1784 | 0.6315 | 0.3912 | 0.5249 | nan | 0.2854 | 0.6190 | 0.5898 | 0.0 | 0.9560 | 0.8343 | 0.9832 | 0.0095 | 0.2359 | 0.4682 | 0.0 | nan | 0.7570 | 0.8768 | 0.6773 | 0.7489 | 0.3980 | nan | 0.5937 | 0.6465 | 0.0482 | 0.8650 | 0.1187 | 0.0 | 0.4391 | 0.0656 | 0.5893 | 0.0 | 0.0220 | 0.7709 | 0.1589 | 0.5394 | 0.3281 | 0.3818 | nan | 0.2384 | 0.5090 | 0.3943 | 0.0 | 0.8860 | 0.7567 | 0.9533 | 0.0072 | 0.1149 | 0.3720 | 0.0 | | 0.0726 | 47.5 | 9500 | 0.7314 | 0.4160 | 0.4960 | 0.8772 | nan | 0.8422 | 0.9544 | 0.7934 | 0.8513 | 0.5438 | nan | 0.7173 | 0.8243 | 0.0799 | 0.9579 | 0.1533 | 0.0 | 0.7087 | 0.0749 | 0.7255 | 0.0 | 0.0160 | 0.9311 | 0.2048 | 0.6372 | 0.3851 | 0.5228 | nan | 0.2714 | 0.6266 | 0.5853 | 0.0 | 0.9553 | 0.8319 | 0.9832 | 0.0115 | 0.2038 | 0.4784 | 0.0 | nan | 0.7578 | 0.8768 | 0.6892 | 0.7488 | 0.3959 | nan | 0.5926 | 0.6495 | 0.0482 | 0.8648 | 0.1313 | 0.0 | 0.4538 | 0.0749 | 0.5859 | 0.0 | 0.0158 | 0.7718 | 0.1794 | 0.5413 | 0.3257 | 0.3844 | nan | 0.2306 | 0.5098 | 0.3921 | 0.0 | 0.8859 | 0.7547 | 0.9535 | 0.0087 | 0.1108 | 0.3771 | 0.0 | | 0.0779 | 47.6 | 9520 | 0.7363 | 0.4140 | 0.4934 | 0.8770 | nan | 0.8399 | 0.9569 | 0.7949 | 0.8506 | 0.5383 | nan | 0.7144 | 0.8275 | 0.0841 | 0.9571 | 0.1461 | 0.0 | 0.6956 | 0.0652 | 0.7286 | 0.0 | 0.0193 | 0.9326 | 0.1878 | 0.6376 | 0.3843 | 0.5204 | nan | 0.2447 | 0.6152 | 0.5780 | 0.0 | 0.9556 | 0.8259 | 0.9832 | 0.0079 | 0.2296 | 0.4659 | 0.0 | nan | 0.7575 | 0.8757 | 0.6880 | 0.7494 | 0.3984 | nan | 0.5922 | 0.6447 | 0.0499 | 0.8652 | 0.1262 | 0.0 | 0.4449 | 0.0652 | 0.5913 | 0.0 | 0.0191 | 0.7713 | 0.1667 | 0.5406 | 0.3256 | 0.3877 | nan | 0.2083 | 0.5086 | 0.3915 | 0.0 | 0.8857 | 0.7537 | 0.9534 | 0.0062 | 0.1125 | 0.3698 | 0.0 | | 0.0684 | 47.7 | 9540 | 0.7313 | 0.4147 | 0.4948 | 0.8773 | nan | 0.8394 | 0.9563 | 0.7983 | 0.8544 | 0.5386 | nan | 0.7088 | 0.8248 | 0.0835 | 0.9565 | 0.1451 | 0.0 | 0.6814 | 0.0755 | 0.7316 | 0.0 | 0.0193 | 0.9340 | 0.1910 | 0.6379 | 0.3899 | 0.5246 | nan | 0.2466 | 0.6246 | 0.5960 | 0.0 | 0.9549 | 0.8363 | 0.9820 | 0.0070 | 0.2391 | 0.4575 | 0.0 | nan | 0.7574 | 0.8760 | 0.6850 | 0.7473 | 0.3987 | nan | 0.5922 | 0.6459 | 0.0498 | 0.8657 | 0.1249 | 0.0 | 0.4395 | 0.0755 | 0.5940 | 0.0 | 0.0191 | 0.7708 | 0.1686 | 0.5411 | 0.3285 | 0.3841 | nan | 0.2108 | 0.5105 | 0.3936 | 0.0 | 0.8873 | 0.7602 | 0.9538 | 0.0055 | 0.1153 | 0.3687 | 0.0 | | 0.0506 | 47.8 | 9560 | 0.7419 | 0.4159 | 0.4964 | 0.8773 | nan | 0.8391 | 0.9555 | 0.7997 | 0.8506 | 0.5418 | nan | 0.7197 | 0.8267 | 0.0754 | 0.9557 | 0.1434 | 0.0 | 0.6838 | 0.0789 | 0.7314 | 0.0 | 0.0211 | 0.9315 | 0.2033 | 0.6406 | 0.3975 | 0.5284 | nan | 0.2621 | 0.6212 | 0.5835 | 0.0 | 0.9555 | 0.8379 | 0.9829 | 0.0084 | 0.2466 | 0.4632 | 0.0 | nan | 0.7574 | 0.8760 | 0.6866 | 0.7498 | 0.3988 | nan | 0.5916 | 0.6472 | 0.0461 | 0.8665 | 0.1237 | 0.0 | 0.4367 | 0.0789 | 0.5946 | 0.0 | 0.0208 | 0.7718 | 0.1779 | 0.5414 | 0.3328 | 0.3886 | nan | 0.2215 | 0.5104 | 0.3945 | 0.0 | 0.8871 | 0.7591 | 0.9536 | 0.0066 | 0.1176 | 0.3700 | 0.0 | | 0.088 | 47.9 | 9580 | 0.7372 | 0.4154 | 0.4948 | 0.8770 | nan | 0.8372 | 0.9557 | 0.7968 | 0.8555 | 0.5386 | nan | 0.7190 | 0.8235 | 0.0752 | 0.9572 | 0.1330 | 0.0 | 0.6780 | 0.0705 | 0.7250 | 0.0 | 0.0221 | 0.9338 | 0.2007 | 0.6348 | 0.3928 | 0.5277 | nan | 0.2733 | 0.6200 | 0.5778 | 0.0 | 0.9557 | 0.8324 | 0.9817 | 0.0100 | 0.2410 | 0.4647 | 0.0 | nan | 0.7565 | 0.8758 | 0.6889 | 0.7478 | 0.3999 | nan | 0.5922 | 0.6489 | 0.0459 | 0.8654 | 0.1161 | 0.0 | 0.4375 | 0.0705 | 0.5903 | 0.0 | 0.0219 | 0.7702 | 0.1762 | 0.5390 | 0.3303 | 0.3902 | nan | 0.2294 | 0.5097 | 0.3954 | 0.0 | 0.8866 | 0.7577 | 0.9539 | 0.0076 | 0.1170 | 0.3704 | 0.0 | | 0.0708 | 48.0 | 9600 | 0.7499 | 0.4138 | 0.4932 | 0.8768 | nan | 0.8380 | 0.9567 | 0.8049 | 0.8498 | 0.5414 | nan | 0.7080 | 0.8187 | 0.0876 | 0.9575 | 0.1376 | 0.0 | 0.6679 | 0.0770 | 0.7211 | 0.0 | 0.0208 | 0.9348 | 0.1811 | 0.6242 | 0.3758 | 0.5263 | nan | 0.2632 | 0.6166 | 0.5870 | 0.0 | 0.9565 | 0.8334 | 0.9838 | 0.0097 | 0.2509 | 0.4512 | 0.0 | nan | 0.7566 | 0.8760 | 0.6799 | 0.7510 | 0.4006 | nan | 0.5910 | 0.6498 | 0.0504 | 0.8651 | 0.1189 | 0.0 | 0.4321 | 0.0770 | 0.5895 | 0.0 | 0.0206 | 0.7681 | 0.1609 | 0.5345 | 0.3214 | 0.3879 | nan | 0.2187 | 0.5091 | 0.3942 | 0.0 | 0.8860 | 0.7582 | 0.9531 | 0.0074 | 0.1174 | 0.3658 | 0.0 | | 0.0619 | 48.1 | 9620 | 0.7347 | 0.4134 | 0.4906 | 0.8774 | nan | 0.8400 | 0.9564 | 0.7991 | 0.8501 | 0.5427 | nan | 0.7101 | 0.8264 | 0.0870 | 0.9591 | 0.1333 | 0.0 | 0.6645 | 0.0631 | 0.7117 | 0.0 | 0.0173 | 0.9342 | 0.1844 | 0.6364 | 0.3685 | 0.4912 | nan | 0.2683 | 0.6247 | 0.5788 | 0.0 | 0.9545 | 0.8391 | 0.9843 | 0.0087 | 0.2019 | 0.4636 | 0.0 | nan | 0.7576 | 0.8762 | 0.6888 | 0.7512 | 0.4006 | nan | 0.5913 | 0.6446 | 0.0503 | 0.8633 | 0.1157 | 0.0 | 0.4366 | 0.0631 | 0.5840 | 0.0 | 0.0172 | 0.7691 | 0.1639 | 0.5377 | 0.3190 | 0.3767 | nan | 0.2234 | 0.5091 | 0.3951 | 0.0 | 0.8869 | 0.7603 | 0.9529 | 0.0067 | 0.1134 | 0.3731 | 0.0 | | 0.0679 | 48.2 | 9640 | 0.7443 | 0.4146 | 0.4932 | 0.8774 | nan | 0.8398 | 0.9563 | 0.8032 | 0.8523 | 0.5367 | nan | 0.7114 | 0.8243 | 0.0852 | 0.9570 | 0.1333 | 0.0 | 0.6632 | 0.0695 | 0.7156 | 0.0 | 0.0205 | 0.9321 | 0.1847 | 0.6359 | 0.3747 | 0.5341 | nan | 0.2829 | 0.6160 | 0.5802 | 0.0 | 0.9557 | 0.8381 | 0.9843 | 0.0100 | 0.2119 | 0.4738 | 0.0 | nan | 0.7574 | 0.8765 | 0.6845 | 0.7502 | 0.4017 | nan | 0.5913 | 0.6466 | 0.0493 | 0.8652 | 0.1155 | 0.0 | 0.4376 | 0.0695 | 0.5849 | 0.0 | 0.0204 | 0.7699 | 0.1649 | 0.5374 | 0.3215 | 0.3882 | nan | 0.2340 | 0.5089 | 0.3947 | 0.0 | 0.8867 | 0.7605 | 0.9530 | 0.0075 | 0.1149 | 0.3748 | 0.0 | | 0.0938 | 48.3 | 9660 | 0.7429 | 0.4140 | 0.4931 | 0.8774 | nan | 0.8406 | 0.9563 | 0.8019 | 0.8513 | 0.5320 | nan | 0.7138 | 0.8390 | 0.0770 | 0.9569 | 0.1339 | 0.0 | 0.6531 | 0.0543 | 0.7198 | 0.0 | 0.0187 | 0.9321 | 0.1847 | 0.6366 | 0.3761 | 0.5390 | nan | 0.2840 | 0.6273 | 0.5741 | 0.0 | 0.9549 | 0.8357 | 0.9836 | 0.0111 | 0.2154 | 0.4776 | 0.0 | nan | 0.7576 | 0.8767 | 0.6863 | 0.7500 | 0.4008 | nan | 0.5908 | 0.6389 | 0.0470 | 0.8657 | 0.1162 | 0.0 | 0.4395 | 0.0543 | 0.5858 | 0.0 | 0.0185 | 0.7704 | 0.1651 | 0.5387 | 0.3217 | 0.3894 | nan | 0.2368 | 0.5099 | 0.3934 | 0.0 | 0.8866 | 0.7584 | 0.9532 | 0.0083 | 0.1141 | 0.3750 | 0.0 | | 0.0591 | 48.4 | 9680 | 0.7427 | 0.4138 | 0.4931 | 0.8771 | nan | 0.8375 | 0.9563 | 0.8043 | 0.8503 | 0.5410 | nan | 0.7166 | 0.8399 | 0.0708 | 0.9560 | 0.1297 | 0.0 | 0.6557 | 0.0510 | 0.7179 | 0.0 | 0.0270 | 0.9321 | 0.1910 | 0.6275 | 0.3767 | 0.5428 | nan | 0.2835 | 0.6128 | 0.5744 | 0.0 | 0.9556 | 0.8359 | 0.9835 | 0.0084 | 0.2141 | 0.4868 | 0.0 | nan | 0.7571 | 0.8764 | 0.6832 | 0.7505 | 0.4016 | nan | 0.5924 | 0.6416 | 0.0450 | 0.8665 | 0.1128 | 0.0 | 0.4379 | 0.0510 | 0.5873 | 0.0 | 0.0268 | 0.7699 | 0.1697 | 0.5351 | 0.3212 | 0.3897 | nan | 0.2340 | 0.5080 | 0.3918 | 0.0 | 0.8863 | 0.7582 | 0.9532 | 0.0065 | 0.1118 | 0.3760 | 0.0 | | 0.0523 | 48.5 | 9700 | 0.7421 | 0.4144 | 0.4949 | 0.8775 | nan | 0.8384 | 0.9572 | 0.8026 | 0.8506 | 0.5378 | nan | 0.7151 | 0.8431 | 0.0752 | 0.9573 | 0.1406 | 0.0 | 0.6618 | 0.0523 | 0.7143 | 0.0 | 0.0241 | 0.9290 | 0.1937 | 0.6402 | 0.3804 | 0.5604 | nan | 0.2732 | 0.6196 | 0.5871 | 0.0 | 0.9550 | 0.8367 | 0.9846 | 0.0083 | 0.2182 | 0.4810 | 0.0 | nan | 0.7572 | 0.8761 | 0.6848 | 0.7501 | 0.4023 | nan | 0.5922 | 0.6393 | 0.0462 | 0.8657 | 0.1215 | 0.0 | 0.4336 | 0.0523 | 0.5862 | 0.0 | 0.0239 | 0.7723 | 0.1720 | 0.5407 | 0.3235 | 0.3985 | nan | 0.2255 | 0.5089 | 0.3921 | 0.0 | 0.8870 | 0.7597 | 0.9527 | 0.0065 | 0.1145 | 0.3759 | 0.0 | | 0.0637 | 48.6 | 9720 | 0.7331 | 0.4134 | 0.4923 | 0.8774 | nan | 0.8411 | 0.9564 | 0.8016 | 0.8486 | 0.5389 | nan | 0.7131 | 0.8436 | 0.0734 | 0.9559 | 0.1355 | 0.0 | 0.6678 | 0.0479 | 0.7143 | 0.0 | 0.0212 | 0.9321 | 0.1742 | 0.6388 | 0.3770 | 0.5414 | nan | 0.2821 | 0.6092 | 0.5614 | 0.0 | 0.9559 | 0.8377 | 0.9840 | 0.0100 | 0.2209 | 0.4709 | 0.0 | nan | 0.7578 | 0.8764 | 0.6842 | 0.7507 | 0.4014 | nan | 0.5907 | 0.6393 | 0.0455 | 0.8661 | 0.1174 | 0.0 | 0.4289 | 0.0479 | 0.5867 | 0.0 | 0.0211 | 0.7708 | 0.1560 | 0.5389 | 0.3215 | 0.3930 | nan | 0.2359 | 0.5076 | 0.3937 | 0.0 | 0.8865 | 0.7603 | 0.9530 | 0.0076 | 0.1152 | 0.3744 | 0.0 | | 0.0684 | 48.7 | 9740 | 0.7413 | 0.4147 | 0.4926 | 0.8777 | nan | 0.8393 | 0.9575 | 0.7960 | 0.8495 | 0.5399 | nan | 0.7155 | 0.8305 | 0.0608 | 0.9568 | 0.1343 | 0.0 | 0.6793 | 0.0566 | 0.7062 | 0.0 | 0.0236 | 0.9330 | 0.1737 | 0.6419 | 0.3874 | 0.5362 | nan | 0.2728 | 0.6194 | 0.5792 | 0.0 | 0.9542 | 0.8375 | 0.9835 | 0.0097 | 0.2096 | 0.4797 | 0.0 | nan | 0.7577 | 0.8758 | 0.6888 | 0.7504 | 0.4032 | nan | 0.5920 | 0.6493 | 0.0382 | 0.8655 | 0.1165 | 0.0 | 0.4411 | 0.0566 | 0.5861 | 0.0 | 0.0234 | 0.7713 | 0.1561 | 0.5405 | 0.3279 | 0.3945 | nan | 0.2280 | 0.5106 | 0.3945 | 0.0 | 0.8875 | 0.7615 | 0.9533 | 0.0074 | 0.1156 | 0.3780 | 0.0 | | 0.0569 | 48.8 | 9760 | 0.7418 | 0.4147 | 0.4920 | 0.8776 | nan | 0.8406 | 0.9572 | 0.7979 | 0.8524 | 0.5363 | nan | 0.7072 | 0.8246 | 0.0664 | 0.9582 | 0.1298 | 0.0 | 0.6691 | 0.0599 | 0.7100 | 0.0 | 0.0233 | 0.9336 | 0.1790 | 0.6340 | 0.3755 | 0.5214 | nan | 0.2883 | 0.6211 | 0.5849 | 0.0 | 0.9546 | 0.8381 | 0.9827 | 0.0090 | 0.2097 | 0.4801 | 0.0 | nan | 0.7578 | 0.8763 | 0.6864 | 0.7495 | 0.4025 | nan | 0.5912 | 0.6513 | 0.0409 | 0.8647 | 0.1129 | 0.0 | 0.4435 | 0.0599 | 0.5852 | 0.0 | 0.0232 | 0.7699 | 0.1597 | 0.5369 | 0.3219 | 0.3894 | nan | 0.2395 | 0.5099 | 0.3934 | 0.0 | 0.8873 | 0.7619 | 0.9536 | 0.0069 | 0.1146 | 0.3783 | 0.0 | | 0.0643 | 48.9 | 9780 | 0.7366 | 0.4135 | 0.4904 | 0.8775 | nan | 0.8419 | 0.9566 | 0.7893 | 0.8522 | 0.5418 | nan | 0.7185 | 0.8334 | 0.0726 | 0.9567 | 0.1313 | 0.0 | 0.6664 | 0.0590 | 0.7157 | 0.0 | 0.0185 | 0.9318 | 0.1687 | 0.6233 | 0.3777 | 0.5091 | nan | 0.2657 | 0.6215 | 0.5780 | 0.0 | 0.9570 | 0.8320 | 0.9825 | 0.0083 | 0.1973 | 0.4853 | 0.0 | nan | 0.7581 | 0.8762 | 0.6937 | 0.7500 | 0.4024 | nan | 0.5925 | 0.6464 | 0.0436 | 0.8658 | 0.1139 | 0.0 | 0.4394 | 0.0590 | 0.5873 | 0.0 | 0.0184 | 0.7697 | 0.1521 | 0.5328 | 0.3214 | 0.3823 | nan | 0.2255 | 0.5103 | 0.3930 | 0.0 | 0.8863 | 0.7600 | 0.9536 | 0.0065 | 0.1109 | 0.3795 | 0.0 | | 0.0618 | 49.0 | 9800 | 0.7324 | 0.4133 | 0.4908 | 0.8776 | nan | 0.8385 | 0.9577 | 0.8009 | 0.8520 | 0.5329 | nan | 0.7132 | 0.8277 | 0.0697 | 0.9596 | 0.1262 | 0.0 | 0.6673 | 0.0580 | 0.7046 | 0.0 | 0.0220 | 0.9325 | 0.1735 | 0.6305 | 0.3849 | 0.5274 | nan | 0.2633 | 0.6253 | 0.5768 | 0.0 | 0.9547 | 0.8406 | 0.9843 | 0.0085 | 0.1999 | 0.4747 | 0.0 | nan | 0.7577 | 0.8761 | 0.6846 | 0.7503 | 0.4020 | nan | 0.5919 | 0.6485 | 0.0424 | 0.8634 | 0.1100 | 0.0 | 0.4368 | 0.0580 | 0.5832 | 0.0 | 0.0219 | 0.7696 | 0.1555 | 0.5351 | 0.3255 | 0.3876 | nan | 0.2220 | 0.5101 | 0.3954 | 0.0 | 0.8875 | 0.7621 | 0.9528 | 0.0066 | 0.1131 | 0.3773 | 0.0 | | 0.0722 | 49.1 | 9820 | 0.7347 | 0.4152 | 0.4928 | 0.8777 | nan | 0.8403 | 0.9565 | 0.7873 | 0.8557 | 0.5384 | nan | 0.7119 | 0.8282 | 0.0710 | 0.9564 | 0.1294 | 0.0 | 0.6734 | 0.0744 | 0.7173 | 0.0 | 0.0147 | 0.9324 | 0.1844 | 0.6363 | 0.3878 | 0.5176 | nan | 0.2736 | 0.6280 | 0.5896 | 0.0 | 0.9558 | 0.8401 | 0.9836 | 0.0111 | 0.2020 | 0.4727 | 0.0 | nan | 0.7581 | 0.8762 | 0.6974 | 0.7481 | 0.4016 | nan | 0.5941 | 0.6497 | 0.0431 | 0.8656 | 0.1120 | 0.0 | 0.4399 | 0.0744 | 0.5860 | 0.0 | 0.0146 | 0.7702 | 0.1637 | 0.5370 | 0.3277 | 0.3850 | nan | 0.2310 | 0.5107 | 0.3964 | 0.0 | 0.8870 | 0.7609 | 0.9530 | 0.0084 | 0.1158 | 0.3774 | 0.0 | | 0.0848 | 49.2 | 9840 | 0.7302 | 0.4154 | 0.4924 | 0.8779 | nan | 0.8416 | 0.9570 | 0.7898 | 0.8545 | 0.5362 | nan | 0.7096 | 0.8272 | 0.0683 | 0.9567 | 0.1272 | 0.0 | 0.6605 | 0.0652 | 0.7112 | 0.0 | 0.0269 | 0.9322 | 0.1873 | 0.6383 | 0.3877 | 0.5232 | nan | 0.2786 | 0.6276 | 0.5780 | 0.0 | 0.9544 | 0.8408 | 0.9845 | 0.0039 | 0.2122 | 0.4777 | 0.0 | nan | 0.7582 | 0.8762 | 0.6952 | 0.7485 | 0.4011 | nan | 0.5941 | 0.6516 | 0.0409 | 0.8653 | 0.1102 | 0.0 | 0.4407 | 0.0652 | 0.5879 | 0.0 | 0.0267 | 0.7704 | 0.1661 | 0.5384 | 0.3281 | 0.3896 | nan | 0.2317 | 0.5104 | 0.3957 | 0.0 | 0.8876 | 0.7614 | 0.9526 | 0.0032 | 0.1171 | 0.3778 | 0.0 | | 0.0643 | 49.3 | 9860 | 0.7355 | 0.4159 | 0.4939 | 0.8777 | nan | 0.8423 | 0.9557 | 0.7939 | 0.8495 | 0.5429 | nan | 0.7175 | 0.8320 | 0.0748 | 0.9574 | 0.1283 | 0.0 | 0.6710 | 0.0728 | 0.7084 | 0.0 | 0.0211 | 0.9312 | 0.1863 | 0.6321 | 0.3859 | 0.5260 | nan | 0.2964 | 0.6212 | 0.5756 | 0.0 | 0.9555 | 0.8396 | 0.9838 | 0.0094 | 0.2111 | 0.4841 | 0.0 | nan | 0.7584 | 0.8765 | 0.6940 | 0.7514 | 0.4006 | nan | 0.5930 | 0.6498 | 0.0438 | 0.8652 | 0.1112 | 0.0 | 0.4418 | 0.0728 | 0.5869 | 0.0 | 0.0209 | 0.7707 | 0.1661 | 0.5366 | 0.3264 | 0.3899 | nan | 0.2456 | 0.5099 | 0.3942 | 0.0 | 0.8869 | 0.7603 | 0.9532 | 0.0072 | 0.1160 | 0.3788 | 0.0 | | 0.0671 | 49.4 | 9880 | 0.7447 | 0.4160 | 0.4946 | 0.8777 | nan | 0.8400 | 0.9564 | 0.8003 | 0.8508 | 0.5393 | nan | 0.7193 | 0.8202 | 0.0841 | 0.9561 | 0.1303 | 0.0 | 0.6808 | 0.0803 | 0.7160 | 0.0 | 0.0180 | 0.9329 | 0.1805 | 0.6364 | 0.3888 | 0.5383 | nan | 0.2899 | 0.6208 | 0.5842 | 0.0 | 0.9553 | 0.8362 | 0.9844 | 0.0111 | 0.1967 | 0.4782 | 0.0 | nan | 0.7581 | 0.8765 | 0.6879 | 0.7514 | 0.4014 | nan | 0.5924 | 0.6533 | 0.0488 | 0.8656 | 0.1124 | 0.0 | 0.4460 | 0.0803 | 0.5854 | 0.0 | 0.0179 | 0.7700 | 0.1612 | 0.5372 | 0.3270 | 0.3889 | nan | 0.2425 | 0.5099 | 0.3944 | 0.0 | 0.8867 | 0.7602 | 0.9529 | 0.0084 | 0.1167 | 0.3799 | 0.0 | | 0.0706 | 49.5 | 9900 | 0.7401 | 0.4150 | 0.4933 | 0.8775 | nan | 0.8401 | 0.9563 | 0.7972 | 0.8519 | 0.5382 | nan | 0.7231 | 0.8251 | 0.0681 | 0.9572 | 0.1265 | 0.0 | 0.6689 | 0.0728 | 0.7077 | 0.0 | 0.0200 | 0.9336 | 0.1805 | 0.6236 | 0.3934 | 0.5249 | nan | 0.2937 | 0.6237 | 0.5834 | 0.0 | 0.9554 | 0.8395 | 0.9831 | 0.0091 | 0.2201 | 0.4694 | 0.0 | nan | 0.7584 | 0.8766 | 0.6901 | 0.7510 | 0.4012 | nan | 0.5928 | 0.6518 | 0.0408 | 0.8652 | 0.1098 | 0.0 | 0.4336 | 0.0728 | 0.5861 | 0.0 | 0.0199 | 0.7688 | 0.1606 | 0.5332 | 0.3284 | 0.3879 | nan | 0.2446 | 0.5098 | 0.3961 | 0.0 | 0.8869 | 0.7609 | 0.9535 | 0.0070 | 0.1169 | 0.3748 | 0.0 | | 0.1163 | 49.6 | 9920 | 0.7314 | 0.4154 | 0.4931 | 0.8776 | nan | 0.8406 | 0.9555 | 0.7982 | 0.8534 | 0.5451 | nan | 0.7131 | 0.8257 | 0.0790 | 0.9569 | 0.1210 | 0.0 | 0.6700 | 0.0738 | 0.7103 | 0.0 | 0.0224 | 0.9322 | 0.1839 | 0.6400 | 0.3738 | 0.5067 | nan | 0.3024 | 0.6231 | 0.5802 | 0.0 | 0.9561 | 0.8347 | 0.9831 | 0.0082 | 0.2098 | 0.4802 | 0.0 | nan | 0.7584 | 0.8768 | 0.6893 | 0.7502 | 0.4007 | nan | 0.5928 | 0.6520 | 0.0460 | 0.8651 | 0.1058 | 0.0 | 0.4382 | 0.0738 | 0.5842 | 0.0 | 0.0223 | 0.7704 | 0.1638 | 0.5387 | 0.3210 | 0.3845 | nan | 0.2505 | 0.5099 | 0.3950 | 0.0 | 0.8865 | 0.7604 | 0.9534 | 0.0063 | 0.1171 | 0.3788 | 0.0 | | 0.0655 | 49.7 | 9940 | 0.7432 | 0.4168 | 0.4966 | 0.8777 | nan | 0.8411 | 0.9569 | 0.7993 | 0.8503 | 0.5401 | nan | 0.7121 | 0.8312 | 0.0845 | 0.9536 | 0.1329 | 0.0 | 0.6689 | 0.0911 | 0.7217 | 0.0 | 0.0183 | 0.9332 | 0.1866 | 0.6354 | 0.3938 | 0.5485 | nan | 0.2954 | 0.6268 | 0.5871 | 0.0 | 0.9534 | 0.8387 | 0.9836 | 0.0106 | 0.2143 | 0.4811 | 0.0 | nan | 0.7581 | 0.8764 | 0.6886 | 0.7514 | 0.4024 | nan | 0.5921 | 0.6494 | 0.0486 | 0.8673 | 0.1144 | 0.0 | 0.4376 | 0.0911 | 0.5913 | 0.0 | 0.0181 | 0.7701 | 0.1660 | 0.5373 | 0.3309 | 0.3936 | nan | 0.2439 | 0.5109 | 0.3925 | 0.0 | 0.8879 | 0.7621 | 0.9533 | 0.0080 | 0.1168 | 0.3782 | 0.0 | | 0.0654 | 49.8 | 9960 | 0.7503 | 0.4162 | 0.4953 | 0.8776 | nan | 0.8410 | 0.9559 | 0.8005 | 0.8522 | 0.5450 | nan | 0.7082 | 0.8301 | 0.0806 | 0.9556 | 0.1303 | 0.0 | 0.6670 | 0.0746 | 0.7186 | 0.0 | 0.0215 | 0.9323 | 0.1870 | 0.6362 | 0.3906 | 0.5446 | nan | 0.3029 | 0.6134 | 0.5720 | 0.0 | 0.9558 | 0.8356 | 0.9824 | 0.0106 | 0.2257 | 0.4792 | 0.0 | nan | 0.7582 | 0.8768 | 0.6872 | 0.7508 | 0.4013 | nan | 0.5918 | 0.6502 | 0.0475 | 0.8662 | 0.1127 | 0.0 | 0.4361 | 0.0745 | 0.5863 | 0.0 | 0.0213 | 0.7704 | 0.1661 | 0.5377 | 0.3288 | 0.3947 | nan | 0.2522 | 0.5096 | 0.3940 | 0.0 | 0.8870 | 0.7608 | 0.9537 | 0.0080 | 0.1186 | 0.3769 | 0.0 | | 0.0668 | 49.9 | 9980 | 0.7339 | 0.4160 | 0.4957 | 0.8778 | nan | 0.8397 | 0.9574 | 0.8012 | 0.8533 | 0.5329 | nan | 0.7149 | 0.8324 | 0.0808 | 0.9581 | 0.1315 | 0.0 | 0.6686 | 0.0743 | 0.7097 | 0.0 | 0.0236 | 0.9318 | 0.1824 | 0.6321 | 0.3937 | 0.5660 | nan | 0.2925 | 0.6250 | 0.5827 | 0.0 | 0.9539 | 0.8401 | 0.9840 | 0.0107 | 0.2145 | 0.4754 | 0.0 | nan | 0.7578 | 0.8765 | 0.6865 | 0.7502 | 0.4021 | nan | 0.5921 | 0.6487 | 0.0468 | 0.8647 | 0.1139 | 0.0 | 0.4381 | 0.0743 | 0.5855 | 0.0 | 0.0234 | 0.7708 | 0.1630 | 0.5373 | 0.3288 | 0.3982 | nan | 0.2423 | 0.5108 | 0.3945 | 0.0 | 0.8876 | 0.7623 | 0.9531 | 0.0081 | 0.1176 | 0.3773 | 0.0 | | 0.0611 | 50.0 | 10000 | 0.7413 | 0.4159 | 0.4945 | 0.8774 | nan | 0.8388 | 0.9573 | 0.8007 | 0.8552 | 0.5279 | nan | 0.7175 | 0.8243 | 0.0845 | 0.9557 | 0.1300 | 0.0 | 0.6660 | 0.0883 | 0.7186 | 0.0 | 0.0195 | 0.9361 | 0.1788 | 0.6178 | 0.3916 | 0.5516 | nan | 0.2882 | 0.6120 | 0.5812 | 0.0 | 0.9537 | 0.8377 | 0.9844 | 0.0132 | 0.2116 | 0.4815 | 0.0 | nan | 0.7575 | 0.8765 | 0.6860 | 0.7491 | 0.4030 | nan | 0.5916 | 0.6521 | 0.0488 | 0.8662 | 0.1122 | 0.0 | 0.4442 | 0.0883 | 0.5874 | 0.0 | 0.0193 | 0.7672 | 0.1600 | 0.5304 | 0.3282 | 0.3920 | nan | 0.2410 | 0.5084 | 0.3941 | 0.0 | 0.8872 | 0.7611 | 0.9529 | 0.0098 | 0.1155 | 0.3780 | 0.0 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1 - Datasets 2.12.0 - Tokenizers 0.13.2
Audi24/fire_classifier
Audi24
2023-09-22T00:06:37Z
65
1
transformers
[ "transformers", "tf", "vit", "image-classification", "generated_from_keras_callback", "base_model:google/vit-base-patch16-224-in21k", "base_model:finetune:google/vit-base-patch16-224-in21k", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2023-09-20T01:08:00Z
--- license: apache-2.0 base_model: google/vit-base-patch16-224-in21k tags: - generated_from_keras_callback model-index: - name: Audi24/fire_classifier results: [] --- <!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # Audi24/fire_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.1936 - Validation Loss: 0.1743 - Train Accuracy: 0.9889 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 1755, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 1.0088 | 0.8898 | 0.8667 | 0 | | 0.7325 | 0.6165 | 0.9333 | 1 | | 0.4620 | 0.3794 | 0.9444 | 2 | | 0.3100 | 0.2546 | 0.9667 | 3 | | 0.1936 | 0.1743 | 0.9889 | 4 | ### Framework versions - Transformers 4.33.2 - TensorFlow 2.13.0 - Datasets 2.14.5 - Tokenizers 0.13.3
RitaQi/speechModel-test1
RitaQi
2023-09-22T00:02:41Z
79
0
transformers
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "dataset:minds14", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-19T04:23:57Z
--- tags: - generated_from_trainer datasets: - minds14 model-index: - name: speechModel-test1 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # speechModel-test1 This model was trained from scratch on the minds14 dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 2000 ### Framework versions - Transformers 4.33.1 - Pytorch 2.0.1+cpu - Datasets 2.14.4 - Tokenizers 0.13.3
AshokKakunuri/whisper-tiny-ashok
AshokKakunuri
2023-09-21T23:40:01Z
75
0
transformers
[ "transformers", "pytorch", "whisper", "automatic-speech-recognition", "generated_from_trainer", "dataset:PolyAI/minds14", "base_model:openai/whisper-tiny", "base_model:finetune:openai/whisper-tiny", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-21T20:13:40Z
--- license: apache-2.0 base_model: openai/whisper-tiny tags: - generated_from_trainer datasets: - PolyAI/minds14 metrics: - wer model-index: - name: whisper-tiny-ashok results: - task: name: Automatic Speech Recognition type: automatic-speech-recognition dataset: name: PolyAI/minds14 type: PolyAI/minds14 config: en-US split: train args: en-US metrics: - name: Wer type: wer value: 0.35301353013530135 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # whisper-tiny-ashok This model is a fine-tuned version of [openai/whisper-tiny](https://huggingface.co/openai/whisper-tiny) on the PolyAI/minds14 dataset. It achieves the following results on the evaluation set: - Loss: 0.8440 - Wer Ortho: 34.6847 - Wer: 0.3530 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 50 - training_steps: 1000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer Ortho | Wer | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:| | 0.0234 | 6.67 | 100 | 0.6639 | 34.2986 | 0.3383 | | 0.003 | 13.33 | 200 | 0.7587 | 33.9768 | 0.3401 | | 0.0005 | 20.0 | 300 | 0.7870 | 34.2342 | 0.3475 | | 0.0003 | 26.67 | 400 | 0.8045 | 35.1351 | 0.3567 | | 0.0002 | 33.33 | 500 | 0.8144 | 35.5856 | 0.3610 | | 0.0001 | 40.0 | 600 | 0.8262 | 35.5212 | 0.3604 | | 0.0001 | 46.67 | 700 | 0.8341 | 35.3282 | 0.3592 | | 0.0001 | 53.33 | 800 | 0.8397 | 35.1995 | 0.3579 | | 0.0001 | 60.0 | 900 | 0.8426 | 34.7490 | 0.3536 | | 0.0001 | 66.67 | 1000 | 0.8440 | 34.6847 | 0.3530 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
Saul98lm/Prueba4
Saul98lm
2023-09-21T23:39:32Z
106
0
transformers
[ "transformers", "pytorch", "tensorboard", "roberta", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-21T22:31:37Z
--- license: apache-2.0 tags: - generated_from_trainer datasets: - glue metrics: - accuracy - f1 model-index: - name: Prueba4 results: - task: name: Text Classification type: text-classification dataset: name: glue type: glue config: mrpc split: validation args: mrpc metrics: - name: Accuracy type: accuracy value: 0.8406862745098039 - name: F1 type: f1 value: 0.8845470692717585 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Prueba4 This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 1.1542 - Accuracy: 0.8407 - F1: 0.8845 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.0931 | 1.09 | 500 | 1.1624 | 0.8260 | 0.8807 | | 0.0917 | 2.18 | 1000 | 1.1542 | 0.8407 | 0.8845 | ### Framework versions - Transformers 4.28.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
daochf/Lora-MetaLlama2-7b-hf-PuceDs05-v01
daochf
2023-09-21T23:31:28Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-21T23:30:42Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.5.0
alpindale/Llama-2-7b-ONNX
alpindale
2023-09-21T23:24:22Z
0
8
null
[ "onnx", "text generation", "text-generation", "en", "license:llama2", "region:us" ]
text-generation
2023-09-21T19:36:33Z
--- language: - en thumbnail: null tags: - text generation pipeline_tag: text-generation inference: false license: llama2 --- # Llama-2 ONNX This repository contains optimized version of Llama-2 7B. ## Downloading the model You can use `huggingface_hub` to download this repository. This can be done through both python scripting and the commandline. Refer to the [HuggingFace Hub Documentation](https://huggingface.co/docs/huggingface_hub/guides/download) for the Python examples. With CLI: 1. Make sure you have an updated `huggingface_hub` installed. ```sh pip install -U huggingface_hub ``` 2. Download the repository. ```sh huggingface-cli download alpindale/Llama-2-7b-ONNX --repo-type model --cache-dir /path/to/custom/cache/directory --local-dir /path/to/download/dir --local-dir-use-symlinks False ``` The `--cache-dir` kwarg is only necessary if your default cache directory (`~/.cache`) does not have enough disk space to accomodate the entire repository. ### Chat Interface You can use the Gradio chat interface to run the models. First, install the required packages: ```sh pip install -r ChatApp/requirements.txt ``` Set the Python path to the root directory of the repository (necessary for importing the required modules): ```sh export PYTHONPATH=$PYTHONPATH:$(pwd) ``` Then you can simply run: ```sh python ChatApp/app.py ``` You can then navigate to [http://localhost:7860](https://127.0.0.1:7860) on your browser to access the interface. ## CLI Interface The repository also provides example code for running the models. ```sh python llama2_onnx_inference.py --onnx_file FP16/LlamaV2_7B_float16.onnx --embedding_file embeddings.pth --tokenizer_path tokenizer.model --prompt "What is the lightest element?" ``` Output: ``` The lightest element is hydrogen. Hydrogen is the lightest element on the periodic table, with an atomic mass of 1.00794 u (unified atomic mass units). ``` ## FAQ ### Why is the first inference session slow? ONNX runtime execution provider might need to generate JIT binaries for the underlying hardware, typically the binary is cache and will be loaded directly in the subsequent runs to reduce the overhead. ### Why is FP16 slower than FP32 on my device? Your device may not support native FP16 math, therefore weights will be cast to FP32 at runtime. Using the FP32 version of the model will avoid the cast overhead. ### How do I optimize inference? It's recommended that inputs/outputs are put on target device to avoid expensive data copies, please refer to the following documentations for details: [I/O Binding | onnxruntime](https://onnxruntime.ai/docs/performance/tune-performance/iobinding.html) ### What generation parameters should I use the model with? You can perform temperature and top-p sampling with the provided example code. Please refer to Meta's example [here](https://github.com/facebookresearch/llama/).
aadel4/kid-whisper-small-myst_cslu
aadel4
2023-09-21T23:14:31Z
115
0
transformers
[ "transformers", "pytorch", "whisper", "automatic-speech-recognition", "generated_from_trainer", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-21T22:51:50Z
--- license: apache-2.0 tags: - generated_from_trainer metrics: - wer model-index: - name: openai/whisper-small-en results: - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: myst-test type: asr config: en split: test metrics: - type: wer value: 12.11 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: cslu_scripted type: asr config: en split: test metrics: - type: wer value: 2.74 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: cslu_spontaneous type: asr config: en split: test metrics: - type: wer value: 32.72 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: librispeech type: asr config: en split: testclean metrics: - type: wer value: 7.97 name: WER --- # openai/whisper-small-en This model is a fine-tuned version of [openai/whisper-small-en](https://huggingface.co/openai/whisper-small-en) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.21313807368278503 - Wer: 8.312998407517968 ## Training and evaluation data Training data: Myst Train (125 hours) + CSLU Scripted train (35 hours) Evaluation data: Myst Dev (20.9 hours) + CSLU Scripted Dev(4.8) ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 10000 - converged_after: 1500
CyberHarem/ujiie_mutsumi_idolmastercinderellagirls
CyberHarem
2023-09-21T23:07:41Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/ujiie_mutsumi_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-21T22:59:06Z
--- license: mit datasets: - CyberHarem/ujiie_mutsumi_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of ujiie_mutsumi_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 4080, you need to download `4080/ujiie_mutsumi_idolmastercinderellagirls.pt` as the embedding and `4080/ujiie_mutsumi_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 4080**, with the score of 0.953. The trigger words are: 1. `ujiie_mutsumi_idolmastercinderellagirls` 2. `black_hair, long_hair, braid, smile, bangs, blush, open_mouth, black_eyes, blue_eyes, grey_eyes` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.849 | [Download](5100/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.946 | [Download](4760/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.914 | [Download](4420/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | **4080** | **0.953** | [**Download**](4080/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.806 | [Download](3740/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.903 | [Download](3400/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.824 | [Download](3060/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.882 | [Download](2720/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.885 | [Download](2380/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.774 | [Download](2040/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.656 | [Download](1700/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.789 | [Download](1360/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.760 | [Download](1020/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.656 | [Download](680/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.620 | [Download](340/ujiie_mutsumi_idolmastercinderellagirls.zip) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
onarganogun/videomae-base-finetuned-ucf101-subset
onarganogun
2023-09-21T23:07:20Z
60
0
transformers
[ "transformers", "pytorch", "tensorboard", "videomae", "video-classification", "generated_from_trainer", "base_model:MCG-NJU/videomae-base", "base_model:finetune:MCG-NJU/videomae-base", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
video-classification
2023-08-17T08:57:11Z
--- license: cc-by-nc-4.0 base_model: MCG-NJU/videomae-base tags: - generated_from_trainer metrics: - accuracy model-index: - name: videomae-base-finetuned-ucf101-subset results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # videomae-base-finetuned-ucf101-subset This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4051 - Accuracy: 0.8714 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - training_steps: 148 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.1336 | 0.26 | 38 | 1.7594 | 0.5857 | | 0.9483 | 1.26 | 76 | 0.9110 | 0.7286 | | 0.4881 | 2.26 | 114 | 0.5500 | 0.8571 | | 0.3729 | 3.23 | 148 | 0.4051 | 0.8714 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3
pbaoo2705/falcon-7b-sharded
pbaoo2705
2023-09-21T22:56:16Z
0
0
null
[ "generated_from_trainer", "base_model:ybelkada/falcon-7b-sharded-bf16", "base_model:finetune:ybelkada/falcon-7b-sharded-bf16", "region:us" ]
null
2023-08-30T14:46:52Z
--- base_model: ybelkada/falcon-7b-sharded-bf16 tags: - generated_from_trainer model-index: - name: falcon-7b-sharded results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # falcon-7b-sharded This model is a fine-tuned version of [ybelkada/falcon-7b-sharded-bf16](https://huggingface.co/ybelkada/falcon-7b-sharded-bf16) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: constant - lr_scheduler_warmup_ratio: 0.03 - training_steps: 500 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
kaekitsune/mizuno
kaekitsune
2023-09-21T22:24:43Z
0
0
null
[ "license:creativeml-openrail-m", "region:us" ]
null
2023-09-21T22:20:43Z
--- license: creativeml-openrail-m ---
PratikSahu/ppo-Pyramids
PratikSahu
2023-09-21T22:22:02Z
13
0
ml-agents
[ "ml-agents", "tensorboard", "onnx", "Pyramids", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-Pyramids", "region:us" ]
reinforcement-learning
2023-09-21T22:21:57Z
--- library_name: ml-agents tags: - Pyramids - deep-reinforcement-learning - reinforcement-learning - ML-Agents-Pyramids --- # **ppo** Agent playing **Pyramids** This is a trained model of a **ppo** agent playing **Pyramids** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents). ## Usage (with ML-Agents) The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/ We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub: - A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction - A *longer tutorial* to understand how works ML-Agents: https://huggingface.co/learn/deep-rl-course/unit5/introduction ### Resume the training ```bash mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume ``` ### Watch your Agent play You can watch your agent **playing directly in your browser** 1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity 2. Step 1: Find your model_id: PratikSahu/ppo-Pyramids 3. Step 2: Select your *.nn /*.onnx file 4. Click on Watch the agent play 👀
christopherastone/distilgpt2-proofs
christopherastone
2023-09-21T21:41:35Z
139
0
transformers
[ "transformers", "pytorch", "tf", "jax", "safetensors", "gpt2", "text-generation", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2022-03-02T23:29:05Z
--- widget: - text: "Let MATH be given." - text: "If MATH is a nonempty" - text: "By the inductive hypothesis," --- [DistilGPT2](https://huggingface.co/distilgpt2) English language model fine-tuned on mathematical proofs extracted from [arXiv.org](https://arxiv.org) LaTeX sources from 1992 to 2020. Proofs have been cleaned up a bit. In particular, they use * `CITE` for any citation * `REF` for any reference * `MATH` for any LaTeX mathematical formula * `CASE:` for any `\item` or labeled subcase.
isashap/contexttrained-validationloss-gpt2
isashap
2023-09-21T21:39:06Z
37
0
peft
[ "peft", "text-generation", "region:us" ]
text-generation
2023-09-21T21:26:07Z
--- library_name: peft pipeline_tag: text-generation widget: - text: "Job: Keywords: Resume Point:" --- ## Training procedure ### Framework versions - PEFT 0.5.0
Panchovix/Synthia-70B-v1.2b_4.65bpw-h6-exl2
Panchovix
2023-09-21T21:33:15Z
7
2
transformers
[ "transformers", "llama", "text-generation", "license:llama2", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
2023-09-18T19:23:55Z
--- license: llama2 --- 4.65 bit/bpw quantization of Synthia-70B-v1.2b (https://huggingface.co/migtissera/Synthia-70B-v1.2), using exllamav2. Update on 21/09/2023: Re-quantized with latest exllamav2 update, which fixes some measurement issues.
CyberHarem/nanjou_hikaru_idolmastercinderellagirls
CyberHarem
2023-09-21T21:24:52Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/nanjou_hikaru_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-21T21:15:10Z
--- license: mit datasets: - CyberHarem/nanjou_hikaru_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of nanjou_hikaru_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 5100, you need to download `5100/nanjou_hikaru_idolmastercinderellagirls.pt` as the embedding and `5100/nanjou_hikaru_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 5100**, with the score of 0.772. The trigger words are: 1. `nanjou_hikaru_idolmastercinderellagirls` 2. `long_hair, blue_eyes, black_hair, smile, ahoge` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | **5100** | **0.772** | [**Download**](5100/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![pattern_3-5100](5100/previews/pattern_3.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.721 | [Download](4760/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![pattern_3-4760](4760/previews/pattern_3.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.747 | [Download](4420/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![pattern_3-4420](4420/previews/pattern_3.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.660 | [Download](4080/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![pattern_3-4080](4080/previews/pattern_3.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.699 | [Download](3740/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![pattern_3-3740](3740/previews/pattern_3.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.705 | [Download](3400/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![pattern_3-3400](3400/previews/pattern_3.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.656 | [Download](3060/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![pattern_3-3060](3060/previews/pattern_3.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.628 | [Download](2720/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![pattern_3-2720](2720/previews/pattern_3.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.612 | [Download](2380/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![pattern_3-2380](2380/previews/pattern_3.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.509 | [Download](2040/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![pattern_3-2040](2040/previews/pattern_3.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.577 | [Download](1700/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![pattern_3-1700](1700/previews/pattern_3.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.523 | [Download](1360/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![pattern_3-1360](1360/previews/pattern_3.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.450 | [Download](1020/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![pattern_3-1020](1020/previews/pattern_3.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.409 | [Download](680/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![pattern_3-680](680/previews/pattern_3.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.188 | [Download](340/nanjou_hikaru_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![pattern_3-340](340/previews/pattern_3.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
jradchenko/DeciCoder-1b
jradchenko
2023-09-21T20:55:59Z
133
0
transformers
[ "transformers", "safetensors", "text-generation", "text generation", "Deci AI", "DeciCoder", "custom_code", "dataset:bigcode/starcoderdata", "arxiv:2305.13245", "arxiv:2104.09864", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
2023-09-21T20:41:52Z
--- pipeline_tag: text-generation license: apache-2.0 tags: - text generation - Deci AI - DeciCoder programming_language: - Java - JavaScript - Python metrics: - code_eval inference: true widget: - text: 'def print_hello_world():' example_title: Hello world group: Python model-index: - name: DeciCoder-1b results: - task: type: text-generation dataset: type: nuprl/MultiPL-E name: MultiPL-HumanEval (Python) metrics: - name: pass@1 type: pass@1 value: 0.191 verified: false - task: type: text-generation dataset: type: nuprl/MultiPL-E name: MultiPL-HumanEval (JavaScript) metrics: - name: pass@1 type: pass@1 value: 0.184 verified: false - task: type: text-generation dataset: type: nuprl/MultiPL-E name: MultiPL-HumanEval (Java) metrics: - name: pass@1 type: pass@1 value: 0.166 verified: false datasets: - bigcode/starcoderdata --- # Model Card for DeciCoder 1B DeciCoder 1B is a 1 billion parameter decoder-only code completion model trained on the Python, Java, and Javascript subsets of [Starcoder Training Dataset](https://huggingface.co/datasets/bigcode/starcoderdata). The model uses Grouped Query Attention and has a context window of 2048 tokens. It was trained using a Fill-in-the-Middle training objective. The model's architecture was generated by Deci's proprietary Neural Architecture Search-based technology, AutoNAC. ## Model Details - **Developed by:** Deci - **Model type:** DeciCoder is an auto-regressive language model based on the transformer decoder architecture, using Grouped Query Attention. - **Language(s):** Python, Java, JavaScript - **License:** Model checkpoints are licensed under the [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) ## Model Architecture | Parameters | Layers | Heads | Sequence Length | GQA num_key_value_heads | Hidden Size | |:----------|:----------|:----------|:----------|:----------|:----------| | 1.1B | 20 | 32 | 2048 | 4 | 2048 | | - **Decoder layer:** Grouped Query Attention [Ainslie et al., 2023](https://arxiv.org/abs/2305.13245) - **Position Embeddings:** Rotary Position Embeddings [Su et al., 2021](https://arxiv.org/abs/2104.09864) ## Uses The model is intended to do single/multiline code completion from a context window of up to 2048k tokens. It is *not* an instruction model and commands like \"Write a function that computes the absolute value of an integer,\" won't yield the desired results. A more effective approach is to frame instructions in the style of source code comments (e.g. \# this function calculates the absolute value of an integer) or to present a function signature and docstring, enabling the model to complete the function's body. ### How to Use ```bibtex # pip install -q transformers import torch from transformers import AutoModelForCausalLM, AutoTokenizer checkpoint = "Deci/DeciCoder-1b" device = "cuda" # for GPU usage or "cpu" for CPU usage tokenizer = AutoTokenizer.from_pretrained(checkpoint) model = AutoModelForCausalLM.from_pretrained(checkpoint, torch_dtype=torch.bfloat16, trust_remote_code=True).to(device) inputs = tokenizer.encode("def print_hello_world():", return_tensors="pt").to(device) outputs = model.generate(inputs, max_new_tokens=100) print(tokenizer.decode(outputs[0])) ``` ### Attribution DeciCoder was trained on StarCoder Training Dataset, filtered for Python, Java, and Javascript code. For additional information, please refer to [https://huggingface.co/datasets/bigcode/starcoderdata](https://huggingface.co/datasets/bigcode/starcoderdata). ### Limitations The model has undergone training with source code from Python, Java, and JavaScript. While the primary language in the source is English, it does contain other languages. Therefore, the model can produce code snippets given some context. However, there\'s no assurance that the resulting code will function as expected. It might be suboptimal, contain bugs, or even exploits. ## Training Details ### Training Data DeciCoder was trained on the Python, Java, and Javascript subsets of [Starcoder Training Dataset](https://huggingface.co/datasets/bigcode/starcoderdata) ### Training Procedure - **Warm-Up Steps**: 9000 - **Total Training Steps**: 284k - **Total Tokens**: 446B - **Global Batch Size**: 768 - **Optimizer**: AdamW - **Optimizer Parameters**: beta1=0.9, beta2=0.95 - **Weight Decay**: 0.1 - **Learning Rate**: 4e-4 - **Learning Rate Schedule**: cosine ## Evaluation Below are DeciCoder's pass@1 on MultiPL HumanEval scores | Python | JavaScript | Java | |:----------|:----------|:----------| | 19.1% | 18.4% | 16.6% | ### Runtime Benchmarks |Inference Tool/Hardware | A10 (tokens/sec) |A100 (tokens/sec) | |:----------|:----------|:----------| | PyTorch | 1,364.2 | 3,244.4 | | Infery LLM | 3,889.3 | 11,676.8 | - Throughput (tokens/sec) - Measured with optimal batch size per hardware - A10 on BS 128, A100 on BS 512 ## Documentation - [Notebook](https://colab.research.google.com/drive/1JCxvBsWCZKHfIcHSMVf7GZCs3ClMQPjs) - Blog post: [Introducing DeciCoder: The New Gold Standard in Efficient and Accurate Code Generation](https://deci.ai/blog/decicoder-efficient-and-accurate-code-generation-llm/) - Questions:Feel free to contact us via our [Discord Community!](https://discord.com/invite/p9ecgRhDR8/) ## How to Cite Please cite this model using this format. ```bibtex @misc{DeciFoundationModels, title = {DeciCoder}, author = {DeciAI Research Team}, year = {2023} url={[https://huggingface.co/deci/decicoder-1b](https://huggingface.co/deci/decicoder-1b)}, } ```
codyreading/custom_diffusion-noprior
codyreading
2023-09-21T20:44:28Z
9
0
diffusers
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "custom-diffusion", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-21T20:38:08Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 instance_prompt: None tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - custom-diffusion inference: true --- # Custom Diffusion - codyreading/custom_diffusion-noprior These are Custom Diffusion adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on None using [Custom Diffusion](https://www.cs.cmu.edu/~custom-diffusion). You can find some example images in the following. For more details on the training, please follow [this link](https://github.com/huggingface/diffusers/blob/main/examples/custom_diffusion).
mchen-hf-2023/q-FrozenLake-v1-4x4-noSlippery
mchen-hf-2023
2023-09-21T20:42:12Z
0
0
null
[ "FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
reinforcement-learning
2023-09-21T20:42:07Z
--- tags: - FrozenLake-v1-4x4-no_slippery - q-learning - reinforcement-learning - custom-implementation model-index: - name: q-FrozenLake-v1-4x4-noSlippery results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: FrozenLake-v1-4x4-no_slippery type: FrozenLake-v1-4x4-no_slippery metrics: - type: mean_reward value: 1.00 +/- 0.00 name: mean_reward verified: false --- # **Q-Learning** Agent playing1 **FrozenLake-v1** This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** . ## Usage ```python model = load_from_hub(repo_id="mchen-hf-2023/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
OpenDILabCommunity/QbertNoFrameskip-v4-C51
OpenDILabCommunity
2023-09-21T20:02:39Z
0
0
pytorch
[ "pytorch", "deep-reinforcement-learning", "reinforcement-learning", "DI-engine", "QbertNoFrameskip-v4", "en", "license:apache-2.0", "region:us" ]
reinforcement-learning
2023-05-18T19:42:42Z
--- language: en license: apache-2.0 library_name: pytorch tags: - deep-reinforcement-learning - reinforcement-learning - DI-engine - QbertNoFrameskip-v4 benchmark_name: OpenAI/Gym/Atari task_name: QbertNoFrameskip-v4 pipeline_tag: reinforcement-learning model-index: - name: C51 results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: OpenAI/Gym/Atari-QbertNoFrameskip-v4 type: OpenAI/Gym/Atari-QbertNoFrameskip-v4 metrics: - type: mean_reward value: 16750.0 +/- 0.0 name: mean_reward --- # Play **QbertNoFrameskip-v4** with **C51** Policy ## Model Description <!-- Provide a longer summary of what this model is. --> This is a simple **C51** implementation to OpenAI/Gym/Atari **QbertNoFrameskip-v4** using the [DI-engine library](https://github.com/opendilab/di-engine) and the [DI-zoo](https://github.com/opendilab/DI-engine/tree/main/dizoo). **DI-engine** is a python library for solving general decision intelligence problems, which is based on implementations of reinforcement learning framework using PyTorch or JAX. This library aims to standardize the reinforcement learning framework across different algorithms, benchmarks, environments, and to support both academic researches and prototype applications. Besides, self-customized training pipelines and applications are supported by reusing different abstraction levels of DI-engine reinforcement learning framework. ## Model Usage ### Install the Dependencies <details close> <summary>(Click for Details)</summary> ```shell # install huggingface_ding git clone https://github.com/opendilab/huggingface_ding.git pip3 install -e ./huggingface_ding/ # install environment dependencies if needed pip3 install DI-engine[common_env] ``` </details> ### Git Clone from Huggingface and Run the Model <details close> <summary>(Click for Details)</summary> ```shell # running with trained model python3 -u run.py ``` **run.py** ```python from ding.bonus import C51Agent from ding.config import Config from easydict import EasyDict import torch # Pull model from files which are git cloned from huggingface policy_state_dict = torch.load("pytorch_model.bin", map_location=torch.device("cpu")) cfg = EasyDict(Config.file_to_dict("policy_config.py").cfg_dict) # Instantiate the agent agent = C51Agent( env_id="QbertNoFrameskip-v4", exp_name="QbertNoFrameskip-v4-C51", cfg=cfg.exp_config, policy_state_dict=policy_state_dict ) # Continue training agent.train(step=5000) # Render the new agent performance agent.deploy(enable_save_replay=True) ``` </details> ### Run Model by Using Huggingface_ding <details close> <summary>(Click for Details)</summary> ```shell # running with trained model python3 -u run.py ``` **run.py** ```python from ding.bonus import C51Agent from huggingface_ding import pull_model_from_hub # Pull model from Hugggingface hub policy_state_dict, cfg = pull_model_from_hub(repo_id="OpenDILabCommunity/QbertNoFrameskip-v4-C51") # Instantiate the agent agent = C51Agent( env_id="QbertNoFrameskip-v4", exp_name="QbertNoFrameskip-v4-C51", cfg=cfg.exp_config, policy_state_dict=policy_state_dict ) # Continue training agent.train(step=5000) # Render the new agent performance agent.deploy(enable_save_replay=True) ``` </details> ## Model Training ### Train the Model and Push to Huggingface_hub <details close> <summary>(Click for Details)</summary> ```shell #Training Your Own Agent python3 -u train.py ``` **train.py** ```python from ding.bonus import C51Agent from huggingface_ding import push_model_to_hub # Instantiate the agent agent = C51Agent(env_id="QbertNoFrameskip-v4", exp_name="QbertNoFrameskip-v4-C51") # Train the agent return_ = agent.train(step=int(20000000)) # Push model to huggingface hub push_model_to_hub( agent=agent.best, env_name="OpenAI/Gym/Atari", task_name="QbertNoFrameskip-v4", algo_name="C51", wandb_url=return_.wandb_url, github_repo_url="https://github.com/opendilab/DI-engine", github_doc_model_url="https://di-engine-docs.readthedocs.io/en/latest/12_policies/c51.html", github_doc_env_url="https://di-engine-docs.readthedocs.io/en/latest/13_envs/atari.html", installation_guide="pip3 install DI-engine[common_env]", usage_file_by_git_clone="./c51/qbert_c51_deploy.py", usage_file_by_huggingface_ding="./c51/qbert_c51_download.py", train_file="./c51/qbert_c51.py", repo_id="OpenDILabCommunity/QbertNoFrameskip-v4-C51", create_repo=False ) ``` </details> **Configuration** <details close> <summary>(Click for Details)</summary> ```python exp_config = { 'env': { 'manager': { 'episode_num': float("inf"), 'max_retry': 1, 'retry_type': 'reset', 'auto_reset': True, 'step_timeout': None, 'reset_timeout': None, 'retry_waiting_time': 0.1, 'cfg_type': 'BaseEnvManagerDict' }, 'stop_value': 30000, 'n_evaluator_episode': 8, 'collector_env_num': 8, 'evaluator_env_num': 8, 'env_id': 'QbertNoFrameskip-v4', 'frame_stack': 4, 'env_wrapper': 'atari_default' }, 'policy': { 'model': { 'encoder_hidden_size_list': [128, 128, 512], 'v_min': -10, 'v_max': 10, 'n_atom': 51, 'obs_shape': [4, 84, 84], 'action_shape': 6 }, 'learn': { 'learner': { 'train_iterations': 1000000000, 'dataloader': { 'num_workers': 0 }, 'log_policy': True, 'hook': { 'load_ckpt_before_run': '', 'log_show_after_iter': 100, 'save_ckpt_after_iter': 10000, 'save_ckpt_after_run': True }, 'cfg_type': 'BaseLearnerDict' }, 'update_per_collect': 10, 'batch_size': 32, 'learning_rate': 0.0001, 'target_update_freq': 500, 'target_theta': 0.005, 'ignore_done': False }, 'collect': { 'collector': {}, 'n_sample': 100, 'unroll_len': 1 }, 'eval': { 'evaluator': { 'eval_freq': 4000, 'render': { 'render_freq': -1, 'mode': 'train_iter' }, 'figure_path': None, 'cfg_type': 'InteractionSerialEvaluatorDict', 'stop_value': 30000, 'n_episode': 8 } }, 'other': { 'replay_buffer': { 'replay_buffer_size': 400000 }, 'eps': { 'type': 'exp', 'start': 1.0, 'end': 0.05, 'decay': 1000000 } }, 'on_policy': False, 'cuda': True, 'multi_gpu': False, 'bp_update_sync': True, 'traj_len_inf': False, 'type': 'c51', 'priority': True, 'priority_IS_weight': False, 'discount_factor': 0.99, 'nstep': 3, 'cfg_type': 'C51PolicyDict' }, 'exp_name': 'QbertNoFrameskip-v4-C51', 'seed': 0, 'wandb_logger': { 'gradient_logger': True, 'video_logger': True, 'plot_logger': True, 'action_logger': True, 'return_logger': False } } ``` </details> **Training Procedure** <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> - **Weights & Biases (wandb):** [monitor link](https://wandb.ai/zjowowen/QbertNoFrameskip-v4-C51) ## Model Information <!-- Provide the basic links for the model. --> - **Github Repository:** [repo link](https://github.com/opendilab/DI-engine) - **Doc**: [DI-engine-docs Algorithm link](https://di-engine-docs.readthedocs.io/en/latest/12_policies/c51.html) - **Configuration:** [config link](https://huggingface.co/OpenDILabCommunity/QbertNoFrameskip-v4-C51/blob/main/policy_config.py) - **Demo:** [video](https://huggingface.co/OpenDILabCommunity/QbertNoFrameskip-v4-C51/blob/main/replay.mp4) <!-- Provide the size information for the model. --> - **Parameters total size:** 55276.2 KB - **Last Update Date:** 2023-09-21 ## Environments <!-- Address questions around what environment the model is intended to be trained and deployed at, including the necessary information needed to be provided for future users. --> - **Benchmark:** OpenAI/Gym/Atari - **Task:** QbertNoFrameskip-v4 - **Gym version:** 0.25.1 - **DI-engine version:** v0.4.9 - **PyTorch version:** 2.0.1+cu117 - **Doc**: [DI-engine-docs Environments link](https://di-engine-docs.readthedocs.io/en/latest/13_envs/atari.html)
grace-pro/snli_w_premise_100k_1_epoch
grace-pro
2023-09-21T19:55:41Z
105
0
transformers
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-21T19:33:34Z
--- license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: snli_w_premise_100k_1_epoch results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # snli_w_premise_100k_1_epoch This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0920 - Accuracy: 0.9735 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 0.0837 | 1.0 | 18750 | 0.0920 | 0.9735 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
CyberHarem/yoshioka_saki_idolmastercinderellagirls
CyberHarem
2023-09-21T19:47:19Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/yoshioka_saki_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-21T19:37:41Z
--- license: mit datasets: - CyberHarem/yoshioka_saki_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of yoshioka_saki_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 4420, you need to download `4420/yoshioka_saki_idolmastercinderellagirls.pt` as the embedding and `4420/yoshioka_saki_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 4420**, with the score of 0.749. The trigger words are: 1. `yoshioka_saki_idolmastercinderellagirls` 2. `short_hair, smile, green_eyes, brown_hair, open_mouth, breasts` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.674 | [Download](5100/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![pattern_3-5100](5100/previews/pattern_3.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.715 | [Download](4760/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![pattern_3-4760](4760/previews/pattern_3.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | **4420** | **0.749** | [**Download**](4420/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![pattern_3-4420](4420/previews/pattern_3.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.668 | [Download](4080/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![pattern_3-4080](4080/previews/pattern_3.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.683 | [Download](3740/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![pattern_3-3740](3740/previews/pattern_3.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.589 | [Download](3400/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![pattern_3-3400](3400/previews/pattern_3.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.680 | [Download](3060/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![pattern_3-3060](3060/previews/pattern_3.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.618 | [Download](2720/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![pattern_3-2720](2720/previews/pattern_3.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.659 | [Download](2380/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![pattern_3-2380](2380/previews/pattern_3.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.610 | [Download](2040/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![pattern_3-2040](2040/previews/pattern_3.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.719 | [Download](1700/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![pattern_3-1700](1700/previews/pattern_3.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.654 | [Download](1360/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![pattern_3-1360](1360/previews/pattern_3.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.675 | [Download](1020/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![pattern_3-1020](1020/previews/pattern_3.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.576 | [Download](680/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![pattern_3-680](680/previews/pattern_3.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.294 | [Download](340/yoshioka_saki_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![pattern_3-340](340/previews/pattern_3.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
Bogdan63/distilbert-imdb
Bogdan63
2023-09-21T19:45:05Z
107
0
transformers
[ "transformers", "pytorch", "distilbert", "text-classification", "generated_from_trainer", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-10T13:46:45Z
--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer model-index: - name: distilbert-imdb results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-imdb This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Recall | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:------:| | No log | 1.0 | 313 | 0.2279 | 0.9112 | 0.9112 | 0.9112 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
Logeswaransr/T5_MineAI_Prototype
Logeswaransr
2023-09-21T19:43:09Z
103
1
transformers
[ "transformers", "pytorch", "t5", "text2text-generation", "generated_from_trainer", "base_model:google/flan-t5-base", "base_model:finetune:google/flan-t5-base", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text2text-generation
2023-09-21T19:37:52Z
--- license: apache-2.0 base_model: google/flan-t5-base tags: - generated_from_trainer metrics: - rouge model-index: - name: results_T5 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # results_T5 This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.8755 - Rouge1: 0.2921 - Rouge2: 0.1519 - Rougel: 0.2857 - Rougelsum: 0.2866 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 8 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:| | No log | 1.0 | 109 | 0.4682 | 0.2349 | 0.0878 | 0.2327 | 0.2344 | | No log | 2.0 | 218 | 0.4153 | 0.2519 | 0.0965 | 0.2481 | 0.2503 | | No log | 3.0 | 327 | 0.4102 | 0.3011 | 0.1465 | 0.2979 | 0.2990 | | No log | 4.0 | 436 | 0.4386 | 0.2555 | 0.1138 | 0.2496 | 0.2496 | | 0.8199 | 5.0 | 545 | 0.4784 | 0.2725 | 0.1188 | 0.2675 | 0.2665 | | 0.8199 | 6.0 | 654 | 0.5088 | 0.2524 | 0.1066 | 0.2497 | 0.2501 | | 0.8199 | 7.0 | 763 | 0.5680 | 0.2542 | 0.1093 | 0.2497 | 0.2496 | | 0.8199 | 8.0 | 872 | 0.5982 | 0.2740 | 0.1375 | 0.2694 | 0.2698 | | 0.8199 | 9.0 | 981 | 0.6575 | 0.2730 | 0.1368 | 0.2723 | 0.2714 | | 0.0653 | 10.0 | 1090 | 0.6753 | 0.2822 | 0.1519 | 0.2798 | 0.2781 | | 0.0653 | 11.0 | 1199 | 0.6923 | 0.2795 | 0.1486 | 0.2780 | 0.2774 | | 0.0653 | 12.0 | 1308 | 0.7350 | 0.2471 | 0.1209 | 0.2458 | 0.2457 | | 0.0653 | 13.0 | 1417 | 0.7698 | 0.2762 | 0.1463 | 0.2720 | 0.2733 | | 0.0225 | 14.0 | 1526 | 0.7867 | 0.2771 | 0.1372 | 0.2763 | 0.2755 | | 0.0225 | 15.0 | 1635 | 0.8166 | 0.3166 | 0.1689 | 0.3132 | 0.3133 | | 0.0225 | 16.0 | 1744 | 0.8085 | 0.3027 | 0.1572 | 0.2998 | 0.3009 | | 0.0225 | 17.0 | 1853 | 0.8162 | 0.3090 | 0.1734 | 0.3025 | 0.3038 | | 0.0225 | 18.0 | 1962 | 0.8484 | 0.2965 | 0.1627 | 0.2917 | 0.2909 | | 0.0105 | 19.0 | 2071 | 0.8610 | 0.2881 | 0.1487 | 0.2813 | 0.2819 | | 0.0105 | 20.0 | 2180 | 0.8688 | 0.2811 | 0.1494 | 0.2755 | 0.2770 | | 0.0105 | 21.0 | 2289 | 0.8733 | 0.2777 | 0.1453 | 0.2708 | 0.2724 | | 0.0105 | 22.0 | 2398 | 0.8776 | 0.2771 | 0.1475 | 0.2709 | 0.2711 | | 0.0061 | 23.0 | 2507 | 0.8717 | 0.2829 | 0.1467 | 0.2749 | 0.2749 | | 0.0061 | 24.0 | 2616 | 0.8729 | 0.2878 | 0.1467 | 0.2803 | 0.2806 | | 0.0061 | 25.0 | 2725 | 0.8755 | 0.2921 | 0.1519 | 0.2857 | 0.2866 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.13.3
jgeselowitz/poem_labeler
jgeselowitz
2023-09-21T19:43:00Z
105
0
transformers
[ "transformers", "pytorch", "distilbert", "text-classification", "generated_from_trainer", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-21T19:24:46Z
--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: poem_labeler results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # poem_labeler This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.1318 - Accuracy: 0.7315 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.0602 | 1.0 | 2382 | 0.9264 | 0.6935 | | 0.5889 | 2.0 | 4764 | 0.9186 | 0.723 | | 0.2638 | 3.0 | 7146 | 1.1318 | 0.7315 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
rfanucchi/ppo-LunarLander-v2
rfanucchi
2023-09-21T19:37:01Z
0
0
stable-baselines3
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-21T19:34:44Z
--- library_name: stable-baselines3 tags: - LunarLander-v2 - deep-reinforcement-learning - reinforcement-learning - stable-baselines3 model-index: - name: PPO results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: LunarLander-v2 type: LunarLander-v2 metrics: - type: mean_reward value: 248.58 +/- 38.88 name: mean_reward verified: false --- # **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
glaiveai/glaive-coder-7b
glaiveai
2023-09-21T19:35:50Z
1,565
54
transformers
[ "transformers", "pytorch", "llama", "text-generation", "code", "en", "dataset:glaiveai/glaive-code-assistant", "license:llama2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-17T14:49:44Z
--- license: llama2 datasets: - glaiveai/glaive-code-assistant language: - en tags: - code --- # Glaive-coder-7b Glaive-coder-7b is a 7B parameter code model trained on a dataset of ~140k programming related problems and solutions generated from Glaive’s synthetic data generation platform. The model is fine-tuned on the CodeLlama-7b model. ## Usage: The model is trained to act as a code assistant, and can do both single instruction following and multi-turn conversations. It follows the same prompt format as CodeLlama-7b-Instruct- ``` <s>[INST] <<SYS>> {{ system_prompt }} <</SYS>> {{ user_msg }} [/INST] {{ model_answer }} </s> <s>[INST] {{ user_msg }} [/INST] ``` You can run the model in the following way- ```python from transformers import AutoModelForCausalLM , AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("glaiveai/glaive-coder-7b") model = AutoModelForCausalLM.from_pretrained("glaiveai/glaive-coder-7b").half().cuda() def fmt_prompt(prompt): return f"<s> [INST] {prompt} [/INST]" inputs = tokenizer(fmt_prompt(prompt),return_tensors="pt").to(model.device) outputs = model.generate(**inputs,do_sample=True,temperature=0.1,top_p=0.95,max_new_tokens=100) print(tokenizer.decode(outputs[0],skip_special_tokens=True,clean_up_tokenization_spaces=False)) ``` ## Benchmarks: The model achieves a 63.1% pass@1 on HumanEval and a 45.2% pass@1 on MBPP, however it is evident that these benchmarks are not representative of real-world usage of code models so we are launching the [Code Models Arena](https://arena.glaive.ai/) to let users vote on model outputs so we can have a better understanding of user preference on code models and come up with new and better benchmarks. We plan to release the Arena results as soon as we have a sufficient amount of data. Join the Glaive [discord](https://discord.gg/fjQ4uf3yWD) for improvement suggestions, bug-reports and collaborating on more open-source projects.
NousResearch/Llama-2-70b-chat-hf
NousResearch
2023-09-21T19:05:17Z
625
19
transformers
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "facebook", "meta", "llama-2", "en", "autotrain_compatible", "text-generation-inference", "region:us" ]
text-generation
2023-07-19T04:36:22Z
--- extra_gated_heading: Access Llama 2 on Hugging Face extra_gated_description: >- This is a form to enable access to Llama 2 on Hugging Face after you have been granted access from Meta. Please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads) and accept our license terms and acceptable use policy before submitting this form. Requests will be processed in 1-2 days. extra_gated_prompt: "**Your Hugging Face account email address MUST match the email you provide on the Meta website, or your request will not be approved.**" extra_gated_button_content: Submit extra_gated_fields: I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website: checkbox language: - en pipeline_tag: text-generation inference: false tags: - facebook - meta - pytorch - llama - llama-2 --- # **Llama 2** Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 70B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom. ## Model Details *Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.* Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM. **Model Developers** Meta **Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations. **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety. ||Training Data|Params|Content Length|GQA|Tokens|LR| |---|---|---|---|---|---|---| |Llama 2|*A new mix of publicly available online data*|7B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|13B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|70B|4k|&#10004;|2.0T|1.5 x 10<sup>-4</sup>| *Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability. **Model Dates** Llama 2 was trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) ## Intended Use **Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. **Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2. ## Hardware and Software **Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. **Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program. ||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)| |---|---|---|---| |Llama 2 7B|184320|400|31.22| |Llama 2 13B|368640|400|62.44| |Llama 2 70B|1720320|400|291.42| |Total|3311616||539.00| **CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. ## Training Data **Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. **Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023. ## Evaluation Results In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library. |Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval| |---|---|---|---|---|---|---|---|---|---| |Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9| |Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9| |Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7| |Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6| |Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3| |Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1| |Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**| **Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1. |||TruthfulQA|Toxigen| |---|---|---|---| |Llama 1|7B|27.42|23.00| |Llama 1|13B|41.74|23.08| |Llama 1|33B|44.19|22.57| |Llama 1|65B|48.71|21.77| |Llama 2|7B|33.29|**21.25**| |Llama 2|13B|41.86|26.10| |Llama 2|70B|**50.18**|24.60| **Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better). |||TruthfulQA|Toxigen| |---|---|---|---| |Llama-2-Chat|7B|57.04|**0.00**| |Llama-2-Chat|13B|62.18|**0.00**| |Llama-2-Chat|70B|**64.14**|0.01| **Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above. ## Ethical Considerations and Limitations Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide) ## Reporting Issues Please report any software “bug,” or other problems with the models through one of the following means: - Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama) - Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback) - Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info) ## Llama Model Index |Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf| |---|---|---|---|---| |7B| [Link](https://huggingface.co/llamaste/Llama-2-7b) | [Link](https://huggingface.co/llamaste/Llama-2-7b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-7b-chat-hf)| |13B| [Link](https://huggingface.co/llamaste/Llama-2-13b) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-13b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-13b-hf)| |70B| [Link](https://huggingface.co/llamaste/Llama-2-70b) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf) | [Link](https://huggingface.co/llamaste/Llama-2-70b-chat) | [Link](https://huggingface.co/llamaste/Llama-2-70b-hf)|
mirfan899/uner-bert-ner
mirfan899
2023-09-21T18:53:09Z
115
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:google-bert/bert-base-multilingual-cased", "base_model:finetune:google-bert/bert-base-multilingual-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-21T18:52:35Z
--- license: apache-2.0 base_model: bert-base-multilingual-cased tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: uner-bert-ner results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # uner-bert-ner This model is a fine-tuned version of [bert-base-multilingual-cased](https://huggingface.co/bert-base-multilingual-cased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1354 - Precision: 0.8267 - Recall: 0.8707 - F1: 0.8481 - Accuracy: 0.9640 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 144 | 0.1496 | 0.7687 | 0.7971 | 0.7826 | 0.9533 | | No log | 2.0 | 288 | 0.1429 | 0.7719 | 0.8584 | 0.8129 | 0.9573 | | No log | 3.0 | 432 | 0.1267 | 0.8014 | 0.8682 | 0.8335 | 0.9629 | | 0.1628 | 4.0 | 576 | 0.1316 | 0.8206 | 0.8723 | 0.8457 | 0.9644 | | 0.1628 | 5.0 | 720 | 0.1354 | 0.8267 | 0.8707 | 0.8481 | 0.9640 | ### Framework versions - Transformers 4.33.0 - Pytorch 2.0.0 - Datasets 2.14.5 - Tokenizers 0.13.3
jgeselowitz/old_poetry
jgeselowitz
2023-09-21T18:43:42Z
161
0
transformers
[ "transformers", "pytorch", "distilbert", "text-classification", "generated_from_trainer", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-21T18:43:28Z
--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer metrics: - accuracy model-index: - name: test_trainer results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # test_trainer This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.1338 - Accuracy: 0.731 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.0802 | 1.0 | 2382 | 0.9411 | 0.698 | | 0.6109 | 2.0 | 4764 | 0.9286 | 0.728 | | 0.2759 | 3.0 | 7146 | 1.1338 | 0.731 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
mirfan899/uner-roberta-ner
mirfan899
2023-09-21T18:42:07Z
103
1
transformers
[ "transformers", "pytorch", "xlm-roberta", "token-classification", "generated_from_trainer", "base_model:FacebookAI/xlm-roberta-base", "base_model:finetune:FacebookAI/xlm-roberta-base", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-21T18:40:21Z
--- license: mit base_model: xlm-roberta-base tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: uner-roberta-ner results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # uner-roberta-ner This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0930 - Precision: 0.8622 - Recall: 0.9010 - F1: 0.8812 - Accuracy: 0.9728 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 144 | 0.1285 | 0.8005 | 0.8241 | 0.8121 | 0.9589 | | No log | 2.0 | 288 | 0.1142 | 0.8142 | 0.8748 | 0.8434 | 0.9655 | | No log | 3.0 | 432 | 0.0962 | 0.8485 | 0.8985 | 0.8728 | 0.9702 | | 0.1923 | 4.0 | 576 | 0.0916 | 0.8543 | 0.9018 | 0.8774 | 0.9719 | | 0.1923 | 5.0 | 720 | 0.0930 | 0.8622 | 0.9010 | 0.8812 | 0.9728 | ### Framework versions - Transformers 4.33.0 - Pytorch 2.0.0 - Datasets 2.14.5 - Tokenizers 0.13.3
jondurbin/airoboros-l2-7b-2.2.1
jondurbin
2023-09-21T18:39:31Z
1,424
3
transformers
[ "transformers", "pytorch", "llama", "text-generation", "dataset:jondurbin/airoboros-2.2.1", "license:llama2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-20T17:26:33Z
--- license: llama2 datasets: - jondurbin/airoboros-2.2.1 --- ### Overview Another experimental model, using mostly sythetic data generated by [airoboros](https://github.com/jondurbin/airoboros) This is essentially a minor "fix" branch of [airoboros-l2-7b-2.2](https://hf.co/jondurbin/airoboros-l2-7b-2.2) with a updates, primarily: - [re-generated writing responses](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#re-generated-writing-responses) - [longer contextual blocks](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#longer-contextual-blocks) - [removal of "rp" data](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#rp-category-removed) This is a fairly general purpose model, but focuses heavily on instruction following, rather than casual chat/roleplay. Huge thank you to the folks over at [a16z](https://a16z.com/) for sponsoring the costs associated with building models and associated tools! ### Prompt format The prompt format: ``` A chat. USER: {prompt} ASSISTANT: ``` The default system prompt ("A chat.") was used for most of the prompts, however it also included a wide sampling of responses with other prompts, particularly in "stylized\_response", "rp", "gtkm", etc. Here's another example: ``` A chat between Bob (aka USER) and Tom (aka ASSISTANT). Tom is an extremely intelligent 18th century bookkeeper, who speaks loquaciously. USER: {prompt} ASSISTANT: ``` And chat scenario that wouldn't require USER/ASSISTANT (but should use stopping criteria to prevent the model from speaking on your behalf). ``` A chat between old friends: Timmy and Tommy. {description of characters} {setting for the chat} Timmy: *takes a big sip from his coffee* "Ah, sweet, delicious, magical coffee." Tommy: ``` __*I strongly suggest adding stopping criteria/early inference stopping on "USER:", and/or whatever names you specify in the system prompt.*__ ### Fine tuning info https://wandb.ai/jondurbin/airoboros-l2-7b-2.2.1/runs/ka6jlcj7?workspace=user-jondurbin ### Helpful usage tips *The prompts shown here are are just the text that would be included after USER: and before ASSISTANT: in the full prompt format above, the system prompt and USER:/ASSISTANT: have been omited for readability.* #### Context obedient question answering By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations. The format for a closed-context prompt is as follows: ``` BEGININPUT BEGINCONTEXT [key0: value0] [key1: value1] ... other metdata ... ENDCONTEXT [insert your text blocks here] ENDINPUT [add as many other blocks, in the exact same format] BEGININSTRUCTION [insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.] ENDINSTRUCTION ``` It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up. *The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!* I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it. - `BEGININPUT` - denotes a new input block - `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block - `ENDCONTEXT` - denotes the end of the metadata block for the current input - [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context. - `ENDINPUT` - denotes the end of the current input block - [repeat as many input blocks in this format as you want] - `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above. - [instruction(s)] - `ENDINSTRUCTION` - denotes the end of instruction set It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to. Here's a trivial, but important example to prove the point: ``` BEGININPUT BEGINCONTEXT date: 2021-01-01 url: https://web.site/123 ENDCONTEXT In a shocking turn of events, blueberries are now green, but will be sticking with the same name. ENDINPUT BEGININSTRUCTION What color are bluberries? Source? ENDINSTRUCTION ``` And the response: ``` Blueberries are now green. Source: date: 2021-01-01 url: https://web.site/123 ``` #### Summarization 500 samples have been included from [this dataset](https://huggingface.co/datasets/mattpscott/airoboros-summarization), using the same format as contextual question answering, for example: ``` BEGININPUT {text to summarize} ENDINPUT BEGININSTRUCTION Summarize the input in around 130 words. ENDINSTRUCTION ``` #### Getting longer responses You can use a few techniques to get longer responses. Detailed prompts, with explicit instruction for word count: ``` Please compose a narrative set in the heart of an ancient library, steeped in the scent of old parchment and ink. The protagonist should be a young scholar who is dedicated to studying the art of storytelling and its evolution throughout history. In her pursuit of knowledge, she stumbles upon a forgotten tome that seems to possess an unusual aura. This book has the ability to bring stories to life, literally manifesting characters and scenarios from within its pages into reality. The main character must navigate through various epochs of storytelling - from oral traditions of tribal societies, through medieval minstrels' tales, to modern-day digital narratives - as they come alive around her. Each era presents its unique challenges and lessons about the power and impact of stories on human civilization. One such character could be a sentient quill pen, who was once used by renowned authors of yesteryears and now holds their wisdom and experiences. It becomes her mentor, guiding her through this journey with witty remarks and insightful commentary. Ensure that your tale encapsulates the thrill of adventure, the beauty of learning, and the profound connection between humans and their stories. All characters involved should be non-human entities. Feel free to explore creative liberties but maintain the mentioned elements. Your response should be approximately 2300 words. ``` Or, a simpler example: ``` Please create a long, detailed story about a dragon in an old growth forest who, for some reason, begins speaking the words of the source code of linux. ``` #### Coding You can ask for fairly complex coding instructions with multiple criteria, e.g.: ``` Create a python application with the following requirements: - Asyncio FastAPI webserver - ping endpoint that returns the current date in JSON format - file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate ``` Or inline criteria: ``` Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values. ``` You can also optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.: ``` Write a websocket application in node.js. PLAINFORMAT ``` #### Agent/function calling The dataset includes many examples of function/args generation based on input criteria. This is somewhat similar to the OpenAI function calling, but the output is either JSON or YAML. Example prompt: ``` As an AI assistant, please select the most suitable function and parameters from the list of available functions below, based on the user's input. Provide your response in JSON format. Input: I want to know how many times 'Python' is mentioned in my text file. Available functions: file_analytics: description: This tool performs various operations on a text file. params: action: The operation we want to perform on the data, such as "count_occurrences", "find_line", etc. filters: keyword: The word or phrase we want to search for. ``` Response: ```json { "function": "file_analytics", "params": { "action": "count_occurrences", "filters": { "keyword": "Python" } } } ``` #### Chain-of-thought You can ask for several possible responses to a given problem, with a ranking and final answer selection. Example prompt: ``` A group of people decide to play a round-robin tournament where every player plays every other player exactly once. If a total of 45 games were played, how many players participated in the tournament? Offer several potential answers, rank them according to mathematical logic, and pick the most feasible one. ``` Example response: ``` Solution 1: In a round-robin tournament, each player plays every other player exactly once. This means that the total number of games played is equal to the combination of all players taken two at a time (since a game involves two players). The formula for combinations is nC2 = n(n-1)/2, where n is the total number of players. We know that nC2 equals 45 (the total number of games), so we can set up the equation as follows: n(n-1)/2 = 45 Solving this equation gives us n=10. Final answer: There were 10 players in the tournament. Solution 2: Another way to approach this problem is by using the concept of triangular numbers. Triangular numbers are a sequence of numbers where each number can be represented as a triangle with dots. The nth triangular number is given by the sum of the first n natural numbers. If we consider each game as a dot and each player as a side of the triangle, then the total number of games (45) would correspond to the 9th triangular number because 1+2+3+4+5+6+7+8+9=45. However, since each side of the triangle represents a player, and there's one more player than sides in our model (because the last player has no one left to play against), we need to add one more to get the total number of players. Final answer: There were 10 players in the tournament. Solution 3: We could also use trial and error to solve this problem. Starting from a small number of players, we calculate how many games would be played in a round-robin tournament until we reach 45 games. For example, with 2 players, only 1 game is played; with 3 players, 3 games are played; with 4 players, 6 games are played, and so on. Continuing this process, we find that with 10 players, 45 games are played. Final answer: There were 10 players in the tournament. Ranking of solutions from best to worst: 1. Solution 1: This solution uses direct mathematical reasoning based on the properties of a round-robin tournament and solves an equation to find the answer. 2. Solution 2: This solution uses the concept of triangular numbers which requires additional steps and knowledge beyond the immediate problem context, but still arrives at the correct answer. 3. Solution 3: This solution relies on trial and error, which is less efficient and reliable than the other methods, but it eventually gets to the right answer. Best and final answer: There were 10 players in the tournament. ``` #### reWOO style execution planning The model now supports execution planning for complex instructions that would require making use of several tools. The output is just the plan, you must implement a mechanism to parse the output and actually call the functions! Example prompt: ``` Please construct a systematic plan to generate an optimal response to the user instruction, utilizing a set of provided tools. Each plan will correspond to an evidence value, which will be the output of one of the available functions given an input string that could be the user's question, one or more prior evidence values, or a combination of both. Here are the tools available to be called: DuckDuckGo[input]: Tool that enables the user to search for information using the DuckDuckGo search engine. This tool is helpful in locating a suitable list of sites that may contain the answer to the user's query. It does not directly answer the question but finds a list of sites that might have the answer. TextScraper[input]: Load one or more websites from the input string containing newline delimited links, where input is one or more links, and produces plain text output containing the content of the links. HyperlinkExtractor[input]: Extract links from plain text and produces a plain text, newline delimited response of links. KnowledgeModel[input]: Question answering language model, particularly useful in answering questions based on an input passage of text. The input must be a text question that references an :evidence[n]: variable, e.g. What color is the cat, given :evidence1:? The output should be in format: Plan: [first action to take based on input question] :evidence0: = [function to call with input parameter] Plan: [next action to take, based on result of :evidence0:] :evidence1: = [next function to call and input parameter, which may include reference to previous evidence, e.g. "Given context :evidence0"] ... Answer: [:evidence[n]: containing the final answer.] Question: Who won the Nobel Prize in Literature in 2020 and what is their most famous work? ``` Response: ``` Plan: Begin by conducting a web search to find out who won the Nobel Prize in Literature in 2020. :evidence0: = DuckDuckGo[Nobel Prize in Literature 2020 winner] Plan: Utilize the KnowledgeModel to interpret the search results and find the name of the laureate. :evidence1: = KnowledgeModel[Who won the Nobel Prize in Literature in 2020, given :evidence0:?] Plan: Conduct another web search to find the most famous work of the identified laureate. :evidence2: = DuckDuckGo[Most famous work of :evidence1:] Plan: Extract the relevant links from the DuckDuckGo search results for a more focused search. :evidence3: = HyperlinkExtractor[:evidence2:] Plan: Use the TextScraper tool to extract information from the relevant links. :evidence4: = TextScraper[:evidence3:] Plan: Finally, utilize the KnowledgeModel to identify and summarize the most famous work of the laureate from the extracted information. :evidence5: = KnowledgeModel[What is the most famous work of :evidence1:, given :evidence4:?] Answer: :evidence5: ``` For this to be useful, you'd have to parse the output plan text, and implement/call each of the functions. This is just pseudo-code, completely untested off the top of my head, and obviously would requiring full implementation + hardening: ```python import re import requests def inject_context(input_text, **context): for ref in set(re.findall(r"(:evidence[0-9]+:)", input_text, re.I)): input_text = input_text.replace(ref, context.get(ref, "")) return input_text def duckduckgo(input_text, **context): search_string = inject_context(input_text, **context) ... search via duck duck go using search_string ... return text content def link_extractor(input_text, **context): input_text = inject_context(input_text, **context) return "\n".join(list(set(re.findall(r"(https?://[^\s]+?\.?)", input_text, re.I)))) def scrape(input_text, **context): input_text = inject_context(input_text, **context) text = [] for link in input_text.splitlines(): text.append(requests.get(link).text) return "\n".join(text) def infer(input_text, **context) prompt = inject_context(input_text, **context) ... call model with prompt, return output def parse_plan(plan): method_map = { "DuckDuckGo": duckduckgo, "HyperlinkExtractor": link_extractor, "KnowledgeModel": infer, "TextScraper": scrape, } context = {} for line in plan.strip().splitlines(): if line.startswith("Plan:"): print(line) continue parts = re.match("^(:evidence[0-9]+:)\s*=\s*([^\[]+])(\[.*\])\s$", line, re.I) if not parts: if line.startswith("Answer: "): return context.get(line.split(" ")[-1].strip(), "Answer couldn't be generated...") raise RuntimeError("bad format: " + line) context[parts.group(1)] = method_map[parts.group(2)](parts.group(3), **context) ``` ### Contribute If you're interested in new functionality, particularly a new "instructor" type to generate a specific type of training data, take a look at the dataset generation tool repo: https://github.com/jondurbin/airoboros and either make a PR or open an issue with details. To help me with the OpenAI/compute costs: - https://bmc.link/jondurbin - ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11 - BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf ### Licence and usage restrictions The airoboros 2.2 models are built on top of llama-2/codellama. The llama-2 base model has a custom Meta license: - See the [meta-license/LICENSE.txt](meta-license/LICENSE.txt) file attached for the original license provided by Meta. - See also [meta-license/USE_POLICY.md](meta-license/USE_POLICY.md) and [meta-license/Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf), also provided by Meta. The fine-tuning data was mostly generated by OpenAI API calls to gpt-4, via [airoboros](https://github.com/jondurbin/airoboros) The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI - what does *compete* actually mean here? - these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place - if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works - the training data used in essentially all large language models includes a significant amount of copyrighted or otherwise non-permissive licensing in the first place - other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2 I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license for llama-2) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly. Your best bet is probably to avoid using this commercially due to the OpenAI API usage. Either way, by using this model, you agree to completely indemnify me.
jondurbin/airoboros-l2-13b-2.2.1
jondurbin
2023-09-21T18:39:18Z
1,474
3
transformers
[ "transformers", "pytorch", "llama", "text-generation", "dataset:jondurbin/airoboros-2.2.1", "license:llama2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-20T18:35:28Z
--- license: llama2 datasets: - jondurbin/airoboros-2.2.1 --- ### Overview Another experimental model, using mostly sythetic data generated by [airoboros](https://github.com/jondurbin/airoboros) This is essentially a minor "fix" branch of [airoboros-l2-13b-2.2](https://hf.co/jondurbin/airoboros-l2-13b-2.2) with a updates, primarily: - [re-generated writing responses](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#re-generated-writing-responses) - [longer contextual blocks](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#longer-contextual-blocks) - [removal of "rp" data](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#rp-category-removed) - [(less aggressive) de-censoring](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#de-censoring) - more fine-tuning epochs This is a fairly general purpose model, but focuses heavily on instruction following, rather than casual chat/roleplay. Huge thank you to the folks over at [a16z](https://a16z.com/) for sponsoring the costs associated with building models and associated tools! ### Prompt format The prompt format: ``` A chat. USER: {prompt} ASSISTANT: ``` The default system prompt ("A chat.") was used for most of the prompts, however it also included a wide sampling of responses with other prompts, particularly in "stylized\_response", "rp", "gtkm", etc. Here's another example: ``` A chat between Bob (aka USER) and Tom (aka ASSISTANT). Tom is an extremely intelligent 18th century bookkeeper, who speaks loquaciously. USER: {prompt} ASSISTANT: ``` And chat scenario that wouldn't require USER/ASSISTANT (but should use stopping criteria to prevent the model from speaking on your behalf). ``` A chat between old friends: Timmy and Tommy. {description of characters} {setting for the chat} Timmy: *takes a big sip from his coffee* "Ah, sweet, delicious, magical coffee." Tommy: ``` __*I strongly suggest adding stopping criteria/early inference stopping on "USER:", and/or whatever names you specify in the system prompt.*__ ### Fine tuning info https://wandb.ai/jondurbin/airoboros-l2-13b-2.2.1/runs/zbz8mgaz?workspace=user-jondurbin ### Helpful usage tips *The prompts shown here are are just the text that would be included after USER: and before ASSISTANT: in the full prompt format above, the system prompt and USER:/ASSISTANT: have been omited for readability.* #### Context obedient question answering By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations. The format for a closed-context prompt is as follows: ``` BEGININPUT BEGINCONTEXT [key0: value0] [key1: value1] ... other metdata ... ENDCONTEXT [insert your text blocks here] ENDINPUT [add as many other blocks, in the exact same format] BEGININSTRUCTION [insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.] ENDINSTRUCTION ``` It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up. *The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!* I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it. - `BEGININPUT` - denotes a new input block - `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block - `ENDCONTEXT` - denotes the end of the metadata block for the current input - [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context. - `ENDINPUT` - denotes the end of the current input block - [repeat as many input blocks in this format as you want] - `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above. - [instruction(s)] - `ENDINSTRUCTION` - denotes the end of instruction set It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to. Here's a trivial, but important example to prove the point: ``` BEGININPUT BEGINCONTEXT date: 2021-01-01 url: https://web.site/123 ENDCONTEXT In a shocking turn of events, blueberries are now green, but will be sticking with the same name. ENDINPUT BEGININSTRUCTION What color are bluberries? Source? ENDINSTRUCTION ``` And the response: ``` Blueberries are now green. Source: date: 2021-01-01 url: https://web.site/123 ``` #### Summarization 500 samples have been included from [this dataset](https://huggingface.co/datasets/mattpscott/airoboros-summarization), using the same format as contextual question answering, for example: ``` BEGININPUT {text to summarize} ENDINPUT BEGININSTRUCTION Summarize the input in around 130 words. ENDINSTRUCTION ``` #### Getting longer responses You can use a few techniques to get longer responses. Detailed prompts, with explicit instruction for word count: ``` Please compose a narrative set in the heart of an ancient library, steeped in the scent of old parchment and ink. The protagonist should be a young scholar who is dedicated to studying the art of storytelling and its evolution throughout history. In her pursuit of knowledge, she stumbles upon a forgotten tome that seems to possess an unusual aura. This book has the ability to bring stories to life, literally manifesting characters and scenarios from within its pages into reality. The main character must navigate through various epochs of storytelling - from oral traditions of tribal societies, through medieval minstrels' tales, to modern-day digital narratives - as they come alive around her. Each era presents its unique challenges and lessons about the power and impact of stories on human civilization. One such character could be a sentient quill pen, who was once used by renowned authors of yesteryears and now holds their wisdom and experiences. It becomes her mentor, guiding her through this journey with witty remarks and insightful commentary. Ensure that your tale encapsulates the thrill of adventure, the beauty of learning, and the profound connection between humans and their stories. All characters involved should be non-human entities. Feel free to explore creative liberties but maintain the mentioned elements. Your response should be approximately 2300 words. ``` Or, a simpler example: ``` Please create a long, detailed story about a dragon in an old growth forest who, for some reason, begins speaking the words of the source code of linux. ``` #### Coding You can ask for fairly complex coding instructions with multiple criteria, e.g.: ``` Create a python application with the following requirements: - Asyncio FastAPI webserver - ping endpoint that returns the current date in JSON format - file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate ``` Or inline criteria: ``` Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values. ``` You can also optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.: ``` Write a websocket application in node.js. PLAINFORMAT ``` #### Agent/function calling The dataset includes many examples of function/args generation based on input criteria. This is somewhat similar to the OpenAI function calling, but the output is either JSON or YAML. Example prompt: ``` As an AI assistant, please select the most suitable function and parameters from the list of available functions below, based on the user's input. Provide your response in JSON format. Input: I want to know how many times 'Python' is mentioned in my text file. Available functions: file_analytics: description: This tool performs various operations on a text file. params: action: The operation we want to perform on the data, such as "count_occurrences", "find_line", etc. filters: keyword: The word or phrase we want to search for. ``` Response: ```json { "function": "file_analytics", "params": { "action": "count_occurrences", "filters": { "keyword": "Python" } } } ``` #### Chain-of-thought You can ask for several possible responses to a given problem, with a ranking and final answer selection. Example prompt: ``` A group of people decide to play a round-robin tournament where every player plays every other player exactly once. If a total of 45 games were played, how many players participated in the tournament? Offer several potential answers, rank them according to mathematical logic, and pick the most feasible one. ``` Example response: ``` Solution 1: In a round-robin tournament, each player plays every other player exactly once. This means that the total number of games played is equal to the combination of all players taken two at a time (since a game involves two players). The formula for combinations is nC2 = n(n-1)/2, where n is the total number of players. We know that nC2 equals 45 (the total number of games), so we can set up the equation as follows: n(n-1)/2 = 45 Solving this equation gives us n=10. Final answer: There were 10 players in the tournament. Solution 2: Another way to approach this problem is by using the concept of triangular numbers. Triangular numbers are a sequence of numbers where each number can be represented as a triangle with dots. The nth triangular number is given by the sum of the first n natural numbers. If we consider each game as a dot and each player as a side of the triangle, then the total number of games (45) would correspond to the 9th triangular number because 1+2+3+4+5+6+7+8+9=45. However, since each side of the triangle represents a player, and there's one more player than sides in our model (because the last player has no one left to play against), we need to add one more to get the total number of players. Final answer: There were 10 players in the tournament. Solution 3: We could also use trial and error to solve this problem. Starting from a small number of players, we calculate how many games would be played in a round-robin tournament until we reach 45 games. For example, with 2 players, only 1 game is played; with 3 players, 3 games are played; with 4 players, 6 games are played, and so on. Continuing this process, we find that with 10 players, 45 games are played. Final answer: There were 10 players in the tournament. Ranking of solutions from best to worst: 1. Solution 1: This solution uses direct mathematical reasoning based on the properties of a round-robin tournament and solves an equation to find the answer. 2. Solution 2: This solution uses the concept of triangular numbers which requires additional steps and knowledge beyond the immediate problem context, but still arrives at the correct answer. 3. Solution 3: This solution relies on trial and error, which is less efficient and reliable than the other methods, but it eventually gets to the right answer. Best and final answer: There were 10 players in the tournament. ``` #### reWOO style execution planning The model now supports execution planning for complex instructions that would require making use of several tools. The output is just the plan, you must implement a mechanism to parse the output and actually call the functions! Example prompt: ``` Please construct a systematic plan to generate an optimal response to the user instruction, utilizing a set of provided tools. Each plan will correspond to an evidence value, which will be the output of one of the available functions given an input string that could be the user's question, one or more prior evidence values, or a combination of both. Here are the tools available to be called: DuckDuckGo[input]: Tool that enables the user to search for information using the DuckDuckGo search engine. This tool is helpful in locating a suitable list of sites that may contain the answer to the user's query. It does not directly answer the question but finds a list of sites that might have the answer. TextScraper[input]: Load one or more websites from the input string containing newline delimited links, where input is one or more links, and produces plain text output containing the content of the links. HyperlinkExtractor[input]: Extract links from plain text and produces a plain text, newline delimited response of links. KnowledgeModel[input]: Question answering language model, particularly useful in answering questions based on an input passage of text. The input must be a text question that references an :evidence[n]: variable, e.g. What color is the cat, given :evidence1:? The output should be in format: Plan: [first action to take based on input question] :evidence0: = [function to call with input parameter] Plan: [next action to take, based on result of :evidence0:] :evidence1: = [next function to call and input parameter, which may include reference to previous evidence, e.g. "Given context :evidence0"] ... Answer: [:evidence[n]: containing the final answer.] Question: Who won the Nobel Prize in Literature in 2020 and what is their most famous work? ``` Response: ``` Plan: Begin by conducting a web search to find out who won the Nobel Prize in Literature in 2020. :evidence0: = DuckDuckGo[Nobel Prize in Literature 2020 winner] Plan: Utilize the KnowledgeModel to interpret the search results and find the name of the laureate. :evidence1: = KnowledgeModel[Who won the Nobel Prize in Literature in 2020, given :evidence0:?] Plan: Conduct another web search to find the most famous work of the identified laureate. :evidence2: = DuckDuckGo[Most famous work of :evidence1:] Plan: Extract the relevant links from the DuckDuckGo search results for a more focused search. :evidence3: = HyperlinkExtractor[:evidence2:] Plan: Use the TextScraper tool to extract information from the relevant links. :evidence4: = TextScraper[:evidence3:] Plan: Finally, utilize the KnowledgeModel to identify and summarize the most famous work of the laureate from the extracted information. :evidence5: = KnowledgeModel[What is the most famous work of :evidence1:, given :evidence4:?] Answer: :evidence5: ``` For this to be useful, you'd have to parse the output plan text, and implement/call each of the functions. This is just pseudo-code, completely untested off the top of my head, and obviously would requiring full implementation + hardening: ```python import re import requests def inject_context(input_text, **context): for ref in set(re.findall(r"(:evidence[0-9]+:)", input_text, re.I)): input_text = input_text.replace(ref, context.get(ref, "")) return input_text def duckduckgo(input_text, **context): search_string = inject_context(input_text, **context) ... search via duck duck go using search_string ... return text content def link_extractor(input_text, **context): input_text = inject_context(input_text, **context) return "\n".join(list(set(re.findall(r"(https?://[^\s]+?\.?)", input_text, re.I)))) def scrape(input_text, **context): input_text = inject_context(input_text, **context) text = [] for link in input_text.splitlines(): text.append(requests.get(link).text) return "\n".join(text) def infer(input_text, **context) prompt = inject_context(input_text, **context) ... call model with prompt, return output def parse_plan(plan): method_map = { "DuckDuckGo": duckduckgo, "HyperlinkExtractor": link_extractor, "KnowledgeModel": infer, "TextScraper": scrape, } context = {} for line in plan.strip().splitlines(): if line.startswith("Plan:"): print(line) continue parts = re.match("^(:evidence[0-9]+:)\s*=\s*([^\[]+])(\[.*\])\s$", line, re.I) if not parts: if line.startswith("Answer: "): return context.get(line.split(" ")[-1].strip(), "Answer couldn't be generated...") raise RuntimeError("bad format: " + line) context[parts.group(1)] = method_map[parts.group(2)](parts.group(3), **context) ``` ### Contribute If you're interested in new functionality, particularly a new "instructor" type to generate a specific type of training data, take a look at the dataset generation tool repo: https://github.com/jondurbin/airoboros and either make a PR or open an issue with details. To help me with the OpenAI/compute costs: - https://bmc.link/jondurbin - ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11 - BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf ### Licence and usage restrictions The airoboros 2.2 models are built on top of llama-2/codellama. The llama-2 base model has a custom Meta license: - See the [meta-license/LICENSE.txt](meta-license/LICENSE.txt) file attached for the original license provided by Meta. - See also [meta-license/USE_POLICY.md](meta-license/USE_POLICY.md) and [meta-license/Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf), also provided by Meta. The fine-tuning data was mostly generated by OpenAI API calls to gpt-4, via [airoboros](https://github.com/jondurbin/airoboros) The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI - what does *compete* actually mean here? - these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place - if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works - the training data used in essentially all large language models includes a significant amount of copyrighted or otherwise non-permissive licensing in the first place - other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2 I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license for llama-2) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly. Your best bet is probably to avoid using this commercially due to the OpenAI API usage. Either way, by using this model, you agree to completely indemnify me.
jondurbin/airoboros-l2-70b-2.2.1
jondurbin
2023-09-21T18:38:40Z
1,437
9
transformers
[ "transformers", "pytorch", "llama", "text-generation", "dataset:jondurbin/airoboros-2.2.1", "license:llama2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-20T16:14:07Z
--- license: llama2 datasets: - jondurbin/airoboros-2.2.1 --- ### Overview Another experimental model, using mostly sythetic data generated by [airoboros](https://github.com/jondurbin/airoboros) This is essentially a minor "fix" branch of [airoboros-l2-70b-2.2](https://hf.co/jondurbin/airoboros-l2-70b-2.2) with a updates, primarily: - [re-generated writing responses](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#re-generated-writing-responses) - [longer contextual blocks](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#longer-contextual-blocks) - [removal of "rp" data](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#rp-category-removed) - [(less aggressive) de-censoring](https://huggingface.co/datasets/jondurbin/airoboros-2.2.1#de-censoring) - 4 epochs instead of 3 __*Q4_0 quantization WILL NOT WORK on this version, it produces garbage!*__ This is a fairly general purpose model, but focuses heavily on instruction following, rather than casual chat/roleplay. Huge thank you to the folks over at [a16z](https://a16z.com/) for sponsoring the costs associated with building models and associated tools! ### Prompt format The prompt format: ``` A chat. USER: {prompt} ASSISTANT: ``` The default system prompt ("A chat.") was used for most of the prompts, however it also included a wide sampling of responses with other prompts, particularly in "stylized\_response", "rp", "gtkm", etc. Here's another example: ``` A chat between Bob (aka USER) and Tom (aka ASSISTANT). Tom is an extremely intelligent 18th century bookkeeper, who speaks loquaciously. USER: {prompt} ASSISTANT: ``` And chat scenario that wouldn't require USER/ASSISTANT (but should use stopping criteria to prevent the model from speaking on your behalf). ``` A chat between old friends: Timmy and Tommy. {description of characters} {setting for the chat} Timmy: *takes a big sip from his coffee* "Ah, sweet, delicious, magical coffee." Tommy: ``` __*I strongly suggest adding stopping criteria/early inference stopping on "USER:", and/or whatever names you specify in the system prompt.*__ ### Fine tuning info https://wandb.ai/jondurbin/airoboros-l2-70b-2.2.1/runs/f91zmwuz?workspace=user-jondurbin ### Helpful usage tips *The prompts shown here are are just the text that would be included after USER: and before ASSISTANT: in the full prompt format above, the system prompt and USER:/ASSISTANT: have been omited for readability.* #### Context obedient question answering By obedient, I mean the model was trained to ignore what it thinks it knows, and uses the context to answer the question. The model was also tuned to limit the values to the provided context as much as possible to reduce hallucinations. The format for a closed-context prompt is as follows: ``` BEGININPUT BEGINCONTEXT [key0: value0] [key1: value1] ... other metdata ... ENDCONTEXT [insert your text blocks here] ENDINPUT [add as many other blocks, in the exact same format] BEGININSTRUCTION [insert your instruction(s). The model was tuned with single questions, paragraph format, lists, etc.] ENDINSTRUCTION ``` It's also helpful to add "Don't make up answers if you don't know." to your instruction block to make sure if the context is completely unrelated it doesn't make something up. *The __only__ prompts that need this closed context formating are closed-context instructions. Normal questions/instructions do not!* I know it's a bit verbose and annoying, but after much trial and error, using these explicit delimiters helps the model understand where to find the responses and how to associate specific sources with it. - `BEGININPUT` - denotes a new input block - `BEGINCONTEXT` - denotes the block of context (metadata key/value pairs) to associate with the current input block - `ENDCONTEXT` - denotes the end of the metadata block for the current input - [text] - Insert whatever text you want for the input block, as many paragraphs as can fit in the context. - `ENDINPUT` - denotes the end of the current input block - [repeat as many input blocks in this format as you want] - `BEGININSTRUCTION` - denotes the start of the list (or one) instruction(s) to respond to for all of the input blocks above. - [instruction(s)] - `ENDINSTRUCTION` - denotes the end of instruction set It sometimes works without `ENDINSTRUCTION`, but by explicitly including that in the prompt, the model better understands that all of the instructions in the block should be responded to. Here's a trivial, but important example to prove the point: ``` BEGININPUT BEGINCONTEXT date: 2021-01-01 url: https://web.site/123 ENDCONTEXT In a shocking turn of events, blueberries are now green, but will be sticking with the same name. ENDINPUT BEGININSTRUCTION What color are bluberries? Source? ENDINSTRUCTION ``` And the response: ``` Blueberries are now green. Source: date: 2021-01-01 url: https://web.site/123 ``` #### Summarization 500 samples have been included from [this dataset](https://huggingface.co/datasets/mattpscott/airoboros-summarization), using the same format as contextual question answering, for example: ``` BEGININPUT {text to summarize} ENDINPUT BEGININSTRUCTION Summarize the input in around 130 words. ENDINSTRUCTION ``` #### Getting longer responses You can use a few techniques to get longer responses. Detailed prompts, with explicit instruction for word count: ``` Please compose a narrative set in the heart of an ancient library, steeped in the scent of old parchment and ink. The protagonist should be a young scholar who is dedicated to studying the art of storytelling and its evolution throughout history. In her pursuit of knowledge, she stumbles upon a forgotten tome that seems to possess an unusual aura. This book has the ability to bring stories to life, literally manifesting characters and scenarios from within its pages into reality. The main character must navigate through various epochs of storytelling - from oral traditions of tribal societies, through medieval minstrels' tales, to modern-day digital narratives - as they come alive around her. Each era presents its unique challenges and lessons about the power and impact of stories on human civilization. One such character could be a sentient quill pen, who was once used by renowned authors of yesteryears and now holds their wisdom and experiences. It becomes her mentor, guiding her through this journey with witty remarks and insightful commentary. Ensure that your tale encapsulates the thrill of adventure, the beauty of learning, and the profound connection between humans and their stories. All characters involved should be non-human entities. Feel free to explore creative liberties but maintain the mentioned elements. Your response should be approximately 2300 words. ``` Or, a simpler example: ``` Please create a long, detailed story about a dragon in an old growth forest who, for some reason, begins speaking the words of the source code of linux. ``` #### Coding You can ask for fairly complex coding instructions with multiple criteria, e.g.: ``` Create a python application with the following requirements: - Asyncio FastAPI webserver - ping endpoint that returns the current date in JSON format - file upload endpoint, which calculates the file's sha256 checksum, and checks postgres to deduplicate ``` Or inline criteria: ``` Write a multi-threaded TCP server in C that accepts a "GET [key]" input and "SET [key] [value]" input, and uses a binary tree to get and store the input values. ``` You can also optionally add a single space and "PLAINFORMAT" at the end of your prompt to avoid backticks, explanations, etc. and just print the code, e.g.: ``` Write a websocket application in node.js. PLAINFORMAT ``` #### Agent/function calling The dataset includes many examples of function/args generation based on input criteria. This is somewhat similar to the OpenAI function calling, but the output is either JSON or YAML. Example prompt: ``` As an AI assistant, please select the most suitable function and parameters from the list of available functions below, based on the user's input. Provide your response in JSON format. Input: I want to know how many times 'Python' is mentioned in my text file. Available functions: file_analytics: description: This tool performs various operations on a text file. params: action: The operation we want to perform on the data, such as "count_occurrences", "find_line", etc. filters: keyword: The word or phrase we want to search for. ``` Response: ```json { "function": "file_analytics", "params": { "action": "count_occurrences", "filters": { "keyword": "Python" } } } ``` #### Chain-of-thought You can ask for several possible responses to a given problem, with a ranking and final answer selection. Example prompt: ``` A group of people decide to play a round-robin tournament where every player plays every other player exactly once. If a total of 45 games were played, how many players participated in the tournament? Offer several potential answers, rank them according to mathematical logic, and pick the most feasible one. ``` Example response: ``` Solution 1: In a round-robin tournament, each player plays every other player exactly once. This means that the total number of games played is equal to the combination of all players taken two at a time (since a game involves two players). The formula for combinations is nC2 = n(n-1)/2, where n is the total number of players. We know that nC2 equals 45 (the total number of games), so we can set up the equation as follows: n(n-1)/2 = 45 Solving this equation gives us n=10. Final answer: There were 10 players in the tournament. Solution 2: Another way to approach this problem is by using the concept of triangular numbers. Triangular numbers are a sequence of numbers where each number can be represented as a triangle with dots. The nth triangular number is given by the sum of the first n natural numbers. If we consider each game as a dot and each player as a side of the triangle, then the total number of games (45) would correspond to the 9th triangular number because 1+2+3+4+5+6+7+8+9=45. However, since each side of the triangle represents a player, and there's one more player than sides in our model (because the last player has no one left to play against), we need to add one more to get the total number of players. Final answer: There were 10 players in the tournament. Solution 3: We could also use trial and error to solve this problem. Starting from a small number of players, we calculate how many games would be played in a round-robin tournament until we reach 45 games. For example, with 2 players, only 1 game is played; with 3 players, 3 games are played; with 4 players, 6 games are played, and so on. Continuing this process, we find that with 10 players, 45 games are played. Final answer: There were 10 players in the tournament. Ranking of solutions from best to worst: 1. Solution 1: This solution uses direct mathematical reasoning based on the properties of a round-robin tournament and solves an equation to find the answer. 2. Solution 2: This solution uses the concept of triangular numbers which requires additional steps and knowledge beyond the immediate problem context, but still arrives at the correct answer. 3. Solution 3: This solution relies on trial and error, which is less efficient and reliable than the other methods, but it eventually gets to the right answer. Best and final answer: There were 10 players in the tournament. ``` #### reWOO style execution planning The model now supports execution planning for complex instructions that would require making use of several tools. The output is just the plan, you must implement a mechanism to parse the output and actually call the functions! Example prompt: ``` Please construct a systematic plan to generate an optimal response to the user instruction, utilizing a set of provided tools. Each plan will correspond to an evidence value, which will be the output of one of the available functions given an input string that could be the user's question, one or more prior evidence values, or a combination of both. Here are the tools available to be called: DuckDuckGo[input]: Tool that enables the user to search for information using the DuckDuckGo search engine. This tool is helpful in locating a suitable list of sites that may contain the answer to the user's query. It does not directly answer the question but finds a list of sites that might have the answer. TextScraper[input]: Load one or more websites from the input string containing newline delimited links, where input is one or more links, and produces plain text output containing the content of the links. HyperlinkExtractor[input]: Extract links from plain text and produces a plain text, newline delimited response of links. KnowledgeModel[input]: Question answering language model, particularly useful in answering questions based on an input passage of text. The input must be a text question that references an :evidence[n]: variable, e.g. What color is the cat, given :evidence1:? The output should be in format: Plan: [first action to take based on input question] :evidence0: = [function to call with input parameter] Plan: [next action to take, based on result of :evidence0:] :evidence1: = [next function to call and input parameter, which may include reference to previous evidence, e.g. "Given context :evidence0"] ... Answer: [:evidence[n]: containing the final answer.] Question: Who won the Nobel Prize in Literature in 2020 and what is their most famous work? ``` Response: ``` Plan: Begin by conducting a web search to find out who won the Nobel Prize in Literature in 2020. :evidence0: = DuckDuckGo[Nobel Prize in Literature 2020 winner] Plan: Utilize the KnowledgeModel to interpret the search results and find the name of the laureate. :evidence1: = KnowledgeModel[Who won the Nobel Prize in Literature in 2020, given :evidence0:?] Plan: Conduct another web search to find the most famous work of the identified laureate. :evidence2: = DuckDuckGo[Most famous work of :evidence1:] Plan: Extract the relevant links from the DuckDuckGo search results for a more focused search. :evidence3: = HyperlinkExtractor[:evidence2:] Plan: Use the TextScraper tool to extract information from the relevant links. :evidence4: = TextScraper[:evidence3:] Plan: Finally, utilize the KnowledgeModel to identify and summarize the most famous work of the laureate from the extracted information. :evidence5: = KnowledgeModel[What is the most famous work of :evidence1:, given :evidence4:?] Answer: :evidence5: ``` For this to be useful, you'd have to parse the output plan text, and implement/call each of the functions. This is just pseudo-code, completely untested off the top of my head, and obviously would requiring full implementation + hardening: ```python import re import requests def inject_context(input_text, **context): for ref in set(re.findall(r"(:evidence[0-9]+:)", input_text, re.I)): input_text = input_text.replace(ref, context.get(ref, "")) return input_text def duckduckgo(input_text, **context): search_string = inject_context(input_text, **context) ... search via duck duck go using search_string ... return text content def link_extractor(input_text, **context): input_text = inject_context(input_text, **context) return "\n".join(list(set(re.findall(r"(https?://[^\s]+?\.?)", input_text, re.I)))) def scrape(input_text, **context): input_text = inject_context(input_text, **context) text = [] for link in input_text.splitlines(): text.append(requests.get(link).text) return "\n".join(text) def infer(input_text, **context) prompt = inject_context(input_text, **context) ... call model with prompt, return output def parse_plan(plan): method_map = { "DuckDuckGo": duckduckgo, "HyperlinkExtractor": link_extractor, "KnowledgeModel": infer, "TextScraper": scrape, } context = {} for line in plan.strip().splitlines(): if line.startswith("Plan:"): print(line) continue parts = re.match("^(:evidence[0-9]+:)\s*=\s*([^\[]+])(\[.*\])\s$", line, re.I) if not parts: if line.startswith("Answer: "): return context.get(line.split(" ")[-1].strip(), "Answer couldn't be generated...") raise RuntimeError("bad format: " + line) context[parts.group(1)] = method_map[parts.group(2)](parts.group(3), **context) ``` ### Contribute If you're interested in new functionality, particularly a new "instructor" type to generate a specific type of training data, take a look at the dataset generation tool repo: https://github.com/jondurbin/airoboros and either make a PR or open an issue with details. To help me with the OpenAI/compute costs: - https://bmc.link/jondurbin - ETH 0xce914eAFC2fe52FdceE59565Dd92c06f776fcb11 - BTC bc1qdwuth4vlg8x37ggntlxu5cjfwgmdy5zaa7pswf ### Licence and usage restrictions The airoboros 2.2 models are built on top of llama-2/codellama. The llama-2 base model has a custom Meta license: - See the [meta-license/LICENSE.txt](meta-license/LICENSE.txt) file attached for the original license provided by Meta. - See also [meta-license/USE_POLICY.md](meta-license/USE_POLICY.md) and [meta-license/Responsible-Use-Guide.pdf](meta-license/Responsible-Use-Guide.pdf), also provided by Meta. The fine-tuning data was mostly generated by OpenAI API calls to gpt-4, via [airoboros](https://github.com/jondurbin/airoboros) The ToS for OpenAI API usage has a clause preventing the output from being used to train a model that __competes__ with OpenAI - what does *compete* actually mean here? - these small open source models will not produce output anywhere near the quality of gpt-4, or even gpt-3.5, so I can't imagine this could credibly be considered competing in the first place - if someone else uses the dataset to do the same, they wouldn't necessarily be violating the ToS because they didn't call the API, so I don't know how that works - the training data used in essentially all large language models includes a significant amount of copyrighted or otherwise non-permissive licensing in the first place - other work using the self-instruct method, e.g. the original here: https://github.com/yizhongw/self-instruct released the data and model as apache-2 I am purposingly leaving this license ambiguous (other than the fact you must comply with the Meta original license for llama-2) because I am not a lawyer and refuse to attempt to interpret all of the terms accordingly. Your best bet is probably to avoid using this commercially due to the OpenAI API usage. Either way, by using this model, you agree to completely indemnify me.
Tostino/Inkbot-13b-4k
Tostino
2023-09-21T18:36:48Z
9
10
transformers
[ "transformers", "pytorch", "llama", "text-generation", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-21T05:56:48Z
# Model Card for Inkbot ## Model Details Inkbot is a conversational AI model designed to interpret and respond to structured prompts with or without contextual information. Built on the latest advancements in natural language processing (NLP) and understanding (NLU), Inkbot provides users with accurate and meaningful interactions, addressing a wide range of queries and topics. Its unique feature lies in the structured prompt system, allowing users to engage in dynamic dialogues that can evolve based on context, metadata, and user input. ### Performance - The model excels in RAG type queries, answering from context, and overriding memory when available. - It can handle very large contexts, but may sometimes enter a repeating text loop, especially during complex tasks. - The model is intended to be more functional and less chatty, avoiding the waste of tokens on superfluous language. ## How to Use Inkbot uses a structured prompt template system. ### Prompt Template Structure #### With Context: ``` <#meta#> - Date: [DATE] - Task: [TASK TYPE] <#system#> [SYSTEM PROMPT/DESCRIPTION OF THE INTERACTION] <#chat#> <#user#> [USER QUESTION] <#bot#> [BOT RESPONSE] <#user#> [USER FOLLOW-UP QUESTION] <#user_context#> [ADDITIONAL CONTEXT FOR THE BOT] <#bot#> [BOT RESPONSE BASED ON CONTEXT] ``` #### Without Context: ``` <#meta#> - Date: [DATE] - Task: [TASK TYPE] <#system#> [SYSTEM PROMPT/DESCRIPTION OF THE INTERACTION] <#chat#> <#user#> [USER QUESTION] <#bot#> [BOT RESPONSE] <#user#> [USER FOLLOW-UP QUESTION] <#bot#> [BOT RESPONSE] ``` ### Examples #### With Context: ``` <#meta#> - Date: 2023-07-20 - Task: general <#system#> A dialogue focused on determining dates based on context clues. <#chat#> <#user#> What is today's date? <#bot#> The date is: 2023-07-20 <#user#> When did the event in the provided context take place? <#user_context#> It's the day after new years eve, and I'm starting 2018 with a headache. <#bot#> The event in the context took place on: 2018-01-01 ``` #### Without Context: ``` <#meta#> - Date: 2023-07-20 - Task: general <#system#> A simple dialogue about daily queries. <#chat#> <#user#> Will it rain today? <#bot#> I am not sure, without knowing your location I am unable to determine the weather. <#user#> Can you compose a haiku for me? <#bot#> Whispers of the breeze, Nature's melodies unfold, Dawn dispels the night. ``` ## Task Options Inkbot has been trained for a variety of tasks. Below are some of the key task options you can utilize, but feel free to try others to test the effect: 1. **general**: This is the default task and is designed for a broad range of general questions and interactions. - Usage: Suitable for most day-to-day interactions and queries. 2. **knowledge_graph**: This task involves extracting, understanding, and representing information in a structured way. - Usage: When you want to extract relationships between entities or desire structured representations of data. 3. **question_answer**: Explicitly trained for answering questions in a straightforward manner. - Usage: Best used when you have direct questions and expect concise answers. 4. **reasoning**: Allows Inkbot to showcase its logical and deductive reasoning capabilities. - Usage: Ideal for puzzles, riddles, or scenarios where logical analysis is required. 5. **translation**: Use this for language translation tasks. - Usage: Provide a sentence or paragraph in one language, and specify the desired target language for translation. 6. **summarization**: Trained for condensing large texts into shorter, coherent summaries. - Usage: When you have a lengthy text or article that you want to be summarized to its key points. 7. **creative_writing**: Engage Inkbot in composing stories, poetry, and other creative content. - Usage: For tasks that require imaginative and original content generation. ## Limitations - Adhere to the prompt structure for best results. - When providing contextual details, clarity is essential for Inkbot to derive accurate and meaningful responses. - The overriding memory from user_context property generally only works for the next prompt or two, after which the model reverts to its original behavior. - On complex tasks, like creating a coherent story based on a set of facts from context, there's a potential for a repeating text loop as context fills. - Sometimes the model doesn't know when to end a knowledge graph, which can result in adding nodes and edges until it runs out of context. ## Additional Notes - The 'date', 'task', and 'system' are crucial metadata components that need to be provided outside the core dialogue. - Use the 'user_context' when you want to offer supplementary context that guides Inkbot's response. You can interleave it in the chat log as necessary. - The specific tag format, such as `<#word#>`, is used to because there are filters in a lot of APIs for <|word|> and this makes interactions easier. --- license: llama2 ---
TemporalGames/opt-1.3b-lambada_rmt_ms7_bptt7_sl2028_mt10_lTrue_LORA_cur3
TemporalGames
2023-09-21T18:23:11Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-21T18:23:09Z
--- library_name: peft --- ## Training procedure ### Framework versions - PEFT 0.6.0.dev0
Art101/ppo-Huggy
Art101
2023-09-21T18:13:58Z
0
0
ml-agents
[ "ml-agents", "tensorboard", "onnx", "Huggy", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-Huggy", "region:us" ]
reinforcement-learning
2023-09-21T18:13:47Z
--- library_name: ml-agents tags: - Huggy - deep-reinforcement-learning - reinforcement-learning - ML-Agents-Huggy --- # **ppo** Agent playing **Huggy** This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents). ## Usage (with ML-Agents) The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/ We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub: - A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction - A *longer tutorial* to understand how works ML-Agents: https://huggingface.co/learn/deep-rl-course/unit5/introduction ### Resume the training ```bash mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume ``` ### Watch your Agent play You can watch your agent **playing directly in your browser** 1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity 2. Step 1: Find your model_id: Art101/ppo-Huggy 3. Step 2: Select your *.nn /*.onnx file 4. Click on Watch the agent play 👀
felixquinihildebet/taxi-v3
felixquinihildebet
2023-09-21T18:05:29Z
0
0
null
[ "Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
reinforcement-learning
2023-09-21T17:55:22Z
--- tags: - Taxi-v3 - q-learning - reinforcement-learning - custom-implementation model-index: - name: taxi-v3 results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: Taxi-v3 type: Taxi-v3 metrics: - type: mean_reward value: 7.56 +/- 2.71 name: mean_reward verified: false --- # **Q-Learning** Agent playing1 **Taxi-v3** This is a trained model of a **Q-Learning** agent playing **Taxi-v3** . ## Usage ```python model = load_from_hub(repo_id="felixquinihildebet/taxi-v3", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
Ori/lama-2-13b-peft-strategyqa-with-retrieval-mix-v2-seed-3
Ori
2023-09-21T17:53:56Z
2
0
peft
[ "peft", "safetensors", "region:us" ]
null
2023-09-21T17:52:51Z
--- library_name: peft --- ## Training procedure ### Framework versions - PEFT 0.5.0.dev0
cloudwalkerw/wavlm-base_3
cloudwalkerw
2023-09-21T17:48:37Z
16
0
transformers
[ "transformers", "pytorch", "wavlm", "audio-classification", "generated_from_trainer", "base_model:microsoft/wavlm-base", "base_model:finetune:microsoft/wavlm-base", "endpoints_compatible", "region:us" ]
audio-classification
2023-09-18T14:31:32Z
--- base_model: microsoft/wavlm-base tags: - audio-classification - generated_from_trainer metrics: - accuracy model-index: - name: wavlm-base_3 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wavlm-base_3 This model is a fine-tuned version of [microsoft/wavlm-base](https://huggingface.co/microsoft/wavlm-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.6534 - Accuracy: 0.8974 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 16 - eval_batch_size: 2 - seed: 0 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2236 | 1.24 | 100 | 12.8495 | 0.4467 | | 0.0514 | 2.48 | 200 | 16.3078 | 0.2677 | | 0.0 | 3.72 | 300 | 17.5651 | 0.2597 | | 0.3252 | 4.95 | 400 | 15.0382 | 0.1912 | | 1.0577 | 6.19 | 500 | 0.6534 | 0.8974 | | 0.6973 | 7.43 | 600 | 0.7352 | 0.1026 | | 0.6939 | 8.67 | 700 | 0.6210 | 0.8974 | | 0.6944 | 9.91 | 800 | 0.7129 | 0.1026 | ### Framework versions - Transformers 4.34.0.dev0 - Pytorch 2.0.0.post302 - Datasets 2.14.5 - Tokenizers 0.13.3
darxkies/bart-large-cnn-samsum-ChatGPT_v3
darxkies
2023-09-21T17:47:31Z
10
1
transformers
[ "transformers", "rust", "bart", "text2text-generation", "summarization", "autotrain_compatible", "endpoints_compatible", "region:us" ]
summarization
2023-09-21T15:44:26Z
--- pipeline_tag: summarization --- Original model: [Qiliang/bart-large-cnn-samsum-ChatGPT_v3](https://https://huggingface.co/Qiliang/bart-large-cnn-samsum-ChatGPT_v3) Added files for [rust-bert](https://github.com/guillaume-be/rust-bert)
darxkies/bge-base-en-v1.5
darxkies
2023-09-21T17:47:07Z
2
0
transformers
[ "transformers", "rust", "bert", "feature-extraction", "sentence-similarity", "endpoints_compatible", "region:us" ]
sentence-similarity
2023-09-20T21:36:50Z
--- pipeline_tag: sentence-similarity --- Original model: [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) Added files for [rust-bert](https://github.com/guillaume-be/rust-bert)
megagonlabs/transformers-ud-japanese-electra-base-ginza-520
megagonlabs
2023-09-21T17:45:45Z
125
3
transformers
[ "transformers", "pytorch", "electra", "feature-extraction", "PyTorch", "Transformers", "spaCy", "ELECTRA", "GiNZA", "mC4", "UD_Japanese-BCCWJ", "GSK2014-A", "ja", "MIT", "arxiv:1910.10683", "license:mit", "endpoints_compatible", "region:us" ]
feature-extraction
2023-09-21T14:14:04Z
--- language: - ja thumbnail: "https://raw.githubusercontent.com/megagonlabs/ginza/static/docs/images/GiNZA_logo_4c_s.png" tags: - PyTorch - Transformers - spaCy - ELECTRA - GiNZA - mC4 - UD_Japanese-BCCWJ - GSK2014-A - ja - MIT license: "mit" datasets: - mC4 - UD_Japanese_BCCWJ-r2.8 - GSK2014-A(2019) metrics: - UAS - LAS - UPOS --- # transformers-ud-japanese-electra-ginza-520 (sudachitra-wordpiece, mC4 Japanese) This is an [ELECTRA](https://github.com/google-research/electra) model pretrained on approximately 200M Japanese sentences extracted from the [mC4](https://huggingface.co/datasets/mc4) and finetuned by [spaCy v3](https://spacy.io/usage/v3) on [UD\_Japanese\_BCCWJ r2.8](https://universaldependencies.org/treebanks/ja_bccwj/index.html). The base pretrain model is [megagonlabs/transformers-ud-japanese-electra-base-discrimininator](https://huggingface.co/megagonlabs/transformers-ud-japanese-electra-base-discriminator). The entire spaCy v3 model is distributed as a python package named [`ja_ginza_electra`](https://pypi.org/project/ja-ginza-electra/) from PyPI along with [`GiNZA v5`](https://github.com/megagonlabs/ginza) which provides some custom pipeline components to recognize the Japanese bunsetu-phrase structures. Try running it as below: ```console $ pip install ginza ja_ginza_electra $ ginza ``` ## Licenses The models are distributed under the terms of the [MIT License](https://opensource.org/licenses/mit-license.php). ## Acknowledgments This model is permitted to be published under the `MIT License` under a joint research agreement between NINJAL (National Institute for Japanese Language and Linguistics) and Megagon Labs Tokyo. ## Citations - [mC4](https://huggingface.co/datasets/mc4) Contains information from `mC4` which is made available under the [ODC Attribution License](https://opendatacommons.org/licenses/by/1-0/). ``` @article{2019t5, author = {Colin Raffel and Noam Shazeer and Adam Roberts and Katherine Lee and Sharan Narang and Michael Matena and Yanqi Zhou and Wei Li and Peter J. Liu}, title = {Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer}, journal = {arXiv e-prints}, year = {2019}, archivePrefix = {arXiv}, eprint = {1910.10683}, } ``` - [UD\_Japanese\_BCCWJ r2.8](https://universaldependencies.org/treebanks/ja_bccwj/index.html) ``` Asahara, M., Kanayama, H., Tanaka, T., Miyao, Y., Uematsu, S., Mori, S., Matsumoto, Y., Omura, M., & Murawaki, Y. (2018). Universal Dependencies Version 2 for Japanese. In LREC-2018. ``` - [GSK2014-A(2019)](https://www.gsk.or.jp/catalog/gsk2014-a/)
poteminr/llama2-nerel-bio
poteminr
2023-09-21T17:33:53Z
4
0
peft
[ "peft", "region:us" ]
null
2023-08-13T12:12:36Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: True - load_in_4bit: False - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.6.0.dev0
CyberHarem/senzaki_ema_idolmastercinderellagirls
CyberHarem
2023-09-21T17:22:10Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/senzaki_ema_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-21T17:13:47Z
--- license: mit datasets: - CyberHarem/senzaki_ema_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of senzaki_ema_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 3400, you need to download `3400/senzaki_ema_idolmastercinderellagirls.pt` as the embedding and `3400/senzaki_ema_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 3400**, with the score of 0.999. The trigger words are: 1. `senzaki_ema_idolmastercinderellagirls` 2. `short_hair, jewelry, blonde_hair, very_short_hair, earrings, smile, red_eyes, open_mouth` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:---------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.962 | [Download](5100/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.993 | [Download](4760/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.998 | [Download](4420/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.996 | [Download](4080/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.962 | [Download](3740/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | **3400** | **0.999** | [**Download**](3400/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.978 | [Download](3060/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.994 | [Download](2720/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.996 | [Download](2380/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.994 | [Download](2040/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.992 | [Download](1700/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.997 | [Download](1360/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.983 | [Download](1020/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.992 | [Download](680/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.847 | [Download](340/senzaki_ema_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
annahaz/xlm-roberta-base-misogyny-sexism-indomain-mix-bal
annahaz
2023-09-21T17:12:38Z
126
0
transformers
[ "transformers", "pytorch", "xlm-roberta", "text-classification", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2022-08-16T18:33:59Z
This is a multilingual misogyny and sexism detection model. This model was released with the following paper (https://rdcu.be/dmIpq): ``` @InProceedings{10.1007/978-3-031-43129-6_9, author="Chang, Rong-Ching and May, Jonathan and Lerman, Kristina", editor="Thomson, Robert and Al-khateeb, Samer and Burger, Annetta and Park, Patrick and A. Pyke, Aryn", title="Feedback Loops and Complex Dynamics of Harmful Speech in Online Discussions", booktitle="Social, Cultural, and Behavioral Modeling", year="2023", publisher="Springer Nature Switzerland", address="Cham", pages="85--94", abstract="Harmful and toxic speech contribute to an unwelcoming online environment that suppresses participation and conversation. Efforts have focused on detecting and mitigating harmful speech; however, the mechanisms by which toxicity degrades online discussions are not well understood. This paper makes two contributions. First, to comprehensively model harmful comments, we introduce a multilingual misogyny and sexist speech detection model (https://huggingface.co/annahaz/xlm-roberta-base-misogyny-sexism-indomain-mix-bal). Second, we model the complex dynamics of online discussions as feedback loops in which harmful comments lead to negative emotions which prompt even more harmful comments. To quantify the feedback loops, we use a combination of mutual Granger causality and regression to analyze discussions on two political forums on Reddit: the moderated political forum r/Politics and the moderated neutral political forum r/NeutralPolitics. Our results suggest that harmful comments and negative emotions create self-reinforcing feedback loops in forums. Contrarily, moderation with neutral discussion appears to tip interactions into self-extinguishing feedback loops that reduce harmful speech and negative emotions. Our study sheds more light on the complex dynamics of harmful speech and the role of moderation and neutral discussion in mitigating these dynamics.", isbn="978-3-031-43129-6" } ``` We combined several multilingual ground truth datasets for misogyny and sexism (M/S) versus non-misogyny and non-sexism (non-M/S) [3,5,8,9,11,13, 20]. Specifically, the dataset expressing misogynistic or sexist speech (M/S) and the same number of texts expressing non-M/S speech in each language included 8, 582 English-language texts, 872 in French, 561 in Hindi, 2, 190 in Italian, and 612 in Bengali. The test data was a balanced set of 100 texts sampled randomly from both M/S and non-M/S groups in each language, for a total of 500 examples of M/S speech and 500 examples of non-M/S speech. References of the datasets are: 3. Bhattacharya, S., et al.: Developing a multilingual annotated corpus of misog- yny and aggression, pp. 158–168. ELRA, Marseille, France, May 2020. https:// aclanthology.org/2020.trac- 1.25 5. Chiril, P., Moriceau, V., Benamara, F., Mari, A., Origgi, G., Coulomb-Gully, M.: An annotated corpus for sexism detection in French tweets. In: Proceedings of LREC, pp. 1397–1403 (2020) 8. Fersini, E., et al.: SemEval-2022 task 5: multimedia automatic misogyny identification. In: Proceedings of SemEval, pp. 533–549 (2022) 9. Fersini, E., Nozza, D., Rosso, P.: Overview of the Evalita 2018 task on automatic misogyny identification (AMI). EVALITA Eval. NLP Speech Tools Italian 12, 59 (2018) 11. Guest, E., Vidgen, B., Mittos, A., Sastry, N., Tyson, G., Margetts, H.: An expert annotated dataset for the detection of online misogyny. In: Proceedings of EACL, pp. 1336–1350 (2021) 13. Jha, A., Mamidi, R.: When does a compliment become sexist? Analysis and classification of ambivalent sexism using Twitter data. In: Proceedings of NLP+CSS, pp. 7–16 (2017) 20. Waseem, Z., Hovy, D.: Hateful symbols or hateful people? Predictive features for hate speech detection on Twitter. In: Proceedings of NAACL SRW, pp. 88–93 (2016) Please see the paper for more detail. --- license: mit tags: - generated_from_trainer metrics: - accuracy - f1 - precision - recall model-index: - name: xlm-roberta-base-misogyny-sexism-indomain-mix-bal results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # xlm-roberta-base-misogyny-sexism-indomain-mix-bal This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.8259 - Accuracy: 0.826 - F1: 0.8333 - Precision: 0.7996 - Recall: 0.87 - Mae: 0.174 - Tn: 391 - Fp: 109 - Fn: 65 - Tp: 435 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall | Mae | Tn | Fp | Fn | Tp | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|:-----:|:---:|:---:|:--:|:---:| | 0.2643 | 1.0 | 1603 | 0.6511 | 0.82 | 0.8269 | 0.7963 | 0.86 | 0.18 | 390 | 110 | 70 | 430 | | 0.2004 | 2.0 | 3206 | 0.8259 | 0.826 | 0.8333 | 0.7996 | 0.87 | 0.174 | 391 | 109 | 65 | 435 | ### Framework versions - Transformers 4.20.1 - Pytorch 1.12.0+cu102 - Datasets 2.3.2 - Tokenizers 0.12.1 # Multilingual_Misogyny_Detection
amirabdullah19852020/pythia-410m_utility_reward
amirabdullah19852020
2023-09-21T17:03:36Z
59
0
transformers
[ "transformers", "pytorch", "gpt_neox", "text-generation", "trl", "reinforcement-learning", "license:apache-2.0", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
reinforcement-learning
2023-09-21T15:26:30Z
--- license: apache-2.0 tags: - trl - transformers - reinforcement-learning --- # TRL Model This is a [TRL language model](https://github.com/huggingface/trl) that has been fine-tuned with reinforcement learning to guide the model outputs according to a value, function, or human feedback. The model can be used for text generation. ## Usage To use this model for inference, first install the TRL library: ```bash python -m pip install trl ``` You can then generate text as follows: ```python from transformers import pipeline generator = pipeline("text-generation", model="amirabdullah19852020//tmp/tmplbhw73mv/amirabdullah19852020/pythia-410m_utility_reward") outputs = generator("Hello, my llama is cute") ``` If you want to use the model for training or to obtain the outputs from the value head, load the model as follows: ```python from transformers import AutoTokenizer from trl import AutoModelForCausalLMWithValueHead tokenizer = AutoTokenizer.from_pretrained("amirabdullah19852020//tmp/tmplbhw73mv/amirabdullah19852020/pythia-410m_utility_reward") model = AutoModelForCausalLMWithValueHead.from_pretrained("amirabdullah19852020//tmp/tmplbhw73mv/amirabdullah19852020/pythia-410m_utility_reward") inputs = tokenizer("Hello, my llama is cute", return_tensors="pt") outputs = model(**inputs, labels=inputs["input_ids"]) ```
Akash092003/llama-2-finetuned-ludwig-wandb
Akash092003
2023-09-21T16:58:26Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-21T16:58:24Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.5.0 - PEFT 0.5.0
LarryAIDraw/yamato_kantaicollection
LarryAIDraw
2023-09-21T16:57:37Z
0
0
null
[ "license:creativeml-openrail-m", "region:us" ]
null
2023-09-20T16:48:28Z
--- license: creativeml-openrail-m --- https://civitai.com/models/131265/yamato-kantai-collection
ShivamMangale/XLM-Roberta-base-allhiweakdap_5th_iteration_d5_d4_d3_d2_d1_d0
ShivamMangale
2023-09-21T16:55:00Z
106
0
transformers
[ "transformers", "pytorch", "xlm-roberta", "question-answering", "generated_from_trainer", "dataset:squad", "base_model:FacebookAI/xlm-roberta-base", "base_model:finetune:FacebookAI/xlm-roberta-base", "license:mit", "endpoints_compatible", "region:us" ]
question-answering
2023-09-21T15:33:26Z
--- license: mit base_model: xlm-roberta-base tags: - generated_from_trainer datasets: - squad model-index: - name: XLM-Roberta-base-allhiweakdap_5th_iteration_d5_d4_d3_d2_d1_d0 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # XLM-Roberta-base-allhiweakdap_5th_iteration_d5_d4_d3_d2_d1_d0 This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on the squad dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 8 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.13.3
jtlowell/cozy_fantasy_xl
jtlowell
2023-09-21T16:43:51Z
3
1
diffusers
[ "diffusers", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "lora", "dataset:jtlowell/cozy_interiors_2", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0", "region:us" ]
text-to-image
2023-09-21T15:51:17Z
--- base_model: stabilityai/stable-diffusion-xl-base-1.0 instance_prompt: cozy_int tags: - stable-diffusion-xl - stable-diffusion-xl-diffusers - text-to-image - diffusers - lora inference: true datasets: - jtlowell/cozy_interiors_2 --- # LoRA DreamBooth - jtlowell/cozy_fantasy_xl These are LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained on the concept prompt: `cozy_int` Use this keyword to trigger your custom model in your prompts. LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Usage Make sure to upgrade diffusers to >= 0.19.0: ``` pip install diffusers --upgrade ``` In addition make sure to install transformers, safetensors, accelerate as well as the invisible watermark: ``` pip install invisible_watermark transformers accelerate safetensors ``` To just use the base model, you can run: ```python import torch from diffusers import DiffusionPipeline, AutoencoderKL vae = AutoencoderKL.from_pretrained('madebyollin/sdxl-vae-fp16-fix', torch_dtype=torch.float16) pipe = DiffusionPipeline.from_pretrained( "stabilityai/stable-diffusion-xl-base-1.0", vae=vae, torch_dtype=torch.float16, variant="fp16", use_safetensors=True ) # This is where you load your trained weights pipe.load_lora_weights('jtlowell/cozy_fantasy_xl') pipe.to("cuda") prompt = "A majestic cozy_int jumping from a big stone at night" image = pipe(prompt=prompt, num_inference_steps=50).images[0] ```