modelId
stringlengths
5
139
author
stringlengths
2
42
last_modified
timestamp[us, tz=UTC]date
2020-02-15 11:33:14
2025-08-31 06:26:39
downloads
int64
0
223M
likes
int64
0
11.7k
library_name
stringclasses
530 values
tags
listlengths
1
4.05k
pipeline_tag
stringclasses
55 values
createdAt
timestamp[us, tz=UTC]date
2022-03-02 23:29:04
2025-08-31 06:26:13
card
stringlengths
11
1.01M
suzii/pretrain-gpt2-large-2
suzii
2023-09-22T11:10:50Z
132
0
transformers
[ "transformers", "pytorch", "gpt2", "text-generation", "generated_from_trainer", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-22T07:14:35Z
--- tags: - generated_from_trainer model-index: - name: pretrain-gpt2-large-2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # pretrain-gpt2-large-2 This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set: - Loss: 2.4854 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 1 - seed: 42 - distributed_type: multi-GPU - num_devices: 2 - gradient_accumulation_steps: 32 - total_train_batch_size: 1024 - total_eval_batch_size: 2 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 167 | 2.5651 | | No log | 2.0 | 335 | 2.5465 | | No log | 3.0 | 502 | 2.5311 | | No log | 4.0 | 670 | 2.5183 | | No log | 4.99 | 837 | 2.5078 | | No log | 6.0 | 1005 | 2.4995 | | 2.5339 | 7.0 | 1173 | 2.4932 | | 2.5339 | 8.0 | 1340 | 2.4888 | | 2.5339 | 9.0 | 1508 | 2.4862 | | 2.5339 | 9.96 | 1670 | 2.4854 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu117 - Datasets 2.14.5 - Tokenizers 0.13.3
Atheer174/Products_NER
Atheer174
2023-09-22T11:05:20Z
113
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:dslim/bert-base-NER", "base_model:finetune:dslim/bert-base-NER", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-22T02:47:38Z
--- license: mit base_model: dslim/bert-base-NER tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: Products_NER results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Products_NER This model is a fine-tuned version of [dslim/bert-base-NER](https://huggingface.co/dslim/bert-base-NER) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0022 - Precision: 0.9991 - Recall: 0.9992 - F1: 0.9992 - Accuracy: 0.9996 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.0051 | 1.0 | 2470 | 0.0035 | 0.9981 | 0.9986 | 0.9984 | 0.9992 | | 0.0016 | 2.0 | 4940 | 0.0022 | 0.9991 | 0.9992 | 0.9992 | 0.9996 | ### Framework versions - Transformers 4.33.2 - Pytorch 1.13.1+cu117 - Datasets 2.14.5 - Tokenizers 0.13.3
pranaykoppula/vtonseconduser
pranaykoppula
2023-09-22T11:02:39Z
1
0
diffusers
[ "diffusers", "safetensors", "text-to-image", "stable-diffusion", "license:creativeml-openrail-m", "autotrain_compatible", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
2023-09-22T10:58:30Z
--- license: creativeml-openrail-m tags: - text-to-image - stable-diffusion --- ### vtonseconduser Dreambooth model trained by pranaykoppula with [TheLastBen's fast-DreamBooth](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast-DreamBooth.ipynb) notebook Test the concept via A1111 Colab [fast-Colab-A1111](https://colab.research.google.com/github/TheLastBen/fast-stable-diffusion/blob/main/fast_stable_diffusion_AUTOMATIC1111.ipynb) Sample pictures of this concept: ![0](https://huggingface.co/pranaykoppula/vtonseconduser/resolve/main/sample_images/aa448800_l.jpg) ![1](https://huggingface.co/pranaykoppula/vtonseconduser/resolve/main/sample_images/aa448800_l_a4.jpg) ![2](https://huggingface.co/pranaykoppula/vtonseconduser/resolve/main/sample_images/aa448800_l_a5.jpg) ![3](https://huggingface.co/pranaykoppula/vtonseconduser/resolve/main/sample_images/aa448800_l_a2.jpg) ![4](https://huggingface.co/pranaykoppula/vtonseconduser/resolve/main/sample_images/aa448800_l_a1.jpg)
CyberHarem/manabe_itsuki_idolmastercinderellagirls
CyberHarem
2023-09-22T10:37:35Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/manabe_itsuki_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T10:29:21Z
--- license: mit datasets: - CyberHarem/manabe_itsuki_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of manabe_itsuki_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 4080, you need to download `4080/manabe_itsuki_idolmastercinderellagirls.pt` as the embedding and `4080/manabe_itsuki_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 4080**, with the score of 0.948. The trigger words are: 1. `manabe_itsuki_idolmastercinderellagirls` 2. `brown_hair, ponytail, smile, open_mouth, brown_eyes, blush, breasts, long_hair` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.921 | [Download](5100/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![pattern_3-5100](5100/previews/pattern_3.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.922 | [Download](4760/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![pattern_3-4760](4760/previews/pattern_3.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.908 | [Download](4420/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![pattern_3-4420](4420/previews/pattern_3.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | **4080** | **0.948** | [**Download**](4080/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![pattern_3-4080](4080/previews/pattern_3.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.923 | [Download](3740/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![pattern_3-3740](3740/previews/pattern_3.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.901 | [Download](3400/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![pattern_3-3400](3400/previews/pattern_3.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.898 | [Download](3060/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![pattern_3-3060](3060/previews/pattern_3.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.908 | [Download](2720/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![pattern_3-2720](2720/previews/pattern_3.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.917 | [Download](2380/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![pattern_3-2380](2380/previews/pattern_3.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.873 | [Download](2040/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![pattern_3-2040](2040/previews/pattern_3.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.923 | [Download](1700/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![pattern_3-1700](1700/previews/pattern_3.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.896 | [Download](1360/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![pattern_3-1360](1360/previews/pattern_3.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.716 | [Download](1020/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![pattern_3-1020](1020/previews/pattern_3.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.792 | [Download](680/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![pattern_3-680](680/previews/pattern_3.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.469 | [Download](340/manabe_itsuki_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![pattern_3-340](340/previews/pattern_3.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
Nonzerophilip/testThesisSmall
Nonzerophilip
2023-09-22T10:32:17Z
107
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:KBLab/bert-base-swedish-cased-ner", "base_model:finetune:KBLab/bert-base-swedish-cased-ner", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-22T10:26:03Z
--- base_model: KBLab/bert-base-swedish-cased-ner tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: testThesisSmall results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # testThesisSmall This model is a fine-tuned version of [KBLab/bert-base-swedish-cased-ner](https://huggingface.co/KBLab/bert-base-swedish-cased-ner) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5213 - Precision: 0.4406 - Recall: 0.2977 - F1: 0.3553 - Accuracy: 0.8680 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 15 | 0.6246 | 0.3586 | 0.1739 | 0.2342 | 0.8469 | | No log | 2.0 | 30 | 0.5443 | 0.3785 | 0.2241 | 0.2815 | 0.8583 | | No log | 3.0 | 45 | 0.5213 | 0.4406 | 0.2977 | 0.3553 | 0.8680 | ### Framework versions - Transformers 4.33.0 - Pytorch 2.0.1 - Datasets 2.14.5 - Tokenizers 0.13.3
KnutJaegersberg/companion_cube_ggml
KnutJaegersberg
2023-09-22T10:23:24Z
0
2
null
[ "dataset:jkhedri/psychology-dataset", "dataset:Amod/mental_health_counseling_conversations", "dataset:Adapting/empathetic_dialogues_v2", "license:apache-2.0", "region:us" ]
null
2023-09-22T09:01:39Z
--- license: apache-2.0 datasets: - jkhedri/psychology-dataset - Amod/mental_health_counseling_conversations - Adapting/empathetic_dialogues_v2 --- ![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/63732ebbbd81fae2b3aaf3fb/F4Vx4EZN6zRCiRKF58Heb.jpeg) This tensor is made for entertainment, as I was exercizing how to do DPO. After I stumbled on psychotherapy datasets on huggingface, I mixed them with the empathetic dialogues datasets of meta and got interested in what happens fine tuning opennllama 7b on it. One of the datasets included accepted and rejected therapist answers, so I used that for DPO. This is not a bandaid replacement for seeking professional help. That said, here is a CPU version for the GPUless anons. You can get interesting interactions out of it with the chatmode of the textgen webui, I get 6 tokens per seconds on 8 cores. Prompt Example: ``` ### System: You are an empathetic, self-aware, open-minded psychotherapist with good listening skills. User will need your support. Your goal is to help patients cope with their problems. Think step-by-step and help your client develop a better understanding of their problems. ### User: I'm not feeling well. ### Psychotherapist: ``` ![image/png](https://cdn-uploads.huggingface.co/production/uploads/63732ebbbd81fae2b3aaf3fb/XBxNhL_xJl9htCguzA_qh.png)
HazemHM/q-Taxi-v3
HazemHM
2023-09-22T10:19:47Z
0
0
null
[ "Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T09:06:22Z
--- tags: - Taxi-v3 - q-learning - reinforcement-learning - custom-implementation model-index: - name: q-Taxi-v3 results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: Taxi-v3 type: Taxi-v3 metrics: - type: mean_reward value: 7.56 +/- 2.71 name: mean_reward verified: false --- # **Q-Learning** Agent playing1 **Taxi-v3** This is a trained model of a **Q-Learning** agent playing **Taxi-v3** . ## Usage ```python model = load_from_hub(repo_id="HazemHM/q-Taxi-v3", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
MattStammers/appo-atari-amidar
MattStammers
2023-09-22T10:06:22Z
0
0
sample-factory
[ "sample-factory", "tensorboard", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T10:04:38Z
--- library_name: sample-factory tags: - deep-reinforcement-learning - reinforcement-learning - sample-factory model-index: - name: APPO results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: atari_amidar type: atari_amidar metrics: - type: mean_reward value: 246.00 +/- 77.43 name: mean_reward verified: false --- A(n) **APPO** model trained on the **atari_amidar** environment. This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory. Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/ ## Downloading the model After installing Sample-Factory, download the model with: ``` python -m sample_factory.huggingface.load_from_hub -r MattStammers/appo-atari-amidar ``` ## Using the model To run the model after download, use the `enjoy` script corresponding to this environment: ``` python -m sf_examples.atari.enjoy_atari --algo=APPO --env=atari_amidar --train_dir=./train_dir --experiment=appo-atari-amidar ``` You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag. See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details ## Training with this model To continue training with this model, use the `train` script corresponding to this environment: ``` python -m sf_examples.atari.train_atari --algo=APPO --env=atari_amidar --train_dir=./train_dir --experiment=appo-atari-amidar --restart_behavior=resume --train_for_env_steps=10000000000 ``` Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
Vasanth/xwin-finetuned-alpaca-cleaned
Vasanth
2023-09-22T10:02:51Z
0
0
null
[ "generated_from_trainer", "base_model:TheBloke/Xwin-LM-7B-V0.1-GPTQ", "base_model:finetune:TheBloke/Xwin-LM-7B-V0.1-GPTQ", "license:llama2", "region:us" ]
null
2023-09-22T09:02:15Z
--- license: llama2 base_model: TheBloke/Xwin-LM-7B-V0.1-GPTQ tags: - generated_from_trainer model-index: - name: xwin-finetuned-alpaca-cleaned results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # xwin-finetuned-alpaca-cleaned This model is a fine-tuned version of [TheBloke/Xwin-LM-7B-V0.1-GPTQ](https://huggingface.co/TheBloke/Xwin-LM-7B-V0.1-GPTQ) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - training_steps: 250 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
EladAssia/dqn-SpaceInvadersNoFrameskip-v4
EladAssia
2023-09-22T10:00:19Z
4
0
stable-baselines3
[ "stable-baselines3", "SpaceInvadersNoFrameskip-v4", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T09:59:40Z
--- library_name: stable-baselines3 tags: - SpaceInvadersNoFrameskip-v4 - deep-reinforcement-learning - reinforcement-learning - stable-baselines3 model-index: - name: DQN results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: SpaceInvadersNoFrameskip-v4 type: SpaceInvadersNoFrameskip-v4 metrics: - type: mean_reward value: 469.00 +/- 248.28 name: mean_reward verified: false --- # **DQN** Agent playing **SpaceInvadersNoFrameskip-v4** This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3) and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo). The RL Zoo is a training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included. ## Usage (with SB3 RL Zoo) RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/> SB3: https://github.com/DLR-RM/stable-baselines3<br/> SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib Install the RL Zoo (with SB3 and SB3-Contrib): ```bash pip install rl_zoo3 ``` ``` # Download model and save it into the logs/ folder python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga EladAssia -f logs/ python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ ``` If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do: ``` python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga EladAssia -f logs/ python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ ``` ## Training (with the RL Zoo) ``` python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ # Upload the model and generate video (when possible) python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga EladAssia ``` ## Hyperparameters ```python OrderedDict([('batch_size', 32), ('buffer_size', 100000), ('env_wrapper', ['stable_baselines3.common.atari_wrappers.AtariWrapper']), ('exploration_final_eps', 0.01), ('exploration_fraction', 0.1), ('frame_stack', 4), ('gradient_steps', 1), ('learning_rate', 0.0001), ('learning_starts', 100000), ('n_timesteps', 1000000), ('optimize_memory_usage', False), ('policy', 'CnnPolicy'), ('target_update_interval', 1000), ('train_freq', 4), ('normalize', False)]) ``` # Environment Arguments ```python {'render_mode': 'rgb_array'} ```
ldos/text_shortening_model_v49
ldos
2023-09-22T09:54:27Z
105
0
transformers
[ "transformers", "pytorch", "bart", "text2text-generation", "generated_from_trainer", "base_model:facebook/bart-large-xsum", "base_model:finetune:facebook/bart-large-xsum", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text2text-generation
2023-09-22T08:28:40Z
--- license: mit base_model: facebook/bart-large-xsum tags: - generated_from_trainer metrics: - rouge model-index: - name: text_shortening_model_v49 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # text_shortening_model_v49 This model is a fine-tuned version of [facebook/bart-large-xsum](https://huggingface.co/facebook/bart-large-xsum) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.7760 - Rouge1: 0.5119 - Rouge2: 0.2768 - Rougel: 0.4448 - Rougelsum: 0.4444 - Bert precision: 0.8755 - Bert recall: 0.8801 - Average word count: 8.8492 - Max word count: 20 - Min word count: 5 - Average token count: 16.4709 - % shortened texts with length > 12: 8.7302 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Bert precision | Bert recall | Average word count | Max word count | Min word count | Average token count | % shortened texts with length > 12 | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:--------------:|:-----------:|:------------------:|:--------------:|:--------------:|:-------------------:|:----------------------------------:| | 1.8542 | 1.0 | 83 | 1.6189 | 0.5121 | 0.2699 | 0.4302 | 0.4304 | 0.863 | 0.8909 | 11.3386 | 21 | 5 | 19.4312 | 31.746 | | 0.9651 | 2.0 | 166 | 1.4837 | 0.4957 | 0.2664 | 0.4347 | 0.4362 | 0.8687 | 0.8758 | 8.8598 | 19 | 4 | 16.9815 | 9.2593 | | 0.608 | 3.0 | 249 | 1.4074 | 0.5012 | 0.2693 | 0.4346 | 0.4342 | 0.8725 | 0.8781 | 8.836 | 20 | 4 | 15.5265 | 5.5556 | | 0.3788 | 4.0 | 332 | 1.5646 | 0.5202 | 0.2836 | 0.4535 | 0.4537 | 0.876 | 0.881 | 8.9312 | 18 | 5 | 16.4365 | 10.3175 | | 0.2296 | 5.0 | 415 | 1.7760 | 0.5119 | 0.2768 | 0.4448 | 0.4444 | 0.8755 | 0.8801 | 8.8492 | 20 | 5 | 16.4709 | 8.7302 | ### Framework versions - Transformers 4.33.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
zongxiao/hubert-base-ls960-finetuned-gtzan
zongxiao
2023-09-22T09:53:07Z
159
0
transformers
[ "transformers", "pytorch", "hubert", "audio-classification", "generated_from_trainer", "dataset:marsyas/gtzan", "base_model:facebook/hubert-base-ls960", "base_model:finetune:facebook/hubert-base-ls960", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
audio-classification
2023-09-22T02:26:34Z
--- license: apache-2.0 base_model: facebook/hubert-base-ls960 tags: - generated_from_trainer datasets: - marsyas/gtzan metrics: - accuracy model-index: - name: hubert-base-ls960-finetuned-gtzan results: - task: name: Audio Classification type: audio-classification dataset: name: GTZAN type: marsyas/gtzan config: all split: train args: all metrics: - name: Accuracy type: accuracy value: 0.82 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # hubert-base-ls960-finetuned-gtzan This model is a fine-tuned version of [facebook/hubert-base-ls960](https://huggingface.co/facebook/hubert-base-ls960) on the GTZAN dataset. It achieves the following results on the evaluation set: - Loss: 0.6912 - Accuracy: 0.82 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 2.0279 | 1.0 | 112 | 2.1237 | 0.14 | | 1.8601 | 1.99 | 224 | 1.4994 | 0.55 | | 1.2448 | 3.0 | 337 | 1.2065 | 0.62 | | 1.2081 | 4.0 | 449 | 0.9849 | 0.64 | | 1.1896 | 4.99 | 561 | 0.8475 | 0.69 | | 0.6236 | 6.0 | 674 | 1.0019 | 0.73 | | 0.6113 | 6.99 | 786 | 1.0411 | 0.7 | | 0.5026 | 8.0 | 899 | 0.8096 | 0.77 | | 0.5218 | 9.0 | 1011 | 0.7381 | 0.79 | | 0.4961 | 9.97 | 1120 | 0.6912 | 0.82 | ### Framework versions - Transformers 4.34.0.dev0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.0
OpenDILabCommunity/Pendulum-v1-DDPG
OpenDILabCommunity
2023-09-22T09:47:12Z
0
0
pytorch
[ "pytorch", "deep-reinforcement-learning", "reinforcement-learning", "DI-engine", "Pendulum-v1", "en", "license:apache-2.0", "region:us" ]
reinforcement-learning
2023-04-29T12:35:40Z
--- language: en license: apache-2.0 library_name: pytorch tags: - deep-reinforcement-learning - reinforcement-learning - DI-engine - Pendulum-v1 benchmark_name: OpenAI/Gym/ClassicControl task_name: Pendulum-v1 pipeline_tag: reinforcement-learning model-index: - name: DDPG results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: OpenAI/Gym/ClassicControl-Pendulum-v1 type: OpenAI/Gym/ClassicControl-Pendulum-v1 metrics: - type: mean_reward value: -185.23 +/- 138.05 name: mean_reward --- # Play **Pendulum-v1** with **DDPG** Policy ## Model Description <!-- Provide a longer summary of what this model is. --> This is a simple **DDPG** implementation to OpenAI/Gym/ClassicControl **Pendulum-v1** using the [DI-engine library](https://github.com/opendilab/di-engine) and the [DI-zoo](https://github.com/opendilab/DI-engine/tree/main/dizoo). **DI-engine** is a python library for solving general decision intelligence problems, which is based on implementations of reinforcement learning framework using PyTorch or JAX. This library aims to standardize the reinforcement learning framework across different algorithms, benchmarks, environments, and to support both academic researches and prototype applications. Besides, self-customized training pipelines and applications are supported by reusing different abstraction levels of DI-engine reinforcement learning framework. ## Model Usage ### Install the Dependencies <details close> <summary>(Click for Details)</summary> ```shell # install huggingface_ding git clone https://github.com/opendilab/huggingface_ding.git pip3 install -e ./huggingface_ding/ # install environment dependencies if needed pip3 install DI-engine[common_env] ``` </details> ### Git Clone from Huggingface and Run the Model <details close> <summary>(Click for Details)</summary> ```shell # running with trained model python3 -u run.py ``` **run.py** ```python from ding.bonus import DDPGAgent from ding.config import Config from easydict import EasyDict import torch # Pull model from files which are git cloned from huggingface policy_state_dict = torch.load("pytorch_model.bin", map_location=torch.device("cpu")) cfg = EasyDict(Config.file_to_dict("policy_config.py").cfg_dict) # Instantiate the agent agent = DDPGAgent(env_id="Pendulum-v1", exp_name="Pendulum-v1-DDPG", cfg=cfg.exp_config, policy_state_dict=policy_state_dict) # Continue training agent.train(step=5000) # Render the new agent performance agent.deploy(enable_save_replay=True) ``` </details> ### Run Model by Using Huggingface_ding <details close> <summary>(Click for Details)</summary> ```shell # running with trained model python3 -u run.py ``` **run.py** ```python from ding.bonus import DDPGAgent from huggingface_ding import pull_model_from_hub # Pull model from Hugggingface hub policy_state_dict, cfg = pull_model_from_hub(repo_id="OpenDILabCommunity/Pendulum-v1-DDPG") # Instantiate the agent agent = DDPGAgent(env_id="Pendulum-v1", exp_name="Pendulum-v1-DDPG", cfg=cfg.exp_config, policy_state_dict=policy_state_dict) # Continue training agent.train(step=5000) # Render the new agent performance agent.deploy(enable_save_replay=True) ``` </details> ## Model Training ### Train the Model and Push to Huggingface_hub <details close> <summary>(Click for Details)</summary> ```shell #Training Your Own Agent python3 -u train.py ``` **train.py** ```python from ding.bonus import DDPGAgent from huggingface_ding import push_model_to_hub # Instantiate the agent agent = DDPGAgent(env_id="Pendulum-v1", exp_name="Pendulum-v1-DDPG") # Train the agent return_ = agent.train(step=int(4000000)) # Push model to huggingface hub push_model_to_hub( agent=agent.best, env_name="OpenAI/Gym/ClassicControl", task_name="Pendulum-v1", algo_name="DDPG", wandb_url=return_.wandb_url, github_repo_url="https://github.com/opendilab/DI-engine", github_doc_model_url="https://di-engine-docs.readthedocs.io/en/latest/12_policies/ddpg.html", github_doc_env_url="https://di-engine-docs.readthedocs.io/en/latest/13_envs/pendulum.html", installation_guide="pip3 install DI-engine[common_env]", usage_file_by_git_clone="./ddpg/pendulum_ddpg_deploy.py", usage_file_by_huggingface_ding="./ddpg/pendulum_ddpg_download.py", train_file="./ddpg/pendulum_ddpg.py", repo_id="OpenDILabCommunity/Pendulum-v1-DDPG", create_repo=False ) ``` </details> **Configuration** <details close> <summary>(Click for Details)</summary> ```python exp_config = { 'env': { 'manager': { 'episode_num': float("inf"), 'max_retry': 1, 'retry_type': 'reset', 'auto_reset': True, 'step_timeout': None, 'reset_timeout': None, 'retry_waiting_time': 0.1, 'cfg_type': 'BaseEnvManagerDict' }, 'stop_value': -250, 'n_evaluator_episode': 5, 'env_id': 'Pendulum-v1', 'collector_env_num': 8, 'evaluator_env_num': 5, 'act_scale': True }, 'policy': { 'model': { 'obs_shape': 3, 'action_shape': 1, 'twin_critic': False, 'action_space': 'regression' }, 'learn': { 'learner': { 'train_iterations': 1000000000, 'dataloader': { 'num_workers': 0 }, 'log_policy': True, 'hook': { 'load_ckpt_before_run': '', 'log_show_after_iter': 100, 'save_ckpt_after_iter': 10000, 'save_ckpt_after_run': True }, 'cfg_type': 'BaseLearnerDict' }, 'update_per_collect': 2, 'batch_size': 128, 'learning_rate_actor': 0.001, 'learning_rate_critic': 0.001, 'ignore_done': True, 'target_theta': 0.005, 'discount_factor': 0.99, 'actor_update_freq': 1, 'noise': False }, 'collect': { 'collector': { 'collect_print_freq': 1000 }, 'unroll_len': 1, 'noise_sigma': 0.1, 'n_sample': 48 }, 'eval': { 'evaluator': { 'eval_freq': 100, 'render': { 'render_freq': -1, 'mode': 'train_iter' }, 'figure_path': None, 'cfg_type': 'InteractionSerialEvaluatorDict', 'stop_value': -250, 'n_episode': 5 } }, 'other': { 'replay_buffer': { 'replay_buffer_size': 20000, 'max_use': 16 } }, 'on_policy': False, 'cuda': False, 'multi_gpu': False, 'bp_update_sync': True, 'traj_len_inf': False, 'type': 'ddpg', 'priority': False, 'priority_IS_weight': False, 'random_collect_size': 800, 'transition_with_policy_data': False, 'action_space': 'continuous', 'reward_batch_norm': False, 'multi_agent': False, 'cfg_type': 'DDPGPolicyDict' }, 'exp_name': 'Pendulum-v1-DDPG', 'seed': 0, 'wandb_logger': { 'gradient_logger': True, 'video_logger': True, 'plot_logger': True, 'action_logger': True, 'return_logger': False } } ``` </details> **Training Procedure** <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> - **Weights & Biases (wandb):** [monitor link](https://wandb.ai/zjowowen/Pendulum-v1-DDPG) ## Model Information <!-- Provide the basic links for the model. --> - **Github Repository:** [repo link](https://github.com/opendilab/DI-engine) - **Doc**: [DI-engine-docs Algorithm link](https://di-engine-docs.readthedocs.io/en/latest/12_policies/ddpg.html) - **Configuration:** [config link](https://huggingface.co/OpenDILabCommunity/Pendulum-v1-DDPG/blob/main/policy_config.py) - **Demo:** [video](https://huggingface.co/OpenDILabCommunity/Pendulum-v1-DDPG/blob/main/replay.mp4) <!-- Provide the size information for the model. --> - **Parameters total size:** 70.52 KB - **Last Update Date:** 2023-09-22 ## Environments <!-- Address questions around what environment the model is intended to be trained and deployed at, including the necessary information needed to be provided for future users. --> - **Benchmark:** OpenAI/Gym/ClassicControl - **Task:** Pendulum-v1 - **Gym version:** 0.25.1 - **DI-engine version:** v0.4.9 - **PyTorch version:** 2.0.1+cu117 - **Doc**: [DI-engine-docs Environments link](https://di-engine-docs.readthedocs.io/en/latest/13_envs/pendulum.html)
polejowska/detr-r50-cd45rb-8ah-6l-500q-corrected
polejowska
2023-09-22T09:39:34Z
170
0
transformers
[ "transformers", "pytorch", "detr", "object-detection", "dataset:polejowska/cd45rb", "endpoints_compatible", "region:us" ]
object-detection
2023-09-10T19:46:49Z
--- datasets: - polejowska/cd45rb ---
polejowska/detr-r101-cd45rb-8ah-6l-256d-4096ffn-correcetd
polejowska
2023-09-22T09:38:38Z
167
0
transformers
[ "transformers", "pytorch", "detr", "object-detection", "generated_from_trainer", "dataset:polejowska/cd45rb", "endpoints_compatible", "region:us" ]
object-detection
2023-09-09T10:06:08Z
--- tags: - generated_from_trainer datasets: - polejowska/cd45rb model-index: - name: detr-r101-cd45rb-8ah-6l-256d-4096ffn-correcetd results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr-r101-cd45rb-8ah-6l-256d-4096ffn-correcetd This model is a fine-tuned version of [](https://huggingface.co/) on the cd45rb_nan_xywh dataset. It achieves the following results on the evaluation set: - Loss: 1.5819 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:------:|:---------------:| | 2.3194 | 1.0 | 4606 | 1.7874 | | 2.3461 | 2.0 | 9212 | 1.8376 | | 2.3028 | 3.0 | 13818 | 1.9087 | | 2.2979 | 4.0 | 18424 | 1.9066 | | 2.273 | 5.0 | 23030 | 1.7689 | | 2.2372 | 6.0 | 27636 | 1.8501 | | 2.2429 | 7.0 | 32242 | 1.7625 | | 2.2066 | 8.0 | 36848 | 1.7588 | | 2.1914 | 9.0 | 41454 | 1.7062 | | 2.1553 | 10.0 | 46060 | 1.7069 | | 2.1302 | 11.0 | 50666 | 1.6863 | | 2.1321 | 12.0 | 55272 | 1.7722 | | 2.1227 | 13.0 | 59878 | 1.6639 | | 2.087 | 14.0 | 64484 | 1.6486 | | 2.0676 | 15.0 | 69090 | 1.6632 | | 2.0604 | 16.0 | 73696 | 1.6429 | | 2.0435 | 17.0 | 78302 | 1.6224 | | 2.0241 | 18.0 | 82908 | 1.6321 | | 2.0041 | 19.0 | 87514 | 1.5970 | | 1.9966 | 20.0 | 92120 | 1.5943 | | 2.0165 | 21.0 | 96726 | 1.6092 | | 2.007 | 22.0 | 101332 | 1.5986 | | 2.0009 | 23.0 | 105938 | 1.5931 | | 1.992 | 24.0 | 110544 | 1.5898 | | 1.9769 | 25.0 | 115150 | 1.5819 | ### Framework versions - Transformers 4.28.0 - Pytorch 2.0.1 - Datasets 2.12.0 - Tokenizers 0.13.3
BBBBirdIsTheWord/q-Taxi-v3
BBBBirdIsTheWord
2023-09-22T09:35:05Z
0
0
null
[ "Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T07:41:15Z
--- tags: - Taxi-v3 - q-learning - reinforcement-learning - custom-implementation model-index: - name: q-Taxi-v3 results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: Taxi-v3 type: Taxi-v3 metrics: - type: mean_reward value: 7.56 +/- 2.71 name: mean_reward verified: false --- # **Q-Learning** Agent playing1 **Taxi-v3** This is a trained model of a **Q-Learning** agent playing **Taxi-v3** . ## Usage ```python model = load_from_hub(repo_id="BBBBirdIsTheWord/q-Taxi-v3", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
THUDM/MathGLM
THUDM
2023-09-22T09:19:09Z
1
8
null
[ "arxiv:2309.03241", "license:afl-3.0", "region:us" ]
null
2023-09-21T20:25:06Z
--- license: afl-3.0 --- ## Model description MathGLM-10B is finetuned from GLM-10B on a dataset with additional multi-step arithmetic operations and math problems described in text, achieves similar performance to GPT-4 on a 5,000-samples Chinese math problem test set. ## How to use First, you shoud run the following command to pip sat. ``` pip install SwissArmyTransformer ``` Second, you run the inference code to evaluate our MathGLM-10B. ``` bash inference.sh ``` ## Citation Please cite our paper if you find this code useful for your research: ``` @article{yang2023gpt, title={GPT Can Solve Mathematical Problems Without a Calculator}, author={Yang, Zhen and Ding, Ming and Lv, Qingsong and Jiang, Zhihuan and He, Zehai and Guo, Yuyi and Bai, Jinfeng and Tang, Jie}, journal={arXiv preprint arXiv:2309.03241}, year={2023} } ```
huytx267/matching_default
huytx267
2023-09-22T09:18:20Z
12
0
sentence-transformers
[ "sentence-transformers", "pytorch", "distilbert", "feature-extraction", "sentence-similarity", "transformers", "multilingual", "arxiv:1908.10084", "license:apache-2.0", "autotrain_compatible", "text-embeddings-inference", "endpoints_compatible", "region:us" ]
sentence-similarity
2023-09-22T09:09:15Z
--- pipeline_tag: sentence-similarity language: multilingual license: apache-2.0 tags: - sentence-transformers - feature-extraction - sentence-similarity - transformers --- # sentence-transformers/distiluse-base-multilingual-cased-v2 This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 512 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('sentence-transformers/distiluse-base-multilingual-cased-v2') embeddings = model.encode(sentences) print(embeddings) ``` ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/distiluse-base-multilingual-cased-v2) ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: DistilBertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False}) (2): Dense({'in_features': 768, 'out_features': 512, 'bias': True, 'activation_function': 'torch.nn.modules.activation.Tanh'}) ) ``` ## Citing & Authors This model was trained by [sentence-transformers](https://www.sbert.net/). If you find this model helpful, feel free to cite our publication [Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks](https://arxiv.org/abs/1908.10084): ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "http://arxiv.org/abs/1908.10084", } ```
yicozy/study-dictionary-roberta-base
yicozy
2023-09-22T09:11:15Z
109
0
transformers
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "base_model:FacebookAI/roberta-base", "base_model:finetune:FacebookAI/roberta-base", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-21T08:54:02Z
--- license: mit base_model: roberta-base tags: - generated_from_trainer metrics: - f1 - accuracy - recall model-index: - name: study-dictionary-roberta-base results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # study-dictionary-roberta-base This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0011 - F1: 1.0 - Roc Auc: 1.0 - Accuracy: 1.0 - Recall: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 10 - eval_batch_size: 10 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | Roc Auc | Accuracy | Recall | |:-------------:|:-----:|:-----:|:---------------:|:------:|:-------:|:--------:|:------:| | 0.3342 | 1.0 | 778 | 0.1192 | 0.0 | 0.5 | 0.0 | 0.0 | | 0.1099 | 2.0 | 1556 | 0.1040 | 0.0 | 0.5 | 0.0 | 0.0 | | 0.0892 | 3.0 | 2334 | 0.0465 | 0.6835 | 0.7644 | 0.5479 | 0.5293 | | 0.0345 | 4.0 | 3112 | 0.0240 | 0.9147 | 0.9241 | 0.8817 | 0.8485 | | 0.025 | 5.0 | 3890 | 0.0152 | 0.9594 | 0.9650 | 0.9493 | 0.9303 | | 0.0144 | 6.0 | 4668 | 0.0114 | 0.9735 | 0.9811 | 0.9671 | 0.9625 | | 0.0118 | 7.0 | 5446 | 0.0082 | 0.9779 | 0.9848 | 0.9717 | 0.9700 | | 0.0081 | 8.0 | 6224 | 0.0057 | 0.9873 | 0.9887 | 0.9839 | 0.9774 | | 0.0065 | 9.0 | 7002 | 0.0052 | 0.9839 | 0.9860 | 0.9848 | 0.9720 | | 0.0054 | 10.0 | 7780 | 0.0039 | 0.9895 | 0.9904 | 0.9888 | 0.9809 | | 0.0041 | 11.0 | 8558 | 0.0030 | 0.9942 | 0.9949 | 0.9925 | 0.9899 | | 0.0036 | 12.0 | 9336 | 0.0026 | 0.9936 | 0.9940 | 0.9942 | 0.9881 | | 0.0027 | 13.0 | 10114 | 0.0023 | 0.9956 | 0.9964 | 0.9958 | 0.9927 | | 0.0023 | 14.0 | 10892 | 0.0018 | 0.9985 | 0.9986 | 0.9972 | 0.9972 | | 0.0021 | 15.0 | 11670 | 0.0017 | 0.9985 | 0.9994 | 0.9974 | 0.9988 | | 0.0018 | 16.0 | 12448 | 0.0015 | 0.9985 | 0.9992 | 0.9979 | 0.9985 | | 0.0014 | 17.0 | 13226 | 0.0012 | 0.9997 | 0.9998 | 0.9994 | 0.9995 | | 0.0013 | 18.0 | 14004 | 0.0011 | 1.0 | 1.0 | 1.0 | 1.0 | | 0.0012 | 19.0 | 14782 | 0.0010 | 1.0 | 1.0 | 1.0 | 1.0 | | 0.0012 | 20.0 | 15560 | 0.0010 | 1.0 | 1.0 | 1.0 | 1.0 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
LyaaaaaGames/gpt4all-j-halved
LyaaaaaGames
2023-09-22T09:05:45Z
12
0
transformers
[ "transformers", "pytorch", "gptj", "text-generation", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-generation
2023-09-22T08:49:19Z
--- license: apache-2.0 --- This model is a cut in half and sharded version of the original https://huggingface.co/nomic-ai/gpt4all-j
MattStammers/appo-atari-alien
MattStammers
2023-09-22T09:02:36Z
0
0
sample-factory
[ "sample-factory", "tensorboard", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T09:02:29Z
--- library_name: sample-factory tags: - deep-reinforcement-learning - reinforcement-learning - sample-factory model-index: - name: APPO results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: atari_alien type: atari_alien metrics: - type: mean_reward value: 1123.00 +/- 251.48 name: mean_reward verified: false --- A(n) **APPO** model trained on the **atari_alien** environment. This model was trained using Sample-Factory 2.0: https://github.com/alex-petrenko/sample-factory. Documentation for how to use Sample-Factory can be found at https://www.samplefactory.dev/ ## Downloading the model After installing Sample-Factory, download the model with: ``` python -m sample_factory.huggingface.load_from_hub -r MattStammers/appo-atari-alien ``` ## Using the model To run the model after download, use the `enjoy` script corresponding to this environment: ``` python -m sf_examples.atari.enjoy_atari --algo=APPO --env=atari_alien --train_dir=./train_dir --experiment=appo-atari-alien ``` You can also upload models to the Hugging Face Hub using the same script with the `--push_to_hub` flag. See https://www.samplefactory.dev/10-huggingface/huggingface/ for more details ## Training with this model To continue training with this model, use the `train` script corresponding to this environment: ``` python -m sf_examples.atari.train_atari --algo=APPO --env=atari_alien --train_dir=./train_dir --experiment=appo-atari-alien --restart_behavior=resume --train_for_env_steps=10000000000 ``` Note, you may have to adjust `--train_for_env_steps` to a suitably high number as the experiment will resume at the number of steps it concluded at.
tomasito12/Reinforce-1
tomasito12
2023-09-22T09:00:14Z
0
0
null
[ "CartPole-v1", "reinforce", "reinforcement-learning", "custom-implementation", "deep-rl-class", "model-index", "region:us" ]
reinforcement-learning
2023-09-06T09:25:34Z
--- tags: - CartPole-v1 - reinforce - reinforcement-learning - custom-implementation - deep-rl-class model-index: - name: reinforce-1 results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: CartPole-v1 type: CartPole-v1 metrics: - type: mean_reward value: 500.00 +/- 0.00 name: mean_reward verified: false --- # **Reinforce** Agent playing **CartPole-v1** This is a trained model of a **Reinforce** agent playing **CartPole-v1** . To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
Johnkimmyshinmmy/e
Johnkimmyshinmmy
2023-09-22T08:57:25Z
1
0
diffusers
[ "diffusers", "tensorboard", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "lora", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-22T08:49:17Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 instance_prompt: a photo of aszx interior tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA DreamBooth - Johnkimmyshinmmy/e These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on a photo of aszx interior using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. ![img_0](./image_0.png) ![img_1](./image_1.png) ![img_2](./image_2.png) ![img_3](./image_3.png) LoRA for the text encoder was enabled: False.
CyberHarem/shiipon_akibameidosensou
CyberHarem
2023-09-22T08:52:21Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/shiipon_akibameidosensou", "license:mit", "region:us" ]
text-to-image
2023-09-22T08:39:14Z
--- license: mit datasets: - CyberHarem/shiipon_akibameidosensou pipeline_tag: text-to-image tags: - art --- # Lora of shiipon_akibameidosensou This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 3740, you need to download `3740/shiipon_akibameidosensou.pt` as the embedding and `3740/shiipon_akibameidosensou.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 3740**, with the score of 0.991. The trigger words are: 1. `shiipon_akibameidosensou` 2. `blonde_hair, long_hair, brown_eyes, maid_headdress, maid, mole_under_eye, mole, bow, animal_ears, pink_bow, apron, ponytail, maid_apron, blush, fake_animal_ears, pig_ears` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:--------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.988 | [Download](5100/shiipon_akibameidosensou.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![pattern_3-5100](5100/previews/pattern_3.png) | ![pattern_4-5100](5100/previews/pattern_4.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.984 | [Download](4760/shiipon_akibameidosensou.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![pattern_3-4760](4760/previews/pattern_3.png) | ![pattern_4-4760](4760/previews/pattern_4.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.986 | [Download](4420/shiipon_akibameidosensou.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![pattern_3-4420](4420/previews/pattern_3.png) | ![pattern_4-4420](4420/previews/pattern_4.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.988 | [Download](4080/shiipon_akibameidosensou.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![pattern_3-4080](4080/previews/pattern_3.png) | ![pattern_4-4080](4080/previews/pattern_4.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | **3740** | **0.991** | [**Download**](3740/shiipon_akibameidosensou.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![pattern_3-3740](3740/previews/pattern_3.png) | ![pattern_4-3740](3740/previews/pattern_4.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.915 | [Download](3400/shiipon_akibameidosensou.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![pattern_3-3400](3400/previews/pattern_3.png) | ![pattern_4-3400](3400/previews/pattern_4.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.986 | [Download](3060/shiipon_akibameidosensou.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![pattern_3-3060](3060/previews/pattern_3.png) | ![pattern_4-3060](3060/previews/pattern_4.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.988 | [Download](2720/shiipon_akibameidosensou.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![pattern_3-2720](2720/previews/pattern_3.png) | ![pattern_4-2720](2720/previews/pattern_4.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.986 | [Download](2380/shiipon_akibameidosensou.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![pattern_3-2380](2380/previews/pattern_3.png) | ![pattern_4-2380](2380/previews/pattern_4.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.988 | [Download](2040/shiipon_akibameidosensou.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![pattern_3-2040](2040/previews/pattern_3.png) | ![pattern_4-2040](2040/previews/pattern_4.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.982 | [Download](1700/shiipon_akibameidosensou.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![pattern_3-1700](1700/previews/pattern_3.png) | ![pattern_4-1700](1700/previews/pattern_4.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.969 | [Download](1360/shiipon_akibameidosensou.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![pattern_3-1360](1360/previews/pattern_3.png) | ![pattern_4-1360](1360/previews/pattern_4.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.888 | [Download](1020/shiipon_akibameidosensou.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![pattern_3-1020](1020/previews/pattern_3.png) | ![pattern_4-1020](1020/previews/pattern_4.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.928 | [Download](680/shiipon_akibameidosensou.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![pattern_3-680](680/previews/pattern_3.png) | ![pattern_4-680](680/previews/pattern_4.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.533 | [Download](340/shiipon_akibameidosensou.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![pattern_3-340](340/previews/pattern_3.png) | ![pattern_4-340](340/previews/pattern_4.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
lotfyhussein/btc-tweet-sentiment2
lotfyhussein
2023-09-22T08:46:24Z
1
0
peft
[ "peft", "region:us" ]
null
2023-09-22T08:45:38Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: True - load_in_4bit: False - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.6.0.dev0
tomaarsen/span-marker-roberta-large-ontonotes5
tomaarsen
2023-09-22T08:45:26Z
324
12
span-marker
[ "span-marker", "pytorch", "safetensors", "token-classification", "ner", "named-entity-recognition", "en", "dataset:tner/ontonotes5", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
token-classification
2023-06-10T15:28:36Z
--- license: apache-2.0 library_name: span-marker tags: - span-marker - token-classification - ner - named-entity-recognition pipeline_tag: token-classification widget: - text: >- Amelia Earhart flew her single engine Lockheed Vega 5B across the Atlantic to Paris. example_title: Amelia Earhart - text: >- Leonardo di ser Piero da Vinci painted the Mona Lisa based on Italian noblewoman Lisa del Giocondo. example_title: Leonardo da Vinci - text: >- On June 13th, 2014, at 4:44 pm during the 2014 World Cup held in Salvador, Brazil, the legendary soccer player, Robin van Persie, representing the Dutch national team, scored a remarkable goal in the 44th minute. example_title: Robin van Persie model-index: - name: >- SpanMarker w. roberta-large on OntoNotes v5.0 by Tom Aarsen results: - task: type: token-classification name: Named Entity Recognition dataset: type: tner/ontonotes5 name: OntoNotes v5.0 split: test revision: cf9ef57ad260810be1298ba795d83c09a915e959 metrics: - type: f1 value: 0.9153 name: F1 - type: precision value: 0.9116 name: Precision - type: recall value: 0.9191 name: Recall datasets: - tner/ontonotes5 language: - en metrics: - f1 - recall - precision --- # SpanMarker for Named Entity Recognition This is a [SpanMarker](https://github.com/tomaarsen/SpanMarkerNER) model that can be used for Named Entity Recognition. In particular, this SpanMarker model uses [roberta-large](https://huggingface.co/roberta-large) as the underlying encoder. See [train.py](train.py) for the training script. ## Usage To use this model for inference, first install the `span_marker` library: ```bash pip install span_marker ``` You can then run inference with this model like so: ```python from span_marker import SpanMarkerModel # Download from the 🤗 Hub model = SpanMarkerModel.from_pretrained("tomaarsen/span-marker-roberta-large-ontonotes5") # Run inference entities = model.predict("Amelia Earhart flew her single engine Lockheed Vega 5B across the Atlantic to Paris.") ``` ### Limitations **Warning**: This model works best when punctuation is separated from the prior words, so ```python # ✅ model.predict("He plays J. Robert Oppenheimer , an American theoretical physicist .") # ❌ model.predict("He plays J. Robert Oppenheimer, an American theoretical physicist.") # You can also supply a list of words directly: ✅ model.predict(["He", "plays", "J.", "Robert", "Oppenheimer", ",", "an", "American", "theoretical", "physicist", "."]) ``` The same may be beneficial for some languages, such as splitting `"l'ocean Atlantique"` into `"l' ocean Atlantique"`. See the [SpanMarker](https://github.com/tomaarsen/SpanMarkerNER) repository for documentation and additional information on this library.
ICBU-NPU/FashionGPT-70B-V1.1
ICBU-NPU
2023-09-22T08:32:59Z
1,468
43
transformers
[ "transformers", "pytorch", "llama", "text-generation", "en", "dataset:Open-Orca/OpenOrca", "dataset:openchat/openchat_sharegpt4_dataset", "dataset:LDJnr/Puffin", "dataset:ehartford/samantha-data", "dataset:OpenAssistant/oasst1", "dataset:jondurbin/airoboros-gpt4-1.4.1", "arxiv:2306.02707", "arxiv:2305.14314", "license:llama2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-17T08:53:56Z
--- language: - en library_name: transformers license: llama2 datasets: - Open-Orca/OpenOrca - openchat/openchat_sharegpt4_dataset - LDJnr/Puffin - ehartford/samantha-data - OpenAssistant/oasst1 - jondurbin/airoboros-gpt4-1.4.1 --- # FashionGPT-V1.1 ### Introduction This is a llama-2-70B model combined with multiple adapters via appropriate methods. <br> ### Dataset Here is the list of datasets used: * Orca-style 40K dataset. This dataset is a filtered subset of [OpenOrca-GPT4](<https://huggingface.co/datasets/Open-Orca/OpenOrca/blob/main/1M-GPT4-Augmented.parquet>) and [airoboros-gpt4-1.4.1](<https://huggingface.co/datasets/jondurbin/airoboros-gpt4-1.4.1>). * [Samantha](<https://huggingface.co/datasets/ehartford/samantha-data>) made by Eric Hartford and cleaned by us, about 6.5K samples. * [oasst1](<https://huggingface.co/datasets/OpenAssistant/oasst1>) cleaned by us, containing about 80K samples. * Misconception data generated using misconception data generator in [airoboros_repo](<https://github.com/jondurbin/airoboros>), about 0.5K samples. * GPT-4 Multi-turn Conversations. This dataset is a filtered mixture of [openchat sharegpt4](https://huggingface.co/datasets/openchat/openchat_sharegpt4_dataset/) and [Puffin](<https://huggingface.co/datasets/LDJnr/Puffin>), containing about 8K samples. <br> ### Training * We train our adapters with [jondurbin's forked QLoRA repo](<https://github.com/jondurbin/qlora>) * We add multi-turn conversational data support from [fastchat](<https://github.com/lm-sys/FastChat/blob/main/fastchat/train/train.py>), with minor modifications. * We use bash shell script similar to [airoboros-70b-gpt4-1.4.1](<https://gist.github.com/jondurbin/87fc040b92a3073125ed516b04bc6e19>) to train our adapters. * We combine multiple adapters to llama-2-70B with more novel strategy than our [v1 model](<https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1>). The details of combining multiple adapters will be unveiled in our upcoming paper. <br> ### Prompt Template ``` ### System: {System} ### User: {User} ### Assistant: {Assistant} ``` <br> ### Evaluation | Metric | Value | |-----------------------|-------| | ARC (25-shot) | 71.76 | | HellaSwag (10-shot) | 88.20 | | MMLU (5-shot) | 70.99 | | TruthfulQA (0-shot) | 65.26 | | Avg. | 74.05 | <br> ### license disclaimer This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind. <br> ### Limitations & Biases Llama 2 and fine-tuned variants are a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2 and any fine-tuned varient's potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2 variants, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at <https://ai.meta.com/llama/responsible-use-guide/> <br> ### Citiation: * airoboros: <https://github.com/jondurbin/airoboros> * samantha: <https://erichartford.com/meet-samantha> ```bibtex @misc{mukherjee2023orca, title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4}, author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah}, year={2023}, eprint={2306.02707}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ```bibtex @article{dettmers2023qlora, title={QLoRA: Efficient Finetuning of Quantized LLMs}, author={Dettmers, Tim and Pagnoni, Artidoro and Holtzman, Ari and Zettlemoyer, Luke}, journal={arXiv preprint arXiv:2305.14314}, year={2023} } ``` ```bibtex @software{touvron2023llama2, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava, Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller, Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez Madian Khabsa, Isabel Kloumann, Artem Korenev, Punit Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov, Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric Michael Smith, Ranjan Subramanian, Xiaoqing Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian Xiang Kuan, Puxin Xu , Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan, Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, Thomas Scialom}, year={2023} } ``` ```bibtex @software{openchat, title = {{OpenChat: Advancing Open-source Language Models with Imperfect Data}}, author = {Wang, Guan and Cheng, Sijie and Yu, Qiying and Liu, Changling}, doi = {10.5281/zenodo.8105775}, url = {https://github.com/imoneoi/openchat}, version = {pre-release}, year = {2023}, month = {7}, } ```
teachyourselfcoding/llama-2-13b-22sep
teachyourselfcoding
2023-09-22T08:15:13Z
0
0
null
[ "region:us" ]
null
2023-09-22T08:11:09Z
## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: True - load_in_4bit: False - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.6.0.dev0
jerichosiahaya/vits-tts-id
jerichosiahaya
2023-09-22T08:02:52Z
153
7
transformers
[ "transformers", "text-generation-inference", "text-to-speech", "id", "license:mpl-2.0", "endpoints_compatible", "region:us" ]
text-to-speech
2023-09-11T06:56:33Z
--- license: mpl-2.0 language: - id library_name: transformers tags: - text-generation-inference - text-to-speech --- This is Indonesian TTS trained using VITS model on LJSpeech datasets
Niraya666/swin-tiny-patch4-window7-224-finetuned-ADC-3cls-0922
Niraya666
2023-09-22T07:55:08Z
213
0
transformers
[ "transformers", "pytorch", "swin", "image-classification", "generated_from_trainer", "dataset:imagefolder", "base_model:microsoft/swin-tiny-patch4-window7-224", "base_model:finetune:microsoft/swin-tiny-patch4-window7-224", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2023-09-22T07:37:09Z
--- license: apache-2.0 base_model: microsoft/swin-tiny-patch4-window7-224 tags: - generated_from_trainer datasets: - imagefolder metrics: - accuracy model-index: - name: swin-tiny-patch4-window7-224-finetuned-ADC-3cls-0922 results: - task: name: Image Classification type: image-classification dataset: name: imagefolder type: imagefolder config: default split: test args: default metrics: - name: Accuracy type: accuracy value: 0.8285714285714286 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-ADC-3cls-0922 This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.6771 - Accuracy: 0.8286 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.2 - num_epochs: 200 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 2 | 0.6875 | 0.8143 | | No log | 2.0 | 4 | 0.6874 | 0.8143 | | No log | 3.0 | 6 | 0.6873 | 0.8143 | | No log | 4.0 | 8 | 0.6871 | 0.8143 | | 0.7555 | 5.0 | 10 | 0.6869 | 0.8143 | | 0.7555 | 6.0 | 12 | 0.6866 | 0.8143 | | 0.7555 | 7.0 | 14 | 0.6862 | 0.8143 | | 0.7555 | 8.0 | 16 | 0.6858 | 0.8143 | | 0.7555 | 9.0 | 18 | 0.6853 | 0.8143 | | 0.7576 | 10.0 | 20 | 0.6848 | 0.8143 | | 0.7576 | 11.0 | 22 | 0.6842 | 0.8143 | | 0.7576 | 12.0 | 24 | 0.6836 | 0.8143 | | 0.7576 | 13.0 | 26 | 0.6830 | 0.8143 | | 0.7576 | 14.0 | 28 | 0.6823 | 0.8143 | | 0.769 | 15.0 | 30 | 0.6816 | 0.8 | | 0.769 | 16.0 | 32 | 0.6808 | 0.8 | | 0.769 | 17.0 | 34 | 0.6800 | 0.8143 | | 0.769 | 18.0 | 36 | 0.6791 | 0.8143 | | 0.769 | 19.0 | 38 | 0.6781 | 0.8143 | | 0.7564 | 20.0 | 40 | 0.6771 | 0.8286 | | 0.7564 | 21.0 | 42 | 0.6760 | 0.8143 | | 0.7564 | 22.0 | 44 | 0.6748 | 0.8143 | | 0.7564 | 23.0 | 46 | 0.6737 | 0.8 | | 0.7564 | 24.0 | 48 | 0.6725 | 0.8 | | 0.7508 | 25.0 | 50 | 0.6713 | 0.8143 | | 0.7508 | 26.0 | 52 | 0.6701 | 0.8143 | | 0.7508 | 27.0 | 54 | 0.6689 | 0.8143 | | 0.7508 | 28.0 | 56 | 0.6674 | 0.8143 | | 0.7508 | 29.0 | 58 | 0.6660 | 0.8143 | | 0.747 | 30.0 | 60 | 0.6646 | 0.8143 | | 0.747 | 31.0 | 62 | 0.6631 | 0.8143 | | 0.747 | 32.0 | 64 | 0.6616 | 0.8143 | | 0.747 | 33.0 | 66 | 0.6601 | 0.8143 | | 0.747 | 34.0 | 68 | 0.6586 | 0.8143 | | 0.7343 | 35.0 | 70 | 0.6570 | 0.8143 | | 0.7343 | 36.0 | 72 | 0.6553 | 0.8143 | | 0.7343 | 37.0 | 74 | 0.6536 | 0.8143 | | 0.7343 | 38.0 | 76 | 0.6517 | 0.8143 | | 0.7343 | 39.0 | 78 | 0.6499 | 0.8143 | | 0.7532 | 40.0 | 80 | 0.6480 | 0.8143 | | 0.7532 | 41.0 | 82 | 0.6461 | 0.8143 | | 0.7532 | 42.0 | 84 | 0.6442 | 0.8143 | | 0.7532 | 43.0 | 86 | 0.6423 | 0.8143 | | 0.7532 | 44.0 | 88 | 0.6405 | 0.8143 | | 0.7239 | 45.0 | 90 | 0.6387 | 0.8143 | | 0.7239 | 46.0 | 92 | 0.6368 | 0.8143 | | 0.7239 | 47.0 | 94 | 0.6352 | 0.8143 | | 0.7239 | 48.0 | 96 | 0.6337 | 0.8143 | | 0.7239 | 49.0 | 98 | 0.6321 | 0.8286 | | 0.7085 | 50.0 | 100 | 0.6307 | 0.8286 | | 0.7085 | 51.0 | 102 | 0.6294 | 0.8286 | | 0.7085 | 52.0 | 104 | 0.6278 | 0.8286 | | 0.7085 | 53.0 | 106 | 0.6263 | 0.8286 | | 0.7085 | 54.0 | 108 | 0.6248 | 0.8143 | | 0.7203 | 55.0 | 110 | 0.6233 | 0.8143 | | 0.7203 | 56.0 | 112 | 0.6219 | 0.8143 | | 0.7203 | 57.0 | 114 | 0.6205 | 0.8143 | | 0.7203 | 58.0 | 116 | 0.6191 | 0.8143 | | 0.7203 | 59.0 | 118 | 0.6179 | 0.8143 | | 0.7136 | 60.0 | 120 | 0.6167 | 0.8143 | | 0.7136 | 61.0 | 122 | 0.6157 | 0.8143 | | 0.7136 | 62.0 | 124 | 0.6148 | 0.8 | | 0.7136 | 63.0 | 126 | 0.6138 | 0.8 | | 0.7136 | 64.0 | 128 | 0.6125 | 0.8 | | 0.7123 | 65.0 | 130 | 0.6111 | 0.8 | | 0.7123 | 66.0 | 132 | 0.6096 | 0.8143 | | 0.7123 | 67.0 | 134 | 0.6083 | 0.8143 | | 0.7123 | 68.0 | 136 | 0.6070 | 0.8143 | | 0.7123 | 69.0 | 138 | 0.6057 | 0.8143 | | 0.7076 | 70.0 | 140 | 0.6046 | 0.8143 | | 0.7076 | 71.0 | 142 | 0.6035 | 0.8143 | | 0.7076 | 72.0 | 144 | 0.6023 | 0.8143 | | 0.7076 | 73.0 | 146 | 0.6011 | 0.8143 | | 0.7076 | 74.0 | 148 | 0.5999 | 0.8143 | | 0.6878 | 75.0 | 150 | 0.5988 | 0.8143 | | 0.6878 | 76.0 | 152 | 0.5975 | 0.8143 | | 0.6878 | 77.0 | 154 | 0.5964 | 0.8143 | | 0.6878 | 78.0 | 156 | 0.5953 | 0.8143 | | 0.6878 | 79.0 | 158 | 0.5942 | 0.8143 | | 0.6657 | 80.0 | 160 | 0.5932 | 0.8143 | | 0.6657 | 81.0 | 162 | 0.5923 | 0.8143 | | 0.6657 | 82.0 | 164 | 0.5914 | 0.8143 | | 0.6657 | 83.0 | 166 | 0.5906 | 0.8143 | | 0.6657 | 84.0 | 168 | 0.5897 | 0.8143 | | 0.6434 | 85.0 | 170 | 0.5888 | 0.8143 | | 0.6434 | 86.0 | 172 | 0.5878 | 0.8143 | | 0.6434 | 87.0 | 174 | 0.5868 | 0.8143 | | 0.6434 | 88.0 | 176 | 0.5859 | 0.8143 | | 0.6434 | 89.0 | 178 | 0.5851 | 0.8143 | | 0.6825 | 90.0 | 180 | 0.5843 | 0.8143 | | 0.6825 | 91.0 | 182 | 0.5836 | 0.8143 | | 0.6825 | 92.0 | 184 | 0.5828 | 0.8143 | | 0.6825 | 93.0 | 186 | 0.5823 | 0.8143 | | 0.6825 | 94.0 | 188 | 0.5817 | 0.8286 | | 0.6695 | 95.0 | 190 | 0.5809 | 0.8143 | | 0.6695 | 96.0 | 192 | 0.5801 | 0.8143 | | 0.6695 | 97.0 | 194 | 0.5793 | 0.8143 | | 0.6695 | 98.0 | 196 | 0.5787 | 0.8143 | | 0.6695 | 99.0 | 198 | 0.5780 | 0.8143 | | 0.6672 | 100.0 | 200 | 0.5772 | 0.8143 | | 0.6672 | 101.0 | 202 | 0.5762 | 0.8143 | | 0.6672 | 102.0 | 204 | 0.5754 | 0.8143 | | 0.6672 | 103.0 | 206 | 0.5746 | 0.8143 | | 0.6672 | 104.0 | 208 | 0.5738 | 0.8143 | | 0.6569 | 105.0 | 210 | 0.5731 | 0.8143 | | 0.6569 | 106.0 | 212 | 0.5724 | 0.8143 | | 0.6569 | 107.0 | 214 | 0.5716 | 0.8143 | | 0.6569 | 108.0 | 216 | 0.5708 | 0.8143 | | 0.6569 | 109.0 | 218 | 0.5701 | 0.8143 | | 0.6748 | 110.0 | 220 | 0.5694 | 0.8143 | | 0.6748 | 111.0 | 222 | 0.5687 | 0.8143 | | 0.6748 | 112.0 | 224 | 0.5680 | 0.8143 | | 0.6748 | 113.0 | 226 | 0.5674 | 0.8143 | | 0.6748 | 114.0 | 228 | 0.5668 | 0.8143 | | 0.6388 | 115.0 | 230 | 0.5662 | 0.8143 | | 0.6388 | 116.0 | 232 | 0.5657 | 0.8143 | | 0.6388 | 117.0 | 234 | 0.5652 | 0.8143 | | 0.6388 | 118.0 | 236 | 0.5648 | 0.8286 | | 0.6388 | 119.0 | 238 | 0.5645 | 0.8286 | | 0.6551 | 120.0 | 240 | 0.5641 | 0.8286 | | 0.6551 | 121.0 | 242 | 0.5636 | 0.8143 | | 0.6551 | 122.0 | 244 | 0.5631 | 0.8143 | | 0.6551 | 123.0 | 246 | 0.5627 | 0.8143 | | 0.6551 | 124.0 | 248 | 0.5624 | 0.8143 | | 0.6452 | 125.0 | 250 | 0.5622 | 0.8143 | | 0.6452 | 126.0 | 252 | 0.5620 | 0.8143 | | 0.6452 | 127.0 | 254 | 0.5618 | 0.8143 | | 0.6452 | 128.0 | 256 | 0.5615 | 0.8143 | | 0.6452 | 129.0 | 258 | 0.5613 | 0.8143 | | 0.645 | 130.0 | 260 | 0.5611 | 0.8143 | | 0.645 | 131.0 | 262 | 0.5608 | 0.8143 | | 0.645 | 132.0 | 264 | 0.5606 | 0.8143 | | 0.645 | 133.0 | 266 | 0.5602 | 0.8143 | | 0.645 | 134.0 | 268 | 0.5596 | 0.8143 | | 0.629 | 135.0 | 270 | 0.5590 | 0.8143 | | 0.629 | 136.0 | 272 | 0.5582 | 0.8143 | | 0.629 | 137.0 | 274 | 0.5576 | 0.8143 | | 0.629 | 138.0 | 276 | 0.5571 | 0.8143 | | 0.629 | 139.0 | 278 | 0.5568 | 0.8143 | | 0.7126 | 140.0 | 280 | 0.5565 | 0.8143 | | 0.7126 | 141.0 | 282 | 0.5563 | 0.8143 | | 0.7126 | 142.0 | 284 | 0.5561 | 0.8143 | | 0.7126 | 143.0 | 286 | 0.5559 | 0.8143 | | 0.7126 | 144.0 | 288 | 0.5555 | 0.8143 | | 0.669 | 145.0 | 290 | 0.5552 | 0.8143 | | 0.669 | 146.0 | 292 | 0.5547 | 0.8143 | | 0.669 | 147.0 | 294 | 0.5542 | 0.8143 | | 0.669 | 148.0 | 296 | 0.5538 | 0.8143 | | 0.669 | 149.0 | 298 | 0.5534 | 0.8143 | | 0.6481 | 150.0 | 300 | 0.5530 | 0.8143 | | 0.6481 | 151.0 | 302 | 0.5526 | 0.8143 | | 0.6481 | 152.0 | 304 | 0.5522 | 0.8143 | | 0.6481 | 153.0 | 306 | 0.5519 | 0.8143 | | 0.6481 | 154.0 | 308 | 0.5515 | 0.8143 | | 0.6211 | 155.0 | 310 | 0.5510 | 0.8143 | | 0.6211 | 156.0 | 312 | 0.5506 | 0.8143 | | 0.6211 | 157.0 | 314 | 0.5502 | 0.8143 | | 0.6211 | 158.0 | 316 | 0.5499 | 0.8143 | | 0.6211 | 159.0 | 318 | 0.5496 | 0.8143 | | 0.6458 | 160.0 | 320 | 0.5492 | 0.8286 | | 0.6458 | 161.0 | 322 | 0.5490 | 0.8143 | | 0.6458 | 162.0 | 324 | 0.5488 | 0.8143 | | 0.6458 | 163.0 | 326 | 0.5486 | 0.8143 | | 0.6458 | 164.0 | 328 | 0.5484 | 0.8143 | | 0.6317 | 165.0 | 330 | 0.5481 | 0.8143 | | 0.6317 | 166.0 | 332 | 0.5479 | 0.8286 | | 0.6317 | 167.0 | 334 | 0.5476 | 0.8286 | | 0.6317 | 168.0 | 336 | 0.5473 | 0.8286 | | 0.6317 | 169.0 | 338 | 0.5471 | 0.8286 | | 0.6154 | 170.0 | 340 | 0.5470 | 0.8286 | | 0.6154 | 171.0 | 342 | 0.5468 | 0.8286 | | 0.6154 | 172.0 | 344 | 0.5466 | 0.8286 | | 0.6154 | 173.0 | 346 | 0.5464 | 0.8286 | | 0.6154 | 174.0 | 348 | 0.5462 | 0.8286 | | 0.6323 | 175.0 | 350 | 0.5460 | 0.8286 | | 0.6323 | 176.0 | 352 | 0.5459 | 0.8286 | | 0.6323 | 177.0 | 354 | 0.5457 | 0.8286 | | 0.6323 | 178.0 | 356 | 0.5456 | 0.8286 | | 0.6323 | 179.0 | 358 | 0.5455 | 0.8286 | | 0.6331 | 180.0 | 360 | 0.5453 | 0.8286 | | 0.6331 | 181.0 | 362 | 0.5452 | 0.8286 | | 0.6331 | 182.0 | 364 | 0.5451 | 0.8286 | | 0.6331 | 183.0 | 366 | 0.5449 | 0.8286 | | 0.6331 | 184.0 | 368 | 0.5448 | 0.8286 | | 0.6333 | 185.0 | 370 | 0.5447 | 0.8286 | | 0.6333 | 186.0 | 372 | 0.5447 | 0.8286 | | 0.6333 | 187.0 | 374 | 0.5446 | 0.8286 | | 0.6333 | 188.0 | 376 | 0.5445 | 0.8286 | | 0.6333 | 189.0 | 378 | 0.5445 | 0.8286 | | 0.608 | 190.0 | 380 | 0.5444 | 0.8286 | | 0.608 | 191.0 | 382 | 0.5444 | 0.8286 | | 0.608 | 192.0 | 384 | 0.5443 | 0.8286 | | 0.608 | 193.0 | 386 | 0.5443 | 0.8286 | | 0.608 | 194.0 | 388 | 0.5442 | 0.8286 | | 0.6155 | 195.0 | 390 | 0.5442 | 0.8286 | | 0.6155 | 196.0 | 392 | 0.5442 | 0.8286 | | 0.6155 | 197.0 | 394 | 0.5442 | 0.8286 | | 0.6155 | 198.0 | 396 | 0.5441 | 0.8286 | | 0.6155 | 199.0 | 398 | 0.5441 | 0.8286 | | 0.6272 | 200.0 | 400 | 0.5441 | 0.8286 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
qmeeus/whisper-small-ner-combined
qmeeus
2023-09-22T07:49:06Z
49
0
transformers
[ "transformers", "pytorch", "tensorboard", "whisper_for_slu", "token-classification", "whisper-event", "generated_from_trainer", "dataset:qmeeus/slue-voxpopuli", "base_model:openai/whisper-small", "base_model:finetune:openai/whisper-small", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-01-12T15:14:08Z
--- license: apache-2.0 tags: - whisper-event - generated_from_trainer datasets: - qmeeus/slue-voxpopuli metrics: - wer base_model: openai/whisper-small model-index: - name: WhisperForNamedEntityRecognition results: - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: qmeeus/slue-voxpopuli type: qmeeus/slue-voxpopuli split: dev metrics: - type: wer value: 10.482824557809192 name: Wer --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # WhisperForNamedEntityRecognition This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the qmeeus/slue-voxpopuli dataset. It achieves the following results on the evaluation set: - Loss: 8.1514 - Wer: 10.4828 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 200 - training_steps: 1600 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | 31.3741 | 0.06 | 100 | 25.8582 | 10.4828 | | 13.0078 | 1.03 | 200 | 13.4173 | 10.4828 | | 10.3619 | 1.09 | 300 | 10.8540 | 10.4828 | | 8.7869 | 2.06 | 400 | 9.6249 | 10.4828 | | 7.3964 | 3.02 | 500 | 9.1812 | 10.4828 | | 6.6321 | 3.08 | 600 | 8.6536 | 10.4828 | | 6.4612 | 4.05 | 700 | 8.6046 | 10.4828 | | 4.8358 | 5.02 | 800 | 8.0890 | 10.4828 | | 4.4918 | 5.08 | 900 | 8.3141 | 10.4828 | | 4.7548 | 6.04 | 1000 | 8.1660 | 10.4828 | | 3.7881 | 7.01 | 1100 | 8.2471 | 10.4828 | | 3.1916 | 7.07 | 1200 | 8.0779 | 10.4828 | | 3.2039 | 8.04 | 1300 | 8.1106 | 10.4828 | | 3.038 | 9.0 | 1400 | 8.0875 | 10.4828 | | 2.3249 | 9.07 | 1500 | 8.1025 | 10.4828 | | 2.6124 | 10.03 | 1600 | 8.1514 | 10.4828 | ### Framework versions - Transformers 4.26.0.dev0 - Pytorch 1.10.0 - Datasets 2.7.1.dev0 - Tokenizers 0.11.0
rizepth/image_classification
rizepth
2023-09-22T07:46:39Z
191
0
transformers
[ "transformers", "pytorch", "vit", "image-classification", "generated_from_trainer", "dataset:imagefolder", "base_model:google/vit-base-patch16-224-in21k", "base_model:finetune:google/vit-base-patch16-224-in21k", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2023-09-18T17:26:17Z
--- license: apache-2.0 base_model: google/vit-base-patch16-224-in21k tags: - generated_from_trainer datasets: - imagefolder metrics: - accuracy model-index: - name: image_classification results: - task: name: Image Classification type: image-classification dataset: name: imagefolder type: imagefolder config: default split: train args: default metrics: - name: Accuracy type: accuracy value: 0.40625 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # image_classification This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.6857 - Accuracy: 0.4062 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 40 | 1.8755 | 0.3125 | | No log | 2.0 | 80 | 1.6801 | 0.4062 | | No log | 3.0 | 120 | 1.6357 | 0.3812 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
pn51/ppo-Huggy
pn51
2023-09-22T07:46:11Z
6
0
ml-agents
[ "ml-agents", "tensorboard", "onnx", "Huggy", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-Huggy", "region:us" ]
reinforcement-learning
2023-09-22T07:46:00Z
--- library_name: ml-agents tags: - Huggy - deep-reinforcement-learning - reinforcement-learning - ML-Agents-Huggy --- # **ppo** Agent playing **Huggy** This is a trained model of a **ppo** agent playing **Huggy** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents). ## Usage (with ML-Agents) The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/ We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub: - A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction - A *longer tutorial* to understand how works ML-Agents: https://huggingface.co/learn/deep-rl-course/unit5/introduction ### Resume the training ```bash mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume ``` ### Watch your Agent play You can watch your agent **playing directly in your browser** 1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity 2. Step 1: Find your model_id: pn51/ppo-Huggy 3. Step 2: Select your *.nn /*.onnx file 4. Click on Watch the agent play 👀
BBBBirdIsTheWord/q-FrozenLake-v1-4x4-noSlippery
BBBBirdIsTheWord
2023-09-22T07:38:03Z
0
0
null
[ "FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T07:38:00Z
--- tags: - FrozenLake-v1-4x4-no_slippery - q-learning - reinforcement-learning - custom-implementation model-index: - name: q-FrozenLake-v1-4x4-noSlippery results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: FrozenLake-v1-4x4-no_slippery type: FrozenLake-v1-4x4-no_slippery metrics: - type: mean_reward value: 1.00 +/- 0.00 name: mean_reward verified: false --- # **Q-Learning** Agent playing1 **FrozenLake-v1** This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** . ## Usage ```python model = load_from_hub(repo_id="BBBBirdIsTheWord/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
RintaroMisaka/segformer-b0-finetuned-segments-sidewalk-2
RintaroMisaka
2023-09-22T07:33:31Z
185
0
transformers
[ "transformers", "pytorch", "segformer", "vision", "image-segmentation", "generated_from_trainer", "base_model:nvidia/mit-b0", "base_model:finetune:nvidia/mit-b0", "license:other", "endpoints_compatible", "region:us" ]
image-segmentation
2023-09-22T07:18:17Z
--- license: other base_model: nvidia/mit-b0 tags: - vision - image-segmentation - generated_from_trainer model-index: - name: segformer-b0-finetuned-segments-sidewalk-2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the segments/sidewalk-semantic dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
Chiizu/wav2vec2-base-vi-vlsp2020-demo
Chiizu
2023-09-22T07:31:21Z
105
0
transformers
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "base_model:nguyenvulebinh/wav2vec2-base-vi-vlsp2020", "base_model:finetune:nguyenvulebinh/wav2vec2-base-vi-vlsp2020", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-22T07:31:04Z
--- license: cc-by-nc-4.0 base_model: nguyenvulebinh/wav2vec2-base-vi-vlsp2020 tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-base-vi-vlsp2020-demo results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-base-vi-vlsp2020-demo This model is a fine-tuned version of [nguyenvulebinh/wav2vec2-base-vi-vlsp2020](https://huggingface.co/nguyenvulebinh/wav2vec2-base-vi-vlsp2020) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2292 - Wer: 0.0840 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.7245 | 1.0 | 692 | 0.2790 | 0.1101 | | 0.5746 | 2.0 | 1384 | 0.2467 | 0.0919 | | 0.4068 | 3.0 | 2076 | 0.2292 | 0.0840 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
EladAssia/Taxi-v3
EladAssia
2023-09-22T07:28:50Z
0
0
null
[ "Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T07:28:48Z
--- tags: - Taxi-v3 - q-learning - reinforcement-learning - custom-implementation model-index: - name: Taxi-v3 results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: Taxi-v3 type: Taxi-v3 metrics: - type: mean_reward value: 7.50 +/- 2.75 name: mean_reward verified: false --- # **Q-Learning** Agent playing1 **Taxi-v3** This is a trained model of a **Q-Learning** agent playing **Taxi-v3** . ## Usage ```python model = load_from_hub(repo_id="EladAssia/Taxi-v3", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
iamplus/Llama-2-13b-hf-ChatOrca
iamplus
2023-09-22T07:25:34Z
9
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "dataset:iamplus/LLama2-SFT-Data", "dataset:iamplus/Open_Platypus_Orca", "dataset:iamplus/Orca", "dataset:iamplus/Conversational_Data", "license:mit", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-22T07:10:11Z
--- license: mit datasets: - iamplus/LLama2-SFT-Data - iamplus/Open_Platypus_Orca - iamplus/Orca - iamplus/Conversational_Data --- **Description :** This model is trained on a mix of Orca data and Open Source + Closed Multi-turn Conversation data to create a better reasoning model which is capable of holding multi-turn conversations as well. The Dataset split description, Prompt description as well as Training Parameters are given below. **Prompt Description :** The prompt template for the first turn looks like this: ``` <s>[INST] <<SYS>> {{ system_prompt }} <</SYS>> {{ user_message }} [/INST] ``` The prompt template for the multi-turn conversation looks like this: ``` <s>[INST] <<SYS>> {{ system_prompt }} <</SYS>> {{ user_msg_1 }} [/INST] {{ model_answer_1 }} </s><s>[INST] {{ user_msg_2 }} [/INST] ``` This model follows the official Meta's chat model Prompt format. Please refer here : https://huggingface.co/blog/llama2#how-to-prompt-llama-2 on how to prompt the model for single/multi-turn conversations. **Base model :** meta-llama/Llama-2-13b-hf **Data :** 1. 1M Orca dara (Gpt-4 Orca data - OpenOrca) 2. 1.7M chat data (includes OpenAssistant Chat data, Ultrachat, and many more open source Chat Datasets) 3. 30k OpenPlatypus data **Training Params :** ``` Number of Epochs : 2 Batch Size : 128 Sequence Length : 4096 Learning Rate : 2e-5 (Cosine) Weight Decay : 0.1 Gradient Clipping : 1.0 Gamma : 0.85 beta_1 : 0.9 beta_2 : 0.95 eps : 1e-5 Precision : bf16 Optimizer : Any Precision AdamW Optimizer ```
dss107/mp_base4
dss107
2023-09-22T07:21:29Z
4
0
sentence-transformers
[ "sentence-transformers", "pytorch", "mpnet", "setfit", "text-classification", "arxiv:2209.11055", "license:apache-2.0", "region:us" ]
text-classification
2023-09-22T07:20:22Z
--- license: apache-2.0 tags: - setfit - sentence-transformers - text-classification pipeline_tag: text-classification --- # dss107/mp_base4 This is a [SetFit model](https://github.com/huggingface/setfit) that can be used for text classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Usage To use this model for inference, first install the SetFit library: ```bash python -m pip install setfit ``` You can then run inference as follows: ```python from setfit import SetFitModel # Download from Hub and run inference model = SetFitModel.from_pretrained("dss107/mp_base4") # Run inference preds = model(["i loved the spiderman movie!", "pineapple on pizza is the worst 🤮"]) ``` ## BibTeX entry and citation info ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```
p1atdev/kakuyomu-genre-bert
p1atdev
2023-09-22T07:08:46Z
185
0
transformers
[ "transformers", "safetensors", "bert", "text-classification", "ja", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-22T06:28:13Z
--- license: mit language: - ja library_name: transformers pipeline_tag: text-classification tags: - safetensors - bert widget: - example_title: 異世界ファンタジー text: 辺境貴族に転生したので現代知識活用して観光業始めます - example_title: SF text: メタバース・オンライン - example_title: ラブコメ text: 放課後、放送部の二人 - example_title: ミステリー text: タナカ・タロウの事件簿Ⅱ - example_title: 評論 text: 読みやすい文章の書き方とは? --- # kakuyomu-genre-bert 小説のタイトルや紹介文からジャンルを分類する BERT 東北大の [cl-tohoku/bert-base-japanese-char-v3](https://huggingface.co/cl-tohoku/bert-base-japanese-char-v3) をベースにファインチューンされました。
bogdan1/llama2-bg
bogdan1
2023-09-22T07:06:33Z
14
2
transformers
[ "transformers", "pytorch", "llama", "text-generation", "bg", "license:mit", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-04T16:34:17Z
--- license: mit language: - bg --- Llama-2-7b-base fine-tuned on the Chitanka dataset and a dataset made of scraped news comments dating mostly from 2022/2023. The model was fine-tuned using PEFT and QLORA for 12000 steps. **Example Usage** ``` import torch from transformers import LlamaForCausalLM, LlamaTokenizer, GenerationConfig tokenizer = LlamaTokenizer.from_pretrained("bogdan1/llama2-bg") model = LlamaForCausalLM.from_pretrained("bogdan1/llama2-bg") inputs = tokenizer("Спомените ми прекъснаха в момента, когато", return_tensors="pt") input_ids = inputs["input_ids"].to(model.device) generation_config = GenerationConfig( do_sample = False, temperature = 0.8, top_p = 1.0, top_k = 50, repetition_penalty = 1.2, max_new_tokens = 128, use_cache=False ) with torch.no_grad(): generation_output = model.generate( input_ids=input_ids, attention_mask=torch.ones_like(input_ids), generation_config=generation_config, ) output_text = tokenizer.decode(generation_output[0].cuda(), skip_special_tokens=True).strip() ``` **Output comparison** Given a number of prompts, here's how the fine-tuned model compares to the vanilla Llama-2-7b. *Бойко Борисов е* Llama-2-7b: >Бойко Борисов е български политик, бивш министър-председател на България. > >## Биография и кариера > >Роден е в град София през 1952 г., но семейството му се премества да живее във Варна още докато той е малък. Завършил е висше образование по специалност „Икономика“ (СУ „Климент Охридски“, Факултет за международни отношения). Factuality is off, hallucinates. Llama2-bg: >Бойко Борисов е най-големия предател на България.Ще видим колко ще се оправя с него и неговите слуги от ПП,ДБ и ДПС!Този човек трябва да бъде затворен в психиатрията!!!!!Никой няма право да го пуска извън там!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Toxicity is high. *Спомените ми прекъснаха в момента, когато* Llama-2-7b: >Спомените ми прекъснаха в момента, когато се опитам да пиша за тези страни. >От 1945 г., докато не постъпва на работа като журналист през 2003 г., а след това и до днешно време, съм работил във финансите – първоначално като банков служител (от 1968), а после като управляващ директор на две банки (от 1979). След като станах журналист, отново една година бях банкерски консу Llama2-bg: >Спомените ми прекъснаха в момента, когато се насочих към вратата. >— Какво става? — попитах аз и забелязах как тя ме гледа с непознати очи. >Тя кимна бавно: >— Не съм сигурна… но може би е време да излезем от туй. >Изведнъж разбрах какви са чудесата й. Това беше единственият път, през който някога щяхме да бъдем свободни. *Тая държава я съсипаха* Llama-2-7b: >Тая държава я съсипаха, а тезите си дали на някого друг. >The state has destroyed her and those who gave it to someone else. One of the main problems with Llama-2, it continues generating in English, probably due to insufficient multilingual training data. Llama2-bg: >Тая държава я съсипаха тиквун и кокорчо.Ще видим какво ще стане след избори,ако не се намери правителство.Но защо да гласуват хората?Защото няма други възможности.Ами ако имат друга алтернатива?Това е проблемът с демокрацията-не можеш да си вземеш решение без да гласуваш!Или по-скоро -да не гласувате,за да не ни управляват тези кра The grammar is a bit off. *Мама умряла днес. Или може би вчера, не знам.* The famous opening line by Albert Camus. Llama-2-7b: >Мама умряла днес. Или може би вчера, не знам. >Mama died today. Or maybe yesterday, I don't know. Llama2-bg: >Мама умряла днес. Или може би вчера, не знам. >— Какво? — извикаха двамата мъже и се обърнаха към него с очите си разширени от сълзи или гняв. >Той погледна нагоре през прозореца и видя как едва започват да се появяват птиците. След това се обърна обратно към тях: >— Не мога да кажа нищо повече. Аз нямах време да я чуя. Това беше единственото, ко *5 бързи стъпки за оправяне на България:* Llama-2-7b: >5 бързи стъпки за оправяне на България: >1. Следите от корупцията >The Bulgarian government has been in power for almost a year now, and the country is still struggling with corruption scandals that have rocked it to its core. The latest one involves the former Prime Minister Boyko Borisov’s son-in-law who was arrested on suspicion of money laundering after being accused by his father-in-law of embezzling millions from state funds during his tenure as head of Bulgaria's State Fund Agriculture (SFA). This comes just weeks before an election which could see him return Llama2-bg: >5 бързи стъпки за оправяне на България: > >1.Ънтите се от властта > >2.Седят в тюрьма докато не им е ясно колко са кражби и престъпления > >3.Прекратяване на държавната помощ към Украйна > >4.Връщане на парите обратно на хората > >5.Да си избираме правителство с максимум 60 депутати!Това ще ни даде възможност да управляваме страната без чужда подкрепа!! Not ideal, throws a Russian word in the mix as well. **Future steps** Train for more epochs and try to fine-tune the larger Llama-2 models and compare results.
nuwangeek/ppo-LunarLander-v2
nuwangeek
2023-09-22T07:05:47Z
0
0
stable-baselines3
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T07:05:27Z
--- library_name: stable-baselines3 tags: - LunarLander-v2 - deep-reinforcement-learning - reinforcement-learning - stable-baselines3 model-index: - name: PPO results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: LunarLander-v2 type: LunarLander-v2 metrics: - type: mean_reward value: 256.06 +/- 19.06 name: mean_reward verified: false --- # **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
VinayReddyPulyala/graphcorevqa
VinayReddyPulyala
2023-09-22T07:05:16Z
62
0
transformers
[ "transformers", "pytorch", "vilt", "visual-question-answering", "generated_from_trainer", "base_model:dandelin/vilt-b32-mlm", "base_model:finetune:dandelin/vilt-b32-mlm", "license:apache-2.0", "endpoints_compatible", "region:us" ]
visual-question-answering
2023-09-20T07:34:13Z
--- license: apache-2.0 base_model: dandelin/vilt-b32-mlm tags: - generated_from_trainer model-index: - name: graphcorevqa results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # graphcorevqa This model is a fine-tuned version of [dandelin/vilt-b32-mlm](https://huggingface.co/dandelin/vilt-b32-mlm) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
Cartinoe5930/orca_mini_v3-13b-GPTQ
Cartinoe5930
2023-09-22T07:04:12Z
7
0
transformers
[ "transformers", "safetensors", "llama", "text-generation", "en", "dataset:psmathur/orca_mini_v1_dataset", "dataset:ehartford/dolphin", "arxiv:2306.02707", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "4-bit", "gptq", "region:us" ]
text-generation
2023-09-22T07:02:20Z
--- base_model: https://huggingface.co/psmathur/orca_mini_v3_13b datasets: - psmathur/orca_mini_v1_dataset - ehartford/dolphin inference: false language: - en library_name: transformers license: other model_creator: Pankaj Mathur model_name: Orca Mini v3 13B model_type: llama pipeline_tag: text-generation prompt_template: '### System: You are an AI assistant that follows instruction extremely well. Help as much as you can. ### User: {prompt} ### Input: {input} ### Response: ' quantized_by: TheBloke --- <!-- header start --> <!-- 200823 --> <div style="width: auto; margin-left: auto; margin-right: auto"> <img src="https://i.imgur.com/EBdldam.jpg" alt="TheBlokeAI" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </div> <div style="display: flex; justify-content: space-between; width: 100%;"> <div style="display: flex; flex-direction: column; align-items: flex-start;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://discord.gg/theblokeai">Chat & support: TheBloke's Discord server</a></p> </div> <div style="display: flex; flex-direction: column; align-items: flex-end;"> <p style="margin-top: 0.5em; margin-bottom: 0em;"><a href="https://www.patreon.com/TheBlokeAI">Want to contribute? TheBloke's Patreon page</a></p> </div> </div> <div style="text-align:center; margin-top: 0em; margin-bottom: 0em"><p style="margin-top: 0.25em; margin-bottom: 0em;">TheBloke's LLM work is generously supported by a grant from <a href="https://a16z.com">andreessen horowitz (a16z)</a></p></div> <hr style="margin-top: 1.0em; margin-bottom: 1.0em;"> <!-- header end --> # Orca Mini v3 13B - GPTQ - Model creator: [Pankaj Mathur](https://huggingface.co/psmathur) - Original model: [Orca Mini v3 13B](https://huggingface.co/psmathur/orca_mini_v3_13b) <!-- description start --> ## Description This repo contains GPTQ model files for [Pankaj Mathur's Orca Mini v3 13B](https://huggingface.co/psmathur/orca_mini_v3_13b). Multiple GPTQ parameter permutations are provided; see Provided Files below for details of the options provided, their parameters, and the software used to create them. <!-- description end --> <!-- repositories-available start --> ## Repositories available * [AWQ model(s) for GPU inference.](https://huggingface.co/TheBloke/orca_mini_v3_13B-AWQ) * [GPTQ models for GPU inference, with multiple quantisation parameter options.](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ) * [2, 3, 4, 5, 6 and 8-bit GGUF models for CPU+GPU inference](https://huggingface.co/TheBloke/orca_mini_v3_13B-GGUF) * [Pankaj Mathur's original unquantised fp16 model in pytorch format, for GPU inference and for further conversions](https://huggingface.co/psmathur/orca_mini_v3_13b) <!-- repositories-available end --> <!-- prompt-template start --> ## Prompt template: orca_mini ``` ### System: You are an AI assistant that follows instruction extremely well. Help as much as you can. ### User: {prompt} ### Input: {input} ### Response: ``` <!-- prompt-template end --> <!-- licensing start --> ## Licensing The creator of the source model has listed its license as `other`, and this quantization has therefore used that same license. As this model is based on Llama 2, it is also subject to the Meta Llama 2 license terms, and the license files for that are additionally included. It should therefore be considered as being claimed to be licensed under both licenses. I contacted Hugging Face for clarification on dual licensing but they do not yet have an official position. Should this change, or should Meta provide any feedback on this situation, I will update this section accordingly. In the meantime, any questions regarding licensing, and in particular how these two licenses might interact, should be directed to the original model repository: [Pankaj Mathur's Orca Mini v3 13B](https://huggingface.co/psmathur/orca_mini_v3_13b). <!-- licensing end --> <!-- README_GPTQ.md-provided-files start --> ## Provided files and GPTQ parameters Multiple quantisation parameters are provided, to allow you to choose the best one for your hardware and requirements. Each separate quant is in a different branch. See below for instructions on fetching from different branches. All recent GPTQ files are made with AutoGPTQ, and all files in non-main branches are made with AutoGPTQ. Files in the `main` branch which were uploaded before August 2023 were made with GPTQ-for-LLaMa. <details> <summary>Explanation of GPTQ parameters</summary> - Bits: The bit size of the quantised model. - GS: GPTQ group size. Higher numbers use less VRAM, but have lower quantisation accuracy. "None" is the lowest possible value. - Act Order: True or False. Also known as `desc_act`. True results in better quantisation accuracy. Some GPTQ clients have had issues with models that use Act Order plus Group Size, but this is generally resolved now. - Damp %: A GPTQ parameter that affects how samples are processed for quantisation. 0.01 is default, but 0.1 results in slightly better accuracy. - GPTQ dataset: The dataset used for quantisation. Using a dataset more appropriate to the model's training can improve quantisation accuracy. Note that the GPTQ dataset is not the same as the dataset used to train the model - please refer to the original model repo for details of the training dataset(s). - Sequence Length: The length of the dataset sequences used for quantisation. Ideally this is the same as the model sequence length. For some very long sequence models (16+K), a lower sequence length may have to be used. Note that a lower sequence length does not limit the sequence length of the quantised model. It only impacts the quantisation accuracy on longer inference sequences. - ExLlama Compatibility: Whether this file can be loaded with ExLlama, which currently only supports Llama models in 4-bit. </details> | Branch | Bits | GS | Act Order | Damp % | GPTQ Dataset | Seq Len | Size | ExLlama | Desc | | ------ | ---- | -- | --------- | ------ | ------------ | ------- | ---- | ------- | ---- | | [main](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ/tree/main) | 4 | 128 | No | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 7.26 GB | Yes | 4-bit, without Act Order and group size 128g. | | [gptq-4bit-32g-actorder_True](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ/tree/gptq-4bit-32g-actorder_True) | 4 | 32 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 8.00 GB | Yes | 4-bit, with Act Order and group size 32g. Gives highest possible inference quality, with maximum VRAM usage. | | [gptq-4bit-64g-actorder_True](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ/tree/gptq-4bit-64g-actorder_True) | 4 | 64 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 7.51 GB | Yes | 4-bit, with Act Order and group size 64g. Uses less VRAM than 32g, but with slightly lower accuracy. | | [gptq-4bit-128g-actorder_True](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ/tree/gptq-4bit-128g-actorder_True) | 4 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 7.26 GB | Yes | 4-bit, with Act Order and group size 128g. Uses even less VRAM than 64g, but with slightly lower accuracy. | | [gptq-8bit--1g-actorder_True](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ/tree/gptq-8bit--1g-actorder_True) | 8 | None | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 13.36 GB | No | 8-bit, with Act Order. No group size, to lower VRAM requirements. | | [gptq-8bit-128g-actorder_False](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ/tree/gptq-8bit-128g-actorder_False) | 8 | 128 | No | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 13.65 GB | No | 8-bit, with group size 128g for higher inference quality and without Act Order to improve AutoGPTQ speed. | | [gptq-8bit-128g-actorder_True](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ/tree/gptq-8bit-128g-actorder_True) | 8 | 128 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 13.65 GB | No | 8-bit, with group size 128g for higher inference quality and with Act Order for even higher accuracy. | | [gptq-8bit-64g-actorder_True](https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ/tree/gptq-8bit-64g-actorder_True) | 8 | 64 | Yes | 0.1 | [wikitext](https://huggingface.co/datasets/wikitext/viewer/wikitext-2-v1/test) | 4096 | 13.95 GB | No | 8-bit, with group size 64g and Act Order for even higher inference quality. Poor AutoGPTQ CUDA speed. | <!-- README_GPTQ.md-provided-files end --> <!-- README_GPTQ.md-download-from-branches start --> ## How to download from branches - In text-generation-webui, you can add `:branch` to the end of the download name, eg `TheBloke/orca_mini_v3_13B-GPTQ:main` - With Git, you can clone a branch with: ``` git clone --single-branch --branch main https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ ``` - In Python Transformers code, the branch is the `revision` parameter; see below. <!-- README_GPTQ.md-download-from-branches end --> <!-- README_GPTQ.md-text-generation-webui start --> ## How to easily download and use this model in [text-generation-webui](https://github.com/oobabooga/text-generation-webui). Please make sure you're using the latest version of [text-generation-webui](https://github.com/oobabooga/text-generation-webui). It is strongly recommended to use the text-generation-webui one-click-installers unless you're sure you know how to make a manual install. 1. Click the **Model tab**. 2. Under **Download custom model or LoRA**, enter `TheBloke/orca_mini_v3_13B-GPTQ`. - To download from a specific branch, enter for example `TheBloke/orca_mini_v3_13B-GPTQ:main` - see Provided Files above for the list of branches for each option. 3. Click **Download**. 4. The model will start downloading. Once it's finished it will say "Done". 5. In the top left, click the refresh icon next to **Model**. 6. In the **Model** dropdown, choose the model you just downloaded: `orca_mini_v3_13B-GPTQ` 7. The model will automatically load, and is now ready for use! 8. If you want any custom settings, set them and then click **Save settings for this model** followed by **Reload the Model** in the top right. * Note that you do not need to and should not set manual GPTQ parameters any more. These are set automatically from the file `quantize_config.json`. 9. Once you're ready, click the **Text Generation tab** and enter a prompt to get started! <!-- README_GPTQ.md-text-generation-webui end --> <!-- README_GPTQ.md-use-from-python start --> ## How to use this GPTQ model from Python code ### Install the necessary packages Requires: Transformers 4.32.0 or later, Optimum 1.12.0 or later, and AutoGPTQ 0.4.2 or later. ```shell pip3 install transformers>=4.32.0 optimum>=1.12.0 pip3 install auto-gptq --extra-index-url https://huggingface.github.io/autogptq-index/whl/cu118/ # Use cu117 if on CUDA 11.7 ``` If you have problems installing AutoGPTQ using the pre-built wheels, install it from source instead: ```shell pip3 uninstall -y auto-gptq git clone https://github.com/PanQiWei/AutoGPTQ cd AutoGPTQ pip3 install . ``` ### For CodeLlama models only: you must use Transformers 4.33.0 or later. If 4.33.0 is not yet released when you read this, you will need to install Transformers from source: ```shell pip3 uninstall -y transformers pip3 install git+https://github.com/huggingface/transformers.git ``` ### You can then use the following code ```python from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline model_name_or_path = "TheBloke/orca_mini_v3_13B-GPTQ" # To use a different branch, change revision # For example: revision="main" model = AutoModelForCausalLM.from_pretrained(model_name_or_path, device_map="auto", trust_remote_code=False, revision="main") tokenizer = AutoTokenizer.from_pretrained(model_name_or_path, use_fast=True) prompt = "Tell me about AI" prompt_template=f'''### System: You are an AI assistant that follows instruction extremely well. Help as much as you can. ### User: {prompt} ### Input: {input} ### Response: ''' print("\n\n*** Generate:") input_ids = tokenizer(prompt_template, return_tensors='pt').input_ids.cuda() output = model.generate(inputs=input_ids, temperature=0.7, do_sample=True, top_p=0.95, top_k=40, max_new_tokens=512) print(tokenizer.decode(output[0])) # Inference can also be done using transformers' pipeline print("*** Pipeline:") pipe = pipeline( "text-generation", model=model, tokenizer=tokenizer, max_new_tokens=512, do_sample=True, temperature=0.7, top_p=0.95, top_k=40, repetition_penalty=1.1 ) print(pipe(prompt_template)[0]['generated_text']) ``` <!-- README_GPTQ.md-use-from-python end --> <!-- README_GPTQ.md-compatibility start --> ## Compatibility The files provided are tested to work with AutoGPTQ, both via Transformers and using AutoGPTQ directly. They should also work with [Occ4m's GPTQ-for-LLaMa fork](https://github.com/0cc4m/KoboldAI). [ExLlama](https://github.com/turboderp/exllama) is compatible with Llama models in 4-bit. Please see the Provided Files table above for per-file compatibility. [Huggingface Text Generation Inference (TGI)](https://github.com/huggingface/text-generation-inference) is compatible with all GPTQ models. <!-- README_GPTQ.md-compatibility end --> <!-- footer start --> <!-- 200823 --> ## Discord For further support, and discussions on these models and AI in general, join us at: [TheBloke AI's Discord server](https://discord.gg/theblokeai) ## Thanks, and how to contribute Thanks to the [chirper.ai](https://chirper.ai) team! Thanks to Clay from [gpus.llm-utils.org](llm-utils)! I've had a lot of people ask if they can contribute. I enjoy providing models and helping people, and would love to be able to spend even more time doing it, as well as expanding into new projects like fine tuning/training. If you're able and willing to contribute it will be most gratefully received and will help me to keep providing more models, and to start work on new AI projects. Donaters will get priority support on any and all AI/LLM/model questions and requests, access to a private Discord room, plus other benefits. * Patreon: https://patreon.com/TheBlokeAI * Ko-Fi: https://ko-fi.com/TheBlokeAI **Special thanks to**: Aemon Algiz. **Patreon special mentions**: Alicia Loh, Stephen Murray, K, Ajan Kanaga, RoA, Magnesian, Deo Leter, Olakabola, Eugene Pentland, zynix, Deep Realms, Raymond Fosdick, Elijah Stavena, Iucharbius, Erik Bjäreholt, Luis Javier Navarrete Lozano, Nicholas, theTransient, John Detwiler, alfie_i, knownsqashed, Mano Prime, Willem Michiel, Enrico Ros, LangChain4j, OG, Michael Dempsey, Pierre Kircher, Pedro Madruga, James Bentley, Thomas Belote, Luke @flexchar, Leonard Tan, Johann-Peter Hartmann, Illia Dulskyi, Fen Risland, Chadd, S_X, Jeff Scroggin, Ken Nordquist, Sean Connelly, Artur Olbinski, Swaroop Kallakuri, Jack West, Ai Maven, David Ziegler, Russ Johnson, transmissions 11, John Villwock, Alps Aficionado, Clay Pascal, Viktor Bowallius, Subspace Studios, Rainer Wilmers, Trenton Dambrowitz, vamX, Michael Levine, 준교 김, Brandon Frisco, Kalila, Trailburnt, Randy H, Talal Aujan, Nathan Dryer, Vadim, 阿明, ReadyPlayerEmma, Tiffany J. Kim, George Stoitzev, Spencer Kim, Jerry Meng, Gabriel Tamborski, Cory Kujawski, Jeffrey Morgan, Spiking Neurons AB, Edmond Seymore, Alexandros Triantafyllidis, Lone Striker, Cap'n Zoog, Nikolai Manek, danny, ya boyyy, Derek Yates, usrbinkat, Mandus, TL, Nathan LeClaire, subjectnull, Imad Khwaja, webtim, Raven Klaugh, Asp the Wyvern, Gabriel Puliatti, Caitlyn Gatomon, Joseph William Delisle, Jonathan Leane, Luke Pendergrass, SuperWojo, Sebastain Graf, Will Dee, Fred von Graf, Andrey, Dan Guido, Daniel P. Andersen, Nitin Borwankar, Elle, Vitor Caleffi, biorpg, jjj, NimbleBox.ai, Pieter, Matthew Berman, terasurfer, Michael Davis, Alex, Stanislav Ovsiannikov Thank you to all my generous patrons and donaters! And thank you again to a16z for their generous grant. <!-- footer end --> # Original model card: Pankaj Mathur's Orca Mini v3 13B # orca_mini_v3_13b A Llama2-13b model trained on Orca Style datasets. <br> ![orca-mini](https://huggingface.co/psmathur/orca_mini_v3_13b/resolve/main/orca_minis_small.jpeg) <br> **P.S. If you're interested to collaborate, please connect with me at www.linkedin.com/in/pankajam.** <br> ### quantized versions Big thanks to [@TheBloke](https://huggingface.co/TheBloke) 1) https://huggingface.co/TheBloke/orca_mini_v3_13B-GGML 2) https://huggingface.co/TheBloke/orca_mini_v3_13B-GPTQ <br> #### license disclaimer: This model is bound by the license & usage restrictions of the original Llama-2 model. And comes with no warranty or gurantees of any kind. <br> ## Evaluation We evaluated orca_mini_v3_13b on a wide range of tasks using [Language Model Evaluation Harness](https://github.com/EleutherAI/lm-evaluation-harness) from EleutherAI. Here are the results on metrics used by [HuggingFaceH4 Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) ||||| |:------:|:--------:|:-------:|:--------:| |**Task**|**Metric**|**Value**|**Stderr**| |*arc_challenge*|acc_norm|0.6314|0.0141| |*hellaswag*|acc_norm|0.8242|0.0038| |*mmlu*|acc_norm|0.5637|0.0351| |*truthfulqa_mc*|mc2|0.5127|0.0157| |**Total Average**|-|**0.6329877193**|| <br> ## Example Usage Here is the prompt format ``` ### System: You are an AI assistant that follows instruction extremely well. Help as much as you can. ### User: Tell me about Orcas. ### Assistant: ``` Below shows a code example on how to use this model ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline tokenizer = AutoTokenizer.from_pretrained("psmathur/orca_mini_v3_13b") model = AutoModelForCausalLM.from_pretrained( "psmathur/orca_mini_v3_13b", torch_dtype=torch.float16, load_in_8bit=True, low_cpu_mem_usage=True, device_map="auto" ) system_prompt = "### System:\nYou are an AI assistant that follows instruction extremely well. Help as much as you can.\n\n" #generate text steps instruction = "Tell me about Orcas." prompt = f"{system_prompt}### User: {instruction}\n\n### Assistant:\n" inputs = tokenizer(prompt, return_tensors="pt").to("cuda") output = model.generate(**inputs, do_sample=True, top_p=0.95, top_k=0, max_new_tokens=4096) print(tokenizer.decode(output[0], skip_special_tokens=True)) ``` <br> #### Limitations & Biases: While this model aims for accuracy, it can occasionally produce inaccurate or misleading results. Despite diligent efforts in refining the pretraining data, there remains a possibility for the generation of inappropriate, biased, or offensive content. Exercise caution and cross-check information when necessary. <br> ### Citiation: Please kindly cite using the following BibTeX: ``` @misc{orca_mini_v3_13b, author = {Pankaj Mathur}, title = {orca_mini_v3_13b: An Orca Style Llama2-70b model}, year = {2023}, publisher = {HuggingFace}, journal = {HuggingFace repository}, howpublished = {\url{https://https://huggingface.co/psmathur/orca_mini_v3_13b}, } ``` ``` @misc{mukherjee2023orca, title={Orca: Progressive Learning from Complex Explanation Traces of GPT-4}, author={Subhabrata Mukherjee and Arindam Mitra and Ganesh Jawahar and Sahaj Agarwal and Hamid Palangi and Ahmed Awadallah}, year={2023}, eprint={2306.02707}, archivePrefix={arXiv}, primaryClass={cs.CL} } ``` ``` @software{touvron2023llama2, title={Llama 2: Open Foundation and Fine-Tuned Chat Models}, author={Hugo Touvron, Louis Martin, Kevin Stone, Peter Albert, Amjad Almahairi, Yasmine Babaei, Nikolay Bashlykov, Soumya Batra, Prajjwal Bhargava, Shruti Bhosale, Dan Bikel, Lukas Blecher, Cristian Canton Ferrer, Moya Chen, Guillem Cucurull, David Esiobu, Jude Fernandes, Jeremy Fu, Wenyin Fu, Brian Fuller, Cynthia Gao, Vedanuj Goswami, Naman Goyal, Anthony Hartshorn, Saghar Hosseini, Rui Hou, Hakan Inan, Marcin Kardas, Viktor Kerkez Madian Khabsa, Isabel Kloumann, Artem Korenev, Punit Singh Koura, Marie-Anne Lachaux, Thibaut Lavril, Jenya Lee, Diana Liskovich, Yinghai Lu, Yuning Mao, Xavier Martinet, Todor Mihaylov, Pushkar Mishra, Igor Molybog, Yixin Nie, Andrew Poulton, Jeremy Reizenstein, Rashi Rungta, Kalyan Saladi, Alan Schelten, Ruan Silva, Eric Michael Smith, Ranjan Subramanian, Xiaoqing Ellen Tan, Binh Tang, Ross Taylor, Adina Williams, Jian Xiang Kuan, Puxin Xu , Zheng Yan, Iliyan Zarov, Yuchen Zhang, Angela Fan, Melanie Kambadur, Sharan Narang, Aurelien Rodriguez, Robert Stojnic, Sergey Edunov, Thomas Scialom}, year={2023} } ```
Johnkimmyshinmmy/train_lora
Johnkimmyshinmmy
2023-09-22T07:00:10Z
0
0
diffusers
[ "diffusers", "tensorboard", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "lora", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-22T06:50:49Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 instance_prompt: a photo of aszx interior tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA DreamBooth - Johnkimmyshinmmy/train_lora These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on a photo of aszx interior using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. ![img_0](./image_0.png) ![img_1](./image_1.png) ![img_2](./image_2.png) ![img_3](./image_3.png) LoRA for the text encoder was enabled: False.
diana9m/marian-finetuned-kde4-en-to-fr
diana9m
2023-09-22T06:54:47Z
114
0
transformers
[ "transformers", "pytorch", "tensorboard", "safetensors", "marian", "text2text-generation", "translation", "generated_from_trainer", "dataset:kde4", "base_model:Helsinki-NLP/opus-mt-en-fr", "base_model:finetune:Helsinki-NLP/opus-mt-en-fr", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
translation
2023-08-06T20:49:41Z
--- license: apache-2.0 base_model: Helsinki-NLP/opus-mt-en-fr tags: - translation - generated_from_trainer datasets: - kde4 metrics: - bleu model-index: - name: marian-finetuned-kde4-en-to-fr results: - task: name: Sequence-to-sequence Language Modeling type: text2text-generation dataset: name: kde4 type: kde4 config: en-fr split: train args: en-fr metrics: - name: Bleu type: bleu value: 52.88529894542656 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # marian-finetuned-kde4-en-to-fr This model is a fine-tuned version of [Helsinki-NLP/opus-mt-en-fr](https://huggingface.co/Helsinki-NLP/opus-mt-en-fr) on the kde4 dataset. It achieves the following results on the evaluation set: - Loss: 0.8556 - Bleu: 52.8853 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.3 - Tokenizers 0.13.3
Keenan5755/q-Taxi-v3
Keenan5755
2023-09-22T06:46:02Z
0
0
null
[ "Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T06:45:57Z
--- tags: - Taxi-v3 - q-learning - reinforcement-learning - custom-implementation model-index: - name: q-Taxi-v3 results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: Taxi-v3 type: Taxi-v3 metrics: - type: mean_reward value: 7.50 +/- 2.75 name: mean_reward verified: false --- # **Q-Learning** Agent playing1 **Taxi-v3** This is a trained model of a **Q-Learning** agent playing **Taxi-v3** . ## Usage ```python model = load_from_hub(repo_id="Keenan5755/q-Taxi-v3", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
keepsteady/qlora-koalpaca-polyglot-12.8b-50step
keepsteady
2023-09-22T06:43:28Z
2
0
peft
[ "peft", "region:us" ]
null
2023-09-22T06:43:25Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ### Framework versions - PEFT 0.6.0.dev0
Vasanth/deci-finetuned-alpaca-cleaned
Vasanth
2023-09-22T06:40:06Z
15
0
transformers
[ "transformers", "pytorch", "llama", "text-generation", "generated_from_trainer", "custom_code", "base_model:Deci/DeciLM-6b-instruct", "base_model:finetune:Deci/DeciLM-6b-instruct", "license:other", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-22T05:21:10Z
--- license: other base_model: Deci/DeciLM-6b-instruct tags: - generated_from_trainer model-index: - name: deci-finetuned-alpaca-cleaned results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # deci-finetuned-alpaca-cleaned This model is a fine-tuned version of [Deci/DeciLM-6b-instruct](https://huggingface.co/Deci/DeciLM-6b-instruct) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - training_steps: 1000 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
Keenan5755/q-FrozenLake-v1-4x4-noSlippery
Keenan5755
2023-09-22T06:33:52Z
0
0
null
[ "FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T06:33:50Z
--- tags: - FrozenLake-v1-4x4-no_slippery - q-learning - reinforcement-learning - custom-implementation model-index: - name: q-FrozenLake-v1-4x4-noSlippery results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: FrozenLake-v1-4x4-no_slippery type: FrozenLake-v1-4x4-no_slippery metrics: - type: mean_reward value: 1.00 +/- 0.00 name: mean_reward verified: false --- # **Q-Learning** Agent playing1 **FrozenLake-v1** This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** . ## Usage ```python model = load_from_hub(repo_id="Keenan5755/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
zslrmhb/poca-SoccerTwos
zslrmhb
2023-09-22T06:31:20Z
13
0
ml-agents
[ "ml-agents", "tensorboard", "onnx", "SoccerTwos", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-SoccerTwos", "region:us" ]
reinforcement-learning
2023-09-22T06:31:04Z
--- library_name: ml-agents tags: - SoccerTwos - deep-reinforcement-learning - reinforcement-learning - ML-Agents-SoccerTwos --- # **poca** Agent playing **SoccerTwos** This is a trained model of a **poca** agent playing **SoccerTwos** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents). ## Usage (with ML-Agents) The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/ We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub: - A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction - A *longer tutorial* to understand how works ML-Agents: https://huggingface.co/learn/deep-rl-course/unit5/introduction ### Resume the training ```bash mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume ``` ### Watch your Agent play You can watch your agent **playing directly in your browser** 1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity 2. Step 1: Find your model_id: zslrmhb/poca-SoccerTwos 3. Step 2: Select your *.nn /*.onnx file 4. Click on Watch the agent play 👀
CyberHarem/muramatsu_sakura_idolmastercinderellagirls
CyberHarem
2023-09-22T06:22:58Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/muramatsu_sakura_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T06:13:14Z
--- license: mit datasets: - CyberHarem/muramatsu_sakura_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of muramatsu_sakura_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 3400, you need to download `3400/muramatsu_sakura_idolmastercinderellagirls.pt` as the embedding and `3400/muramatsu_sakura_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 3400**, with the score of 0.981. The trigger words are: 1. `muramatsu_sakura_idolmastercinderellagirls` 2. `brown_hair, twintails, smile, open_mouth, short_hair, short_twintails, hairband, bow, blush, pink_eyes, brown_eyes` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:--------------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.980 | [Download](5100/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.981 | [Download](4760/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.981 | [Download](4420/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.978 | [Download](4080/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.952 | [Download](3740/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | **3400** | **0.981** | [**Download**](3400/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.976 | [Download](3060/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.948 | [Download](2720/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.909 | [Download](2380/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.890 | [Download](2040/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.866 | [Download](1700/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.809 | [Download](1360/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.840 | [Download](1020/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.640 | [Download](680/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.387 | [Download](340/muramatsu_sakura_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
pn51/rl-unit-1
pn51
2023-09-22T06:04:27Z
0
0
stable-baselines3
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T06:04:07Z
--- library_name: stable-baselines3 tags: - LunarLander-v2 - deep-reinforcement-learning - reinforcement-learning - stable-baselines3 model-index: - name: PPO results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: LunarLander-v2 type: LunarLander-v2 metrics: - type: mean_reward value: 271.22 +/- 14.87 name: mean_reward verified: false --- # **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
fastbond/llama-2-7b-finetune-GEM_viggo
fastbond
2023-09-22T05:59:00Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-22T05:58:45Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.5.0
kla-20/Peft-Flan-t5-qa-model
kla-20
2023-09-22T05:50:38Z
0
0
null
[ "pytorch", "tensorboard", "generated_from_trainer", "license:apache-2.0", "region:us" ]
null
2023-09-22T04:35:09Z
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: Peft-Flan-t5-qa-model results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Peft-Flan-t5-qa-model This model is a fine-tuned version of [google/flan-t5-base](https://huggingface.co/google/flan-t5-base) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 1 ### Framework versions - Transformers 4.27.2 - Pytorch 1.13.1+cu117 - Datasets 2.11.0 - Tokenizers 0.13.3
vasimakram01/ludwig_llm3
vasimakram01
2023-09-22T05:44:16Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-22T05:41:25Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.5.0 - PEFT 0.5.0
CyberHarem/helen_idolmastercinderellagirls
CyberHarem
2023-09-22T05:34:20Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/helen_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T05:25:51Z
--- license: mit datasets: - CyberHarem/helen_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of helen_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 4760, you need to download `4760/helen_idolmastercinderellagirls.pt` as the embedding and `4760/helen_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 4760**, with the score of 0.996. The trigger words are: 1. `helen_idolmastercinderellagirls` 2. `long_hair, black_hair, green_eyes, breasts, jewelry, smile, cleavage, large_breasts, card_\(medium\), character_name, gem_\(symbol\), necklace` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:---------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.904 | [Download](5100/helen_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | **4760** | **0.996** | [**Download**](4760/helen_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.972 | [Download](4420/helen_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.972 | [Download](4080/helen_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.975 | [Download](3740/helen_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.917 | [Download](3400/helen_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.977 | [Download](3060/helen_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.975 | [Download](2720/helen_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.949 | [Download](2380/helen_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.995 | [Download](2040/helen_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.952 | [Download](1700/helen_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.984 | [Download](1360/helen_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.980 | [Download](1020/helen_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.979 | [Download](680/helen_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.805 | [Download](340/helen_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
MBZUAI-LLM/LLaMA2-7B-GLoRA-ShareGPT
MBZUAI-LLM
2023-09-22T05:19:02Z
0
2
peft
[ "peft", "pytorch", "llama", "llama2", "text-generation", "custom_code", "dataset:shareGPT", "arxiv:2306.07967", "region:us" ]
text-generation
2023-09-21T08:50:58Z
--- library_name: peft datasets: - shareGPT tags: - llama2 inference: false pipeline_tag: text-generation --- # llama2-7b-glora 🦙 This model was built via parameter-efficient GLoRA finetuning of [llama2-7b](https://huggingface.co/meta-llama/Llama-2-7b) on the shareGPT dataset. We adapt only the attention layers using GLoRA. * Model license: This model is under a same license (see the LICENSE file) as LLaMA2. * GLoRA implementation: [script](https://github.com/Arnav0400/peft/blob/main/src/peft/tuners/glora.py) ## Model Description The architecture is similar to LLaMA2-7B, but the bias is true for attention layers. ## Limitations and Biases _The following language is modified from [EleutherAI's GPT-NeoX-20B](https://huggingface.co/EleutherAI/gpt-neox-20b)_ This model can produce factually incorrect output, and should not be relied on to produce factually accurate information. This model was trained on various public datasets. While great efforts have been taken to clean the pretraining data, it is possible that this model could generate lewd, biased or otherwise offensive outputs. ## How to Use Install and import the package dependencies: ```python !pip install -q -U huggingface_hub transformers torch accelerate ``` ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer, BitsAndBytesConfig ``` Basic model loading: ```python model = AutoModelForCausalLM.from_pretrained( "MBZUAI-LLM/LLaMA2-7B-GLoRA-ShareGPT", use_auth_token=True, torch_dtype=torch.bfloat16, device_map="auto", ) tokenizer = AutoTokenizer.from_pretrained("MBZUAI-LLM/LLaMA2-7B-GLoRA-ShareGPT") ``` Once loaded, the model and tokenizer can be used with the following code: ```python def llama_generate( model: AutoModelForCausalLM, tokenizer: AutoTokenizer, prompt: str, max_new_tokens: int = 128, temperature: float = 0.92, ) -> str: """ Initialize the pipeline Uses Hugging Face GenerationConfig defaults https://huggingface.co/docs/transformers/v4.29.1/en/main_classes/text_generation#transformers.GenerationConfig Args: model (transformers.AutoModelForCausalLM): Model for text generation tokenizer (transformers.AutoTokenizer): Tokenizer for model prompt (str): Prompt for text generation max_new_tokens (int, optional): Max new tokens after the prompt to generate. Defaults to 128. temperature (float, optional): The value used to modulate the next token probabilities. Defaults to 1.0 """ device = torch.device("cuda" if torch.cuda.is_available() else "cpu") inputs = tokenizer( [prompt], return_tensors="pt", return_token_type_ids=False, ).to( device ) # tokenize inputs, load on device # when running Torch modules in lower precision, it is best practice to use the torch.autocast context manager. with torch.autocast("cuda", dtype=torch.bfloat16): response = model.generate( **inputs, max_new_tokens=max_new_tokens, temperature=temperature, return_dict_in_generate=True, eos_token_id=tokenizer.eos_token_id, pad_token_id=tokenizer.pad_token_id, ) decoded_output = tokenizer.decode( response["sequences"][0], skip_special_tokens=True, ) # grab output in natural language return decoded_output[len(prompt) :] # remove prompt from output ``` We can now generate text! For example: ```python prompt = "You are a helpful assistant. Tell me a recipe for vegan banana bread.\n" response = llama_generate( model, tokenizer, prompt, max_new_tokens=500, temperature=0.92, ) print(response) ``` ## Disclaimer The license on this model does not constitute legal advice. We are not responsible for the actions of third parties who use this model. Please consult an attorney before using this model for commercial purposes. ## Citation for GLoRA ``` @misc{chavan2023oneforall, title={One-for-All: Generalized LoRA for Parameter-Efficient Fine-tuning}, author={Arnav Chavan and Zhuang Liu and Deepak Gupta and Eric Xing and Zhiqiang Shen}, year={2023}, eprint={2306.07967}, archivePrefix={arXiv}, primaryClass={cs.LG} } ``` ---
datnt114/sd-class-butterflies-32
datnt114
2023-09-22T05:05:50Z
35
0
diffusers
[ "diffusers", "pytorch", "unconditional-image-generation", "diffusion-models-class", "license:mit", "diffusers:DDPMPipeline", "region:us" ]
unconditional-image-generation
2023-09-22T04:52:06Z
--- license: mit tags: - pytorch - diffusers - unconditional-image-generation - diffusion-models-class --- # Model Card for Unit 1 of the [Diffusion Models Class 🧨](https://github.com/huggingface/diffusion-models-class) This model is a diffusion model for unconditional image generation of cute 🦋. ## Usage ```python from diffusers import DDPMPipeline pipeline = DDPMPipeline.from_pretrained('datnt114/sd-class-butterflies-32') image = pipeline().images[0] image ```
anhtu77/wav2vec2-base-vi-vlsp2020-demo
anhtu77
2023-09-22T05:01:57Z
108
0
transformers
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "base_model:nguyenvulebinh/wav2vec2-base-vi-vlsp2020", "base_model:finetune:nguyenvulebinh/wav2vec2-base-vi-vlsp2020", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-22T04:59:51Z
--- license: cc-by-nc-4.0 base_model: nguyenvulebinh/wav2vec2-base-vi-vlsp2020 tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-base-vi-vlsp2020-demo results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-base-vi-vlsp2020-demo This model is a fine-tuned version of [nguyenvulebinh/wav2vec2-base-vi-vlsp2020](https://huggingface.co/nguyenvulebinh/wav2vec2-base-vi-vlsp2020) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.2509 - Wer: 0.1280 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 0.8445 | 1.0 | 787 | 0.3264 | 0.1494 | | 0.5248 | 2.0 | 1574 | 0.2784 | 0.1365 | | 0.445 | 3.0 | 2361 | 0.2509 | 0.1280 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
santis2/opt-6.7b-alpaca-instruction-fine-tuning-lora
santis2
2023-09-22T05:01:34Z
0
0
null
[ "generated_from_trainer", "base_model:facebook/opt-6.7b", "base_model:finetune:facebook/opt-6.7b", "license:other", "region:us" ]
null
2023-09-22T04:14:12Z
--- license: other base_model: facebook/opt-6.7b tags: - generated_from_trainer model-index: - name: opt-6.7b-alpaca-instruction-fine-tuning-lora results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # opt-6.7b-alpaca-instruction-fine-tuning-lora This model is a fine-tuned version of [facebook/opt-6.7b](https://huggingface.co/facebook/opt-6.7b) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0005 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_steps: 1000 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
deeprasanth/mldpo
deeprasanth
2023-09-22T04:55:41Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-22T04:45:33Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.5.0
oelu/test
oelu
2023-09-22T04:18:23Z
0
0
diffusers
[ "diffusers", "law", "en", "de", "dataset:allenai/dolma", "license:bigscience-bloom-rail-1.0", "region:us" ]
null
2023-09-22T04:10:32Z
--- license: bigscience-bloom-rail-1.0 datasets: - allenai/dolma language: - en - de metrics: - accuracy library_name: diffusers tags: - law ---
Johnkimmyshinmmy/dogresult
Johnkimmyshinmmy
2023-09-22T04:03:38Z
3
0
diffusers
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "lora", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-22T03:54:58Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 instance_prompt: a photo of sks dog tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA DreamBooth - Johnkimmyshinmmy/dogresult These are LoRA adaption weights for runwayml/stable-diffusion-v1-5. The weights were trained on a photo of sks dog using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. ![img_0](./image_0.png) ![img_1](./image_1.png) ![img_2](./image_2.png) ![img_3](./image_3.png) LoRA for the text encoder was enabled: False.
CyberHarem/nonomura_sora_idolmastercinderellagirls
CyberHarem
2023-09-22T03:59:44Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/nonomura_sora_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T03:50:13Z
--- license: mit datasets: - CyberHarem/nonomura_sora_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of nonomura_sora_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 3740, you need to download `3740/nonomura_sora_idolmastercinderellagirls.pt` as the embedding and `3740/nonomura_sora_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 3740**, with the score of 0.978. The trigger words are: 1. `nonomura_sora_idolmastercinderellagirls` 2. `long_hair, green_eyes, open_mouth, smile, twintails, breasts, black_hair, hair_ornament, blush, brown_hair, drill_hair, one_eye_closed` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:-----------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:----------------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.959 | [Download](5100/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![pattern_3-5100](5100/previews/pattern_3.png) | [<NSFW, click to see>](5100/previews/pattern_4.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.977 | [Download](4760/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![pattern_3-4760](4760/previews/pattern_3.png) | [<NSFW, click to see>](4760/previews/pattern_4.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.956 | [Download](4420/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![pattern_3-4420](4420/previews/pattern_3.png) | [<NSFW, click to see>](4420/previews/pattern_4.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.973 | [Download](4080/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![pattern_3-4080](4080/previews/pattern_3.png) | [<NSFW, click to see>](4080/previews/pattern_4.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | **3740** | **0.978** | [**Download**](3740/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![pattern_3-3740](3740/previews/pattern_3.png) | [<NSFW, click to see>](3740/previews/pattern_4.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.977 | [Download](3400/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![pattern_3-3400](3400/previews/pattern_3.png) | [<NSFW, click to see>](3400/previews/pattern_4.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.969 | [Download](3060/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![pattern_3-3060](3060/previews/pattern_3.png) | [<NSFW, click to see>](3060/previews/pattern_4.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.937 | [Download](2720/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![pattern_3-2720](2720/previews/pattern_3.png) | [<NSFW, click to see>](2720/previews/pattern_4.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.951 | [Download](2380/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![pattern_3-2380](2380/previews/pattern_3.png) | [<NSFW, click to see>](2380/previews/pattern_4.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.967 | [Download](2040/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![pattern_3-2040](2040/previews/pattern_3.png) | [<NSFW, click to see>](2040/previews/pattern_4.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.908 | [Download](1700/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![pattern_3-1700](1700/previews/pattern_3.png) | [<NSFW, click to see>](1700/previews/pattern_4.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.914 | [Download](1360/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![pattern_3-1360](1360/previews/pattern_3.png) | [<NSFW, click to see>](1360/previews/pattern_4.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.911 | [Download](1020/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![pattern_3-1020](1020/previews/pattern_3.png) | [<NSFW, click to see>](1020/previews/pattern_4.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.918 | [Download](680/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![pattern_3-680](680/previews/pattern_3.png) | [<NSFW, click to see>](680/previews/pattern_4.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.848 | [Download](340/nonomura_sora_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![pattern_3-340](340/previews/pattern_3.png) | [<NSFW, click to see>](340/previews/pattern_4.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
yamiletzii/NLPmodelo
yamiletzii
2023-09-22T03:56:14Z
120
0
transformers
[ "transformers", "pytorch", "tensorboard", "roberta", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-21T22:35:04Z
--- license: apache-2.0 tags: - generated_from_trainer datasets: - glue metrics: - accuracy - f1 model-index: - name: NLPmodelo results: - task: name: Text Classification type: text-classification dataset: name: glue type: glue config: mrpc split: validation args: mrpc metrics: - name: Accuracy type: accuracy value: 0.8333333333333334 - name: F1 type: f1 value: 0.8763636363636363 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # NLPmodelo This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.6317 - Accuracy: 0.8333 - F1: 0.8764 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.5022 | 1.09 | 500 | 0.4258 | 0.8382 | 0.8809 | | 0.3347 | 2.18 | 1000 | 0.6317 | 0.8333 | 0.8764 | ### Framework versions - Transformers 4.28.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
yashika0998/vit-base-patch16-224-finetuned-flower
yashika0998
2023-09-22T03:43:20Z
170
0
transformers
[ "transformers", "pytorch", "vit", "image-classification", "generated_from_trainer", "dataset:imagefolder", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2023-09-21T17:37:33Z
--- license: apache-2.0 tags: - generated_from_trainer datasets: - imagefolder model-index: - name: vit-base-patch16-224-finetuned-flower results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vit-base-patch16-224-finetuned-flower This model is a fine-tuned version of [google/vit-base-patch16-224](https://huggingface.co/google/vit-base-patch16-224) on the imagefolder dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results ### Framework versions - Transformers 4.24.0 - Pytorch 2.0.1+cu118 - Datasets 2.7.1 - Tokenizers 0.13.3
jonas-luehrs/bert-base-cased-MLM-chemistry-tokenCLS-CATALYST
jonas-luehrs
2023-09-22T03:34:03Z
105
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:jonas-luehrs/bert-base-cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/bert-base-cased-MLM-chemistry", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-22T03:21:48Z
--- license: apache-2.0 base_model: jonas-luehrs/bert-base-cased-MLM-chemistry tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: bert-base-cased-MLM-chemistry-tokenCLS-CATALYST results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-cased-MLM-chemistry-tokenCLS-CATALYST This model is a fine-tuned version of [jonas-luehrs/bert-base-cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/bert-base-cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0705 - Precision: 0.6134 - Recall: 0.8232 - F1: 0.7030 - Accuracy: 0.9775 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.0945 | 1.0 | 1114 | 0.0880 | 0.5498 | 0.6406 | 0.5917 | 0.9700 | | 0.0622 | 2.0 | 2228 | 0.0816 | 0.6515 | 0.8725 | 0.7460 | 0.9773 | | 0.0459 | 3.0 | 3342 | 0.0705 | 0.6134 | 0.8232 | 0.7030 | 0.9775 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
jonas-luehrs/bert-base-cased-tokenCLS-CATALYST
jonas-luehrs
2023-09-22T03:18:57Z
105
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:google-bert/bert-base-cased", "base_model:finetune:google-bert/bert-base-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-20T21:26:47Z
--- license: apache-2.0 base_model: bert-base-cased tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: bert-base-cased-tokenCLS-CATALYST results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-cased-tokenCLS-CATALYST This model is a fine-tuned version of [bert-base-cased](https://huggingface.co/bert-base-cased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0691 - Precision: 0.6590 - Recall: 0.8348 - F1: 0.7366 - Accuracy: 0.9762 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.0963 | 1.0 | 1114 | 0.0834 | 0.5403 | 0.7391 | 0.6242 | 0.9723 | | 0.0643 | 2.0 | 2228 | 0.0858 | 0.6242 | 0.8522 | 0.7206 | 0.9779 | | 0.0478 | 3.0 | 3342 | 0.0691 | 0.6590 | 0.8348 | 0.7366 | 0.9762 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
CyberHarem/kiba_manami_idolmastercinderellagirls
CyberHarem
2023-09-22T03:12:44Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/kiba_manami_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T03:00:37Z
--- license: mit datasets: - CyberHarem/kiba_manami_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of kiba_manami_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 4760, you need to download `4760/kiba_manami_idolmastercinderellagirls.pt` as the embedding and `4760/kiba_manami_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 4760**, with the score of 0.906. The trigger words are: 1. `kiba_manami_idolmastercinderellagirls` 2. `short_hair, green_eyes, brown_hair, smile, breasts, cleavage, jewelry, large_breasts` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | pattern_2 | pattern_3 | pattern_4 | pattern_5 | pattern_6 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:---------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.814 | [Download](5100/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![pattern_2-5100](5100/previews/pattern_2.png) | ![pattern_3-5100](5100/previews/pattern_3.png) | ![pattern_4-5100](5100/previews/pattern_4.png) | ![pattern_5-5100](5100/previews/pattern_5.png) | ![pattern_6-5100](5100/previews/pattern_6.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | **4760** | **0.906** | [**Download**](4760/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![pattern_2-4760](4760/previews/pattern_2.png) | ![pattern_3-4760](4760/previews/pattern_3.png) | ![pattern_4-4760](4760/previews/pattern_4.png) | ![pattern_5-4760](4760/previews/pattern_5.png) | ![pattern_6-4760](4760/previews/pattern_6.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | 4420 | 0.854 | [Download](4420/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![pattern_2-4420](4420/previews/pattern_2.png) | ![pattern_3-4420](4420/previews/pattern_3.png) | ![pattern_4-4420](4420/previews/pattern_4.png) | ![pattern_5-4420](4420/previews/pattern_5.png) | ![pattern_6-4420](4420/previews/pattern_6.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.855 | [Download](4080/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![pattern_2-4080](4080/previews/pattern_2.png) | ![pattern_3-4080](4080/previews/pattern_3.png) | ![pattern_4-4080](4080/previews/pattern_4.png) | ![pattern_5-4080](4080/previews/pattern_5.png) | ![pattern_6-4080](4080/previews/pattern_6.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.782 | [Download](3740/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![pattern_2-3740](3740/previews/pattern_2.png) | ![pattern_3-3740](3740/previews/pattern_3.png) | ![pattern_4-3740](3740/previews/pattern_4.png) | ![pattern_5-3740](3740/previews/pattern_5.png) | ![pattern_6-3740](3740/previews/pattern_6.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.873 | [Download](3400/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![pattern_2-3400](3400/previews/pattern_2.png) | ![pattern_3-3400](3400/previews/pattern_3.png) | ![pattern_4-3400](3400/previews/pattern_4.png) | ![pattern_5-3400](3400/previews/pattern_5.png) | ![pattern_6-3400](3400/previews/pattern_6.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.835 | [Download](3060/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![pattern_2-3060](3060/previews/pattern_2.png) | ![pattern_3-3060](3060/previews/pattern_3.png) | ![pattern_4-3060](3060/previews/pattern_4.png) | ![pattern_5-3060](3060/previews/pattern_5.png) | ![pattern_6-3060](3060/previews/pattern_6.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.836 | [Download](2720/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![pattern_2-2720](2720/previews/pattern_2.png) | ![pattern_3-2720](2720/previews/pattern_3.png) | ![pattern_4-2720](2720/previews/pattern_4.png) | ![pattern_5-2720](2720/previews/pattern_5.png) | ![pattern_6-2720](2720/previews/pattern_6.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.860 | [Download](2380/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![pattern_2-2380](2380/previews/pattern_2.png) | ![pattern_3-2380](2380/previews/pattern_3.png) | ![pattern_4-2380](2380/previews/pattern_4.png) | ![pattern_5-2380](2380/previews/pattern_5.png) | ![pattern_6-2380](2380/previews/pattern_6.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.875 | [Download](2040/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![pattern_2-2040](2040/previews/pattern_2.png) | ![pattern_3-2040](2040/previews/pattern_3.png) | ![pattern_4-2040](2040/previews/pattern_4.png) | ![pattern_5-2040](2040/previews/pattern_5.png) | ![pattern_6-2040](2040/previews/pattern_6.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.902 | [Download](1700/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![pattern_2-1700](1700/previews/pattern_2.png) | ![pattern_3-1700](1700/previews/pattern_3.png) | ![pattern_4-1700](1700/previews/pattern_4.png) | ![pattern_5-1700](1700/previews/pattern_5.png) | ![pattern_6-1700](1700/previews/pattern_6.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.846 | [Download](1360/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![pattern_2-1360](1360/previews/pattern_2.png) | ![pattern_3-1360](1360/previews/pattern_3.png) | ![pattern_4-1360](1360/previews/pattern_4.png) | ![pattern_5-1360](1360/previews/pattern_5.png) | ![pattern_6-1360](1360/previews/pattern_6.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.872 | [Download](1020/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![pattern_2-1020](1020/previews/pattern_2.png) | ![pattern_3-1020](1020/previews/pattern_3.png) | ![pattern_4-1020](1020/previews/pattern_4.png) | ![pattern_5-1020](1020/previews/pattern_5.png) | ![pattern_6-1020](1020/previews/pattern_6.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.758 | [Download](680/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![pattern_2-680](680/previews/pattern_2.png) | ![pattern_3-680](680/previews/pattern_3.png) | ![pattern_4-680](680/previews/pattern_4.png) | ![pattern_5-680](680/previews/pattern_5.png) | ![pattern_6-680](680/previews/pattern_6.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.608 | [Download](340/kiba_manami_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![pattern_2-340](340/previews/pattern_2.png) | ![pattern_3-340](340/previews/pattern_3.png) | ![pattern_4-340](340/previews/pattern_4.png) | ![pattern_5-340](340/previews/pattern_5.png) | ![pattern_6-340](340/previews/pattern_6.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
keikofujii/distilbert-base-uncased-finetuned-cola
keikofujii
2023-09-22T03:06:21Z
10
0
transformers
[ "transformers", "pytorch", "distilbert", "text-classification", "generated_from_trainer", "dataset:glue", "base_model:distilbert/distilbert-base-uncased", "base_model:finetune:distilbert/distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-22T02:12:52Z
--- license: apache-2.0 base_model: distilbert-base-uncased tags: - generated_from_trainer datasets: - glue metrics: - matthews_correlation model-index: - name: distilbert-base-uncased-finetuned-cola results: - task: name: Text Classification type: text-classification dataset: name: glue type: glue config: cola split: validation args: cola metrics: - name: Matthews Correlation type: matthews_correlation value: 0.5263989868108533 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-cola This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 0.8375 - Matthews Correlation: 0.5264 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | Matthews Correlation | |:-------------:|:-----:|:----:|:---------------:|:--------------------:| | 0.5282 | 1.0 | 535 | 0.4713 | 0.4719 | | 0.3535 | 2.0 | 1070 | 0.5188 | 0.5052 | | 0.2315 | 3.0 | 1605 | 0.6135 | 0.5193 | | 0.18 | 4.0 | 2140 | 0.7950 | 0.5123 | | 0.1332 | 5.0 | 2675 | 0.8375 | 0.5264 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
jonas-luehrs/chembert_cased-MLM-chemistry-tokenCLS-BATTERY
jonas-luehrs
2023-09-22T02:52:07Z
106
0
transformers
[ "transformers", "pytorch", "bert", "token-classification", "generated_from_trainer", "base_model:jonas-luehrs/chembert_cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/chembert_cased-MLM-chemistry", "autotrain_compatible", "endpoints_compatible", "region:us" ]
token-classification
2023-09-22T02:48:33Z
--- base_model: jonas-luehrs/chembert_cased-MLM-chemistry tags: - generated_from_trainer metrics: - precision - recall - f1 - accuracy model-index: - name: chembert_cased-MLM-chemistry-tokenCLS-BATTERY results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chembert_cased-MLM-chemistry-tokenCLS-BATTERY This model is a fine-tuned version of [jonas-luehrs/chembert_cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/chembert_cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0659 - Precision: 0.6898 - Recall: 0.8528 - F1: 0.7627 - Accuracy: 0.9771 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | No log | 1.0 | 338 | 0.0783 | 0.6913 | 0.7761 | 0.7312 | 0.9728 | | 0.1439 | 2.0 | 676 | 0.0633 | 0.6970 | 0.8466 | 0.7645 | 0.9762 | | 0.0491 | 3.0 | 1014 | 0.0659 | 0.6898 | 0.8528 | 0.7627 | 0.9771 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
hipnologo/GPT-Neox-20b-QLoRA-FineTune-english_quotes_dataset
hipnologo
2023-09-22T02:52:03Z
36
0
peft
[ "peft", "text-generation-inference", "text-generation", "en", "dataset:Abirate/english_quotes", "base_model:EleutherAI/gpt-neox-20b", "base_model:adapter:EleutherAI/gpt-neox-20b", "license:apache-2.0", "region:us" ]
text-generation
2023-06-19T19:25:15Z
--- language: - en license: apache-2.0 library_name: peft tags: - text-generation-inference datasets: - Abirate/english_quotes pipeline_tag: text-generation base_model: EleutherAI/gpt-neox-20b --- # hipnologo/GPT-Neox-20b-QLoRA-FineTune-english_quotes_dataset ## Training procedure The following `bitsandbytes` quantization config was used during training: - load_in_8bit: False - load_in-4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: bfloat16 ## Model description This model is a fine-tuned version of the `EleutherAI/gpt-neox-20b` model using the QLoRa library and the PEFT library. #### How to use The code below performs the following steps: 1. Imports the necessary libraries: `torch` and classes from the `transformers` library. 2. Specifies the `model_id` as "hipnologo/GPT-Neox-20b-QLoRA-FineTune-english_quotes_dataset". 3. Defines a `BitsAndBytesConfig` object named `bnb_config` with the following configuration: - `load_in_4bit` set to `True` - `bnb_4bit_use_double_quant` set to `True` - `bnb_4bit_quant_type` set to "nf4" - `bnb_4bit_compute_dtype` set to `torch.bfloat16` 4. Initializes an `AutoTokenizer` object named `tokenizer` by loading the tokenizer for the specified `model_id`. 5. Initializes an `AutoModelForCausalLM` object named `model` by loading the pre-trained model for the specified `model_id` and providing the `quantization_config` as `bnb_config`. The model is loaded on device `cuda:0`. 6. Defines a variable `text` with the value "Twenty years from now". 7. Defines a variable `device` with the value "cuda:0", representing the device on which the model will be executed. 8. Encodes the `text` using the `tokenizer` and converts it to a PyTorch tensor, assigning it to the `inputs` variable. The tensor is moved to the specified `device`. 9. Generates text using the `model.generate` method by passing the `inputs` tensor and setting the `max_new_tokens` parameter to 20. The generated output is assigned to the `outputs` variable. 10. Decodes the `outputs` tensor using the `tokenizer` to obtain the generated text without special tokens, and assigns it to the `generated_text` variable. 11. Prints the `generated_text`. ```python import torch from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig # Load the base pre-trained model base_model_id = "EleutherAI/gpt-neox-20b" tokenizer = AutoTokenizer.from_pretrained(base_model_id) model = AutoModelForCausalLM.from_pretrained(base_model_id) # Fine-tuning model model_id = "hipnologo/GPT-Neox-20b-QLoRA-FineTune-english_quotes_dataset" bnb_config = BitsAndBytesConfig( load_in_4bit=True, bnb_4bit_use_double_quant=True, bnb_4bit_quant_type="nf4", bnb_4bit_compute_dtype=torch.bfloat16 ) # Load the fine-tuned model model = AutoModelForCausalLM.from_pretrained(model_id, quantization_config=bnb_config, device_map={"":0}) text = "Twenty years from now" device = "cuda:0" inputs = tokenizer(text, return_tensors="pt").to(device) outputs = model.generate(**inputs, max_new_tokens=20) generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True) print(generated_text) ``` ### Framework versions - PEFT 0.4.0.dev0 ## Training procedure - Trainable params: 8650752 - all params: 10597552128 - trainable%: 0.08162971878329976 ## License This model is licensed under Apache 2.0. Please see the [LICENSE](https://www.apache.org/licenses/LICENSE-2.0) for more information.
hipnologo/llama-2-7b-hf-finetune-oa-guanaco-small
hipnologo
2023-09-22T02:51:40Z
9
0
peft
[ "peft", "pytorch", "llama", "llama-2", "text-generation", "en", "dataset:mychen76/small_openassistant-guanaco", "base_model:meta-llama/Llama-2-7b-hf", "base_model:adapter:meta-llama/Llama-2-7b-hf", "8-bit", "region:us" ]
text-generation
2023-08-15T06:01:00Z
--- language: - en library_name: peft tags: - llama-2 datasets: - mychen76/small_openassistant-guanaco pipeline_tag: text-generation base_model: meta-llama/Llama-2-7b-hf --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: fp4 - bnb_4bit_use_double_quant: False - bnb_4bit_compute_dtype: float32 ### Framework versions - PEFT 0.4.0
hipnologo/falcon-7b-qlora-finetune-chatbot
hipnologo
2023-09-22T02:51:15Z
8
1
peft
[ "peft", "text-generation-inference", "text-generation", "en", "dataset:hipnologo/Ecommerce-FAQ-Chatbot-Dataset", "base_model:tiiuae/falcon-7b", "base_model:adapter:tiiuae/falcon-7b", "license:apache-2.0", "region:us" ]
text-generation
2023-06-19T21:30:40Z
--- language: - en license: apache-2.0 library_name: peft tags: - text-generation-inference datasets: - hipnologo/Ecommerce-FAQ-Chatbot-Dataset pipeline_tag: text-generation base_model: tiiuae/falcon-7b --- # Falcon 7B LLM Fine Tune Model ## Model description This model is a fine-tuned version of the `tiiuae/falcon-7b` model using the QLoRa library and the PEFT library. ## Intended uses & limitations #### How to use - The model and tokenizer are loaded using the `from_pretrained` methods. - The padding token of the tokenizer is set to be the same as the end-of-sentence (EOS) token. - The `generation_config` is used to set parameters for generating responses, such as the maximum number of new tokens to generate and the temperature for the softmax function. - The prompt is defined, encoded using the tokenizer, and passed to the `model.generate` method to generate a response. - The generated response is decoded using the tokenizer and printed. ```python # Import necessary classes and functions from transformers import AutoTokenizer, AutoModelForCausalLM from peft import PeftConfig, PeftModel # Specify the model PEFT_MODEL = "hipnologo/falcon-7b-qlora-finetune-chatbot" # Load the PEFT config config = PeftConfig.from_pretrained(PEFT_MODEL) # Load the base model and tokenizer model = AutoModelForCausalLM.from_pretrained( config.based_model_name_or_path, return_dict=True, quantization_config=bnb_config, device_map="auto", trust_remote_code=True, ) tokenizer = AutoTokenizer.from_pretrained(config.base_model_name_or_path) # Set the padding token to be the same as the EOS token tokenizer.pad_token = tokenizer.eos_token # Load the PEFT model model = PeftModel.from_pretrained(model, PEFT_MODEL) # Set the generation parameters generation_config = model.generation_config generation_config.max_new_tokens = 200 generation_config.temperature = 0.7 generation_config.top_p = 0.7 generation_config.num_return_sequences = 1 generation_config.pad_token_id = tokenizer.eos_token_id generation_config.eos_token_id = tokenizer.eos_token_id # Define the prompt prompt = """ <human>: How can I create an account? <assistant>: """.strip() print(prompt) # Encode the prompt encoding = tokenizer(prompt, return_tensors="pt").to(model.device) # Generate a response with torch.inference_mode(): outputs = model.generate( input_ids=encoding.input_ids, attention_mask=encoding.attention_mask, generation_config=generation_config, ) # Print the generated response print(tokenizer.decode(outputs[0],skip_special_tokens=True)) ``` ## Training procedure The model was fine-tuned on the [Ecommerce-FAQ-Chatbot-Dataset](https://kaggle.com/datasets/saadmakhdoom/ecommerce-faq-chatbot-dataset) using the `bitsandbytes` quantization config: - load_in_8bit: `False` - load_in_4bit: `True` - llm_int8_threshold: `6.0` - llm_int8_skip_modules: `None` - llm_int8_enable_fp32_cpu_offload: `False` - llm_int8_has_fp16_weight: `False` - bnb_4bit_quant_type: `nf4` - bnb_4bit_use_double_quant: `True` - bnb_4bit_compute_dtype: `bfloat16` ### Framework versions - PEFT 0.4.0.dev0 ## Evaluation results The model was trained for 80 steps, with the training loss decreasing from 0.184 to nearly 0. The final training loss was `0.03094411873175886`. - Trainable params: 2359296 - All params: 3611104128 - Trainable%: 0.06533447711203746 ## License This model is licensed under Apache 2.0. Please see the [LICENSE](https://www.apache.org/licenses/LICENSE-2.0) for more information.
jonas-luehrs/chembert_cased-MLM-chemistry-textCLS-RHEOLOGY
jonas-luehrs
2023-09-22T02:47:55Z
105
0
transformers
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "base_model:jonas-luehrs/chembert_cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/chembert_cased-MLM-chemistry", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-22T02:46:07Z
--- base_model: jonas-luehrs/chembert_cased-MLM-chemistry tags: - generated_from_trainer metrics: - f1 - precision - recall - accuracy model-index: - name: chembert_cased-MLM-chemistry-textCLS-RHEOLOGY results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chembert_cased-MLM-chemistry-textCLS-RHEOLOGY This model is a fine-tuned version of [jonas-luehrs/chembert_cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/chembert_cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5925 - F1: 0.7527 - Precision: 0.7836 - Recall: 0.7716 - Accuracy: 0.7716 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | Precision | Recall | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------:|:---------:|:------:|:--------:| | 1.1553 | 1.0 | 46 | 0.8494 | 0.6731 | 0.6580 | 0.7099 | 0.7099 | | 0.7613 | 2.0 | 92 | 0.6545 | 0.7297 | 0.7155 | 0.7593 | 0.7593 | | 0.5792 | 3.0 | 138 | 0.5925 | 0.7527 | 0.7836 | 0.7716 | 0.7716 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
jonas-luehrs/chembert_cased-MLM-chemistry
jonas-luehrs
2023-09-22T02:42:48Z
127
0
transformers
[ "transformers", "pytorch", "bert", "fill-mask", "generated_from_trainer", "base_model:jiangg/chembert_cased", "base_model:finetune:jiangg/chembert_cased", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
2023-09-22T02:08:09Z
--- base_model: jiangg/chembert_cased tags: - generated_from_trainer model-index: - name: chembert_cased-MLM-chemistry results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # chembert_cased-MLM-chemistry This model is a fine-tuned version of [jiangg/chembert_cased](https://huggingface.co/jiangg/chembert_cased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.1485 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 2.6478 | 1.0 | 683 | 2.3777 | | 2.4088 | 2.0 | 1366 | 2.2350 | | 2.3112 | 3.0 | 2049 | 2.1976 | | 2.2641 | 4.0 | 2732 | 2.1444 | | 2.2392 | 5.0 | 3415 | 2.1437 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
jonas-luehrs/bert-base-cased-MLM-chemistry-textCLS-RHEOLOGY
jonas-luehrs
2023-09-22T02:36:37Z
105
0
transformers
[ "transformers", "pytorch", "bert", "text-classification", "generated_from_trainer", "base_model:jonas-luehrs/bert-base-cased-MLM-chemistry", "base_model:finetune:jonas-luehrs/bert-base-cased-MLM-chemistry", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-22T02:20:48Z
--- license: apache-2.0 base_model: jonas-luehrs/bert-base-cased-MLM-chemistry tags: - generated_from_trainer metrics: - f1 - precision - recall - accuracy model-index: - name: bert-base-cased-MLM-chemistry-textCLS-RHEOLOGY results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-cased-MLM-chemistry-textCLS-RHEOLOGY This model is a fine-tuned version of [jonas-luehrs/bert-base-cased-MLM-chemistry](https://huggingface.co/jonas-luehrs/bert-base-cased-MLM-chemistry) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.7384 - F1: 0.7045 - Precision: 0.6979 - Recall: 0.7284 - Accuracy: 0.7284 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | Precision | Recall | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:------:|:---------:|:------:|:--------:| | 1.2604 | 1.0 | 46 | 1.0089 | 0.5830 | 0.5536 | 0.6173 | 0.6173 | | 0.8074 | 2.0 | 92 | 0.7888 | 0.6577 | 0.6702 | 0.6852 | 0.6852 | | 0.6246 | 3.0 | 138 | 0.7384 | 0.7045 | 0.6979 | 0.7284 | 0.7284 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
INo0121/whisper-base-ko-callvoice
INo0121
2023-09-22T02:35:51Z
81
2
transformers
[ "transformers", "pytorch", "whisper", "automatic-speech-recognition", "hf-asr-leaderboard", "generated_from_trainer", "ko", "dataset:INo0121/low_quality_call_voice", "base_model:openai/whisper-base", "base_model:finetune:openai/whisper-base", "license:apache-2.0", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-07T01:50:46Z
--- language: - ko license: apache-2.0 base_model: openai/whisper-base tags: - hf-asr-leaderboard - generated_from_trainer datasets: - INo0121/low_quality_call_voice model-index: - name: Whisper Base for Korean Low quaiity Call Voices results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Whisper Base for Korean Low quaiity Call Voices This model is a fine-tuned version of [openai/whisper-base](https://huggingface.co/openai/whisper-base) on the Korean Low Quaiity Call Voices dataset. It achieves the following results on the evaluation set: - Loss: 0.4941 - Cer: 30.7538 ## Model description 프로젝트 용도로 파인튜닝된 모델입니다. OpenAI의 Whisper-Base 모델을 바탕으로 '한국어 저음질 음성 통화 데이터'에 대한 정확도를 증가시키고자 파인튜닝을 진행한 모델이며, 사용한 데이터는 AI-HUB의 ‘저음질 전화망 음성인식 데이터’ 중 일부로서 오디오 파일 기준 240,771.06초(파일 1개당 평균 길이는 약 5.296초) 텍스트 데이터 기준 총 1,696,414글자의 크기입니다. This is a fine-tuned model for project use. This model was fine-tuned to increase the accuracy of ‘Korean low-quality voice call data’ based on OpenAI’s Whisper-Base model. The data used is part of AI-HUB’s ‘low-quality telephone network voice recognition data’, which is 240,771.06 seconds based on audio files(average length per file is about 5.296 seconds). The total size is 1,696,414 characters based on text data. ## Intended uses & limitations 파인튜닝에 사용된 Base model과 dataset 모두 학습 목적으로 사용하였으며, 따라서 본 모델 역시 학습 목적으로만 사용 가능합니다. Both the base model and dataset used for fine tuning were used for learning purposes, so this model can also be used only for learning purposes. ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 8000 ### Training results | Training Loss | Epoch | Step | Validation Loss | Cer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | 0.6416 | 0.44 | 1000 | 0.6564 | 64.1489 | | 0.5914 | 0.88 | 2000 | 0.5688 | 37.4957 | | 0.435 | 1.32 | 3000 | 0.5349 | 32.6734 | | 0.4056 | 1.76 | 4000 | 0.5124 | 30.9065 | | 0.3368 | 2.2 | 5000 | 0.5057 | 32.6925 | | 0.3107 | 2.64 | 6000 | 0.4979 | 32.8315 | | 0.3016 | 3.08 | 7000 | 0.4947 | 29.3060 | | 0.2979 | 3.52 | 8000 | 0.4941 | 30.7538 | ### Framework versions - Transformers 4.34.0.dev0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
psm151/wav2vec2-large-xlsr-turkish-demo-colab
psm151
2023-09-22T02:31:08Z
106
0
transformers
[ "transformers", "pytorch", "wav2vec2", "automatic-speech-recognition", "generated_from_trainer", "base_model:facebook/wav2vec2-base", "base_model:finetune:facebook/wav2vec2-base", "license:apache-2.0", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-18T07:45:46Z
--- license: apache-2.0 base_model: facebook/wav2vec2-base tags: - generated_from_trainer metrics: - wer model-index: - name: wav2vec2-large-xlsr-turkish-demo-colab results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-large-xlsr-turkish-demo-colab This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4659 - Wer: 0.3465 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 30 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 3.4507 | 4.0 | 500 | 1.4016 | 0.9587 | | 0.5928 | 8.0 | 1000 | 0.4211 | 0.4360 | | 0.2228 | 12.0 | 1500 | 0.4629 | 0.3856 | | 0.1292 | 16.0 | 2000 | 0.4531 | 0.3654 | | 0.0898 | 20.0 | 2500 | 0.4969 | 0.3700 | | 0.0613 | 24.0 | 3000 | 0.4731 | 0.3542 | | 0.046 | 28.0 | 3500 | 0.4659 | 0.3465 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 1.18.3 - Tokenizers 0.13.3
weishuai-4670/textual_inversion_style_2
weishuai-4670
2023-09-22T02:22:27Z
13
0
diffusers
[ "diffusers", "tensorboard", "safetensors", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "textual_inversion", "base_model:runwayml/stable-diffusion-v1-5", "base_model:adapter:runwayml/stable-diffusion-v1-5", "license:creativeml-openrail-m", "autotrain_compatible", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us" ]
text-to-image
2023-09-20T03:04:57Z
--- license: creativeml-openrail-m base_model: runwayml/stable-diffusion-v1-5 tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - textual_inversion inference: true --- # Textual inversion text2image fine-tuning - weishuai-4670/textual_inversion_style_2 These are textual inversion adaption weights for runwayml/stable-diffusion-v1-5. You can find some example images in the following.
reeen115/lora_output_2-1
reeen115
2023-09-22T02:10:42Z
1
0
diffusers
[ "diffusers", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "lora", "base_model:stabilityai/stable-diffusion-2-1", "base_model:adapter:stabilityai/stable-diffusion-2-1", "license:creativeml-openrail-m", "region:us" ]
text-to-image
2023-09-22T01:49:29Z
--- license: creativeml-openrail-m base_model: stabilityai/stable-diffusion-2-1 instance_prompt: cardboards, grayscale tags: - stable-diffusion - stable-diffusion-diffusers - text-to-image - diffusers - lora inference: true --- # LoRA DreamBooth - reeen115/lora_output_2-1 These are LoRA adaption weights for stabilityai/stable-diffusion-2-1. The weights were trained on cardboards, grayscale using [DreamBooth](https://dreambooth.github.io/). You can find some example images in the following. LoRA for the text encoder was enabled: False.
tranvancuong2597/dqn-SpaceInvadersNoFrameskip-v4
tranvancuong2597
2023-09-22T02:05:01Z
0
0
stable-baselines3
[ "stable-baselines3", "SpaceInvadersNoFrameskip-v4", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
reinforcement-learning
2023-09-22T01:23:31Z
--- library_name: stable-baselines3 tags: - SpaceInvadersNoFrameskip-v4 - deep-reinforcement-learning - reinforcement-learning - stable-baselines3 model-index: - name: DQN results: - task: type: reinforcement-learning name: reinforcement-learning dataset: name: SpaceInvadersNoFrameskip-v4 type: SpaceInvadersNoFrameskip-v4 metrics: - type: mean_reward value: 786.50 +/- 281.02 name: mean_reward verified: false --- # **DQN** Agent playing **SpaceInvadersNoFrameskip-v4** This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3) and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo). The RL Zoo is a training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included. ## Usage (with SB3 RL Zoo) RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/> SB3: https://github.com/DLR-RM/stable-baselines3<br/> SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib Install the RL Zoo (with SB3 and SB3-Contrib): ```bash pip install rl_zoo3 ``` ``` # Download model and save it into the logs/ folder python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga tranvancuong2597 -f logs/ python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ ``` If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do: ``` python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga tranvancuong2597 -f logs/ python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ ``` ## Training (with the RL Zoo) ``` python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ # Upload the model and generate video (when possible) python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga tranvancuong2597 ``` ## Hyperparameters ```python OrderedDict([('batch_size', 32), ('buffer_size', 100000), ('env_wrapper', ['stable_baselines3.common.atari_wrappers.AtariWrapper']), ('exploration_final_eps', 0.01), ('exploration_fraction', 0.1), ('frame_stack', 4), ('gradient_steps', 1), ('learning_rate', 0.0001), ('learning_starts', 100000), ('n_timesteps', 10000000.0), ('optimize_memory_usage', False), ('policy', 'CnnPolicy'), ('target_update_interval', 1000), ('train_freq', 4), ('normalize', False)]) ``` # Environment Arguments ```python {'render_mode': 'rgb_array'} ```
Lambang0902/face_shape_classification
Lambang0902
2023-09-22T01:56:29Z
0
0
null
[ "arxiv:1910.09700", "region:us" ]
null
2023-09-22T01:41:17Z
--- # For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/modelcard.md?plain=1 # Doc / guide: https://huggingface.co/docs/hub/model-cards {} --- # Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> This modelcard aims to be a base template for new models. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/modelcard_template.md?plain=1). ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Data Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Data Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
chansuga/swin-tiny-patch4-window7-224-finetuned-eurosat
chansuga
2023-09-22T01:55:30Z
187
0
transformers
[ "transformers", "pytorch", "tensorboard", "swin", "image-classification", "generated_from_trainer", "dataset:imagefolder", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2023-09-14T05:57:22Z
--- license: apache-2.0 tags: - generated_from_trainer datasets: - imagefolder metrics: - accuracy model-index: - name: swin-tiny-patch4-window7-224-finetuned-eurosat results: - task: name: Image Classification type: image-classification dataset: name: imagefolder type: imagefolder args: default metrics: - name: Accuracy type: accuracy value: 0.9777777777777777 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.0628 - Accuracy: 0.9778 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.2799 | 1.0 | 190 | 0.1301 | 0.9574 | | 0.1848 | 2.0 | 380 | 0.0803 | 0.9711 | | 0.1504 | 3.0 | 570 | 0.0628 | 0.9778 | ### Framework versions - Transformers 4.17.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.14.0
ahirtonlopes/videomae-base-finetuned-ucf101-subset
ahirtonlopes
2023-09-22T01:46:48Z
59
0
transformers
[ "transformers", "pytorch", "videomae", "video-classification", "generated_from_trainer", "base_model:MCG-NJU/videomae-base", "base_model:finetune:MCG-NJU/videomae-base", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
video-classification
2023-08-31T16:22:57Z
--- license: cc-by-nc-4.0 base_model: MCG-NJU/videomae-base tags: - generated_from_trainer metrics: - accuracy model-index: - name: videomae-base-finetuned-ucf101-subset results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # videomae-base-finetuned-ucf101-subset This model is a fine-tuned version of [MCG-NJU/videomae-base](https://huggingface.co/MCG-NJU/videomae-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4043 - Accuracy: 0.8968 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - training_steps: 300 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.835 | 0.25 | 75 | 1.5384 | 0.4143 | | 0.5822 | 1.25 | 150 | 0.6968 | 0.8 | | 0.2967 | 2.25 | 225 | 0.3601 | 0.9143 | | 0.0952 | 3.25 | 300 | 0.2773 | 0.9 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
CyberHarem/oohara_michiru_idolmastercinderellagirls
CyberHarem
2023-09-22T01:36:06Z
0
0
null
[ "art", "text-to-image", "dataset:CyberHarem/oohara_michiru_idolmastercinderellagirls", "license:mit", "region:us" ]
text-to-image
2023-09-22T01:26:43Z
--- license: mit datasets: - CyberHarem/oohara_michiru_idolmastercinderellagirls pipeline_tag: text-to-image tags: - art --- # Lora of oohara_michiru_idolmastercinderellagirls This model is trained with [HCP-Diffusion](https://github.com/7eu7d7/HCP-Diffusion). And the auto-training framework is maintained by [DeepGHS Team](https://huggingface.co/deepghs). The base model used during training is [NAI](https://huggingface.co/deepghs/animefull-latest), and the base model used for generating preview images is [Meina/MeinaMix_V11](https://huggingface.co/Meina/MeinaMix_V11). After downloading the pt and safetensors files for the specified step, you need to use them simultaneously. The pt file will be used as an embedding, while the safetensors file will be loaded for Lora. For example, if you want to use the model from step 4420, you need to download `4420/oohara_michiru_idolmastercinderellagirls.pt` as the embedding and `4420/oohara_michiru_idolmastercinderellagirls.safetensors` for loading Lora. By using both files together, you can generate images for the desired characters. **The best step we recommend is 4420**, with the score of 0.974. The trigger words are: 1. `oohara_michiru_idolmastercinderellagirls` 2. `brown_hair, drill_hair, smile, food, pink_eyes, fang, open_mouth, bow, hair_ornament` For the following groups, it is not recommended to use this model and we express regret: 1. Individuals who cannot tolerate any deviations from the original character design, even in the slightest detail. 2. Individuals who are facing the application scenarios with high demands for accuracy in recreating character outfits. 3. Individuals who cannot accept the potential randomness in AI-generated images based on the Stable Diffusion algorithm. 4. Individuals who are not comfortable with the fully automated process of training character models using LoRA, or those who believe that training character models must be done purely through manual operations to avoid disrespecting the characters. 5. Individuals who finds the generated image content offensive to their values. These are available steps: | Steps | Score | Download | pattern_1 | bikini | bondage | free | maid | miko | nude | nude2 | suit | yukata | |:---------|:----------|:------------------------------------------------------------------|:-----------------------------------------------|:-----------------------------------------|:--------------------------------------------------|:-------------------------------------|:-------------------------------------|:-------------------------------------|:-----------------------------------------------|:------------------------------------------------|:-------------------------------------|:-----------------------------------------| | 5100 | 0.971 | [Download](5100/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-5100](5100/previews/pattern_1.png) | ![bikini-5100](5100/previews/bikini.png) | [<NSFW, click to see>](5100/previews/bondage.png) | ![free-5100](5100/previews/free.png) | ![maid-5100](5100/previews/maid.png) | ![miko-5100](5100/previews/miko.png) | [<NSFW, click to see>](5100/previews/nude.png) | [<NSFW, click to see>](5100/previews/nude2.png) | ![suit-5100](5100/previews/suit.png) | ![yukata-5100](5100/previews/yukata.png) | | 4760 | 0.971 | [Download](4760/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-4760](4760/previews/pattern_1.png) | ![bikini-4760](4760/previews/bikini.png) | [<NSFW, click to see>](4760/previews/bondage.png) | ![free-4760](4760/previews/free.png) | ![maid-4760](4760/previews/maid.png) | ![miko-4760](4760/previews/miko.png) | [<NSFW, click to see>](4760/previews/nude.png) | [<NSFW, click to see>](4760/previews/nude2.png) | ![suit-4760](4760/previews/suit.png) | ![yukata-4760](4760/previews/yukata.png) | | **4420** | **0.974** | [**Download**](4420/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-4420](4420/previews/pattern_1.png) | ![bikini-4420](4420/previews/bikini.png) | [<NSFW, click to see>](4420/previews/bondage.png) | ![free-4420](4420/previews/free.png) | ![maid-4420](4420/previews/maid.png) | ![miko-4420](4420/previews/miko.png) | [<NSFW, click to see>](4420/previews/nude.png) | [<NSFW, click to see>](4420/previews/nude2.png) | ![suit-4420](4420/previews/suit.png) | ![yukata-4420](4420/previews/yukata.png) | | 4080 | 0.870 | [Download](4080/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-4080](4080/previews/pattern_1.png) | ![bikini-4080](4080/previews/bikini.png) | [<NSFW, click to see>](4080/previews/bondage.png) | ![free-4080](4080/previews/free.png) | ![maid-4080](4080/previews/maid.png) | ![miko-4080](4080/previews/miko.png) | [<NSFW, click to see>](4080/previews/nude.png) | [<NSFW, click to see>](4080/previews/nude2.png) | ![suit-4080](4080/previews/suit.png) | ![yukata-4080](4080/previews/yukata.png) | | 3740 | 0.972 | [Download](3740/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-3740](3740/previews/pattern_1.png) | ![bikini-3740](3740/previews/bikini.png) | [<NSFW, click to see>](3740/previews/bondage.png) | ![free-3740](3740/previews/free.png) | ![maid-3740](3740/previews/maid.png) | ![miko-3740](3740/previews/miko.png) | [<NSFW, click to see>](3740/previews/nude.png) | [<NSFW, click to see>](3740/previews/nude2.png) | ![suit-3740](3740/previews/suit.png) | ![yukata-3740](3740/previews/yukata.png) | | 3400 | 0.967 | [Download](3400/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-3400](3400/previews/pattern_1.png) | ![bikini-3400](3400/previews/bikini.png) | [<NSFW, click to see>](3400/previews/bondage.png) | ![free-3400](3400/previews/free.png) | ![maid-3400](3400/previews/maid.png) | ![miko-3400](3400/previews/miko.png) | [<NSFW, click to see>](3400/previews/nude.png) | [<NSFW, click to see>](3400/previews/nude2.png) | ![suit-3400](3400/previews/suit.png) | ![yukata-3400](3400/previews/yukata.png) | | 3060 | 0.964 | [Download](3060/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-3060](3060/previews/pattern_1.png) | ![bikini-3060](3060/previews/bikini.png) | [<NSFW, click to see>](3060/previews/bondage.png) | ![free-3060](3060/previews/free.png) | ![maid-3060](3060/previews/maid.png) | ![miko-3060](3060/previews/miko.png) | [<NSFW, click to see>](3060/previews/nude.png) | [<NSFW, click to see>](3060/previews/nude2.png) | ![suit-3060](3060/previews/suit.png) | ![yukata-3060](3060/previews/yukata.png) | | 2720 | 0.927 | [Download](2720/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-2720](2720/previews/pattern_1.png) | ![bikini-2720](2720/previews/bikini.png) | [<NSFW, click to see>](2720/previews/bondage.png) | ![free-2720](2720/previews/free.png) | ![maid-2720](2720/previews/maid.png) | ![miko-2720](2720/previews/miko.png) | [<NSFW, click to see>](2720/previews/nude.png) | [<NSFW, click to see>](2720/previews/nude2.png) | ![suit-2720](2720/previews/suit.png) | ![yukata-2720](2720/previews/yukata.png) | | 2380 | 0.957 | [Download](2380/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-2380](2380/previews/pattern_1.png) | ![bikini-2380](2380/previews/bikini.png) | [<NSFW, click to see>](2380/previews/bondage.png) | ![free-2380](2380/previews/free.png) | ![maid-2380](2380/previews/maid.png) | ![miko-2380](2380/previews/miko.png) | [<NSFW, click to see>](2380/previews/nude.png) | [<NSFW, click to see>](2380/previews/nude2.png) | ![suit-2380](2380/previews/suit.png) | ![yukata-2380](2380/previews/yukata.png) | | 2040 | 0.912 | [Download](2040/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-2040](2040/previews/pattern_1.png) | ![bikini-2040](2040/previews/bikini.png) | [<NSFW, click to see>](2040/previews/bondage.png) | ![free-2040](2040/previews/free.png) | ![maid-2040](2040/previews/maid.png) | ![miko-2040](2040/previews/miko.png) | [<NSFW, click to see>](2040/previews/nude.png) | [<NSFW, click to see>](2040/previews/nude2.png) | ![suit-2040](2040/previews/suit.png) | ![yukata-2040](2040/previews/yukata.png) | | 1700 | 0.788 | [Download](1700/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-1700](1700/previews/pattern_1.png) | ![bikini-1700](1700/previews/bikini.png) | [<NSFW, click to see>](1700/previews/bondage.png) | ![free-1700](1700/previews/free.png) | ![maid-1700](1700/previews/maid.png) | ![miko-1700](1700/previews/miko.png) | [<NSFW, click to see>](1700/previews/nude.png) | [<NSFW, click to see>](1700/previews/nude2.png) | ![suit-1700](1700/previews/suit.png) | ![yukata-1700](1700/previews/yukata.png) | | 1360 | 0.755 | [Download](1360/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-1360](1360/previews/pattern_1.png) | ![bikini-1360](1360/previews/bikini.png) | [<NSFW, click to see>](1360/previews/bondage.png) | ![free-1360](1360/previews/free.png) | ![maid-1360](1360/previews/maid.png) | ![miko-1360](1360/previews/miko.png) | [<NSFW, click to see>](1360/previews/nude.png) | [<NSFW, click to see>](1360/previews/nude2.png) | ![suit-1360](1360/previews/suit.png) | ![yukata-1360](1360/previews/yukata.png) | | 1020 | 0.706 | [Download](1020/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-1020](1020/previews/pattern_1.png) | ![bikini-1020](1020/previews/bikini.png) | [<NSFW, click to see>](1020/previews/bondage.png) | ![free-1020](1020/previews/free.png) | ![maid-1020](1020/previews/maid.png) | ![miko-1020](1020/previews/miko.png) | [<NSFW, click to see>](1020/previews/nude.png) | [<NSFW, click to see>](1020/previews/nude2.png) | ![suit-1020](1020/previews/suit.png) | ![yukata-1020](1020/previews/yukata.png) | | 680 | 0.644 | [Download](680/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-680](680/previews/pattern_1.png) | ![bikini-680](680/previews/bikini.png) | [<NSFW, click to see>](680/previews/bondage.png) | ![free-680](680/previews/free.png) | ![maid-680](680/previews/maid.png) | ![miko-680](680/previews/miko.png) | [<NSFW, click to see>](680/previews/nude.png) | [<NSFW, click to see>](680/previews/nude2.png) | ![suit-680](680/previews/suit.png) | ![yukata-680](680/previews/yukata.png) | | 340 | 0.520 | [Download](340/oohara_michiru_idolmastercinderellagirls.zip) | ![pattern_1-340](340/previews/pattern_1.png) | ![bikini-340](340/previews/bikini.png) | [<NSFW, click to see>](340/previews/bondage.png) | ![free-340](340/previews/free.png) | ![maid-340](340/previews/maid.png) | ![miko-340](340/previews/miko.png) | [<NSFW, click to see>](340/previews/nude.png) | [<NSFW, click to see>](340/previews/nude2.png) | ![suit-340](340/previews/suit.png) | ![yukata-340](340/previews/yukata.png) |
Rosi-si/py_gec_mT5_v2
Rosi-si
2023-09-22T01:21:44Z
111
0
transformers
[ "transformers", "pytorch", "mt5", "text2text-generation", "generated_from_trainer", "base_model:Rosi-si/py_gec_mT5", "base_model:finetune:Rosi-si/py_gec_mT5", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text2text-generation
2023-09-22T00:12:46Z
--- license: apache-2.0 base_model: Rosi-si/py_gec_mT5 tags: - generated_from_trainer model-index: - name: py_gec_mT5_v2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # py_gec_mT5_v2 This model is a fine-tuned version of [Rosi-si/py_gec_mT5](https://huggingface.co/Rosi-si/py_gec_mT5) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.3107 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 4 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:-----:|:---------------:| | 0.0 | 1.0 | 4187 | 0.3107 | | 0.0 | 2.0 | 8374 | 0.3107 | | 0.0 | 3.0 | 12561 | 0.3107 | | 0.0 | 4.0 | 16748 | 0.3107 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
4bit/Qwen-VL-Chat
4bit
2023-09-22T01:12:54Z
4
0
transformers
[ "transformers", "pytorch", "qwen", "text-generation", "custom_code", "zh", "en", "arxiv:2308.12966", "autotrain_compatible", "region:us" ]
text-generation
2023-09-22T01:07:31Z
--- language: - zh - en tags: - qwen pipeline_tag: text-generation inference: false --- # Qwen-VL-Chat <br> <p align="center"> <img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/logo_vl.jpg" width="400"/> <p> <br> <p align="center"> Qwen-VL <a href="https://modelscope.cn/models/qwen/Qwen-VL/summary">🤖 <a> | <a href="https://huggingface.co/Qwen/Qwen-VL">🤗</a>&nbsp | Qwen-VL-Chat <a href="https://modelscope.cn/models/qwen/Qwen-VL-Chat/summary">🤖 <a>| <a href="https://huggingface.co/Qwen/Qwen-VL-Chat">🤗</a>&nbsp | Qwen-VL-Chat-Int4 <a href="https://huggingface.co/Qwen/Qwen-VL-Chat-Int4">🤗</a> <br> <a href="assets/wechat.png">WeChat</a>&nbsp&nbsp | &nbsp&nbsp<a href="https://discord.gg/z3GAxXZ9Ce">Discord</a>&nbsp&nbsp | &nbsp&nbsp<a href="https://modelscope.cn/studios/qwen/Qwen-VL-Chat-Demo/summary">Demo</a>&nbsp | &nbsp<a href="https://arxiv.org/abs/2308.12966">Report</a> </p> <br> **Qwen-VL** 是阿里云研发的大规模视觉语言模型(Large Vision Language Model, LVLM)。Qwen-VL 可以以图像、文本、检测框作为输入,并以文本和检测框作为输出。Qwen-VL 系列模型性能强大,具备多语言对话、多图交错对话等能力,并支持中文开放域定位和细粒度图像识别与理解。 **Qwen-VL** (Qwen Large Vision Language Model) is the visual multimodal version of the large model series, Qwen (abbr. Tongyi Qianwen), proposed by Alibaba Cloud. Qwen-VL accepts image, text, and bounding box as inputs, outputs text and bounding box. The features of Qwen-VL include: 目前,我们提供了Qwen-VL和Qwen-VL-Chat两个模型,分别为预训练模型和Chat模型。如果想了解更多关于模型的信息,请点击[链接](https://github.com/QwenLM/Qwen-VL/blob/master/visual_memo.md)查看我们的技术备忘录。本仓库为Qwen-VL-Chat仓库。 We release Qwen-VL and Qwen-VL-Chat, which are pretrained model and Chat model respectively. For more details about Qwen-VL, please refer to our [technical memo](https://github.com/QwenLM/Qwen-VL/blob/master/visual_memo.md). This repo is the one for Qwen-VL-Chat. <br> ## 安装要求 (Requirements) * python 3.8及以上版本 * pytorch 1.12及以上版本,推荐2.0及以上版本 * 建议使用CUDA 11.4及以上(GPU用户需考虑此选项) * python 3.8 and above * pytorch 1.12 and above, 2.0 and above are recommended * CUDA 11.4 and above are recommended (this is for GPU users) <br> ## 快速开始 (Quickstart) 我们提供简单的示例来说明如何利用 🤗 Transformers 快速使用Qwen-VL-Chat。 在开始前,请确保你已经配置好环境并安装好相关的代码包。最重要的是,确保你满足上述要求,然后安装相关的依赖库。 Below, we provide simple examples to show how to use Qwen-VL-Chat with 🤗 Transformers. Before running the code, make sure you have setup the environment and installed the required packages. Make sure you meet the above requirements, and then install the dependent libraries. ```bash pip install -r requirements.txt ``` 接下来你可以开始使用Transformers来使用我们的模型。关于视觉模块的更多用法,请参考[教程](TUTORIAL.md)。 Now you can start with Transformers. More usage aboue vision encoder, please refer to [tutorial](TUTORIAL_zh.md). #### 🤗 Transformers To use Qwen-VL-Chat for the inference, all you need to do is to input a few lines of codes as demonstrated below. However, **please make sure that you are using the latest code.** ```python from transformers import AutoModelForCausalLM, AutoTokenizer from transformers.generation import GenerationConfig import torch torch.manual_seed(1234) # Note: The default behavior now has injection attack prevention off. tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen-VL-Chat", trust_remote_code=True) # use bf16 # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="auto", trust_remote_code=True, bf16=True).eval() # use fp16 # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="auto", trust_remote_code=True, fp16=True).eval() # use cpu only # model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="cpu", trust_remote_code=True).eval() # use cuda device model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen-VL-Chat", device_map="cuda", trust_remote_code=True).eval() # Specify hyperparameters for generation (No need to do this if you are using transformers>=4.32.0) # model.generation_config = GenerationConfig.from_pretrained("Qwen/Qwen-VL-Chat", trust_remote_code=True) # 1st dialogue turn query = tokenizer.from_list_format([ {'image': 'https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg'}, {'text': '这是什么'}, ]) response, history = model.chat(tokenizer, query=query, history=None) print(response) # 图中是一名年轻女子在沙滩上和她的狗玩耍,狗的品种可能是拉布拉多。她们坐在沙滩上,狗的前腿抬起来,似乎在和人类击掌。两人之间充满了信任和爱。 # 2nd dialogue turn response, history = model.chat(tokenizer, '输出"击掌"的检测框', history=history) print(response) # <ref>击掌</ref><box>(517,508),(589,611)</box> image = tokenizer.draw_bbox_on_latest_picture(response, history) if image: image.save('1.jpg') else: print("no box") ``` <p align="center"> <img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo_highfive.jpg" width="500"/> <p> <br> ## 量化 (Quantization) ### 用法 (Usage) 当前我们提供了基于[AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ)的量化方案,并提供了Qwen-VL-Chat的Int4量化版本Qwen-VL-Chat-Int4 [点击此处](https://huggingface.co/Qwen/Qwen-VL-Chat-Int4)。该模型在效果评测上几乎无损,并在显存占用和推理速度上具有明显优势。 下文说明如何使用该量化模型。开始之前,请确保你满足要求(如torch2.0及以上、transformers 4.32.0及以上,等)并安装所需的代码库: We provide a new solution based on [AutoGPTQ](https://github.com/PanQiWei/AutoGPTQ), and release an Int4 quantized model for Qwen-VL-Chat, Qwen-VL-Chat-Int4 [Click here](https://huggingface.co/Qwen/Qwen-VL-Chat-Int4), which achieves nearly lossless model effects but improved performance on both memory costs and inference speed. Here we demonstrate how to use our provided quantized models for inference. Before you start, make sure you meet the requirements (e.g., torch 2.0 and above, transformers 4.32.0 and above, etc.) and install the required packages: ```bash pip install optimum git clone https://github.com/JustinLin610/AutoGPTQ.git & cd AutoGPTQ pip install -v . ``` 如遇到安装 `auto-gptq` 的问题,建议您前往官方[repo](https://github.com/PanQiWei/AutoGPTQ) 寻找合适的wheel。 随后你便可以按照上述用法,轻松调用量化模型: If you meet problems installing `auto-gptq`, we advise you to check out the official [repo](https://github.com/PanQiWei/AutoGPTQ) to find a wheel. Then you can load the quantized model easily and run inference as same as usual: ```python model = AutoModelForCausalLM.from_pretrained( "Qwen/Qwen-VL-Chat-Int4", device_map="auto", trust_remote_code=True ).eval() # Either a local path or an u[](https://)rl between <img></img> tags. image_path = 'https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg' response, history = model.chat(tokenizer, query=f'<img>{image_path}</img>这是什么', history=None) print(response) ``` ### 效果评测 (Performance) 我们列出不同精度下模型在评测基准 **[TouchStone](https://github.com/OFA-Sys/TouchStone)** 上的表现,并发现量化模型并没有显著性能损失。结果如下所示: We illustrate the model performance of both BF16 and Int4 models on the benchmark **[TouchStone](https://github.com/OFA-Sys/TouchStone)**, and we find that the quantized model does not suffer from significant performance degradation. Results are shown below: | Quantization | ZH. | EN | | ------------ | :--------: | :-----------: | | BF16 | 401.2 | 645.2 | | Int4 | 386.6 | 651.4 | ### 推理速度 (Inference Speed) 我们测算了在输入一张图片(即258个token)的条件下BF16和Int4的模型生成1792 (2048-258) 和 7934 (8192-258) 个token的平均速度。 We measured the average inference speed (tokens/s) of generating 1792 (2048-258) and 7934 (8192-258) tokens with the context of an image (which takes 258 tokens) under BF16 precision and Int4 quantization, respectively. | Quantization | Speed (2048 tokens) | Speed (8192 tokens) | | ------------ | :-----------------: | :-----------------: | | BF16 | 28.87 | 24.32 | | Int4 | 37.79 | 34.34 | 推理速度测算是在单卡 A100-SXM4-80G GPU上运行,使用PyTorch 2.0.1及CUDA 11.4。 The profiling runs on a single A100-SXM4-80G GPU with PyTorch 2.0.1 and CUDA 11.4. ### GPU显存占用 (GPU Memory Usage) 我们还测算了在一张图片输入的条件下BF16和Int4模型生成1792 (2048-258) 和 7934 (8192-258) 个token所需显存。结果如下所示: We also profile the peak GPU memory usage for encoding 1792 (2048-258) tokens (including an image) as context (and generating single token) and generating 7934 (8192-258) tokens (with an image as context) under BF16 or Int4 quantization level, respectively. The results are shown below. | Quantization | Peak Usage for Encoding 2048 Tokens | Peak Usage for Generating 8192 Tokens | | ------------ | :---------------------------------: | :-----------------------------------: | | BF16 | 22.60GB | 28.01GB | | Int4 | 11.82GB | 17.23GB | 上述速度和显存测算使用[此脚本](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile_mm.py)完成。 The above speed and memory profiling are conducted using [this script](https://qianwen-res.oss-cn-beijing.aliyuncs.com/profile_mm.py). <br> ## 评测 我们从两个角度评测了两个模型的能力: 1. 在**英文标准 Benchmark** 上评测模型的基础任务能力。目前评测了四大类多模态任务: - Zero-shot Caption: 评测模型在未见过数据集上的零样本图片描述能力; - General VQA: 评测模型的通用问答能力,例如判断题、颜色、个数、类目等问答能力; - Text-based VQA:评测模型对于图片中文字相关的识别/问答能力,例如文档问答、图表问答、文字问答等; - Referring Expression Compression:评测模型给定物体描述画检测框的能力; 2. **试金石 (TouchStone)**:为了评测模型整体的图文对话能力和人类对齐水平。我们为此构建了一个基于 GPT4 打分来评测 LVLM 模型的 Benchmark:TouchStone。在 TouchStone-v0.1 中: - 评测基准总计涵盖 300+张图片、800+道题目、27个类别。包括基础属性问答、人物地标问答、影视作品问答、视觉推理、反事实推理、诗歌创作、故事写作,商品比较、图片解题等**尽可能广泛的类别**。 - 为了弥补目前 GPT4 无法直接读取图片的缺陷,我们给所有的带评测图片提供了**人工标注的充分详细描述**,并且将图片的详细描述、问题和模型的输出结果一起交给 GPT4 打分。 - 评测同时包含英文版本和中文版本。 评测结果如下: We evaluated the model's ability from two perspectives: 1. **Standard Benchmarks**: We evaluate the model's basic task capabilities on four major categories of multimodal tasks: - Zero-shot Caption: Evaluate model's zero-shot image captioning ability on unseen datasets; - General VQA: Evaluate the general question-answering ability of pictures, such as the judgment, color, number, category, etc; - Text-based VQA: Evaluate the model's ability to recognize text in pictures, such as document QA, chart QA, etc; - Referring Expression Comprehension: Evaluate the ability to localize a target object in an image described by a referring expression. 2. **TouchStone**: To evaluate the overall text-image dialogue capability and alignment level with humans, we have constructed a benchmark called TouchStone, which is based on scoring with GPT4 to evaluate the LVLM model. - The TouchStone benchmark covers a total of 300+ images, 800+ questions, and 27 categories. Such as attribute-based Q&A, celebrity recognition, writing poetry, summarizing multiple images, product comparison, math problem solving, etc; - In order to break the current limitation of GPT4 in terms of direct image input, TouchStone provides fine-grained image annotations by human labeling. These detailed annotations, along with the questions and the model's output, are then presented to GPT4 for scoring. - The benchmark includes both English and Chinese versions. The results of the evaluation are as follows: Qwen-VL outperforms current SOTA generalist models on multiple VL tasks and has a more comprehensive coverage in terms of capability range. <p align="center"> <img src="https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/radar.png" width="600"/> <p> ### 零样本图像描述 & 通用视觉问答 (Zero-shot Captioning & General VQA) <table> <thead> <tr> <th rowspan="2">Model type</th> <th rowspan="2">Model</th> <th colspan="2">Zero-shot Captioning</th> <th colspan="5">General VQA</th> </tr> <tr> <th>NoCaps</th> <th>Flickr30K</th> <th>VQAv2<sup>dev</sup></th> <th>OK-VQA</th> <th>GQA</th> <th>SciQA-Img<br>(0-shot)</th> <th>VizWiz<br>(0-shot)</th> </tr> </thead> <tbody align="center"> <tr> <td rowspan="10">Generalist<br>Models</td> <td>Flamingo-9B</td> <td>-</td> <td>61.5</td> <td>51.8</td> <td>44.7</td> <td>-</td> <td>-</td> <td>28.8</td> </tr> <tr> <td>Flamingo-80B</td> <td>-</td> <td>67.2</td> <td>56.3</td> <td>50.6</td> <td>-</td> <td>-</td> <td>31.6</td> </tr> <tr> <td>Unified-IO-XL</td> <td>100.0</td> <td>-</td> <td>77.9</td> <td>54.0</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>Kosmos-1</td> <td>-</td> <td>67.1</td> <td>51.0</td> <td>-</td> <td>-</td> <td>-</td> <td>29.2</td> </tr> <tr> <td>Kosmos-2</td> <td>-</td> <td>66.7</td> <td>45.6</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>BLIP-2 (Vicuna-13B)</td> <td>103.9</td> <td>71.6</td> <td>65.0</td> <td>45.9</td> <td>32.3</td> <td>61.0</td> <td>19.6</td> </tr> <tr> <td>InstructBLIP (Vicuna-13B)</td> <td><strong>121.9</strong></td> <td>82.8</td> <td>-</td> <td>-</td> <td>49.5</td> <td>63.1</td> <td>33.4</td> </tr> <tr> <td>Shikra (Vicuna-13B)</td> <td>-</td> <td>73.9</td> <td>77.36</td> <td>47.16</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td><strong>Qwen-VL (Qwen-7B)</strong></td> <td>121.4</td> <td><b>85.8</b></td> <td><b>78.8</b></td> <td><b>58.6</b></td> <td><b>59.3</b></td> <td>67.1</td> <td>35.2</td> </tr> <!-- <tr> <td>Qwen-VL (4-shot)</td> <td>-</td> <td>-</td> <td>-</td> <td>63.6</td> <td>-</td> <td>-</td> <td>39.1</td> </tr> --> <tr> <td>Qwen-VL-Chat</td> <td>120.2</td> <td>81.0</td> <td>78.2</td> <td>56.6</td> <td>57.5</td> <td><b>68.2</b></td> <td><b>38.9</b></td> </tr> <!-- <tr> <td>Qwen-VL-Chat (4-shot)</td> <td>-</td> <td>-</td> <td>-</td> <td>60.6</td> <td>-</td> <td>-</td> <td>44.45</td> </tr> --> <tr> <td>Previous SOTA<br>(Per Task Fine-tuning)</td> <td>-</td> <td>127.0<br>(PALI-17B)</td> <td>84.5<br>(InstructBLIP<br>-FlanT5-XL)</td> <td>86.1<br>(PALI-X<br>-55B)</td> <td>66.1<br>(PALI-X<br>-55B)</td> <td>72.1<br>(CFR)</td> <td>92.53<br>(LLaVa+<br>GPT-4)</td> <td>70.9<br>(PALI-X<br>-55B)</td> </tr> </tbody> </table> - 在 Zero-shot Caption 中,Qwen-VL 在 Flickr30K 数据集上取得了 **SOTA** 的结果,并在 Nocaps 数据集上取得了和 InstructBlip 可竞争的结果。 - 在 General VQA 中,Qwen-VL 取得了 LVLM 模型同等量级和设定下 **SOTA** 的结果。 - For zero-shot image captioning, Qwen-VL achieves the **SOTA** on Flickr30K and competitive results on Nocaps with InstructBlip. - For general VQA, Qwen-VL achieves the **SOTA** under the same generalist LVLM scale settings. ### 文本导向的视觉问答 (Text-oriented VQA) <table> <thead> <tr> <th>Model type</th> <th>Model</th> <th>TextVQA</th> <th>DocVQA</th> <th>ChartQA</th> <th>AI2D</th> <th>OCR-VQA</th> </tr> </thead> <tbody align="center"> <tr> <td rowspan="5">Generalist Models</td> <td>BLIP-2 (Vicuna-13B)</td> <td>42.4</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>InstructBLIP (Vicuna-13B)</td> <td>50.7</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>mPLUG-DocOwl (LLaMA-7B)</td> <td>52.6</td> <td>62.2</td> <td>57.4</td> <td>-</td> <td>-</td> </tr> <tr> <td>Pic2Struct-Large (1.3B)</td> <td>-</td> <td><b>76.6</b></td> <td>58.6</td> <td>42.1</td> <td>71.3</td> </tr> <tr> <td>Qwen-VL (Qwen-7B)</td> <td><b>63.8</b></td> <td>65.1</td> <td><b>65.7</b></td> <td><b>62.3</b></td> <td><b>75.7</b></td> </tr> <tr> <td>Specialist SOTAs<br>(Specialist/Finetuned)</td> <td>PALI-X-55B (Single-task FT)<br>(Without OCR Pipeline)</td> <td>71.44</td> <td>80.0</td> <td>70.0</td> <td>81.2</td> <td>75.0</td> </tr> </tbody> </table> - 在文字相关的识别/问答评测上,取得了当前规模下通用 LVLM 达到的最好结果。 - 分辨率对上述某几个评测非常重要,大部分 224 分辨率的开源 LVLM 模型无法完成以上评测,或只能通过切图的方式解决。Qwen-VL 将分辨率提升到 448,可以直接以端到端的方式进行以上评测。Qwen-VL 在很多任务上甚至超过了 1024 分辨率的 Pic2Struct-Large 模型。 - In text-related recognition/QA evaluation, Qwen-VL achieves the SOTA under the generalist LVLM scale settings. - Resolution is important for several above evaluations. While most open-source LVLM models with 224 resolution are incapable of these evaluations or can only solve these by cutting images, Qwen-VL scales the resolution to 448 so that it can be evaluated end-to-end. Qwen-VL even outperforms Pic2Struct-Large models of 1024 resolution on some tasks. ### 细粒度视觉定位 (Referring Expression Comprehension) <table> <thead> <tr> <th rowspan="2">Model type</th> <th rowspan="2">Model</th> <th colspan="3">RefCOCO</th> <th colspan="3">RefCOCO+</th> <th colspan="2">RefCOCOg</th> <th>GRIT</th> </tr> <tr> <th>val</th> <th>test-A</th> <th>test-B</th> <th>val</th> <th>test-A</th> <th>test-B</th> <th>val-u</th> <th>test-u</th> <th>refexp</th> </tr> </thead> <tbody align="center"> <tr> <td rowspan="8">Generalist Models</td> <td>GPV-2</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>51.50</td> </tr> <tr> <td>OFA-L*</td> <td>79.96</td> <td>83.67</td> <td>76.39</td> <td>68.29</td> <td>76.00</td> <td>61.75</td> <td>67.57</td> <td>67.58</td> <td>61.70</td> </tr> <tr> <td>Unified-IO</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td><b>78.61</b></td> </tr> <tr> <td>VisionLLM-H</td> <td></td> <td>86.70</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> <td>-</td> </tr> <tr> <td>Shikra-7B</td> <td>87.01</td> <td>90.61</td> <td>80.24 </td> <td>81.60</td> <td>87.36</td> <td>72.12</td> <td>82.27</td> <td>82.19</td> <td>69.34</td> </tr> <tr> <td>Shikra-13B</td> <td>87.83 </td> <td>91.11</td> <td>81.81</td> <td>82.89</td> <td>87.79</td> <td>74.41</td> <td>82.64</td> <td>83.16</td> <td>69.03</td> </tr> <tr> <td>Qwen-VL-7B</td> <td><b>89.36</b></td> <td>92.26</td> <td><b>85.34</b></td> <td><b>83.12</b></td> <td>88.25</td> <td><b>77.21</b></td> <td>85.58</td> <td>85.48</td> <td>78.22</td> </tr> <tr> <td>Qwen-VL-7B-Chat</td> <td>88.55</td> <td><b>92.27</b></td> <td>84.51</td> <td>82.82</td> <td><b>88.59</b></td> <td>76.79</td> <td><b>85.96</b></td> <td><b>86.32</b></td> <td>-</td> <tr> <td rowspan="3">Specialist SOTAs<br>(Specialist/Finetuned)</td> <td>G-DINO-L</td> <td>90.56&nbsp;&nbsp;</td> <td>93.19</td> <td>88.24</td> <td>82.75</td> <td>88.95</td> <td>75.92</td> <td>86.13</td> <td>87.02</td> <td>-</td> </tr> <tr> <td>UNINEXT-H</td> <td>92.64 </td> <td>94.33</td> <td>91.46</td> <td>85.24</td> <td>89.63</td> <td>79.79</td> <td>88.73</td> <td>89.37</td> <td>-</td> </tr> <tr> <td>ONE-PEACE</td> <td>92.58 </td> <td>94.18</td> <td>89.26</td> <td>88.77</td> <td>92.21</td> <td>83.23</td> <td>89.22</td> <td>89.27</td> <td>-</td> </tr> </tbody> </table> - 在定位任务上,Qwen-VL 全面超过 Shikra-13B,取得了目前 Generalist LVLM 模型上在 Refcoco 上的 **SOTA**。 - Qwen-VL 并没有在任何中文定位数据上训练过,但通过中文 Caption 数据和 英文 Grounding 数据的训练,可以 Zero-shot 泛化出中文 Grounding 能力。 我们提供了以上**所有**评测脚本以供复现我们的实验结果。请阅读 [eval/EVALUATION.md](eval/EVALUATION.md) 了解更多信息。 - Qwen-VL achieves the **SOTA** in all above referring expression comprehension benchmarks. - Qwen-VL has not been trained on any Chinese grounding data, but it can still generalize to the Chinese Grounding tasks in a zero-shot way by training Chinese Caption data and English Grounding data. We provide all of the above evaluation scripts for reproducing our experimental results. Please read [eval/EVALUATION.md](eval/EVALUATION.md) for more information. ### 闲聊能力测评 (Chat Evaluation) TouchStone 是一个基于 GPT4 打分来评测 LVLM 模型的图文对话能力和人类对齐水平的基准。它涵盖了 300+张图片、800+道题目、27个类别,包括基础属性、人物地标、视觉推理、诗歌创作、故事写作、商品比较、图片解题等**尽可能广泛的类别**。关于 TouchStone 的详细介绍,请参考[touchstone/README_CN.md](touchstone/README_CN.md)了解更多信息。 TouchStone is a benchmark based on scoring with GPT4 to evaluate the abilities of the LVLM model on text-image dialogue and alignment levels with humans. It covers a total of 300+ images, 800+ questions, and 27 categories, such as attribute-based Q&A, celebrity recognition, writing poetry, summarizing multiple images, product comparison, math problem solving, etc. Please read [touchstone/README_CN.md](touchstone/README.md) for more information. #### 英语 (English) | Model | Score | |---------------|-------| | PandaGPT | 488.5 | | MiniGPT4 | 531.7 | | InstructBLIP | 552.4 | | LLaMA-AdapterV2 | 590.1 | | mPLUG-Owl | 605.4 | | LLaVA | 602.7 | | Qwen-VL-Chat | 645.2 | #### 中文 (Chinese) | Model | Score | |---------------|-------| | VisualGLM | 247.1 | | Qwen-VL-Chat | 401.2 | Qwen-VL-Chat 模型在中英文的对齐评测中均取得当前 LVLM 模型下的最好结果。 Qwen-VL-Chat has achieved the best results in both Chinese and English alignment evaluation. <br> ## 常见问题 (FAQ) 如遇到问题,敬请查阅 [FAQ](https://github.com/QwenLM/Qwen-VL/blob/master/FAQ_zh.md)以及issue区,如仍无法解决再提交issue。 If you meet problems, please refer to [FAQ](https://github.com/QwenLM/Qwen-VL/blob/master/FAQ.md) and the issues first to search a solution before you launch a new issue. <br> ## 使用协议 (License Agreement) 研究人员与开发者可使用Qwen-VL和Qwen-VL-Chat或进行二次开发。我们同样允许商业使用,具体细节请查看[LICENSE](https://github.com/QwenLM/Qwen-VL/blob/master/LICENSE)。如需商用,请填写[问卷](https://dashscope.console.aliyun.com/openModelApply/qianwen)申请。 Researchers and developers are free to use the codes and model weights of both Qwen-VL and Qwen-VL-Chat. We also allow their commercial use. Check our license at [LICENSE](LICENSE) for more details. <br> ## 引用 (Citation) 如果你觉得我们的论文和代码对你的研究有帮助,请考虑:star: 和引用 :pencil: :) If you find our paper and code useful in your research, please consider giving a star :star: and citation :pencil: :) ```BibTeX @article{Qwen-VL, title={Qwen-VL: A Frontier Large Vision-Language Model with Versatile Abilities}, author={Bai, Jinze and Bai, Shuai and Yang, Shusheng and Wang, Shijie and Tan, Sinan and Wang, Peng and Lin, Junyang and Zhou, Chang and Zhou, Jingren}, journal={arXiv preprint arXiv:2308.12966}, year={2023} } ``` <br> ## 联系我们 (Contact Us) 如果你想给我们的研发团队和产品团队留言,请通过邮件(qianwen_opensource@alibabacloud.com)联系我们。 If you are interested to leave a message to either our research team or product team, feel free to send an email to qianwen_opensource@alibabacloud.com. ``` ```
jtlowell/cozy_sticker
jtlowell
2023-09-22T00:56:40Z
1
3
diffusers
[ "diffusers", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "lora", "dataset:jtlowell/cozy_stickers", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "base_model:adapter:stabilityai/stable-diffusion-xl-base-1.0", "region:us" ]
text-to-image
2023-09-21T23:37:15Z
--- base_model: stabilityai/stable-diffusion-xl-base-1.0 instance_prompt: cozy_stick tags: - stable-diffusion-xl - stable-diffusion-xl-diffusers - text-to-image - diffusers - lora inference: true datasets: - jtlowell/cozy_stickers --- # LoRA DreamBooth - jtlowell/cozy_sticker These are LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained on the concept prompt: `cozy_stick` Use this keyword to trigger your custom model in your prompts. LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Usage Make sure to upgrade diffusers to >= 0.19.0: ``` pip install diffusers --upgrade ``` In addition make sure to install transformers, safetensors, accelerate as well as the invisible watermark: ``` pip install invisible_watermark transformers accelerate safetensors ``` To just use the base model, you can run: ```python import torch from diffusers import DiffusionPipeline, AutoencoderKL vae = AutoencoderKL.from_pretrained('madebyollin/sdxl-vae-fp16-fix', torch_dtype=torch.float16) pipe = DiffusionPipeline.from_pretrained( "stabilityai/stable-diffusion-xl-base-1.0", vae=vae, torch_dtype=torch.float16, variant="fp16", use_safetensors=True ) # This is where you load your trained weights pipe.load_lora_weights('jtlowell/cozy_sticker') pipe.to("cuda") prompt = "A majestic cozy_stick jumping from a big stone at night" image = pipe(prompt=prompt, num_inference_steps=50).images[0] ```
Undi95/ReMM-v2.2-L2-13B-GGUF
Undi95
2023-09-22T00:32:48Z
24
4
null
[ "gguf", "license:cc-by-nc-4.0", "endpoints_compatible", "region:us" ]
null
2023-09-22T00:14:53Z
--- license: cc-by-nc-4.0 --- Re:MythoMax v2.2 (ReMM v2.2) is a recreation trial of the original [MythoMax-L2-B13](https://huggingface.co/Gryphe/MythoMax-L2-13b) with updated models. This merge use SLERP merging method to merge ReML v2.2 and Huginn v1.2. Explaination : ```shell - ReML-v2.2: (Chronos-Beluga v2/Hermes/Airboros 2.2) => Keeping The-Face-Of-Goonery/Chronos-Beluga-v2-13bfp16 => Replacing jondurbin/airoboros-l2-13b-2.2 by jondurbin/airoboros-l2-13b-2.2.1 (last version) => Keeping NousResearch/Nous-Hermes-Llama2-13b With that : - ReMM-v2.2: (ReML/Huginn v1.2) => Replacing ReMM by the one above (ReML v2.1) => Keeping The-Face-Of-Goonery/Huginn-13b-v1.2 (hottest) ``` <!-- description start --> ## Description This repo contains quantized files of ReMM v2.1, a recreation of the original MythoMax, but updated and merged with SLERP. <!-- description end --> <!-- description start --> ## Models used - The-Face-Of-Goonery/Chronos-Beluga-v2-13bfp16 - jondurbin/airoboros-l2-13b-2.2.1 - NousResearch/Nous-Hermes-Llama2-13b - The-Face-Of-Goonery/Huginn-13b-v1.2 - ReML-v2.1-L2-13B (Private recreation trial of an updated Mythologic-L2-13B) <!-- description end --> <!-- prompt-template start --> ## Prompt template: Alpaca ``` Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: {prompt} ### Response: ``` Special thanks to Sushi.
abdoelsayed/llama-7b-v1-Receipt-Key-Extraction
abdoelsayed
2023-09-22T00:16:36Z
15
0
transformers
[ "transformers", "pytorch", "llama", "text-generation", "en", "ar", "arxiv:2309.09800", "license:llama2", "autotrain_compatible", "text-generation-inference", "endpoints_compatible", "region:us" ]
text-generation
2023-09-21T12:58:19Z
--- license: llama2 language: - en - ar metrics: - accuracy - f1 library_name: transformers --- # llama-7b-v1-Receipt-Key-Extraction llama-7b-v1-Receipt-Key-Extraction is a 7 billion parameter based on LLamA v1 [AMuRD: Annotated Multilingual Receipts Dataset for Cross-lingual Key Information Extraction and Classification ](https://arxiv.org/abs/2309.09800) ## Uses The model is intended for research-only use in English and Arabic for key information extraction for items in receipts. ## How to Get Started with the Model Use the code below to get started with the model. ```bibtex # pip install -q transformers import torch from transformers import AutoModelForCausalLM, AutoTokenizer, GenerationConfig checkpoint = "abdoelsayed/llama-7b-v1-Receipt-Key-Extraction" device = "cuda" if torch.cuda.is_available() else "cpu" tokenizer = AutoTokenizer.from_pretrained(checkpoint, model_max_length=512, padding_side="right", use_fast=False,) model = AutoModelForCausalLM.from_pretrained(checkpoint).to(device) def generate_response(instruction, input_text, max_new_tokens=100, temperature=0.1, num_beams=4 ,top_k=40): prompt = f"Below is an instruction that describes a task, paired with an input that provides further context.\n\n### Instruction:\n{instruction}\n\n### Input:\n{input_text}\n\n### Response:" inputs = tokenizer(prompt, return_tensors="pt") input_ids = inputs["input_ids"].to(device) generation_config = GenerationConfig( temperature=temperature, top_p=top_p, top_k=top_k, num_beams=num_beams, ) with torch.no_grad(): outputs = model.generate(input_ids,generation_config=generation_config, max_new_tokens=max_new_tokens) outputs = tokenizer.decode(outputs.sequences[0]) return output.split("### Response:")[-1].strip().replace("</s>","") instruction = "Extract the class, Brand, Weight, Number of units, Size of units, Price, T.Price, Pack, Unit from the following sentence" input_text = "Americana Okra zero 400 gm" response = generate_response(instruction, input_text) print(response) ``` ## How to Cite Please cite this model using this format. ```bibtex @misc{abdallah2023amurd, title={AMuRD: Annotated Multilingual Receipts Dataset for Cross-lingual Key Information Extraction and Classification}, author={Abdelrahman Abdallah and Mahmoud Abdalla and Mohamed Elkasaby and Yasser Elbendary and Adam Jatowt}, year={2023}, eprint={2309.09800}, archivePrefix={arXiv}, primaryClass={cs.CL} } ```
jonas-luehrs/bert-base-uncased-MLM-fos-chemistry
jonas-luehrs
2023-09-22T00:13:32Z
124
0
transformers
[ "transformers", "pytorch", "bert", "fill-mask", "generated_from_trainer", "base_model:google-bert/bert-base-uncased", "base_model:finetune:google-bert/bert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
fill-mask
2023-09-22T00:02:09Z
--- license: apache-2.0 base_model: bert-base-uncased tags: - generated_from_trainer model-index: - name: bert-base-uncased-MLM-fos-chemistry results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-uncased-MLM-fos-chemistry This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 5.3995 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 7.1312 | 1.0 | 211 | 6.1375 | | 5.9819 | 2.0 | 422 | 5.7641 | | 5.7234 | 3.0 | 633 | 5.6039 | | 5.5776 | 4.0 | 844 | 5.4360 | | 5.4826 | 5.0 | 1055 | 5.4316 | ### Framework versions - Transformers 4.33.2 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
dwang-LI/segformer-b0-finetuned-segments-sidewalk-2
dwang-LI
2023-09-22T00:08:12Z
187
0
transformers
[ "transformers", "pytorch", "segformer", "vision", "image-segmentation", "generated_from_trainer", "base_model:nvidia/mit-b2", "base_model:finetune:nvidia/mit-b2", "license:other", "endpoints_compatible", "region:us" ]
image-segmentation
2023-09-21T18:54:13Z
--- license: other base_model: nvidia/mit-b2 tags: - vision - image-segmentation - generated_from_trainer model-index: - name: segformer-b0-finetuned-segments-sidewalk-2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-b0-finetuned-segments-sidewalk-2 This model is a fine-tuned version of [nvidia/mit-b2](https://huggingface.co/nvidia/mit-b2) on the segments/sidewalk-semantic dataset. It achieves the following results on the evaluation set: - Loss: 0.7413 - Mean Iou: 0.4159 - Mean Accuracy: 0.4945 - Overall Accuracy: 0.8774 - Accuracy Unlabeled: nan - Accuracy Flat-road: 0.8388 - Accuracy Flat-sidewalk: 0.9573 - Accuracy Flat-crosswalk: 0.8007 - Accuracy Flat-cyclinglane: 0.8552 - Accuracy Flat-parkingdriveway: 0.5279 - Accuracy Flat-railtrack: nan - Accuracy Flat-curb: 0.7175 - Accuracy Human-person: 0.8243 - Accuracy Human-rider: 0.0845 - Accuracy Vehicle-car: 0.9557 - Accuracy Vehicle-truck: 0.1300 - Accuracy Vehicle-bus: 0.0 - Accuracy Vehicle-tramtrain: 0.6660 - Accuracy Vehicle-motorcycle: 0.0883 - Accuracy Vehicle-bicycle: 0.7186 - Accuracy Vehicle-caravan: 0.0 - Accuracy Vehicle-cartrailer: 0.0195 - Accuracy Construction-building: 0.9361 - Accuracy Construction-door: 0.1788 - Accuracy Construction-wall: 0.6178 - Accuracy Construction-fenceguardrail: 0.3916 - Accuracy Construction-bridge: 0.5516 - Accuracy Construction-tunnel: nan - Accuracy Construction-stairs: 0.2882 - Accuracy Object-pole: 0.6120 - Accuracy Object-trafficsign: 0.5812 - Accuracy Object-trafficlight: 0.0 - Accuracy Nature-vegetation: 0.9537 - Accuracy Nature-terrain: 0.8377 - Accuracy Sky: 0.9844 - Accuracy Void-ground: 0.0132 - Accuracy Void-dynamic: 0.2116 - Accuracy Void-static: 0.4815 - Accuracy Void-unclear: 0.0 - Iou Unlabeled: nan - Iou Flat-road: 0.7575 - Iou Flat-sidewalk: 0.8765 - Iou Flat-crosswalk: 0.6860 - Iou Flat-cyclinglane: 0.7491 - Iou Flat-parkingdriveway: 0.4030 - Iou Flat-railtrack: nan - Iou Flat-curb: 0.5916 - Iou Human-person: 0.6521 - Iou Human-rider: 0.0488 - Iou Vehicle-car: 0.8662 - Iou Vehicle-truck: 0.1122 - Iou Vehicle-bus: 0.0 - Iou Vehicle-tramtrain: 0.4442 - Iou Vehicle-motorcycle: 0.0883 - Iou Vehicle-bicycle: 0.5874 - Iou Vehicle-caravan: 0.0 - Iou Vehicle-cartrailer: 0.0193 - Iou Construction-building: 0.7672 - Iou Construction-door: 0.1600 - Iou Construction-wall: 0.5304 - Iou Construction-fenceguardrail: 0.3282 - Iou Construction-bridge: 0.3920 - Iou Construction-tunnel: nan - Iou Construction-stairs: 0.2410 - Iou Object-pole: 0.5084 - Iou Object-trafficsign: 0.3941 - Iou Object-trafficlight: 0.0 - Iou Nature-vegetation: 0.8872 - Iou Nature-terrain: 0.7611 - Iou Sky: 0.9529 - Iou Void-ground: 0.0098 - Iou Void-dynamic: 0.1155 - Iou Void-static: 0.3780 - Iou Void-unclear: 0.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Flat-road | Accuracy Flat-sidewalk | Accuracy Flat-crosswalk | Accuracy Flat-cyclinglane | Accuracy Flat-parkingdriveway | Accuracy Flat-railtrack | Accuracy Flat-curb | Accuracy Human-person | Accuracy Human-rider | Accuracy Vehicle-car | Accuracy Vehicle-truck | Accuracy Vehicle-bus | Accuracy Vehicle-tramtrain | Accuracy Vehicle-motorcycle | Accuracy Vehicle-bicycle | Accuracy Vehicle-caravan | Accuracy Vehicle-cartrailer | Accuracy Construction-building | Accuracy Construction-door | Accuracy Construction-wall | Accuracy Construction-fenceguardrail | Accuracy Construction-bridge | Accuracy Construction-tunnel | Accuracy Construction-stairs | Accuracy Object-pole | Accuracy Object-trafficsign | Accuracy Object-trafficlight | Accuracy Nature-vegetation | Accuracy Nature-terrain | Accuracy Sky | Accuracy Void-ground | Accuracy Void-dynamic | Accuracy Void-static | Accuracy Void-unclear | Iou Unlabeled | Iou Flat-road | Iou Flat-sidewalk | Iou Flat-crosswalk | Iou Flat-cyclinglane | Iou Flat-parkingdriveway | Iou Flat-railtrack | Iou Flat-curb | Iou Human-person | Iou Human-rider | Iou Vehicle-car | Iou Vehicle-truck | Iou Vehicle-bus | Iou Vehicle-tramtrain | Iou Vehicle-motorcycle | Iou Vehicle-bicycle | Iou Vehicle-caravan | Iou Vehicle-cartrailer | Iou Construction-building | Iou Construction-door | Iou Construction-wall | Iou Construction-fenceguardrail | Iou Construction-bridge | Iou Construction-tunnel | Iou Construction-stairs | Iou Object-pole | Iou Object-trafficsign | Iou Object-trafficlight | Iou Nature-vegetation | Iou Nature-terrain | Iou Sky | Iou Void-ground | Iou Void-dynamic | Iou Void-static | Iou Void-unclear | |:-------------:|:-----:|:-----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:------------------:|:----------------------:|:-----------------------:|:-------------------------:|:-----------------------------:|:-----------------------:|:------------------:|:---------------------:|:--------------------:|:--------------------:|:----------------------:|:--------------------:|:--------------------------:|:---------------------------:|:------------------------:|:------------------------:|:---------------------------:|:------------------------------:|:--------------------------:|:--------------------------:|:------------------------------------:|:----------------------------:|:----------------------------:|:----------------------------:|:--------------------:|:---------------------------:|:----------------------------:|:--------------------------:|:-----------------------:|:------------:|:--------------------:|:---------------------:|:--------------------:|:---------------------:|:-------------:|:-------------:|:-----------------:|:------------------:|:--------------------:|:------------------------:|:------------------:|:-------------:|:----------------:|:---------------:|:---------------:|:-----------------:|:---------------:|:---------------------:|:----------------------:|:-------------------:|:-------------------:|:----------------------:|:-------------------------:|:---------------------:|:---------------------:|:-------------------------------:|:-----------------------:|:-----------------------:|:-----------------------:|:---------------:|:----------------------:|:-----------------------:|:---------------------:|:------------------:|:-------:|:---------------:|:----------------:|:---------------:|:----------------:| | 1.6058 | 0.1 | 20 | 2.0687 | 0.0800 | 0.1356 | 0.6081 | nan | 0.4527 | 0.9100 | 0.0 | 0.0 | 0.0027 | nan | 0.0016 | 0.0075 | 0.0 | 0.8602 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9256 | 0.0 | 0.0197 | 0.0 | 0.0 | nan | 0.0 | 0.0 | 0.0 | 0.0 | 0.8878 | 0.0039 | 0.2678 | 0.0 | 0.0 | 0.0009 | 0.0 | 0.0 | 0.3225 | 0.6221 | 0.0 | 0.0 | 0.0027 | 0.0 | 0.0015 | 0.0075 | 0.0 | 0.4904 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4424 | 0.0 | 0.0196 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6175 | 0.0039 | 0.2675 | 0.0 | 0.0 | 0.0009 | 0.0 | | 1.0866 | 0.2 | 40 | 1.2653 | 0.1408 | 0.1825 | 0.6924 | nan | 0.5482 | 0.9667 | 0.0 | 0.0 | 0.0041 | nan | 0.0008 | 0.0357 | 0.0 | 0.8373 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9144 | 0.0 | 0.0653 | 0.0 | 0.0 | nan | 0.0 | 0.0022 | 0.0 | 0.0 | 0.9105 | 0.6945 | 0.8432 | 0.0 | 0.0 | 0.0178 | 0.0 | nan | 0.4138 | 0.6618 | 0.0 | 0.0 | 0.0041 | nan | 0.0008 | 0.0352 | 0.0 | 0.6599 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5239 | 0.0 | 0.0645 | 0.0 | 0.0 | nan | 0.0 | 0.0022 | 0.0 | 0.0 | 0.7587 | 0.5554 | 0.8085 | 0.0 | 0.0 | 0.0172 | 0.0 | | 0.8624 | 0.3 | 60 | 1.0931 | 0.1666 | 0.2119 | 0.7183 | nan | 0.5402 | 0.9777 | 0.0 | 0.2995 | 0.0167 | nan | 0.0017 | 0.1723 | 0.0 | 0.9303 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.8316 | 0.0 | 0.0853 | 0.0016 | 0.0 | nan | 0.0 | 0.2050 | 0.0 | 0.0 | 0.9005 | 0.8119 | 0.9414 | 0.0 | 0.0 | 0.0648 | 0.0 | nan | 0.4592 | 0.6670 | 0.0 | 0.2883 | 0.0163 | nan | 0.0017 | 0.1657 | 0.0 | 0.6319 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5774 | 0.0 | 0.0838 | 0.0016 | 0.0 | nan | 0.0 | 0.1763 | 0.0 | 0.0 | 0.7505 | 0.5903 | 0.8594 | 0.0 | 0.0 | 0.0622 | 0.0 | | 1.1933 | 0.4 | 80 | 0.9780 | 0.1780 | 0.2241 | 0.7326 | nan | 0.5494 | 0.9759 | 0.0 | 0.4403 | 0.0635 | nan | 0.0928 | 0.1458 | 0.0 | 0.9218 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.9418 | 0.0 | 0.1182 | 0.0043 | 0.0 | nan | 0.0 | 0.2582 | 0.0 | 0.0 | 0.8457 | 0.8430 | 0.9439 | 0.0 | 0.0 | 0.0278 | 0.0 | nan | 0.4827 | 0.6919 | 0.0 | 0.3890 | 0.0580 | nan | 0.0852 | 0.1389 | 0.0 | 0.6634 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5628 | 0.0 | 0.1040 | 0.0043 | 0.0 | nan | 0.0 | 0.2140 | 0.0 | 0.0 | 0.7901 | 0.6139 | 0.8700 | 0.0 | 0.0 | 0.0276 | 0.0 | | 0.8678 | 0.5 | 100 | 0.8980 | 0.2055 | 0.2559 | 0.7586 | nan | 0.7035 | 0.9240 | 0.0 | 0.5471 | 0.2248 | nan | 0.1919 | 0.4983 | 0.0 | 0.9434 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0122 | 0.0 | 0.0 | 0.8942 | 0.0 | 0.1942 | 0.0415 | 0.0 | nan | 0.0 | 0.3669 | 0.0 | 0.0 | 0.9329 | 0.7145 | 0.9508 | 0.0 | 0.0 | 0.0486 | 0.0 | nan | 0.5343 | 0.7617 | 0.0 | 0.4561 | 0.1464 | nan | 0.1592 | 0.4375 | 0.0 | 0.6418 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0122 | 0.0 | 0.0 | 0.5930 | 0.0 | 0.1698 | 0.0408 | 0.0 | nan | 0.0 | 0.2674 | 0.0 | 0.0 | 0.8039 | 0.6337 | 0.8700 | 0.0 | 0.0 | 0.0478 | 0.0 | | 0.8691 | 0.6 | 120 | 0.8161 | 0.2164 | 0.2640 | 0.7752 | nan | 0.7915 | 0.9325 | 0.0 | 0.5391 | 0.2122 | nan | 0.1662 | 0.4892 | 0.0 | 0.8716 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1287 | 0.0 | 0.0 | 0.9406 | 0.0 | 0.0777 | 0.1205 | 0.0 | nan | 0.0 | 0.3821 | 0.0 | 0.0 | 0.9227 | 0.8422 | 0.9483 | 0.0 | 0.0 | 0.0846 | 0.0 | nan | 0.5700 | 0.7869 | 0.0 | 0.4869 | 0.1733 | nan | 0.1470 | 0.3997 | 0.0 | 0.7279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1273 | 0.0 | 0.0 | 0.5747 | 0.0 | 0.0749 | 0.1087 | 0.0 | nan | 0.0 | 0.2806 | 0.0 | 0.0 | 0.8250 | 0.6766 | 0.8837 | 0.0 | 0.0 | 0.0806 | 0.0 | | 0.5554 | 0.7 | 140 | 0.8194 | 0.2308 | 0.2859 | 0.7661 | nan | 0.5964 | 0.9383 | 0.0 | 0.5685 | 0.2664 | nan | 0.4518 | 0.6607 | 0.0 | 0.9273 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1700 | 0.0 | 0.0 | 0.9297 | 0.0 | 0.2699 | 0.1437 | 0.0 | nan | 0.0 | 0.3847 | 0.0 | 0.0 | 0.9183 | 0.8642 | 0.9349 | 0.0 | 0.0 | 0.1238 | 0.0 | nan | 0.4999 | 0.7466 | 0.0 | 0.4657 | 0.1921 | nan | 0.2892 | 0.4987 | 0.0 | 0.7165 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1661 | 0.0 | 0.0 | 0.6027 | 0.0 | 0.2471 | 0.1317 | 0.0 | nan | 0.0 | 0.2981 | 0.0 | 0.0 | 0.8326 | 0.6980 | 0.8876 | 0.0 | 0.0 | 0.1145 | 0.0 | | 0.794 | 0.8 | 160 | 0.7478 | 0.2403 | 0.2971 | 0.7860 | nan | 0.6689 | 0.9516 | 0.0000 | 0.6487 | 0.2353 | nan | 0.2907 | 0.7860 | 0.0 | 0.9162 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1873 | 0.0 | 0.0 | 0.8967 | 0.0 | 0.4288 | 0.1785 | 0.0 | nan | 0.0 | 0.3825 | 0.0 | 0.0 | 0.9359 | 0.8514 | 0.9644 | 0.0 | 0.0 | 0.1844 | 0.0 | nan | 0.5519 | 0.7755 | 0.0000 | 0.4870 | 0.1750 | nan | 0.2347 | 0.4389 | 0.0 | 0.7562 | 0.0 | 0.0 | 0.0 | 0.0 | 0.1836 | 0.0 | 0.0 | 0.6576 | 0.0 | 0.3686 | 0.1488 | 0.0 | nan | 0.0 | 0.3083 | 0.0 | 0.0 | 0.8332 | 0.7090 | 0.9028 | 0.0 | 0.0 | 0.1575 | 0.0 | | 0.6339 | 0.9 | 180 | 0.7390 | 0.2411 | 0.2973 | 0.7876 | nan | 0.8110 | 0.9249 | 0.0209 | 0.5771 | 0.3073 | nan | 0.3368 | 0.7894 | 0.0 | 0.8493 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5413 | 0.0 | 0.0 | 0.9288 | 0.0 | 0.1732 | 0.1312 | 0.0 | nan | 0.0 | 0.3136 | 0.0 | 0.0 | 0.9477 | 0.7834 | 0.9502 | 0.0 | 0.0 | 0.1273 | 0.0 | nan | 0.5789 | 0.8126 | 0.0209 | 0.4832 | 0.2278 | nan | 0.2592 | 0.3822 | 0.0 | 0.7646 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4843 | 0.0 | 0.0 | 0.6012 | 0.0 | 0.1659 | 0.1198 | 0.0 | nan | 0.0 | 0.2735 | 0.0 | 0.0 | 0.8253 | 0.7060 | 0.8938 | 0.0 | 0.0 | 0.1164 | 0.0 | | 0.8587 | 1.0 | 200 | 0.7084 | 0.2675 | 0.3230 | 0.8018 | nan | 0.7580 | 0.9389 | 0.3724 | 0.5680 | 0.3650 | nan | 0.3542 | 0.7193 | 0.0 | 0.9144 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5573 | 0.0 | 0.0 | 0.9250 | 0.0 | 0.3356 | 0.2530 | 0.0 | nan | 0.0 | 0.2509 | 0.0 | 0.0 | 0.9308 | 0.8555 | 0.9539 | 0.0 | 0.0 | 0.2823 | 0.0 | nan | 0.6263 | 0.7931 | 0.3623 | 0.5008 | 0.2209 | nan | 0.2872 | 0.4916 | 0.0 | 0.7620 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4623 | 0.0 | 0.0 | 0.6392 | 0.0 | 0.3075 | 0.1780 | 0.0 | nan | 0.0 | 0.2203 | 0.0 | 0.0 | 0.8510 | 0.7357 | 0.9055 | 0.0 | 0.0 | 0.2151 | 0.0 | | 0.5614 | 1.1 | 220 | 0.7561 | 0.2601 | 0.3206 | 0.7938 | nan | 0.6713 | 0.9499 | 0.2579 | 0.6298 | 0.3682 | nan | 0.3742 | 0.7721 | 0.0 | 0.9318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5424 | 0.0 | 0.0 | 0.8845 | 0.0 | 0.4127 | 0.1613 | 0.0 | nan | 0.0 | 0.2406 | 0.0 | 0.0 | 0.9196 | 0.8874 | 0.9724 | 0.0 | 0.0 | 0.2822 | 0.0 | nan | 0.5910 | 0.7841 | 0.2544 | 0.4716 | 0.2352 | nan | 0.2894 | 0.4556 | 0.0 | 0.7663 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4722 | 0.0 | 0.0 | 0.6528 | 0.0 | 0.3492 | 0.1422 | 0.0 | nan | 0.0 | 0.2133 | 0.0 | 0.0 | 0.8336 | 0.6845 | 0.9031 | 0.0 | 0.0 | 0.2248 | 0.0 | | 0.6716 | 1.2 | 240 | 0.7154 | 0.2718 | 0.3453 | 0.7967 | nan | 0.6683 | 0.9515 | 0.6023 | 0.6991 | 0.3307 | nan | 0.4583 | 0.7933 | 0.0 | 0.9318 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5865 | 0.0 | 0.0 | 0.8739 | 0.0 | 0.3943 | 0.2839 | 0.0 | nan | 0.0 | 0.4104 | 0.0 | 0.0 | 0.8423 | 0.9039 | 0.9740 | 0.0 | 0.0 | 0.3438 | 0.0 | nan | 0.6036 | 0.8231 | 0.5148 | 0.4430 | 0.2387 | nan | 0.3350 | 0.4530 | 0.0 | 0.7768 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4575 | 0.0 | 0.0 | 0.6788 | 0.0 | 0.3633 | 0.1930 | 0.0 | nan | 0.0 | 0.3109 | 0.0 | 0.0 | 0.7773 | 0.5568 | 0.9052 | 0.0 | 0.0 | 0.2673 | 0.0 | | 0.5977 | 1.3 | 260 | 0.6926 | 0.2792 | 0.3446 | 0.8034 | nan | 0.6946 | 0.9427 | 0.6251 | 0.5124 | 0.4221 | nan | 0.4001 | 0.8085 | 0.0 | 0.8952 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5728 | 0.0 | 0.0 | 0.8277 | 0.0 | 0.6152 | 0.2122 | 0.0 | nan | 0.0 | 0.3902 | 0.0 | 0.0 | 0.9603 | 0.8804 | 0.9529 | 0.0 | 0.0 | 0.3160 | 0.0 | nan | 0.6006 | 0.7857 | 0.5078 | 0.4729 | 0.2514 | nan | 0.3110 | 0.4450 | 0.0 | 0.7790 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4750 | 0.0 | 0.0 | 0.6803 | 0.0 | 0.4791 | 0.1668 | 0.0 | nan | 0.0 | 0.2995 | 0.0 | 0.0 | 0.8278 | 0.7035 | 0.9083 | 0.0 | 0.0 | 0.2401 | 0.0 | | 0.323 | 1.4 | 280 | 0.6871 | 0.2769 | 0.3368 | 0.8095 | nan | 0.9001 | 0.8911 | 0.5783 | 0.5414 | 0.3860 | nan | 0.2742 | 0.7728 | 0.0 | 0.9198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5951 | 0.0 | 0.0 | 0.9161 | 0.0 | 0.3427 | 0.1982 | 0.0 | nan | 0.0 | 0.3783 | 0.0 | 0.0 | 0.9283 | 0.8808 | 0.9731 | 0.0 | 0.0 | 0.3014 | 0.0 | nan | 0.6163 | 0.8266 | 0.5474 | 0.4864 | 0.2627 | nan | 0.2177 | 0.4875 | 0.0 | 0.7824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4786 | 0.0 | 0.0 | 0.6533 | 0.0 | 0.3035 | 0.1736 | 0.0 | nan | 0.0 | 0.3035 | 0.0 | 0.0 | 0.8537 | 0.7231 | 0.9064 | 0.0 | 0.0 | 0.2376 | 0.0 | | 0.4141 | 1.5 | 300 | 0.6476 | 0.2867 | 0.3494 | 0.8218 | nan | 0.7932 | 0.9491 | 0.6643 | 0.6611 | 0.2548 | nan | 0.4516 | 0.7783 | 0.0 | 0.9142 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6078 | 0.0 | 0.0 | 0.8901 | 0.0 | 0.3304 | 0.2616 | 0.0 | nan | 0.0 | 0.4842 | 0.0 | 0.0 | 0.9445 | 0.7866 | 0.9691 | 0.0 | 0.0 | 0.4399 | 0.0 | nan | 0.6531 | 0.8299 | 0.5304 | 0.5258 | 0.2096 | nan | 0.3576 | 0.4954 | 0.0 | 0.7931 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4708 | 0.0 | 0.0 | 0.6738 | 0.0 | 0.2988 | 0.1819 | 0.0 | nan | 0.0 | 0.3425 | 0.0 | 0.0 | 0.8504 | 0.7206 | 0.9123 | 0.0 | 0.0 | 0.3267 | 0.0 | | 0.3646 | 1.6 | 320 | 0.6528 | 0.2763 | 0.3318 | 0.8139 | nan | 0.7482 | 0.9494 | 0.6282 | 0.6090 | 0.3414 | nan | 0.5601 | 0.7720 | 0.0 | 0.8964 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5823 | 0.0 | 0.0 | 0.9521 | 0.0 | 0.2277 | 0.0532 | 0.0 | nan | 0.0 | 0.3867 | 0.0 | 0.0 | 0.9466 | 0.8266 | 0.9693 | 0.0 | 0.0 | 0.1695 | 0.0 | nan | 0.6420 | 0.8276 | 0.5175 | 0.5250 | 0.2451 | nan | 0.3864 | 0.5556 | 0.0 | 0.7824 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5152 | 0.0 | 0.0 | 0.6083 | 0.0 | 0.2015 | 0.0523 | 0.0 | nan | 0.0 | 0.3181 | 0.0 | 0.0 | 0.8566 | 0.7311 | 0.9165 | 0.0 | 0.0 | 0.1596 | 0.0 | | 0.9788 | 1.7 | 340 | 0.7683 | 0.2612 | 0.3304 | 0.7955 | nan | 0.5996 | 0.9751 | 0.3832 | 0.6502 | 0.2049 | nan | 0.4099 | 0.8274 | 0.0 | 0.9395 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7249 | 0.0 | 0.0 | 0.9074 | 0.0 | 0.2946 | 0.1679 | 0.0 | nan | 0.0 | 0.4672 | 0.0 | 0.0 | 0.9432 | 0.8381 | 0.9654 | 0.0 | 0.0 | 0.2729 | 0.0 | nan | 0.5599 | 0.7533 | 0.3669 | 0.5210 | 0.1659 | nan | 0.3300 | 0.3950 | 0.0 | 0.7744 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3063 | 0.0 | 0.0 | 0.6650 | 0.0 | 0.2691 | 0.1526 | 0.0 | nan | 0.0 | 0.3416 | 0.0 | 0.0 | 0.8595 | 0.7511 | 0.9171 | 0.0 | 0.0 | 0.2307 | 0.0 | | 0.6605 | 1.8 | 360 | 0.6275 | 0.2884 | 0.3441 | 0.8263 | nan | 0.8420 | 0.9395 | 0.5045 | 0.6022 | 0.2955 | nan | 0.4330 | 0.7870 | 0.0 | 0.8991 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6139 | 0.0 | 0.0 | 0.9320 | 0.0 | 0.5533 | 0.1601 | 0.0 | nan | 0.0 | 0.3684 | 0.0 | 0.0 | 0.9458 | 0.8448 | 0.9551 | 0.0 | 0.0 | 0.3342 | 0.0 | nan | 0.6410 | 0.8340 | 0.4894 | 0.5314 | 0.2450 | nan | 0.3572 | 0.4974 | 0.0 | 0.7906 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4876 | 0.0 | 0.0 | 0.6881 | 0.0 | 0.4509 | 0.1472 | 0.0 | nan | 0.0 | 0.3078 | 0.0 | 0.0 | 0.8575 | 0.7292 | 0.9075 | 0.0 | 0.0 | 0.2676 | 0.0 | | 0.7524 | 1.9 | 380 | 0.6273 | 0.2919 | 0.3560 | 0.8269 | nan | 0.8081 | 0.9440 | 0.6296 | 0.6118 | 0.3642 | nan | 0.4947 | 0.8006 | 0.0 | 0.9350 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6406 | 0.0 | 0.0 | 0.8665 | 0.0 | 0.4322 | 0.1933 | 0.0 | nan | 0.0 | 0.3940 | 0.0 | 0.0 | 0.9558 | 0.8249 | 0.9678 | 0.0 | 0.0 | 0.5301 | 0.0 | nan | 0.6452 | 0.8301 | 0.5816 | 0.5276 | 0.2781 | nan | 0.3737 | 0.4358 | 0.0 | 0.7892 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4562 | 0.0 | 0.0 | 0.6994 | 0.0 | 0.3982 | 0.1730 | 0.0 | nan | 0.0 | 0.3147 | 0.0 | 0.0 | 0.8553 | 0.7263 | 0.9197 | 0.0 | 0.0 | 0.3361 | 0.0 | | 1.2079 | 2.0 | 400 | 0.6490 | 0.2897 | 0.3486 | 0.8215 | nan | 0.7698 | 0.9312 | 0.5400 | 0.7152 | 0.5081 | nan | 0.4067 | 0.7781 | 0.0 | 0.8939 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6318 | 0.0 | 0.0 | 0.9512 | 0.0 | 0.4015 | 0.1934 | 0.0 | nan | 0.0 | 0.3986 | 0.0 | 0.0 | 0.9288 | 0.8495 | 0.9746 | 0.0 | 0.0 | 0.2818 | 0.0 | nan | 0.6377 | 0.8292 | 0.5056 | 0.5954 | 0.2895 | nan | 0.3449 | 0.5046 | 0.0 | 0.7963 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5140 | 0.0 | 0.0 | 0.6506 | 0.0 | 0.3486 | 0.1627 | 0.0 | nan | 0.0 | 0.3316 | 0.0 | 0.0 | 0.8600 | 0.7487 | 0.9086 | 0.0 | 0.0 | 0.2415 | 0.0 | | 0.405 | 2.1 | 420 | 0.6384 | 0.2937 | 0.3622 | 0.8234 | nan | 0.7031 | 0.9520 | 0.6652 | 0.7688 | 0.3272 | nan | 0.4249 | 0.7941 | 0.0 | 0.9555 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6440 | 0.0 | 0.0 | 0.8782 | 0.0 | 0.5263 | 0.2536 | 0.0 | nan | 0.0 | 0.4750 | 0.0 | 0.0 | 0.9257 | 0.8877 | 0.9717 | 0.0 | 0.0204 | 0.4167 | 0.0 | nan | 0.6183 | 0.8256 | 0.5422 | 0.5549 | 0.2588 | nan | 0.3615 | 0.4897 | 0.0 | 0.7693 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4770 | 0.0 | 0.0 | 0.6953 | 0.0 | 0.4538 | 0.1952 | 0.0 | nan | 0.0 | 0.3494 | 0.0 | 0.0 | 0.8578 | 0.7154 | 0.9221 | 0.0 | 0.0178 | 0.2943 | 0.0 | | 0.3988 | 2.2 | 440 | 0.5934 | 0.2969 | 0.3613 | 0.8282 | nan | 0.8282 | 0.9065 | 0.6606 | 0.7416 | 0.4514 | nan | 0.3391 | 0.8057 | 0.0 | 0.9266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6561 | 0.0 | 0.0 | 0.9058 | 0.0 | 0.5015 | 0.2832 | 0.0 | nan | 0.0 | 0.3546 | 0.0 | 0.0 | 0.9493 | 0.8867 | 0.9751 | 0.0 | 0.0271 | 0.3634 | 0.0 | nan | 0.6390 | 0.8430 | 0.6169 | 0.5474 | 0.2994 | nan | 0.2954 | 0.4906 | 0.0 | 0.8041 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4835 | 0.0 | 0.0 | 0.7002 | 0.0 | 0.4378 | 0.2141 | 0.0 | nan | 0.0 | 0.3096 | 0.0 | 0.0 | 0.8614 | 0.7253 | 0.9219 | 0.0 | 0.0236 | 0.2891 | 0.0 | | 0.3143 | 2.3 | 460 | 0.6416 | 0.2938 | 0.3609 | 0.8219 | nan | 0.6598 | 0.9464 | 0.7305 | 0.7885 | 0.4049 | nan | 0.5647 | 0.7899 | 0.0 | 0.9416 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6645 | 0.0 | 0.0 | 0.9118 | 0.0 | 0.3080 | 0.1446 | 0.0 | nan | 0.0 | 0.4547 | 0.0013 | 0.0 | 0.9516 | 0.8457 | 0.9714 | 0.0 | 0.0180 | 0.4518 | 0.0 | nan | 0.6057 | 0.8265 | 0.6098 | 0.5589 | 0.2849 | nan | 0.4108 | 0.5416 | 0.0 | 0.7952 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4605 | 0.0 | 0.0 | 0.6670 | 0.0 | 0.2780 | 0.1332 | 0.0 | nan | 0.0 | 0.3446 | 0.0013 | 0.0 | 0.8621 | 0.7526 | 0.9232 | 0.0 | 0.0166 | 0.3277 | 0.0 | | 0.2928 | 2.4 | 480 | 0.6131 | 0.2982 | 0.3799 | 0.8252 | nan | 0.8027 | 0.9202 | 0.7321 | 0.6574 | 0.5363 | nan | 0.4619 | 0.8093 | 0.0 | 0.9384 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6897 | 0.0 | 0.0 | 0.7726 | 0.0 | 0.6742 | 0.3453 | 0.0 | nan | 0.0 | 0.5261 | 0.0073 | 0.0 | 0.9268 | 0.9278 | 0.9714 | 0.0 | 0.0291 | 0.4292 | 0.0 | nan | 0.6613 | 0.8264 | 0.5482 | 0.6088 | 0.3040 | nan | 0.3857 | 0.4958 | 0.0 | 0.7986 | 0.0 | 0.0 | 0.0 | 0.0 | 0.3815 | 0.0 | 0.0 | 0.6863 | 0.0 | 0.4553 | 0.2230 | 0.0 | nan | 0.0 | 0.3542 | 0.0072 | 0.0 | 0.8510 | 0.6960 | 0.9259 | 0.0 | 0.0247 | 0.3074 | 0.0 | | 0.4599 | 2.5 | 500 | 0.6091 | 0.3002 | 0.3624 | 0.8349 | nan | 0.7736 | 0.9424 | 0.7071 | 0.7431 | 0.3981 | nan | 0.5366 | 0.7966 | 0.0 | 0.9173 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6335 | 0.0 | 0.0 | 0.9459 | 0.0 | 0.4041 | 0.2116 | 0.0 | nan | 0.0 | 0.4241 | 0.0093 | 0.0 | 0.9330 | 0.9141 | 0.9820 | 0.0 | 0.0143 | 0.3110 | 0.0 | nan | 0.6768 | 0.8398 | 0.5669 | 0.6214 | 0.2799 | nan | 0.4125 | 0.5186 | 0.0 | 0.8087 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4910 | 0.0 | 0.0 | 0.6632 | 0.0 | 0.3655 | 0.1828 | 0.0 | nan | 0.0 | 0.3481 | 0.0093 | 0.0 | 0.8663 | 0.7557 | 0.9166 | 0.0 | 0.0129 | 0.2718 | 0.0 | | 0.4748 | 2.6 | 520 | 0.6341 | 0.2957 | 0.3561 | 0.8299 | nan | 0.8109 | 0.9485 | 0.4931 | 0.6152 | 0.3592 | nan | 0.5488 | 0.8282 | 0.0 | 0.9279 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6551 | 0.0 | 0.0 | 0.9281 | 0.0 | 0.4952 | 0.1922 | 0.0 | nan | 0.0 | 0.4332 | 0.0063 | 0.0 | 0.9320 | 0.8761 | 0.9674 | 0.0 | 0.0217 | 0.3563 | 0.0 | nan | 0.6506 | 0.8292 | 0.4840 | 0.5546 | 0.2594 | nan | 0.4223 | 0.4598 | 0.0 | 0.8118 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4897 | 0.0 | 0.0 | 0.6874 | 0.0 | 0.4248 | 0.1705 | 0.0 | nan | 0.0 | 0.3533 | 0.0063 | 0.0 | 0.8680 | 0.7591 | 0.9237 | 0.0 | 0.0187 | 0.2890 | 0.0 | | 0.3089 | 2.7 | 540 | 0.6322 | 0.3046 | 0.3687 | 0.8342 | nan | 0.7583 | 0.9454 | 0.4860 | 0.7492 | 0.4990 | nan | 0.4709 | 0.8254 | 0.0 | 0.9293 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6323 | 0.0 | 0.0 | 0.8981 | 0.0107 | 0.5664 | 0.2286 | 0.0 | nan | 0.0 | 0.4848 | 0.0162 | 0.0 | 0.9488 | 0.8615 | 0.9829 | 0.0 | 0.0939 | 0.4090 | 0.0 | nan | 0.6480 | 0.8330 | 0.4586 | 0.6005 | 0.3357 | nan | 0.3928 | 0.4786 | 0.0 | 0.8041 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5014 | 0.0 | 0.0 | 0.7103 | 0.0106 | 0.4791 | 0.2035 | 0.0 | nan | 0.0 | 0.3768 | 0.0159 | 0.0 | 0.8618 | 0.7505 | 0.9134 | 0.0 | 0.0524 | 0.3200 | 0.0 | | 1.2466 | 2.8 | 560 | 0.6182 | 0.3016 | 0.3657 | 0.8295 | nan | 0.8185 | 0.9364 | 0.4282 | 0.6726 | 0.4169 | nan | 0.5586 | 0.8235 | 0.0 | 0.9237 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6627 | 0.0 | 0.0 | 0.9047 | 0.0223 | 0.4594 | 0.3392 | 0.0 | nan | 0.0 | 0.4488 | 0.0188 | 0.0 | 0.9480 | 0.7482 | 0.9804 | 0.0000 | 0.0888 | 0.5031 | 0.0 | nan | 0.6578 | 0.8283 | 0.4222 | 0.6020 | 0.3158 | nan | 0.4153 | 0.5162 | 0.0 | 0.8096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4946 | 0.0 | 0.0 | 0.7006 | 0.0211 | 0.4068 | 0.2277 | 0.0 | nan | 0.0 | 0.3612 | 0.0184 | 0.0 | 0.8493 | 0.6842 | 0.9195 | 0.0000 | 0.0515 | 0.3479 | 0.0 | | 0.3471 | 2.9 | 580 | 0.6088 | 0.3130 | 0.3846 | 0.8405 | nan | 0.7740 | 0.9325 | 0.8216 | 0.7382 | 0.5086 | nan | 0.5655 | 0.8194 | 0.0 | 0.9507 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6233 | 0.0 | 0.0 | 0.9001 | 0.0216 | 0.5836 | 0.3403 | 0.0 | nan | 0.0 | 0.3942 | 0.0267 | 0.0 | 0.9301 | 0.8750 | 0.9710 | 0.0 | 0.0167 | 0.5157 | 0.0 | nan | 0.6756 | 0.8458 | 0.4939 | 0.6495 | 0.3248 | nan | 0.4361 | 0.5502 | 0.0 | 0.7944 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4773 | 0.0 | 0.0 | 0.7238 | 0.0215 | 0.4953 | 0.2605 | 0.0 | nan | 0.0 | 0.3396 | 0.0261 | 0.0 | 0.8604 | 0.7330 | 0.9247 | 0.0 | 0.0149 | 0.3694 | 0.0 | | 0.447 | 3.0 | 600 | 0.6063 | 0.3064 | 0.3674 | 0.8408 | nan | 0.7719 | 0.9575 | 0.7510 | 0.7675 | 0.3209 | nan | 0.5208 | 0.8243 | 0.0 | 0.9409 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6379 | 0.0 | 0.0 | 0.8865 | 0.0032 | 0.4287 | 0.1902 | 0.0 | nan | 0.0 | 0.4460 | 0.0301 | 0.0 | 0.9650 | 0.8454 | 0.9800 | 0.0000 | 0.0084 | 0.4821 | 0.0 | nan | 0.6733 | 0.8440 | 0.6036 | 0.6185 | 0.2724 | nan | 0.4230 | 0.4980 | 0.0 | 0.8106 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5051 | 0.0 | 0.0 | 0.7000 | 0.0032 | 0.3955 | 0.1754 | 0.0 | nan | 0.0 | 0.3558 | 0.0298 | 0.0 | 0.8493 | 0.7531 | 0.9206 | 0.0000 | 0.0075 | 0.3646 | 0.0 | | 0.3025 | 3.1 | 620 | 0.6267 | 0.3100 | 0.3841 | 0.8346 | nan | 0.7058 | 0.9394 | 0.7532 | 0.8210 | 0.4279 | nan | 0.5254 | 0.8072 | 0.0 | 0.9292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6649 | 0.0 | 0.0 | 0.8850 | 0.0367 | 0.6998 | 0.3119 | 0.0 | nan | 0.0 | 0.4649 | 0.0061 | 0.0 | 0.9281 | 0.8715 | 0.9689 | 0.0019 | 0.0069 | 0.5364 | 0.0 | nan | 0.6314 | 0.8461 | 0.5290 | 0.5602 | 0.3011 | nan | 0.4210 | 0.5059 | 0.0 | 0.8239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4858 | 0.0 | 0.0 | 0.7287 | 0.0366 | 0.5161 | 0.2431 | 0.0 | nan | 0.0 | 0.3731 | 0.0061 | 0.0 | 0.8683 | 0.7459 | 0.9288 | 0.0018 | 0.0063 | 0.3594 | 0.0 | | 0.5402 | 3.2 | 640 | 0.6114 | 0.3148 | 0.3742 | 0.8413 | nan | 0.8162 | 0.9555 | 0.7193 | 0.6148 | 0.4400 | nan | 0.5070 | 0.8157 | 0.0 | 0.9328 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6549 | 0.0 | 0.0 | 0.9089 | 0.0828 | 0.4958 | 0.3236 | 0.0 | nan | 0.0 | 0.3724 | 0.0483 | 0.0 | 0.9490 | 0.8336 | 0.9762 | 0.0 | 0.0212 | 0.5066 | 0.0 | nan | 0.6801 | 0.8399 | 0.6177 | 0.5636 | 0.3160 | nan | 0.4129 | 0.5409 | 0.0 | 0.8207 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5088 | 0.0 | 0.0 | 0.7141 | 0.0751 | 0.4367 | 0.2617 | 0.0 | nan | 0.0 | 0.3152 | 0.0471 | 0.0 | 0.8678 | 0.7556 | 0.9279 | 0.0 | 0.0185 | 0.3540 | 0.0 | | 0.3071 | 3.3 | 660 | 0.6226 | 0.3122 | 0.3814 | 0.8409 | nan | 0.7812 | 0.9437 | 0.7351 | 0.7543 | 0.4146 | nan | 0.5299 | 0.8407 | 0.0 | 0.9495 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7120 | 0.0 | 0.0 | 0.9031 | 0.0202 | 0.4934 | 0.3704 | 0.0 | nan | 0.0 | 0.3671 | 0.0979 | 0.0 | 0.9470 | 0.8862 | 0.9775 | 0.0 | 0.0764 | 0.4034 | 0.0 | nan | 0.6701 | 0.8389 | 0.5992 | 0.6383 | 0.3214 | nan | 0.4274 | 0.5242 | 0.0 | 0.8016 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4534 | 0.0 | 0.0 | 0.7129 | 0.0199 | 0.4292 | 0.2599 | 0.0 | nan | 0.0 | 0.2979 | 0.0930 | 0.0 | 0.8654 | 0.7525 | 0.9216 | 0.0 | 0.0545 | 0.3106 | 0.0 | | 0.2812 | 3.4 | 680 | 0.5891 | 0.3154 | 0.3853 | 0.8385 | nan | 0.7691 | 0.9259 | 0.7320 | 0.7075 | 0.6098 | nan | 0.6156 | 0.8274 | 0.0 | 0.9228 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6670 | 0.0 | 0.0 | 0.8950 | 0.0003 | 0.7067 | 0.2373 | 0.0 | nan | 0.0 | 0.3845 | 0.0771 | 0.0 | 0.9436 | 0.8482 | 0.9784 | 0.0002 | 0.0573 | 0.4248 | 0.0 | nan | 0.6809 | 0.8459 | 0.6360 | 0.6308 | 0.2685 | nan | 0.4481 | 0.5109 | 0.0 | 0.8121 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4987 | 0.0 | 0.0 | 0.7307 | 0.0003 | 0.5476 | 0.1996 | 0.0 | nan | 0.0 | 0.3162 | 0.0746 | 0.0 | 0.8696 | 0.7530 | 0.9245 | 0.0002 | 0.0378 | 0.3082 | 0.0 | | 0.4997 | 3.5 | 700 | 0.5982 | 0.3166 | 0.3801 | 0.8448 | nan | 0.8607 | 0.9301 | 0.7868 | 0.6661 | 0.4482 | nan | 0.5705 | 0.7812 | 0.0 | 0.9462 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6700 | 0.0 | 0.0 | 0.9333 | 0.0002 | 0.4569 | 0.2193 | 0.0 | nan | 0.0 | 0.4467 | 0.1375 | 0.0 | 0.9381 | 0.8742 | 0.9743 | 0.0 | 0.0891 | 0.4326 | 0.0 | nan | 0.7075 | 0.8577 | 0.6178 | 0.5973 | 0.2946 | nan | 0.4482 | 0.5701 | 0.0 | 0.8044 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4950 | 0.0 | 0.0 | 0.6931 | 0.0002 | 0.4124 | 0.1985 | 0.0 | nan | 0.0 | 0.3601 | 0.1275 | 0.0 | 0.8717 | 0.7650 | 0.9302 | 0.0 | 0.0483 | 0.3308 | 0.0 | | 0.3472 | 3.6 | 720 | 0.6052 | 0.3213 | 0.3866 | 0.8432 | nan | 0.7653 | 0.9485 | 0.7447 | 0.7379 | 0.4858 | nan | 0.6064 | 0.8020 | 0.0 | 0.9290 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6426 | 0.0 | 0.0 | 0.9297 | 0.0 | 0.6149 | 0.3437 | 0.0 | nan | 0.0 | 0.5040 | 0.1435 | 0.0 | 0.9358 | 0.8249 | 0.9786 | 0.0 | 0.1058 | 0.3279 | 0.0 | nan | 0.6746 | 0.8433 | 0.6430 | 0.6389 | 0.3082 | nan | 0.4471 | 0.5350 | 0.0 | 0.8149 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5024 | 0.0 | 0.0 | 0.7226 | 0.0 | 0.4948 | 0.2599 | 0.0 | nan | 0.0 | 0.3883 | 0.1319 | 0.0 | 0.8688 | 0.7569 | 0.9257 | 0.0 | 0.0589 | 0.2650 | 0.0 | | 0.4252 | 3.7 | 740 | 0.6622 | 0.3123 | 0.3891 | 0.8234 | nan | 0.6217 | 0.9484 | 0.7138 | 0.8309 | 0.4263 | nan | 0.5506 | 0.8600 | 0.0 | 0.9525 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6234 | 0.0 | 0.0 | 0.8508 | 0.0084 | 0.5880 | 0.3145 | 0.0 | nan | 0.0 | 0.4943 | 0.2299 | 0.0 | 0.9380 | 0.9003 | 0.9850 | 0.0 | 0.2103 | 0.4036 | 0.0 | nan | 0.5607 | 0.8413 | 0.6569 | 0.4842 | 0.3210 | nan | 0.4427 | 0.4777 | 0.0 | 0.7951 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4647 | 0.0 | 0.0 | 0.7209 | 0.0084 | 0.5126 | 0.2370 | 0.0 | nan | 0.0 | 0.3795 | 0.2061 | 0.0 | 0.8637 | 0.7505 | 0.9223 | 0.0 | 0.0647 | 0.2846 | 0.0 | | 0.3308 | 3.8 | 760 | 0.6377 | 0.3127 | 0.3808 | 0.8289 | nan | 0.6214 | 0.9625 | 0.5946 | 0.8099 | 0.4454 | nan | 0.5188 | 0.8494 | 0.0 | 0.9136 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6696 | 0.0 | 0.0 | 0.9263 | 0.0024 | 0.6152 | 0.2950 | 0.0 | nan | 0.0027 | 0.4453 | 0.1733 | 0.0 | 0.9520 | 0.8743 | 0.9699 | 0.0004 | 0.1977 | 0.3462 | 0.0 | nan | 0.5763 | 0.8251 | 0.5772 | 0.5411 | 0.3457 | nan | 0.4107 | 0.4836 | 0.0 | 0.8239 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5117 | 0.0 | 0.0 | 0.7262 | 0.0024 | 0.5110 | 0.2294 | 0.0 | nan | 0.0026 | 0.3668 | 0.1574 | 0.0 | 0.8705 | 0.7696 | 0.9339 | 0.0004 | 0.0602 | 0.2801 | 0.0 | | 0.4693 | 3.9 | 780 | 0.5859 | 0.3216 | 0.3778 | 0.8472 | nan | 0.7901 | 0.9565 | 0.7815 | 0.7387 | 0.4418 | nan | 0.5939 | 0.8066 | 0.0 | 0.9374 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5633 | 0.0 | 0.0 | 0.9404 | 0.0047 | 0.4300 | 0.1846 | 0.0 | nan | 0.0062 | 0.5028 | 0.1681 | 0.0 | 0.9364 | 0.8907 | 0.9751 | 0.0 | 0.0659 | 0.3760 | 0.0 | nan | 0.6844 | 0.8378 | 0.6638 | 0.6608 | 0.3533 | nan | 0.4678 | 0.5651 | 0.0 | 0.8096 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5098 | 0.0 | 0.0 | 0.7056 | 0.0047 | 0.4085 | 0.1746 | 0.0 | nan | 0.0060 | 0.3747 | 0.1550 | 0.0 | 0.8729 | 0.7620 | 0.9327 | 0.0 | 0.0350 | 0.3067 | 0.0 | | 0.4175 | 4.0 | 800 | 0.5300 | 0.3304 | 0.3990 | 0.8474 | nan | 0.8728 | 0.9146 | 0.8169 | 0.6208 | 0.5492 | nan | 0.5932 | 0.8393 | 0.0 | 0.9432 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6227 | 0.0 | 0.0 | 0.8618 | 0.0183 | 0.7515 | 0.3741 | 0.0 | nan | 0.0 | 0.4608 | 0.2043 | 0.0 | 0.9611 | 0.8383 | 0.9787 | 0.0 | 0.1044 | 0.4406 | 0.0 | nan | 0.7017 | 0.8579 | 0.6370 | 0.5716 | 0.3295 | nan | 0.4682 | 0.5922 | 0.0 | 0.8053 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5317 | 0.0 | 0.0 | 0.7427 | 0.0183 | 0.5577 | 0.2733 | 0.0 | nan | 0.0 | 0.3651 | 0.1856 | 0.0 | 0.8612 | 0.7375 | 0.9269 | 0.0 | 0.0746 | 0.3346 | 0.0 | | 0.2417 | 4.1 | 820 | 0.6240 | 0.3189 | 0.3775 | 0.8361 | nan | 0.8131 | 0.9538 | 0.7682 | 0.4531 | 0.4900 | nan | 0.5241 | 0.8388 | 0.0 | 0.9496 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5933 | 0.0 | 0.0 | 0.9136 | 0.0182 | 0.4895 | 0.2692 | 0.0 | nan | 0.0003 | 0.4741 | 0.2422 | 0.0 | 0.9537 | 0.8414 | 0.9750 | 0.0 | 0.0438 | 0.4760 | 0.0 | nan | 0.6420 | 0.8400 | 0.6500 | 0.4374 | 0.3497 | nan | 0.4397 | 0.5665 | 0.0 | 0.8008 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5243 | 0.0 | 0.0 | 0.7188 | 0.0182 | 0.4395 | 0.2415 | 0.0 | nan | 0.0003 | 0.3859 | 0.2129 | 0.0 | 0.8699 | 0.7533 | 0.9329 | 0.0 | 0.0325 | 0.3490 | 0.0 | | 0.2375 | 4.2 | 840 | 0.5756 | 0.3343 | 0.4028 | 0.8498 | nan | 0.8028 | 0.9399 | 0.7560 | 0.8128 | 0.5208 | nan | 0.5758 | 0.8137 | 0.0 | 0.9266 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6480 | 0.0 | 0.0 | 0.9155 | 0.0155 | 0.5142 | 0.4670 | 0.0 | nan | 0.0 | 0.4630 | 0.3347 | 0.0 | 0.9205 | 0.8892 | 0.9875 | 0.0 | 0.1196 | 0.4656 | 0.0 | nan | 0.7055 | 0.8568 | 0.6719 | 0.6537 | 0.3342 | nan | 0.4664 | 0.5856 | 0.0 | 0.8130 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5293 | 0.0 | 0.0 | 0.7177 | 0.0154 | 0.4507 | 0.2814 | 0.0 | nan | 0.0 | 0.3815 | 0.2734 | 0.0 | 0.8704 | 0.7453 | 0.9204 | 0.0 | 0.0711 | 0.3540 | 0.0 | | 0.4241 | 4.3 | 860 | 0.5682 | 0.3289 | 0.3927 | 0.8526 | nan | 0.8019 | 0.9477 | 0.7645 | 0.7864 | 0.4614 | nan | 0.6374 | 0.8213 | 0.0 | 0.9414 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6449 | 0.0 | 0.0 | 0.9389 | 0.0015 | 0.5075 | 0.1875 | 0.0 | nan | 0.0060 | 0.4489 | 0.3299 | 0.0 | 0.9467 | 0.8621 | 0.9705 | 0.0 | 0.1149 | 0.4442 | 0.0 | nan | 0.7105 | 0.8518 | 0.6212 | 0.6864 | 0.3408 | nan | 0.4810 | 0.5663 | 0.0 | 0.8198 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4875 | 0.0 | 0.0 | 0.7131 | 0.0015 | 0.4607 | 0.1773 | 0.0 | nan | 0.0057 | 0.3741 | 0.2755 | 0.0 | 0.8731 | 0.7626 | 0.9348 | 0.0 | 0.0422 | 0.3375 | 0.0 | | 0.5282 | 4.4 | 880 | 0.6106 | 0.3241 | 0.3981 | 0.8456 | nan | 0.7704 | 0.9356 | 0.8287 | 0.8018 | 0.5745 | nan | 0.5025 | 0.7925 | 0.0 | 0.9564 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6370 | 0.0 | 0.0 | 0.9097 | 0.0360 | 0.5623 | 0.4363 | 0.0 | nan | 0.0 | 0.4484 | 0.2753 | 0.0 | 0.9503 | 0.7920 | 0.9724 | 0.0009 | 0.1229 | 0.4328 | 0.0 | nan | 0.7081 | 0.8635 | 0.4271 | 0.6627 | 0.3582 | nan | 0.4362 | 0.6102 | 0.0 | 0.8004 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4666 | 0.0 | 0.0 | 0.7321 | 0.0353 | 0.4640 | 0.2690 | 0.0 | nan | 0.0 | 0.3772 | 0.2455 | 0.0 | 0.8642 | 0.7079 | 0.9363 | 0.0009 | 0.0663 | 0.3381 | 0.0 | | 0.3367 | 4.5 | 900 | 0.5852 | 0.3273 | 0.3859 | 0.8544 | nan | 0.8327 | 0.9558 | 0.6822 | 0.7709 | 0.3897 | nan | 0.6299 | 0.7857 | 0.0 | 0.9039 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6906 | 0.0 | 0.0 | 0.9428 | 0.0104 | 0.4495 | 0.3420 | 0.0 | nan | 0.0 | 0.4741 | 0.2323 | 0.0 | 0.9426 | 0.8893 | 0.9785 | 0.0 | 0.0100 | 0.4372 | 0.0 | nan | 0.7217 | 0.8512 | 0.6538 | 0.6746 | 0.3280 | nan | 0.4868 | 0.5682 | 0.0 | 0.8238 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4554 | 0.0 | 0.0 | 0.7065 | 0.0103 | 0.4132 | 0.2616 | 0.0 | nan | 0.0 | 0.3935 | 0.2107 | 0.0 | 0.8720 | 0.7465 | 0.9320 | 0.0 | 0.0089 | 0.3559 | 0.0 | | 0.1462 | 4.6 | 920 | 0.5898 | 0.3302 | 0.3945 | 0.8517 | nan | 0.8338 | 0.9321 | 0.7807 | 0.7720 | 0.5273 | nan | 0.5959 | 0.8227 | 0.0 | 0.9378 | 0.0 | 0.0 | 0.0 | 0.0 | 0.7047 | 0.0 | 0.0 | 0.9415 | 0.0315 | 0.4600 | 0.3444 | 0.0 | nan | 0.0 | 0.4654 | 0.2423 | 0.0 | 0.9457 | 0.8191 | 0.9787 | 0.0002 | 0.0175 | 0.4721 | 0.0 | nan | 0.7254 | 0.8518 | 0.6417 | 0.6844 | 0.3375 | nan | 0.4761 | 0.5773 | 0.0 | 0.8160 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4945 | 0.0 | 0.0 | 0.7011 | 0.0308 | 0.4212 | 0.2757 | 0.0 | nan | 0.0 | 0.3946 | 0.2157 | 0.0 | 0.8680 | 0.7422 | 0.9321 | 0.0002 | 0.0144 | 0.3644 | 0.0 | | 0.4018 | 4.7 | 940 | 0.6261 | 0.3313 | 0.4006 | 0.8471 | nan | 0.7361 | 0.9560 | 0.8252 | 0.7443 | 0.4880 | nan | 0.5874 | 0.7623 | 0.0 | 0.9292 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6837 | 0.0 | 0.0 | 0.9315 | 0.0560 | 0.5691 | 0.3433 | 0.0 | nan | 0.0 | 0.4978 | 0.3762 | 0.0 | 0.9313 | 0.9096 | 0.9814 | 0.0004 | 0.0463 | 0.4639 | 0.0 | nan | 0.6786 | 0.8459 | 0.4753 | 0.6694 | 0.3442 | nan | 0.4817 | 0.6111 | 0.0 | 0.8128 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4944 | 0.0 | 0.0 | 0.7258 | 0.0547 | 0.4986 | 0.2692 | 0.0 | nan | 0.0 | 0.3990 | 0.3001 | 0.0 | 0.8752 | 0.7656 | 0.9259 | 0.0004 | 0.0314 | 0.3436 | 0.0 | | 0.4323 | 4.8 | 960 | 0.6071 | 0.3369 | 0.4009 | 0.8527 | nan | 0.8909 | 0.9288 | 0.7706 | 0.7429 | 0.3946 | nan | 0.5634 | 0.8032 | 0.0 | 0.9419 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6484 | 0.0 | 0.0 | 0.8991 | 0.1415 | 0.6081 | 0.2444 | 0.0 | nan | 0.0739 | 0.5198 | 0.4198 | 0.0 | 0.9632 | 0.7166 | 0.9803 | 0.0110 | 0.0255 | 0.5409 | 0.0 | nan | 0.7203 | 0.8627 | 0.6071 | 0.6653 | 0.3017 | nan | 0.4226 | 0.5991 | 0.0 | 0.8170 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4908 | 0.0 | 0.0 | 0.7391 | 0.1318 | 0.5113 | 0.2209 | 0.0 | nan | 0.0611 | 0.4137 | 0.3238 | 0.0 | 0.8530 | 0.6852 | 0.9328 | 0.0093 | 0.0178 | 0.3944 | 0.0 | | 0.2333 | 4.9 | 980 | 0.6312 | 0.3291 | 0.4049 | 0.8368 | nan | 0.6661 | 0.9599 | 0.7668 | 0.8162 | 0.4021 | nan | 0.5460 | 0.8279 | 0.0 | 0.9377 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6167 | 0.0 | 0.0 | 0.8637 | 0.0490 | 0.7147 | 0.3548 | 0.0 | nan | 0.0487 | 0.4632 | 0.4447 | 0.0 | 0.9449 | 0.8856 | 0.9798 | 0.0024 | 0.2556 | 0.4098 | 0.0 | nan | 0.6061 | 0.8459 | 0.6123 | 0.5262 | 0.3227 | nan | 0.4682 | 0.5298 | 0.0 | 0.8213 | 0.0 | 0.0 | 0.0 | 0.0 | 0.4777 | 0.0 | 0.0 | 0.7401 | 0.0488 | 0.5668 | 0.2560 | 0.0 | nan | 0.0474 | 0.3885 | 0.3212 | 0.0 | 0.8738 | 0.7688 | 0.9317 | 0.0021 | 0.0662 | 0.3102 | 0.0 | | 0.322 | 5.0 | 1000 | 0.5919 | 0.3324 | 0.3983 | 0.8527 | nan | 0.8795 | 0.9454 | 0.7501 | 0.6501 | 0.4403 | nan | 0.6006 | 0.8655 | 0.0 | 0.9171 | 0.0 | 0.0 | 0.0 | 0.0 | 0.6457 | 0.0 | 0.0 | 0.9125 | 0.0151 | 0.4907 | 0.2739 | 0.0 | nan | 0.0 | 0.5307 | 0.4038 | 0.0 | 0.9539 | 0.8585 | 0.9824 | 0.0017 | 0.2231 | 0.4063 | 0.0 | nan | 0.7358 | 0.8591 | 0.6885 | 0.6124 | 0.2908 | nan | 0.4820 | 0.5026 | 0.0 | 0.8247 | 0.0 | 0.0 | 0.0 | 0.0 | 0.5430 | 0.0 | 0.0 | 0.7227 | 0.0151 | 0.4502 | 0.2349 | 0.0 | nan | 0.0 | 0.4150 | 0.3229 | 0.0 | 0.8710 | 0.7707 | 0.9332 | 0.0017 | 0.0580 | 0.3015 | 0.0 | | 0.3299 | 5.1 | 1020 | 0.5630 | 0.3381 | 0.4067 | 0.8554 | nan | 0.8362 | 0.9417 | 0.7732 | 0.7712 | 0.4461 | nan | 0.6305 | 0.8050 | 0.0042 | 0.9430 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.6503 | 0.0 | 0.0 | 0.9244 | 0.0045 | 0.5766 | 0.3341 | 0.0 | nan | 0.0 | 0.5153 | 0.4275 | 0.0 | 0.9401 | 0.8734 | 0.9810 | 0.0002 | 0.2593 | 0.3754 | 0.0 | nan | 0.7239 | 0.8683 | 0.6261 | 0.6428 | 0.3037 | nan | 0.5023 | 0.6126 | 0.0030 | 0.8145 | 0.0011 | 0.0 | 0.0 | 0.0 | 0.5157 | 0.0 | 0.0 | 0.7251 | 0.0044 | 0.4858 | 0.2659 | 0.0 | nan | 0.0 | 0.4109 | 0.3378 | 0.0 | 0.8759 | 0.7719 | 0.9339 | 0.0002 | 0.0783 | 0.3135 | 0.0 | | 0.2486 | 5.2 | 1040 | 0.5781 | 0.3354 | 0.4008 | 0.8553 | nan | 0.8114 | 0.9444 | 0.7999 | 0.8031 | 0.4525 | nan | 0.5980 | 0.7806 | 0.0264 | 0.9194 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.6433 | 0.0 | 0.0 | 0.9335 | 0.0814 | 0.5906 | 0.2020 | 0.0 | nan | 0.0002 | 0.4619 | 0.4333 | 0.0 | 0.9473 | 0.8681 | 0.9758 | 0.0013 | 0.0796 | 0.4692 | 0.0 | nan | 0.7074 | 0.8746 | 0.5776 | 0.6471 | 0.3072 | nan | 0.4838 | 0.5905 | 0.0113 | 0.8288 | 0.0031 | 0.0 | 0.0 | 0.0 | 0.5141 | 0.0 | 0.0 | 0.7319 | 0.0772 | 0.4996 | 0.1780 | 0.0 | nan | 0.0002 | 0.3924 | 0.3248 | 0.0 | 0.8743 | 0.7747 | 0.9360 | 0.0012 | 0.0425 | 0.3553 | 0.0 | | 0.2187 | 5.3 | 1060 | 0.5951 | 0.3310 | 0.3996 | 0.8521 | nan | 0.8741 | 0.9406 | 0.7747 | 0.6985 | 0.4974 | nan | 0.5988 | 0.7802 | 0.0157 | 0.9384 | 0.0049 | 0.0 | 0.0 | 0.0 | 0.6773 | 0.0 | 0.0 | 0.9064 | 0.1097 | 0.4512 | 0.2733 | 0.0 | nan | 0.0080 | 0.3901 | 0.4226 | 0.0 | 0.9569 | 0.8016 | 0.9778 | 0.0022 | 0.2137 | 0.4736 | 0.0 | nan | 0.7341 | 0.8672 | 0.6043 | 0.6460 | 0.3411 | nan | 0.4676 | 0.5519 | 0.0038 | 0.8196 | 0.0048 | 0.0 | 0.0 | 0.0 | 0.5047 | 0.0 | 0.0 | 0.7232 | 0.1031 | 0.4142 | 0.2260 | 0.0 | nan | 0.0077 | 0.3316 | 0.3239 | 0.0 | 0.8641 | 0.7379 | 0.9360 | 0.0020 | 0.0542 | 0.3228 | 0.0 | | 1.1803 | 5.4 | 1080 | 0.6495 | 0.3336 | 0.3988 | 0.8391 | nan | 0.6558 | 0.9554 | 0.7805 | 0.8250 | 0.5203 | nan | 0.5736 | 0.8174 | 0.0056 | 0.9396 | 0.0066 | 0.0 | 0.0 | 0.0 | 0.6255 | 0.0 | 0.0 | 0.9273 | 0.0083 | 0.5809 | 0.3201 | 0.0 | nan | 0.0 | 0.5183 | 0.4578 | 0.0 | 0.9433 | 0.8644 | 0.9790 | 0.0010 | 0.0665 | 0.3882 | 0.0 | nan | 0.6035 | 0.8349 | 0.6789 | 0.5849 | 0.3641 | nan | 0.4660 | 0.5953 | 0.0038 | 0.8272 | 0.0064 | 0.0 | 0.0 | 0.0 | 0.5366 | 0.0 | 0.0 | 0.7289 | 0.0081 | 0.4550 | 0.2526 | 0.0 | nan | 0.0 | 0.4133 | 0.3445 | 0.0 | 0.8748 | 0.7729 | 0.9337 | 0.0010 | 0.0502 | 0.3385 | 0.0 | | 0.2229 | 5.5 | 1100 | 0.5565 | 0.3363 | 0.4115 | 0.8594 | nan | 0.8699 | 0.9352 | 0.8052 | 0.7833 | 0.4652 | nan | 0.6304 | 0.7524 | 0.0693 | 0.9367 | 0.0196 | 0.0 | 0.0 | 0.0 | 0.7243 | 0.0 | 0.0 | 0.9247 | 0.0419 | 0.5269 | 0.3136 | 0.0 | nan | 0.0 | 0.5243 | 0.4829 | 0.0 | 0.9406 | 0.9026 | 0.9807 | 0.0009 | 0.1594 | 0.3776 | 0.0 | nan | 0.7612 | 0.8653 | 0.5482 | 0.6938 | 0.3423 | nan | 0.4929 | 0.5383 | 0.0143 | 0.8332 | 0.0188 | 0.0 | 0.0 | 0.0 | 0.5054 | 0.0 | 0.0 | 0.7323 | 0.0410 | 0.4596 | 0.2366 | 0.0 | nan | 0.0 | 0.4012 | 0.3332 | 0.0 | 0.8753 | 0.7496 | 0.9344 | 0.0009 | 0.0710 | 0.3135 | 0.0 | | 0.3822 | 5.6 | 1120 | 0.5669 | 0.3459 | 0.4115 | 0.8612 | nan | 0.8962 | 0.9368 | 0.7621 | 0.7845 | 0.5097 | nan | 0.4860 | 0.8278 | 0.0 | 0.9454 | 0.0269 | 0.0 | 0.0 | 0.0 | 0.6764 | 0.0 | 0.0 | 0.9072 | 0.1391 | 0.5740 | 0.2863 | 0.0 | nan | 0.0011 | 0.5018 | 0.4754 | 0.0 | 0.9440 | 0.8610 | 0.9885 | 0.0011 | 0.1953 | 0.4424 | 0.0 | nan | 0.7435 | 0.8646 | 0.7032 | 0.6904 | 0.3584 | nan | 0.4298 | 0.5766 | 0.0 | 0.8297 | 0.0261 | 0.0 | 0.0 | 0.0 | 0.5415 | 0.0 | 0.0 | 0.7357 | 0.1221 | 0.4772 | 0.2503 | 0.0 | nan | 0.0010 | 0.4117 | 0.3376 | 0.0 | 0.8797 | 0.7681 | 0.9209 | 0.0010 | 0.0649 | 0.3353 | 0.0 | | 0.2649 | 5.7 | 1140 | 0.5898 | 0.3381 | 0.4023 | 0.8563 | nan | 0.7976 | 0.9442 | 0.7650 | 0.7640 | 0.5549 | nan | 0.6973 | 0.8434 | 0.0 | 0.9444 | 0.0094 | 0.0 | 0.0 | 0.0 | 0.6017 | 0.0 | 0.0 | 0.9358 | 0.0022 | 0.5838 | 0.3365 | 0.0 | nan | 0.0000 | 0.4293 | 0.3976 | 0.0 | 0.9468 | 0.8714 | 0.9810 | 0.0012 | 0.0478 | 0.4181 | 0.0 | nan | 0.7151 | 0.8569 | 0.7113 | 0.6623 | 0.3494 | nan | 0.5183 | 0.5293 | 0.0 | 0.8284 | 0.0093 | 0.0 | 0.0 | 0.0 | 0.5048 | 0.0 | 0.0 | 0.7321 | 0.0022 | 0.4675 | 0.2713 | 0.0 | nan | 0.0000 | 0.3827 | 0.3276 | 0.0 | 0.8788 | 0.7549 | 0.9334 | 0.0012 | 0.0408 | 0.3430 | 0.0 | | 0.36 | 5.8 | 1160 | 0.5610 | 0.3374 | 0.4024 | 0.8560 | nan | 0.8640 | 0.9259 | 0.7671 | 0.7668 | 0.6144 | nan | 0.5556 | 0.8438 | 0.0 | 0.9443 | 0.0083 | 0.0 | 0.0 | 0.0 | 0.6159 | 0.0 | 0.0 | 0.9392 | 0.0182 | 0.5002 | 0.2419 | 0.0 | nan | 0.0076 | 0.5526 | 0.4234 | 0.0 | 0.9436 | 0.8715 | 0.9810 | 0.0 | 0.1199 | 0.3711 | 0.0 | nan | 0.7443 | 0.8662 | 0.6964 | 0.6839 | 0.2997 | nan | 0.4887 | 0.5272 | 0.0 | 0.8304 | 0.0082 | 0.0 | 0.0 | 0.0 | 0.5343 | 0.0 | 0.0 | 0.7226 | 0.0181 | 0.4603 | 0.2237 | 0.0 | nan | 0.0071 | 0.4219 | 0.3267 | 0.0 | 0.8794 | 0.7650 | 0.9339 | 0.0 | 0.0503 | 0.3097 | 0.0 | | 0.1851 | 5.9 | 1180 | 0.5901 | 0.3316 | 0.4029 | 0.8500 | nan | 0.7504 | 0.9589 | 0.7043 | 0.8477 | 0.4014 | nan | 0.5987 | 0.8495 | 0.0 | 0.9416 | 0.0057 | 0.0 | 0.0 | 0.0 | 0.7332 | 0.0 | 0.0 | 0.9259 | 0.0010 | 0.5587 | 0.3454 | 0.0 | nan | 0.0001 | 0.5006 | 0.4310 | 0.0 | 0.9551 | 0.8353 | 0.9730 | 0.0 | 0.2293 | 0.3470 | 0.0 | nan | 0.6760 | 0.8659 | 0.6777 | 0.5612 | 0.3372 | nan | 0.5049 | 0.4991 | 0.0 | 0.8348 | 0.0057 | 0.0 | 0.0 | 0.0 | 0.4929 | 0.0 | 0.0 | 0.7364 | 0.0010 | 0.4741 | 0.2656 | 0.0 | nan | 0.0001 | 0.4060 | 0.3345 | 0.0 | 0.8720 | 0.7557 | 0.9380 | 0.0 | 0.0757 | 0.2966 | 0.0 | | 0.2929 | 6.0 | 1200 | 0.5772 | 0.3361 | 0.4214 | 0.8502 | nan | 0.7678 | 0.9437 | 0.8101 | 0.7995 | 0.5434 | nan | 0.6081 | 0.8514 | 0.0 | 0.9448 | 0.0044 | 0.0 | 0.0 | 0.0 | 0.7424 | 0.0 | 0.0 | 0.8567 | 0.1035 | 0.7333 | 0.3929 | 0.0 | nan | 0.1165 | 0.4942 | 0.4598 | 0.0 | 0.9381 | 0.8854 | 0.9776 | 0.0 | 0.1032 | 0.4068 | 0.0 | nan | 0.6929 | 0.8728 | 0.5952 | 0.6048 | 0.3191 | nan | 0.4872 | 0.4926 | 0.0 | 0.8332 | 0.0043 | 0.0 | 0.0 | 0.0 | 0.4247 | 0.0 | 0.0 | 0.7432 | 0.1007 | 0.5006 | 0.3004 | 0.0 | nan | 0.0834 | 0.4060 | 0.3476 | 0.0 | 0.8751 | 0.7607 | 0.9382 | 0.0 | 0.0532 | 0.3191 | 0.0 | | 0.1837 | 6.1 | 1220 | 0.5619 | 0.3425 | 0.4019 | 0.8599 | nan | 0.8292 | 0.9577 | 0.7582 | 0.7946 | 0.4884 | nan | 0.5736 | 0.8172 | 0.0 | 0.9262 | 0.0073 | 0.0 | 0.0 | 0.0 | 0.6724 | 0.0 | 0.0 | 0.9509 | 0.0249 | 0.4688 | 0.4031 | 0.0 | nan | 0.0112 | 0.4901 | 0.4293 | 0.0 | 0.9458 | 0.8549 | 0.9790 | 0.0001 | 0.0968 | 0.3822 | 0.0 | nan | 0.7390 | 0.8602 | 0.6756 | 0.6938 | 0.3610 | nan | 0.4749 | 0.5803 | 0.0 | 0.8379 | 0.0071 | 0.0 | 0.0 | 0.0 | 0.5735 | 0.0 | 0.0 | 0.7104 | 0.0247 | 0.4235 | 0.2840 | 0.0 | nan | 0.0101 | 0.4083 | 0.3263 | 0.0 | 0.8765 | 0.7763 | 0.9354 | 0.0001 | 0.0555 | 0.3266 | 0.0 | | 0.2823 | 6.2 | 1240 | 0.5561 | 0.3477 | 0.4153 | 0.8616 | nan | 0.8684 | 0.9325 | 0.7535 | 0.7614 | 0.5858 | nan | 0.6483 | 0.8377 | 0.0 | 0.9540 | 0.0172 | 0.0 | 0.0 | 0.0 | 0.6799 | 0.0 | 0.0 | 0.9022 | 0.0421 | 0.6256 | 0.2876 | 0.0 | nan | 0.0021 | 0.5645 | 0.4458 | 0.0 | 0.9561 | 0.8393 | 0.9661 | 0.0 | 0.1596 | 0.4593 | 0.0 | nan | 0.7564 | 0.8604 | 0.6944 | 0.6911 | 0.3387 | nan | 0.5181 | 0.5511 | 0.0 | 0.8288 | 0.0169 | 0.0 | 0.0 | 0.0 | 0.5345 | 0.0 | 0.0 | 0.7512 | 0.0414 | 0.5198 | 0.2474 | 0.0 | nan | 0.0020 | 0.4409 | 0.3486 | 0.0 | 0.8731 | 0.7634 | 0.9364 | 0.0 | 0.0737 | 0.3371 | 0.0 | | 0.1737 | 6.3 | 1260 | 0.5944 | 0.3440 | 0.4109 | 0.8589 | nan | 0.8164 | 0.9394 | 0.7514 | 0.8265 | 0.5133 | nan | 0.5731 | 0.8081 | 0.0049 | 0.9528 | 0.0190 | 0.0 | 0.0 | 0.0 | 0.6786 | 0.0 | 0.0 | 0.9077 | 0.0189 | 0.6365 | 0.4641 | 0.0 | nan | 0.0003 | 0.5127 | 0.4451 | 0.0 | 0.9562 | 0.8655 | 0.9817 | 0.0 | 0.0706 | 0.4066 | 0.0 | nan | 0.7103 | 0.8631 | 0.6924 | 0.6440 | 0.3634 | nan | 0.4701 | 0.5896 | 0.0033 | 0.8266 | 0.0181 | 0.0 | 0.0 | 0.0 | 0.4933 | 0.0 | 0.0 | 0.7567 | 0.0189 | 0.5532 | 0.2877 | 0.0 | nan | 0.0003 | 0.4161 | 0.3459 | 0.0 | 0.8718 | 0.7740 | 0.9335 | 0.0 | 0.0414 | 0.3359 | 0.0 | | 0.2331 | 6.4 | 1280 | 0.5757 | 0.3439 | 0.4080 | 0.8642 | nan | 0.8474 | 0.9577 | 0.7499 | 0.8125 | 0.3469 | nan | 0.6753 | 0.8720 | 0.0 | 0.9422 | 0.0256 | 0.0 | 0.0 | 0.0 | 0.6022 | 0.0 | 0.0 | 0.9147 | 0.0303 | 0.6387 | 0.2659 | 0.0 | nan | 0.0033 | 0.5037 | 0.5161 | 0.0 | 0.9533 | 0.8503 | 0.9761 | 0.0 | 0.1375 | 0.4339 | 0.0 | nan | 0.7464 | 0.8625 | 0.7014 | 0.6901 | 0.2919 | nan | 0.5113 | 0.4932 | 0.0 | 0.8415 | 0.0242 | 0.0 | 0.0 | 0.0 | 0.5310 | 0.0 | 0.0 | 0.7484 | 0.0301 | 0.5440 | 0.2362 | 0.0 | nan | 0.0031 | 0.4189 | 0.3563 | 0.0 | 0.8716 | 0.7666 | 0.9388 | 0.0 | 0.0665 | 0.3304 | 0.0 | | 0.2482 | 6.5 | 1300 | 0.5652 | 0.3439 | 0.4139 | 0.8565 | nan | 0.8961 | 0.9318 | 0.7939 | 0.6718 | 0.5130 | nan | 0.6420 | 0.8713 | 0.0 | 0.9445 | 0.0252 | 0.0 | 0.0 | 0.0 | 0.6115 | 0.0 | 0.0 | 0.9118 | 0.1801 | 0.5194 | 0.3921 | 0.0 | nan | 0.0787 | 0.5022 | 0.4451 | 0.0 | 0.9485 | 0.8205 | 0.9784 | 0.0000 | 0.1026 | 0.4647 | 0.0 | nan | 0.7433 | 0.8670 | 0.6723 | 0.6234 | 0.3394 | nan | 0.4819 | 0.4912 | 0.0 | 0.8309 | 0.0242 | 0.0 | 0.0 | 0.0 | 0.5364 | 0.0 | 0.0 | 0.7400 | 0.1646 | 0.4697 | 0.2621 | 0.0 | nan | 0.0725 | 0.4230 | 0.3409 | 0.0 | 0.8691 | 0.7309 | 0.9383 | 0.0000 | 0.0509 | 0.3325 | 0.0 | | 0.2564 | 6.6 | 1320 | 0.5605 | 0.3430 | 0.4096 | 0.8605 | nan | 0.8574 | 0.9411 | 0.7728 | 0.7655 | 0.5374 | nan | 0.6314 | 0.8056 | 0.0 | 0.9448 | 0.0233 | 0.0 | 0.0 | 0.0 | 0.6626 | 0.0 | 0.0 | 0.9296 | 0.0727 | 0.5102 | 0.3465 | 0.0 | nan | 0.0388 | 0.5049 | 0.4320 | 0.0 | 0.9485 | 0.8772 | 0.9762 | 0.0105 | 0.1357 | 0.3834 | 0.0 | nan | 0.7479 | 0.8716 | 0.6437 | 0.6747 | 0.3443 | nan | 0.5198 | 0.5438 | 0.0 | 0.8378 | 0.0228 | 0.0 | 0.0 | 0.0 | 0.5055 | 0.0 | 0.0 | 0.7366 | 0.0677 | 0.4568 | 0.2458 | 0.0 | nan | 0.0316 | 0.4134 | 0.3461 | 0.0 | 0.8717 | 0.7620 | 0.9386 | 0.0074 | 0.0703 | 0.3169 | 0.0 | | 0.2327 | 6.7 | 1340 | 0.5858 | 0.3514 | 0.4217 | 0.8616 | nan | 0.8699 | 0.9510 | 0.7048 | 0.7627 | 0.4538 | nan | 0.6080 | 0.7268 | 0.1376 | 0.9419 | 0.0359 | 0.0 | 0.0 | 0.0 | 0.7361 | 0.0 | 0.0 | 0.8837 | 0.2082 | 0.6157 | 0.3706 | 0.0 | nan | 0.0909 | 0.4611 | 0.4802 | 0.0 | 0.9503 | 0.8700 | 0.9807 | 0.0210 | 0.1612 | 0.4738 | 0.0 | nan | 0.7492 | 0.8674 | 0.6835 | 0.6819 | 0.3450 | nan | 0.5073 | 0.5297 | 0.0222 | 0.8360 | 0.0342 | 0.0 | 0.0 | 0.0 | 0.5117 | 0.0 | 0.0 | 0.7413 | 0.1671 | 0.4875 | 0.2640 | 0.0 | nan | 0.0776 | 0.4006 | 0.3457 | 0.0 | 0.8775 | 0.7710 | 0.9368 | 0.0109 | 0.0734 | 0.3249 | 0.0 | | 0.242 | 6.8 | 1360 | 0.5879 | 0.3377 | 0.4038 | 0.8588 | nan | 0.8580 | 0.9459 | 0.6603 | 0.7829 | 0.5026 | nan | 0.6233 | 0.6993 | 0.1826 | 0.9596 | 0.0324 | 0.0 | 0.0 | 0.0 | 0.6813 | 0.0 | 0.0 | 0.8888 | 0.0606 | 0.6100 | 0.2078 | 0.0 | nan | 0.0106 | 0.5106 | 0.4489 | 0.0 | 0.9608 | 0.8217 | 0.9807 | 0.0013 | 0.0424 | 0.4508 | 0.0 | nan | 0.7297 | 0.8711 | 0.6473 | 0.6682 | 0.3661 | nan | 0.5064 | 0.4987 | 0.0245 | 0.8177 | 0.0323 | 0.0 | 0.0 | 0.0 | 0.4811 | 0.0 | 0.0 | 0.7394 | 0.0582 | 0.4697 | 0.1953 | 0.0 | nan | 0.0093 | 0.4198 | 0.3574 | 0.0 | 0.8634 | 0.7439 | 0.9376 | 0.0010 | 0.0280 | 0.3386 | 0.0 | | 0.2912 | 6.9 | 1380 | 0.6065 | 0.3460 | 0.4119 | 0.8580 | nan | 0.8233 | 0.9491 | 0.7514 | 0.8086 | 0.4890 | nan | 0.6596 | 0.7872 | 0.0 | 0.9363 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.6847 | 0.0 | 0.0 | 0.9520 | 0.0737 | 0.5223 | 0.3596 | 0.0 | nan | 0.0359 | 0.5441 | 0.4222 | 0.0 | 0.9219 | 0.8838 | 0.9741 | 0.0026 | 0.2035 | 0.3599 | 0.0 | nan | 0.7319 | 0.8655 | 0.6413 | 0.6811 | 0.3626 | nan | 0.5255 | 0.5864 | 0.0 | 0.8358 | 0.0330 | 0.0 | 0.0 | 0.0 | 0.5516 | 0.0 | 0.0 | 0.7090 | 0.0674 | 0.4307 | 0.2864 | 0.0 | nan | 0.0250 | 0.4413 | 0.3338 | 0.0 | 0.8745 | 0.7683 | 0.9393 | 0.0022 | 0.0863 | 0.2947 | 0.0 | | 0.343 | 7.0 | 1400 | 0.5567 | 0.3432 | 0.4134 | 0.8604 | nan | 0.8250 | 0.9528 | 0.7195 | 0.8020 | 0.5162 | nan | 0.6391 | 0.8356 | 0.0060 | 0.9500 | 0.0391 | 0.0 | 0.0 | 0.0 | 0.7280 | 0.0 | 0.0 | 0.8962 | 0.0651 | 0.6283 | 0.3323 | 0.0 | nan | 0.0065 | 0.5047 | 0.4910 | 0.0 | 0.9584 | 0.8309 | 0.9817 | 0.0008 | 0.1522 | 0.3661 | 0.0 | nan | 0.7363 | 0.8592 | 0.6737 | 0.6849 | 0.3638 | nan | 0.5259 | 0.5049 | 0.0037 | 0.8376 | 0.0377 | 0.0 | 0.0 | 0.0 | 0.4747 | 0.0 | 0.0 | 0.7459 | 0.0630 | 0.4846 | 0.2764 | 0.0 | nan | 0.0052 | 0.4170 | 0.3515 | 0.0 | 0.8684 | 0.7582 | 0.9370 | 0.0007 | 0.0725 | 0.3003 | 0.0 | | 0.2914 | 7.1 | 1420 | 0.5773 | 0.3455 | 0.4101 | 0.8628 | nan | 0.8788 | 0.9472 | 0.7366 | 0.7511 | 0.4820 | nan | 0.6145 | 0.8190 | 0.0 | 0.9357 | 0.0365 | 0.0 | 0.0 | 0.0 | 0.6832 | 0.0 | 0.0 | 0.9259 | 0.0426 | 0.5416 | 0.2700 | 0.0 | nan | 0.0390 | 0.5164 | 0.4809 | 0.0 | 0.9462 | 0.8881 | 0.9822 | 0.0007 | 0.1845 | 0.4210 | 0.0 | nan | 0.7424 | 0.8681 | 0.6758 | 0.6744 | 0.3689 | nan | 0.5043 | 0.5500 | 0.0 | 0.8381 | 0.0355 | 0.0 | 0.0 | 0.0 | 0.5276 | 0.0 | 0.0 | 0.7309 | 0.0420 | 0.4570 | 0.2420 | 0.0 | nan | 0.0278 | 0.4148 | 0.3608 | 0.0 | 0.8780 | 0.7737 | 0.9370 | 0.0007 | 0.0782 | 0.3276 | 0.0 | | 0.2474 | 7.2 | 1440 | 0.5986 | 0.3456 | 0.4083 | 0.8604 | nan | 0.8241 | 0.9533 | 0.7590 | 0.8042 | 0.4869 | nan | 0.6086 | 0.7678 | 0.0 | 0.9578 | 0.0176 | 0.0 | 0.0 | 0.0 | 0.6546 | 0.0 | 0.0 | 0.9157 | 0.0721 | 0.5568 | 0.2897 | 0.0 | nan | 0.0205 | 0.4932 | 0.4753 | 0.0 | 0.9612 | 0.8364 | 0.9759 | 0.0003 | 0.2280 | 0.4053 | 0.0 | nan | 0.7370 | 0.8694 | 0.6417 | 0.6868 | 0.3282 | nan | 0.5111 | 0.6177 | 0.0 | 0.8225 | 0.0173 | 0.0 | 0.0 | 0.0 | 0.5195 | 0.0 | 0.0 | 0.7384 | 0.0702 | 0.4737 | 0.2533 | 0.0 | nan | 0.0171 | 0.4074 | 0.3697 | 0.0 | 0.8687 | 0.7579 | 0.9399 | 0.0003 | 0.0826 | 0.3284 | 0.0 | | 0.2505 | 7.3 | 1460 | 0.5985 | 0.3471 | 0.4082 | 0.8566 | nan | 0.7938 | 0.9652 | 0.7313 | 0.7649 | 0.4475 | nan | 0.6260 | 0.8675 | 0.0020 | 0.9491 | 0.0379 | 0.0 | 0.0 | 0.0 | 0.6807 | 0.0 | 0.0 | 0.9081 | 0.1519 | 0.6030 | 0.2748 | 0.0 | nan | 0.0039 | 0.5783 | 0.46 | 0.0 | 0.9587 | 0.8108 | 0.9788 | 0.0001 | 0.0826 | 0.3844 | 0.0 | nan | 0.7066 | 0.8387 | 0.6786 | 0.6773 | 0.3727 | nan | 0.5193 | 0.5468 | 0.0016 | 0.8359 | 0.0354 | 0.0 | 0.0 | 0.0 | 0.5663 | 0.0 | 0.0 | 0.7500 | 0.1320 | 0.4856 | 0.2477 | 0.0 | nan | 0.0034 | 0.4339 | 0.3538 | 0.0 | 0.8699 | 0.7449 | 0.9397 | 0.0001 | 0.0558 | 0.3117 | 0.0 | | 0.2611 | 7.4 | 1480 | 0.5702 | 0.3476 | 0.4143 | 0.8581 | nan | 0.7707 | 0.9551 | 0.7422 | 0.8445 | 0.4893 | nan | 0.6655 | 0.8344 | 0.0 | 0.9515 | 0.0443 | 0.0 | 0.0 | 0.0 | 0.6860 | 0.0 | 0.0 | 0.9232 | 0.1644 | 0.6029 | 0.3108 | 0.0 | nan | 0.0147 | 0.5112 | 0.4610 | 0.0 | 0.9478 | 0.8592 | 0.9807 | 0.0065 | 0.0864 | 0.4052 | 0.0 | nan | 0.7017 | 0.8700 | 0.6427 | 0.6132 | 0.3791 | nan | 0.5387 | 0.5832 | 0.0 | 0.8422 | 0.0404 | 0.0 | 0.0 | 0.0 | 0.5270 | 0.0 | 0.0 | 0.7452 | 0.1416 | 0.4843 | 0.2549 | 0.0 | nan | 0.0129 | 0.4183 | 0.3607 | 0.0 | 0.8781 | 0.7718 | 0.9388 | 0.0049 | 0.0451 | 0.3281 | 0.0 | | 0.1479 | 7.5 | 1500 | 0.5701 | 0.3439 | 0.4035 | 0.8595 | nan | 0.8494 | 0.9474 | 0.7318 | 0.7205 | 0.5444 | nan | 0.6775 | 0.8485 | 0.0 | 0.9484 | 0.0252 | 0.0 | 0.0 | 0.0 | 0.6319 | 0.0 | 0.0 | 0.9298 | 0.0287 | 0.5318 | 0.2629 | 0.0 | nan | 0.0034 | 0.4961 | 0.4789 | 0.0 | 0.9603 | 0.8198 | 0.9658 | 0.0029 | 0.0514 | 0.4563 | 0.0 | nan | 0.7411 | 0.8677 | 0.6846 | 0.6476 | 0.3372 | nan | 0.5484 | 0.5975 | 0.0 | 0.8345 | 0.0246 | 0.0 | 0.0 | 0.0 | 0.5414 | 0.0 | 0.0 | 0.7303 | 0.0280 | 0.4573 | 0.2322 | 0.0 | nan | 0.0030 | 0.4152 | 0.3618 | 0.0 | 0.8665 | 0.7385 | 0.9396 | 0.0027 | 0.0370 | 0.3666 | 0.0 | | 0.2073 | 7.6 | 1520 | 0.6279 | 0.3474 | 0.4175 | 0.8507 | nan | 0.7556 | 0.9561 | 0.7696 | 0.7821 | 0.4914 | nan | 0.6223 | 0.8651 | 0.0 | 0.9547 | 0.0416 | 0.0 | 0.0 | 0.0 | 0.6631 | 0.0 | 0.0 | 0.9289 | 0.1014 | 0.5414 | 0.3211 | 0.0 | nan | 0.1147 | 0.5668 | 0.5320 | 0.0 | 0.9357 | 0.8535 | 0.9849 | 0.0320 | 0.1488 | 0.3977 | 0.0 | nan | 0.6791 | 0.8419 | 0.6776 | 0.6390 | 0.3672 | nan | 0.5206 | 0.5841 | 0.0 | 0.8318 | 0.0399 | 0.0 | 0.0 | 0.0 | 0.5261 | 0.0 | 0.0 | 0.7370 | 0.0923 | 0.4608 | 0.2570 | 0.0 | nan | 0.0822 | 0.4361 | 0.3484 | 0.0 | 0.8778 | 0.7638 | 0.9356 | 0.0195 | 0.0696 | 0.3278 | 0.0 | | 0.3208 | 7.7 | 1540 | 0.5902 | 0.3443 | 0.4097 | 0.8625 | nan | 0.8514 | 0.9559 | 0.8170 | 0.7319 | 0.4577 | nan | 0.6391 | 0.8315 | 0.0 | 0.9341 | 0.0208 | 0.0 | 0.0 | 0.0 | 0.6960 | 0.0 | 0.0 | 0.9322 | 0.0069 | 0.5405 | 0.3043 | 0.0 | nan | 0.0195 | 0.4643 | 0.5060 | 0.0 | 0.9447 | 0.8907 | 0.9827 | 0.0075 | 0.1400 | 0.4345 | 0.0 | nan | 0.7546 | 0.8646 | 0.6123 | 0.6707 | 0.3476 | nan | 0.5256 | 0.6086 | 0.0 | 0.8453 | 0.0198 | 0.0 | 0.0 | 0.0 | 0.5410 | 0.0 | 0.0 | 0.7307 | 0.0068 | 0.4733 | 0.2653 | 0.0 | nan | 0.0173 | 0.4133 | 0.3411 | 0.0 | 0.8744 | 0.7639 | 0.9379 | 0.0062 | 0.0566 | 0.3404 | 0.0 | | 0.3173 | 7.8 | 1560 | 0.5700 | 0.3494 | 0.4171 | 0.8633 | nan | 0.8505 | 0.9393 | 0.7827 | 0.8097 | 0.4909 | nan | 0.6613 | 0.8247 | 0.0331 | 0.9473 | 0.0464 | 0.0 | 0.0 | 0.0 | 0.6595 | 0.0 | 0.0 | 0.9329 | 0.0327 | 0.5948 | 0.2566 | 0.0 | nan | 0.0584 | 0.5498 | 0.5224 | 0.0 | 0.9481 | 0.8445 | 0.9784 | 0.0012 | 0.1162 | 0.4649 | 0.0 | nan | 0.7463 | 0.8649 | 0.6431 | 0.6921 | 0.3346 | nan | 0.5205 | 0.5660 | 0.0149 | 0.8430 | 0.0429 | 0.0 | 0.0 | 0.0 | 0.5391 | 0.0 | 0.0 | 0.7422 | 0.0322 | 0.5000 | 0.2370 | 0.0 | nan | 0.0506 | 0.4451 | 0.3526 | 0.0 | 0.8779 | 0.7619 | 0.9406 | 0.0011 | 0.0622 | 0.3703 | 0.0 | | 0.1863 | 7.9 | 1580 | 0.6211 | 0.3514 | 0.4297 | 0.8592 | nan | 0.8506 | 0.9447 | 0.8257 | 0.7064 | 0.4736 | nan | 0.6324 | 0.8422 | 0.0 | 0.9651 | 0.0536 | 0.0 | 0.0 | 0.0 | 0.6058 | 0.0 | 0.0 | 0.9151 | 0.1508 | 0.5574 | 0.4391 | 0.0 | nan | 0.1478 | 0.5624 | 0.5315 | 0.0 | 0.9451 | 0.8766 | 0.9861 | 0.0010 | 0.3727 | 0.3650 | 0.0 | nan | 0.7333 | 0.8714 | 0.5576 | 0.6480 | 0.3506 | nan | 0.5172 | 0.6009 | 0.0 | 0.8275 | 0.0493 | 0.0 | 0.0 | 0.0 | 0.5273 | 0.0 | 0.0 | 0.7444 | 0.1344 | 0.4871 | 0.3098 | 0.0 | nan | 0.1057 | 0.4511 | 0.3565 | 0.0 | 0.8806 | 0.7541 | 0.9369 | 0.0008 | 0.0878 | 0.3123 | 0.0 | | 0.2516 | 8.0 | 1600 | 0.5767 | 0.3501 | 0.4204 | 0.8653 | nan | 0.8572 | 0.9601 | 0.7830 | 0.7746 | 0.4109 | nan | 0.6479 | 0.7926 | 0.0650 | 0.9358 | 0.0504 | 0.0 | 0.0 | 0.0 | 0.7195 | 0.0 | 0.0 | 0.9099 | 0.0133 | 0.5805 | 0.3600 | 0.0 | nan | 0.0194 | 0.5341 | 0.5156 | 0.0 | 0.9602 | 0.8242 | 0.9783 | 0.0 | 0.3244 | 0.4364 | 0.0 | nan | 0.7522 | 0.8696 | 0.6293 | 0.6857 | 0.3462 | nan | 0.5311 | 0.5778 | 0.0151 | 0.8472 | 0.0474 | 0.0 | 0.0 | 0.0 | 0.5229 | 0.0 | 0.0 | 0.7538 | 0.0132 | 0.5060 | 0.2932 | 0.0 | nan | 0.0177 | 0.4476 | 0.3751 | 0.0 | 0.8715 | 0.7523 | 0.9414 | 0.0 | 0.0762 | 0.3314 | 0.0 | | 0.2206 | 8.1 | 1620 | 0.6162 | 0.3546 | 0.4307 | 0.8563 | nan | 0.8299 | 0.9257 | 0.7671 | 0.8026 | 0.4752 | nan | 0.6825 | 0.7886 | 0.0158 | 0.9459 | 0.0448 | 0.0 | 0.0 | 0.0 | 0.6966 | 0.0 | 0.0 | 0.9133 | 0.1368 | 0.6901 | 0.3385 | 0.0 | nan | 0.2352 | 0.5165 | 0.5090 | 0.0 | 0.9412 | 0.8623 | 0.9821 | 0.0369 | 0.2019 | 0.4425 | 0.0 | nan | 0.7445 | 0.8644 | 0.6314 | 0.6645 | 0.3391 | nan | 0.5353 | 0.5417 | 0.0075 | 0.8451 | 0.0427 | 0.0 | 0.0 | 0.0 | 0.4609 | 0.0 | 0.0 | 0.7530 | 0.1226 | 0.5129 | 0.2932 | 0.0 | nan | 0.1710 | 0.4440 | 0.3757 | 0.0 | 0.8807 | 0.7562 | 0.9398 | 0.0078 | 0.0675 | 0.3445 | 0.0 | | 0.189 | 8.2 | 1640 | 0.5740 | 0.3467 | 0.4147 | 0.8648 | nan | 0.8577 | 0.9535 | 0.7585 | 0.7673 | 0.4784 | nan | 0.6893 | 0.8107 | 0.0637 | 0.9507 | 0.0421 | 0.0 | 0.0 | 0.0 | 0.6907 | 0.0 | 0.0 | 0.9336 | 0.0316 | 0.5791 | 0.2895 | 0.0 | nan | 0.0854 | 0.5458 | 0.5167 | 0.0 | 0.9507 | 0.8159 | 0.9754 | 0.0 | 0.0640 | 0.4187 | 0.0 | nan | 0.7502 | 0.8706 | 0.6532 | 0.6880 | 0.3630 | nan | 0.5298 | 0.4944 | 0.0264 | 0.8465 | 0.0401 | 0.0 | 0.0 | 0.0 | 0.5009 | 0.0 | 0.0 | 0.7409 | 0.0303 | 0.4910 | 0.2531 | 0.0 | nan | 0.0716 | 0.4439 | 0.3748 | 0.0 | 0.8713 | 0.7383 | 0.9429 | 0.0 | 0.0365 | 0.3374 | 0.0 | | 0.2028 | 8.3 | 1660 | 0.6399 | 0.3438 | 0.4078 | 0.8578 | nan | 0.8066 | 0.9687 | 0.7532 | 0.8068 | 0.3267 | nan | 0.6167 | 0.7656 | 0.0031 | 0.9349 | 0.0451 | 0.0 | 0.0 | 0.0 | 0.7131 | 0.0 | 0.0 | 0.9415 | 0.0212 | 0.5982 | 0.3048 | 0.0 | nan | 0.0864 | 0.5129 | 0.4953 | 0.0 | 0.9371 | 0.8768 | 0.9825 | 0.0 | 0.2933 | 0.2605 | 0.0 | nan | 0.7242 | 0.8444 | 0.7154 | 0.6779 | 0.2892 | nan | 0.5380 | 0.5381 | 0.0014 | 0.8407 | 0.0419 | 0.0 | 0.0 | 0.0 | 0.5174 | 0.0 | 0.0 | 0.7326 | 0.0193 | 0.4840 | 0.2500 | 0.0 | nan | 0.0719 | 0.4295 | 0.3630 | 0.0 | 0.8772 | 0.7656 | 0.9371 | 0.0 | 0.1074 | 0.2369 | 0.0 | | 0.2783 | 8.4 | 1680 | 0.5473 | 0.3515 | 0.4219 | 0.8664 | nan | 0.8826 | 0.9387 | 0.7571 | 0.8127 | 0.5269 | nan | 0.6741 | 0.8037 | 0.0 | 0.9550 | 0.0350 | 0.0 | 0.0 | 0.0 | 0.6732 | 0.0 | 0.0 | 0.9218 | 0.0816 | 0.5645 | 0.3347 | 0.0 | nan | 0.0116 | 0.5560 | 0.5146 | 0.0 | 0.9369 | 0.8790 | 0.9825 | 0.0009 | 0.2686 | 0.3906 | 0.0 | nan | 0.7608 | 0.8771 | 0.7013 | 0.6885 | 0.3555 | nan | 0.5491 | 0.5221 | 0.0 | 0.8338 | 0.0347 | 0.0 | 0.0 | 0.0 | 0.5370 | 0.0 | 0.0 | 0.7435 | 0.0750 | 0.5028 | 0.2650 | 0.0 | nan | 0.0099 | 0.4431 | 0.3785 | 0.0 | 0.8797 | 0.7654 | 0.9401 | 0.0007 | 0.0687 | 0.3162 | 0.0 | | 0.199 | 8.5 | 1700 | 0.5928 | 0.3595 | 0.4303 | 0.8670 | nan | 0.8255 | 0.9598 | 0.7983 | 0.8581 | 0.4141 | nan | 0.6636 | 0.8888 | 0.0 | 0.9441 | 0.0413 | 0.0 | 0.0132 | 0.0 | 0.6113 | 0.0 | 0.0 | 0.8697 | 0.2663 | 0.7008 | 0.3577 | 0.0 | nan | 0.2193 | 0.5255 | 0.4762 | 0.0 | 0.9526 | 0.8591 | 0.9803 | 0.0 | 0.0804 | 0.4627 | 0.0 | nan | 0.7395 | 0.8762 | 0.6573 | 0.7119 | 0.3497 | nan | 0.5446 | 0.5005 | 0.0 | 0.8375 | 0.0407 | 0.0 | 0.0131 | 0.0 | 0.5413 | 0.0 | 0.0 | 0.7524 | 0.2323 | 0.4987 | 0.2915 | 0.0 | nan | 0.1353 | 0.4328 | 0.3726 | 0.0 | 0.8722 | 0.7516 | 0.9419 | 0.0 | 0.0540 | 0.3573 | 0.0 | | 0.1591 | 8.6 | 1720 | 0.5927 | 0.3568 | 0.4274 | 0.8632 | nan | 0.8250 | 0.9475 | 0.7706 | 0.8347 | 0.5623 | nan | 0.6601 | 0.8675 | 0.0 | 0.9547 | 0.0375 | 0.0 | 0.0 | 0.0 | 0.6326 | 0.0 | 0.0 | 0.9278 | 0.1591 | 0.5080 | 0.3138 | 0.0 | nan | 0.2364 | 0.5457 | 0.5151 | 0.0 | 0.9625 | 0.7945 | 0.9821 | 0.0 | 0.2648 | 0.3730 | 0.0 | nan | 0.7411 | 0.8788 | 0.6746 | 0.7028 | 0.3475 | nan | 0.5438 | 0.5751 | 0.0 | 0.8345 | 0.0370 | 0.0 | 0.0 | 0.0 | 0.5548 | 0.0 | 0.0 | 0.7334 | 0.1370 | 0.4510 | 0.2575 | 0.0 | nan | 0.1640 | 0.4431 | 0.3603 | 0.0 | 0.8660 | 0.7312 | 0.9409 | 0.0 | 0.1150 | 0.3292 | 0.0 | | 0.1553 | 8.7 | 1740 | 0.5695 | 0.3598 | 0.4404 | 0.8679 | nan | 0.8717 | 0.9483 | 0.7673 | 0.7990 | 0.5361 | nan | 0.6620 | 0.7190 | 0.1766 | 0.9412 | 0.0958 | 0.0 | 0.0079 | 0.0 | 0.7929 | 0.0 | 0.0 | 0.8975 | 0.1616 | 0.6552 | 0.3511 | 0.0 | nan | 0.0630 | 0.5369 | 0.5200 | 0.0 | 0.9408 | 0.8779 | 0.9758 | 0.0004 | 0.3845 | 0.4103 | 0.0 | nan | 0.7609 | 0.8754 | 0.6801 | 0.7116 | 0.3771 | nan | 0.5357 | 0.5635 | 0.0429 | 0.8473 | 0.0878 | 0.0 | 0.0078 | 0.0 | 0.4463 | 0.0 | 0.0 | 0.7517 | 0.1464 | 0.5156 | 0.2927 | 0.0 | nan | 0.0538 | 0.4402 | 0.3753 | 0.0 | 0.8760 | 0.7600 | 0.9433 | 0.0003 | 0.0908 | 0.3304 | 0.0 | | 0.1498 | 8.8 | 1760 | 0.5841 | 0.3633 | 0.4338 | 0.8642 | nan | 0.8380 | 0.9265 | 0.7928 | 0.9074 | 0.5189 | nan | 0.6488 | 0.8137 | 0.0 | 0.9494 | 0.0536 | 0.0 | 0.0085 | 0.0 | 0.7206 | 0.0 | 0.0 | 0.9184 | 0.2066 | 0.6024 | 0.4015 | 0.0 | nan | 0.0487 | 0.5791 | 0.5140 | 0.0 | 0.9414 | 0.8646 | 0.9797 | 0.0054 | 0.1717 | 0.4690 | 0.0 | nan | 0.7476 | 0.8657 | 0.6433 | 0.6603 | 0.3790 | nan | 0.5424 | 0.6239 | 0.0 | 0.8431 | 0.0509 | 0.0 | 0.0084 | 0.0 | 0.5871 | 0.0 | 0.0 | 0.7513 | 0.1678 | 0.5091 | 0.3164 | 0.0 | nan | 0.0414 | 0.4595 | 0.3703 | 0.0 | 0.8827 | 0.7695 | 0.9422 | 0.0033 | 0.0879 | 0.3730 | 0.0 | | 0.1828 | 8.9 | 1780 | 0.6037 | 0.3609 | 0.4253 | 0.8654 | nan | 0.8656 | 0.9520 | 0.7476 | 0.7937 | 0.4616 | nan | 0.6561 | 0.8421 | 0.0 | 0.9473 | 0.0502 | 0.0 | 0.0 | 0.0 | 0.7023 | 0.0 | 0.0 | 0.8957 | 0.1736 | 0.6450 | 0.3825 | 0.0 | nan | 0.0666 | 0.5406 | 0.5340 | 0.0 | 0.9570 | 0.7497 | 0.9826 | 0.0161 | 0.1262 | 0.5220 | 0.0 | nan | 0.7611 | 0.8645 | 0.7076 | 0.7248 | 0.3637 | nan | 0.5355 | 0.5979 | 0.0 | 0.8396 | 0.0462 | 0.0 | 0.0 | 0.0 | 0.5541 | 0.0 | 0.0 | 0.7555 | 0.1530 | 0.4978 | 0.3036 | 0.0 | nan | 0.0557 | 0.4552 | 0.3779 | 0.0 | 0.8626 | 0.6957 | 0.9399 | 0.0079 | 0.0662 | 0.3827 | 0.0 | | 0.2588 | 9.0 | 1800 | 0.5867 | 0.3597 | 0.4377 | 0.8588 | nan | 0.8202 | 0.9224 | 0.7975 | 0.8144 | 0.6519 | nan | 0.6872 | 0.7382 | 0.0319 | 0.9508 | 0.0736 | 0.0 | 0.0036 | 0.0 | 0.7372 | 0.0 | 0.0 | 0.9216 | 0.2294 | 0.5725 | 0.3596 | 0.0 | nan | 0.0690 | 0.5450 | 0.5389 | 0.0 | 0.9345 | 0.8713 | 0.9842 | 0.0037 | 0.2376 | 0.5116 | 0.0 | nan | 0.7439 | 0.8635 | 0.6600 | 0.7027 | 0.3202 | nan | 0.5441 | 0.6061 | 0.0136 | 0.8412 | 0.0651 | 0.0 | 0.0036 | 0.0 | 0.4753 | 0.0 | 0.0 | 0.7408 | 0.1843 | 0.4959 | 0.3006 | 0.0 | nan | 0.0559 | 0.4579 | 0.3852 | 0.0 | 0.8822 | 0.7645 | 0.9387 | 0.0026 | 0.0771 | 0.3845 | 0.0 | | 0.2303 | 9.1 | 1820 | 0.5980 | 0.3532 | 0.4292 | 0.8599 | nan | 0.8094 | 0.9427 | 0.8150 | 0.7852 | 0.4762 | nan | 0.7417 | 0.7367 | 0.0162 | 0.9494 | 0.0567 | 0.0 | 0.0859 | 0.0 | 0.7208 | 0.0 | 0.0 | 0.9223 | 0.0887 | 0.6457 | 0.3506 | 0.0 | nan | 0.0485 | 0.5267 | 0.5245 | 0.0 | 0.9494 | 0.8379 | 0.9815 | 0.0079 | 0.2876 | 0.4272 | 0.0 | nan | 0.7148 | 0.8715 | 0.6250 | 0.6618 | 0.3683 | nan | 0.5072 | 0.6077 | 0.0080 | 0.8445 | 0.0527 | 0.0 | 0.0836 | 0.0 | 0.4327 | 0.0 | 0.0 | 0.7509 | 0.0843 | 0.5323 | 0.2819 | 0.0 | nan | 0.0407 | 0.4408 | 0.3615 | 0.0 | 0.8794 | 0.7633 | 0.9411 | 0.0036 | 0.0975 | 0.3483 | 0.0 | | 0.1926 | 9.2 | 1840 | 0.6364 | 0.3523 | 0.4171 | 0.8567 | nan | 0.8013 | 0.9537 | 0.7411 | 0.7849 | 0.4775 | nan | 0.6554 | 0.8531 | 0.0 | 0.9392 | 0.0431 | 0.0 | 0.0162 | 0.0 | 0.6634 | 0.0 | 0.0 | 0.9319 | 0.1200 | 0.4821 | 0.3603 | 0.0 | nan | 0.0067 | 0.4962 | 0.5323 | 0.0 | 0.9510 | 0.8484 | 0.9816 | 0.0133 | 0.2473 | 0.4469 | 0.0 | nan | 0.7045 | 0.8678 | 0.6994 | 0.6142 | 0.3742 | nan | 0.5336 | 0.6132 | 0.0 | 0.8493 | 0.0400 | 0.0 | 0.0161 | 0.0 | 0.5747 | 0.0 | 0.0 | 0.7271 | 0.1111 | 0.4372 | 0.3011 | 0.0 | nan | 0.0058 | 0.4273 | 0.3716 | 0.0 | 0.8753 | 0.7585 | 0.9391 | 0.0073 | 0.0719 | 0.3542 | 0.0 | | 0.1283 | 9.3 | 1860 | 0.6048 | 0.3556 | 0.4276 | 0.8652 | nan | 0.8396 | 0.9455 | 0.7744 | 0.8039 | 0.5487 | nan | 0.6756 | 0.7530 | 0.0430 | 0.9380 | 0.0847 | 0.0 | 0.0 | 0.0 | 0.7757 | 0.0 | 0.0 | 0.9322 | 0.1295 | 0.5577 | 0.2965 | 0.0 | nan | 0.1061 | 0.5510 | 0.5570 | 0.0 | 0.9408 | 0.8590 | 0.9812 | 0.0007 | 0.0445 | 0.5457 | 0.0 | nan | 0.7397 | 0.8676 | 0.7034 | 0.6899 | 0.3817 | nan | 0.5452 | 0.5603 | 0.0131 | 0.8537 | 0.0754 | 0.0 | 0.0 | 0.0 | 0.4386 | 0.0 | 0.0 | 0.7376 | 0.1184 | 0.4767 | 0.2647 | 0.0 | nan | 0.0761 | 0.4534 | 0.3578 | 0.0 | 0.8771 | 0.7525 | 0.9408 | 0.0006 | 0.0278 | 0.4266 | 0.0 | | 0.242 | 9.4 | 1880 | 0.6183 | 0.3537 | 0.4259 | 0.8652 | nan | 0.8536 | 0.9574 | 0.7653 | 0.7996 | 0.4903 | nan | 0.6016 | 0.8441 | 0.0 | 0.9293 | 0.1477 | 0.0 | 0.0374 | 0.0 | 0.7279 | 0.0 | 0.0 | 0.9335 | 0.0248 | 0.5401 | 0.3316 | 0.0 | nan | 0.0044 | 0.5457 | 0.5598 | 0.0 | 0.9513 | 0.8410 | 0.9830 | 0.0017 | 0.4150 | 0.3431 | 0.0 | nan | 0.7464 | 0.8714 | 0.7064 | 0.6863 | 0.3932 | nan | 0.5200 | 0.5922 | 0.0 | 0.8531 | 0.1256 | 0.0 | 0.0368 | 0.0 | 0.5249 | 0.0 | 0.0 | 0.7372 | 0.0233 | 0.4739 | 0.2831 | 0.0 | nan | 0.0039 | 0.4503 | 0.3367 | 0.0 | 0.8766 | 0.7550 | 0.9400 | 0.0011 | 0.0830 | 0.2982 | 0.0 | | 0.1308 | 9.5 | 1900 | 0.6292 | 0.3489 | 0.4163 | 0.8593 | nan | 0.7832 | 0.9470 | 0.7857 | 0.8231 | 0.5413 | nan | 0.6989 | 0.8729 | 0.0 | 0.9641 | 0.0627 | 0.0 | 0.0080 | 0.0 | 0.5577 | 0.0 | 0.0 | 0.9156 | 0.0280 | 0.5252 | 0.3589 | 0.0 | nan | 0.0166 | 0.5069 | 0.5345 | 0.0 | 0.9496 | 0.8725 | 0.9817 | 0.0001 | 0.0829 | 0.5056 | 0.0 | nan | 0.7085 | 0.8673 | 0.6933 | 0.6324 | 0.3683 | nan | 0.5443 | 0.5951 | 0.0 | 0.8239 | 0.0577 | 0.0 | 0.0080 | 0.0 | 0.5004 | 0.0 | 0.0 | 0.7442 | 0.0277 | 0.4713 | 0.2897 | 0.0 | nan | 0.0152 | 0.4401 | 0.3716 | 0.0 | 0.8759 | 0.7625 | 0.9409 | 0.0001 | 0.0449 | 0.3823 | 0.0 | | 0.2915 | 9.6 | 1920 | 0.6007 | 0.3634 | 0.4343 | 0.8576 | nan | 0.8409 | 0.9099 | 0.7642 | 0.8886 | 0.6304 | nan | 0.6444 | 0.8280 | 0.0 | 0.9398 | 0.1026 | 0.0 | 0.1016 | 0.0 | 0.6896 | 0.0 | 0.0 | 0.9341 | 0.1520 | 0.5981 | 0.3620 | 0.0 | nan | 0.0959 | 0.5529 | 0.5185 | 0.0 | 0.9471 | 0.7731 | 0.9769 | 0.0006 | 0.2137 | 0.4315 | 0.0 | nan | 0.7414 | 0.8598 | 0.7123 | 0.6627 | 0.3403 | nan | 0.5468 | 0.6359 | 0.0 | 0.8490 | 0.0906 | 0.0 | 0.0997 | 0.0 | 0.5503 | 0.0 | 0.0 | 0.7453 | 0.1337 | 0.5070 | 0.2869 | 0.0 | nan | 0.0682 | 0.4524 | 0.3722 | 0.0 | 0.8685 | 0.7152 | 0.9420 | 0.0005 | 0.0906 | 0.3574 | 0.0 | | 0.1619 | 9.7 | 1940 | 0.5878 | 0.3652 | 0.4320 | 0.8669 | nan | 0.8511 | 0.9478 | 0.7936 | 0.7795 | 0.5206 | nan | 0.6435 | 0.8479 | 0.0 | 0.9404 | 0.0867 | 0.0 | 0.0883 | 0.0 | 0.6922 | 0.0 | 0.0 | 0.9222 | 0.1750 | 0.6193 | 0.3488 | 0.0 | nan | 0.1099 | 0.5323 | 0.4732 | 0.0 | 0.9485 | 0.8684 | 0.9788 | 0.0002 | 0.1708 | 0.4845 | 0.0 | nan | 0.7496 | 0.8625 | 0.7050 | 0.6968 | 0.3725 | nan | 0.5267 | 0.5723 | 0.0 | 0.8505 | 0.0783 | 0.0 | 0.0870 | 0.0 | 0.5447 | 0.0 | 0.0 | 0.7519 | 0.1509 | 0.5221 | 0.3030 | 0.0 | nan | 0.0797 | 0.4446 | 0.3724 | 0.0 | 0.8815 | 0.7715 | 0.9416 | 0.0002 | 0.0589 | 0.3618 | 0.0 | | 0.2238 | 9.8 | 1960 | 0.5750 | 0.3620 | 0.4280 | 0.8640 | nan | 0.8034 | 0.9497 | 0.8037 | 0.8240 | 0.5548 | nan | 0.6514 | 0.7874 | 0.0 | 0.9436 | 0.0789 | 0.0 | 0.0050 | 0.0 | 0.6596 | 0.0 | 0.0 | 0.9175 | 0.1363 | 0.5863 | 0.2951 | 0.0 | nan | 0.1792 | 0.5565 | 0.5238 | 0.0 | 0.9600 | 0.8624 | 0.9827 | 0.0002 | 0.2023 | 0.4319 | 0.0 | nan | 0.7282 | 0.8672 | 0.7068 | 0.6609 | 0.3731 | nan | 0.5422 | 0.6318 | 0.0 | 0.8465 | 0.0714 | 0.0 | 0.0050 | 0.0 | 0.5472 | 0.0 | 0.0 | 0.7508 | 0.1235 | 0.4912 | 0.2610 | 0.0 | nan | 0.1251 | 0.4525 | 0.3775 | 0.0 | 0.8787 | 0.7697 | 0.9406 | 0.0002 | 0.0868 | 0.3471 | 0.0 | | 0.181 | 9.9 | 1980 | 0.6077 | 0.3659 | 0.4361 | 0.8660 | nan | 0.8058 | 0.9632 | 0.8071 | 0.8176 | 0.4577 | nan | 0.6573 | 0.8114 | 0.0016 | 0.9514 | 0.0731 | 0.0 | 0.0823 | 0.0 | 0.6843 | 0.0 | 0.0 | 0.9176 | 0.2187 | 0.5992 | 0.4228 | 0.0 | nan | 0.0728 | 0.5537 | 0.5348 | 0.0 | 0.9523 | 0.8465 | 0.9807 | 0.0004 | 0.3214 | 0.4197 | 0.0 | nan | 0.7249 | 0.8696 | 0.6807 | 0.6553 | 0.3866 | nan | 0.5360 | 0.6245 | 0.0011 | 0.8409 | 0.0679 | 0.0 | 0.0811 | 0.0 | 0.5402 | 0.0 | 0.0 | 0.7561 | 0.1858 | 0.5035 | 0.3159 | 0.0 | nan | 0.0618 | 0.4582 | 0.3643 | 0.0 | 0.8823 | 0.7655 | 0.9433 | 0.0003 | 0.1145 | 0.3475 | 0.0 | | 0.1723 | 10.0 | 2000 | 0.5698 | 0.3647 | 0.4371 | 0.8676 | nan | 0.8435 | 0.9415 | 0.7988 | 0.8204 | 0.5753 | nan | 0.6608 | 0.8658 | 0.0004 | 0.9529 | 0.0692 | 0.0 | 0.0399 | 0.0 | 0.6560 | 0.0 | 0.0 | 0.9084 | 0.2922 | 0.6382 | 0.2656 | 0.0 | nan | 0.1818 | 0.5671 | 0.4989 | 0.0 | 0.9476 | 0.8513 | 0.9835 | 0.0000 | 0.1109 | 0.5181 | 0.0 | nan | 0.7495 | 0.8765 | 0.6768 | 0.6880 | 0.3644 | nan | 0.5321 | 0.5254 | 0.0002 | 0.8442 | 0.0641 | 0.0 | 0.0396 | 0.0 | 0.5579 | 0.0 | 0.0 | 0.7536 | 0.2312 | 0.5268 | 0.2439 | 0.0 | nan | 0.1472 | 0.4636 | 0.3572 | 0.0 | 0.8827 | 0.7725 | 0.9408 | 0.0000 | 0.0551 | 0.3764 | 0.0 | | 0.1796 | 10.1 | 2020 | 0.6002 | 0.3561 | 0.4173 | 0.8698 | nan | 0.8854 | 0.9622 | 0.7805 | 0.7688 | 0.4451 | nan | 0.6674 | 0.7923 | 0.0049 | 0.9383 | 0.0769 | 0.0 | 0.0 | 0.0 | 0.6885 | 0.0 | 0.0 | 0.9317 | 0.1050 | 0.4506 | 0.3630 | 0.0 | nan | 0.0482 | 0.5843 | 0.5063 | 0.0 | 0.9593 | 0.8249 | 0.9777 | 0.0114 | 0.1671 | 0.4151 | 0.0 | nan | 0.7746 | 0.8784 | 0.6935 | 0.6923 | 0.3830 | nan | 0.5434 | 0.5779 | 0.0020 | 0.8516 | 0.0677 | 0.0 | 0.0 | 0.0 | 0.5666 | 0.0 | 0.0 | 0.7279 | 0.1004 | 0.4194 | 0.2902 | 0.0 | nan | 0.0433 | 0.4666 | 0.3417 | 0.0 | 0.8740 | 0.7591 | 0.9439 | 0.0088 | 0.0505 | 0.3390 | 0.0 | | 0.1796 | 10.2 | 2040 | 0.6191 | 0.3570 | 0.4340 | 0.8510 | nan | 0.7408 | 0.9279 | 0.7743 | 0.8695 | 0.5925 | nan | 0.7097 | 0.7978 | 0.0060 | 0.9488 | 0.1261 | 0.0 | 0.0286 | 0.0 | 0.6876 | 0.0 | 0.0 | 0.9216 | 0.1616 | 0.5833 | 0.3228 | 0.0 | nan | 0.1482 | 0.5801 | 0.5649 | 0.0 | 0.9540 | 0.8119 | 0.9782 | 0.0149 | 0.1834 | 0.4523 | 0.0 | nan | 0.6754 | 0.8756 | 0.7088 | 0.5711 | 0.3412 | nan | 0.5429 | 0.5913 | 0.0021 | 0.8415 | 0.1061 | 0.0 | 0.0284 | 0.0 | 0.5505 | 0.0 | 0.0 | 0.7490 | 0.1510 | 0.4931 | 0.2742 | 0.0 | nan | 0.1126 | 0.4679 | 0.3353 | 0.0 | 0.8774 | 0.7492 | 0.9425 | 0.0090 | 0.0610 | 0.3659 | 0.0 | | 0.2422 | 10.3 | 2060 | 0.5916 | 0.3663 | 0.4314 | 0.8704 | nan | 0.8751 | 0.9613 | 0.7615 | 0.7939 | 0.4967 | nan | 0.6193 | 0.8196 | 0.0002 | 0.9310 | 0.1808 | 0.0 | 0.0156 | 0.0 | 0.6595 | 0.0 | 0.0 | 0.9255 | 0.2948 | 0.5006 | 0.2877 | 0.0 | nan | 0.1222 | 0.5373 | 0.5146 | 0.0 | 0.9422 | 0.8742 | 0.9779 | 0.0004 | 0.2006 | 0.5132 | 0.0 | nan | 0.7712 | 0.8737 | 0.7154 | 0.7130 | 0.3837 | nan | 0.5275 | 0.5635 | 0.0001 | 0.8538 | 0.1490 | 0.0 | 0.0155 | 0.0 | 0.5508 | 0.0 | 0.0 | 0.7359 | 0.2405 | 0.4500 | 0.2541 | 0.0 | nan | 0.0939 | 0.4606 | 0.3456 | 0.0 | 0.8812 | 0.7705 | 0.9437 | 0.0003 | 0.0603 | 0.3672 | 0.0 | | 0.2026 | 10.4 | 2080 | 0.5646 | 0.3630 | 0.4326 | 0.8686 | nan | 0.8538 | 0.9433 | 0.7673 | 0.8307 | 0.5518 | nan | 0.6772 | 0.8276 | 0.0011 | 0.9554 | 0.1327 | 0.0 | 0.0814 | 0.0 | 0.6519 | 0.0 | 0.0 | 0.9182 | 0.0747 | 0.5815 | 0.3975 | 0.0 | nan | 0.0813 | 0.5442 | 0.5356 | 0.0 | 0.9545 | 0.8541 | 0.9815 | 0.0002 | 0.2292 | 0.4163 | 0.0 | nan | 0.7587 | 0.8724 | 0.6988 | 0.7286 | 0.3550 | nan | 0.5457 | 0.5709 | 0.0006 | 0.8423 | 0.1143 | 0.0 | 0.0805 | 0.0 | 0.5323 | 0.0 | 0.0 | 0.7452 | 0.0723 | 0.5020 | 0.2997 | 0.0 | nan | 0.0661 | 0.4535 | 0.3444 | 0.0 | 0.8810 | 0.7710 | 0.9428 | 0.0001 | 0.0852 | 0.3510 | 0.0 | | 0.0985 | 10.5 | 2100 | 0.6142 | 0.3533 | 0.4256 | 0.8627 | nan | 0.8371 | 0.9483 | 0.8197 | 0.7611 | 0.5121 | nan | 0.6438 | 0.8376 | 0.0129 | 0.9141 | 0.1704 | 0.0 | 0.0046 | 0.0 | 0.7401 | 0.0 | 0.0 | 0.9358 | 0.0800 | 0.6162 | 0.3199 | 0.0 | nan | 0.0784 | 0.4951 | 0.4927 | 0.0 | 0.9401 | 0.8604 | 0.9809 | 0.0009 | 0.1092 | 0.5067 | 0.0 | nan | 0.7385 | 0.8643 | 0.6029 | 0.6977 | 0.3564 | nan | 0.5397 | 0.4815 | 0.0045 | 0.8455 | 0.1454 | 0.0 | 0.0046 | 0.0 | 0.5542 | 0.0 | 0.0 | 0.7474 | 0.0779 | 0.5220 | 0.2741 | 0.0 | nan | 0.0691 | 0.4378 | 0.3395 | 0.0 | 0.8814 | 0.7696 | 0.9414 | 0.0006 | 0.0474 | 0.3636 | 0.0 | | 0.129 | 10.6 | 2120 | 0.5970 | 0.3600 | 0.4274 | 0.8712 | nan | 0.8555 | 0.9481 | 0.8121 | 0.8462 | 0.4839 | nan | 0.6820 | 0.7683 | 0.0 | 0.9493 | 0.0611 | 0.0 | 0.0 | 0.0 | 0.7116 | 0.0 | 0.0 | 0.9261 | 0.0578 | 0.5941 | 0.3243 | 0.0 | nan | 0.2018 | 0.5643 | 0.5253 | 0.0 | 0.9469 | 0.8967 | 0.9830 | 0.0139 | 0.0910 | 0.4330 | 0.0 | nan | 0.7628 | 0.8796 | 0.6506 | 0.7280 | 0.3696 | nan | 0.5534 | 0.5205 | 0.0 | 0.8490 | 0.0568 | 0.0 | 0.0 | 0.0 | 0.5488 | 0.0 | 0.0 | 0.7399 | 0.0564 | 0.5002 | 0.2847 | 0.0 | nan | 0.1699 | 0.4503 | 0.3859 | 0.0 | 0.8830 | 0.7681 | 0.9423 | 0.0101 | 0.0500 | 0.3604 | 0.0 | | 0.2195 | 10.7 | 2140 | 0.5875 | 0.3673 | 0.4346 | 0.8706 | nan | 0.8730 | 0.9515 | 0.7622 | 0.8346 | 0.5210 | nan | 0.6573 | 0.8155 | 0.0 | 0.9442 | 0.0891 | 0.0 | 0.2017 | 0.0 | 0.7272 | 0.0 | 0.0 | 0.9272 | 0.1671 | 0.5758 | 0.2867 | 0.0 | nan | 0.1521 | 0.5740 | 0.4965 | 0.0 | 0.9551 | 0.7943 | 0.9859 | 0.0004 | 0.2049 | 0.4082 | 0.0 | nan | 0.7686 | 0.8777 | 0.7037 | 0.7166 | 0.3780 | nan | 0.5573 | 0.5395 | 0.0 | 0.8537 | 0.0818 | 0.0 | 0.1935 | 0.0 | 0.5376 | 0.0 | 0.0 | 0.7443 | 0.1575 | 0.5101 | 0.2481 | 0.0 | nan | 0.1143 | 0.4611 | 0.3603 | 0.0 | 0.8727 | 0.7406 | 0.9378 | 0.0004 | 0.0644 | 0.3343 | 0.0 | | 0.1582 | 10.8 | 2160 | 0.6342 | 0.3674 | 0.4391 | 0.8630 | nan | 0.8124 | 0.9575 | 0.7501 | 0.7968 | 0.5084 | nan | 0.6739 | 0.8201 | 0.0 | 0.9606 | 0.0620 | 0.0 | 0.1560 | 0.0 | 0.6487 | 0.0 | 0.0 | 0.8970 | 0.2414 | 0.6243 | 0.3606 | 0.0 | nan | 0.1296 | 0.5946 | 0.5022 | 0.0 | 0.9500 | 0.8442 | 0.9796 | 0.0014 | 0.3767 | 0.4047 | 0.0 | nan | 0.7290 | 0.8592 | 0.6867 | 0.6946 | 0.3653 | nan | 0.5461 | 0.5826 | 0.0 | 0.8365 | 0.0584 | 0.0 | 0.1430 | 0.0 | 0.5369 | 0.0 | 0.0 | 0.7539 | 0.2097 | 0.5298 | 0.2998 | 0.0 | nan | 0.1009 | 0.4661 | 0.3656 | 0.0 | 0.8808 | 0.7576 | 0.9442 | 0.0009 | 0.0815 | 0.3260 | 0.0 | | 0.141 | 10.9 | 2180 | 0.5708 | 0.3747 | 0.4457 | 0.8710 | nan | 0.8407 | 0.9478 | 0.7710 | 0.8542 | 0.5662 | nan | 0.6839 | 0.8254 | 0.0 | 0.9435 | 0.0333 | 0.0 | 0.0817 | 0.0 | 0.7243 | 0.0 | 0.0008 | 0.9160 | 0.2876 | 0.6263 | 0.4032 | 0.0 | nan | 0.1865 | 0.6024 | 0.5075 | 0.0 | 0.9494 | 0.8427 | 0.9741 | 0.0032 | 0.2235 | 0.4666 | 0.0 | nan | 0.7579 | 0.8806 | 0.6630 | 0.7200 | 0.3798 | nan | 0.5666 | 0.6190 | 0.0 | 0.8524 | 0.0318 | 0.0 | 0.0809 | 0.0 | 0.5689 | 0.0 | 0.0007 | 0.7592 | 0.2481 | 0.5451 | 0.3295 | 0.0 | nan | 0.1405 | 0.4706 | 0.3839 | 0.0 | 0.8806 | 0.7427 | 0.9450 | 0.0021 | 0.0652 | 0.3562 | 0.0 | | 0.3199 | 11.0 | 2200 | 0.5767 | 0.3700 | 0.4486 | 0.8663 | nan | 0.8247 | 0.9515 | 0.8180 | 0.8596 | 0.5109 | nan | 0.5983 | 0.8435 | 0.0015 | 0.9562 | 0.0444 | 0.0 | 0.1568 | 0.0 | 0.7398 | 0.0 | 0.0 | 0.9145 | 0.3573 | 0.6270 | 0.3562 | 0.0 | nan | 0.1856 | 0.5679 | 0.5520 | 0.0 | 0.9489 | 0.7975 | 0.9869 | 0.0110 | 0.3207 | 0.4260 | 0.0 | nan | 0.7408 | 0.8811 | 0.6150 | 0.7041 | 0.3718 | nan | 0.5332 | 0.5834 | 0.0008 | 0.8431 | 0.0424 | 0.0 | 0.1501 | 0.0 | 0.5590 | 0.0 | 0.0 | 0.7555 | 0.2712 | 0.5466 | 0.2982 | 0.0 | nan | 0.1552 | 0.4603 | 0.3707 | 0.0 | 0.8720 | 0.7199 | 0.9373 | 0.0073 | 0.0800 | 0.3409 | 0.0 | | 0.3278 | 11.1 | 2220 | 0.6226 | 0.3555 | 0.4225 | 0.8637 | nan | 0.8071 | 0.9370 | 0.7867 | 0.8534 | 0.6282 | nan | 0.6651 | 0.8242 | 0.0 | 0.9478 | 0.0697 | 0.0 | 0.0220 | 0.0 | 0.7002 | 0.0 | 0.0007 | 0.9362 | 0.0432 | 0.5970 | 0.2850 | 0.0 | nan | 0.0440 | 0.5153 | 0.5356 | 0.0 | 0.9503 | 0.8485 | 0.9854 | 0.0018 | 0.0713 | 0.4643 | 0.0 | nan | 0.7277 | 0.8729 | 0.6498 | 0.7103 | 0.3589 | nan | 0.5541 | 0.5788 | 0.0 | 0.8502 | 0.0648 | 0.0 | 0.0218 | 0.0 | 0.5862 | 0.0 | 0.0006 | 0.7406 | 0.0431 | 0.5097 | 0.2525 | 0.0 | nan | 0.0372 | 0.4472 | 0.3750 | 0.0 | 0.8800 | 0.7613 | 0.9395 | 0.0014 | 0.0410 | 0.3711 | 0.0 | | 0.1683 | 11.2 | 2240 | 0.6238 | 0.3707 | 0.4469 | 0.8674 | nan | 0.8060 | 0.9628 | 0.7842 | 0.8534 | 0.4724 | nan | 0.6818 | 0.8545 | 0.0 | 0.9610 | 0.0733 | 0.0 | 0.2164 | 0.0 | 0.6961 | 0.0 | 0.0 | 0.9128 | 0.3061 | 0.5964 | 0.3623 | 0.0 | nan | 0.2198 | 0.5741 | 0.5310 | 0.0 | 0.9357 | 0.8772 | 0.9830 | 0.0016 | 0.1934 | 0.4458 | 0.0 | nan | 0.7305 | 0.8713 | 0.6423 | 0.7167 | 0.3767 | nan | 0.5533 | 0.5352 | 0.0 | 0.8412 | 0.0703 | 0.0 | 0.2041 | 0.0 | 0.5618 | 0.0 | 0.0 | 0.7557 | 0.2336 | 0.5064 | 0.2999 | 0.0 | nan | 0.1291 | 0.4647 | 0.3650 | 0.0 | 0.8804 | 0.7569 | 0.9428 | 0.0015 | 0.0728 | 0.3509 | 0.0 | | 0.1605 | 11.3 | 2260 | 0.5795 | 0.3728 | 0.4441 | 0.8687 | nan | 0.8542 | 0.9531 | 0.7614 | 0.8369 | 0.5294 | nan | 0.6406 | 0.7638 | 0.0 | 0.9529 | 0.1167 | 0.0 | 0.3061 | 0.0 | 0.7072 | 0.0 | 0.0 | 0.9357 | 0.1757 | 0.5333 | 0.3332 | 0.0 | nan | 0.2005 | 0.5990 | 0.5629 | 0.0 | 0.9396 | 0.8592 | 0.9558 | 0.0038 | 0.2350 | 0.4551 | 0.0 | nan | 0.7537 | 0.8742 | 0.6982 | 0.7383 | 0.3793 | nan | 0.5505 | 0.6206 | 0.0 | 0.8509 | 0.1002 | 0.0 | 0.2371 | 0.0 | 0.5679 | 0.0 | 0.0 | 0.7413 | 0.1622 | 0.4769 | 0.2774 | 0.0 | nan | 0.1264 | 0.4381 | 0.3432 | 0.0 | 0.8818 | 0.7618 | 0.9269 | 0.0031 | 0.0698 | 0.3487 | 0.0 | | 0.1396 | 11.4 | 2280 | 0.5767 | 0.3647 | 0.4384 | 0.8682 | nan | 0.8746 | 0.9356 | 0.8180 | 0.8394 | 0.4808 | nan | 0.6852 | 0.7584 | 0.0 | 0.9572 | 0.1009 | 0.0 | 0.3371 | 0.0 | 0.6918 | 0.0 | 0.0 | 0.9334 | 0.0883 | 0.5746 | 0.3300 | 0.0 | nan | 0.1219 | 0.5086 | 0.5141 | 0.0 | 0.9436 | 0.8834 | 0.9812 | 0.0021 | 0.2657 | 0.4033 | 0.0 | nan | 0.7602 | 0.8805 | 0.5760 | 0.7281 | 0.3592 | nan | 0.5368 | 0.6123 | 0.0 | 0.8479 | 0.0859 | 0.0 | 0.2574 | 0.0 | 0.5466 | 0.0 | 0.0 | 0.7417 | 0.0864 | 0.4825 | 0.2789 | 0.0 | nan | 0.0901 | 0.4440 | 0.3503 | 0.0 | 0.8823 | 0.7613 | 0.9418 | 0.0021 | 0.0847 | 0.3331 | 0.0 | | 0.1348 | 11.5 | 2300 | 0.6258 | 0.3692 | 0.4448 | 0.8655 | nan | 0.8082 | 0.9612 | 0.7932 | 0.8424 | 0.4898 | nan | 0.6678 | 0.8474 | 0.0 | 0.9463 | 0.1012 | 0.0 | 0.5633 | 0.0 | 0.6651 | 0.0 | 0.0 | 0.9174 | 0.1172 | 0.5338 | 0.3249 | 0.0 | nan | 0.1375 | 0.5718 | 0.5233 | 0.0 | 0.9462 | 0.8342 | 0.9834 | 0.0045 | 0.1324 | 0.5217 | 0.0 | nan | 0.7343 | 0.8648 | 0.6512 | 0.7241 | 0.3822 | nan | 0.5542 | 0.5387 | 0.0 | 0.8505 | 0.0884 | 0.0 | 0.3417 | 0.0 | 0.5591 | 0.0 | 0.0 | 0.7463 | 0.1116 | 0.4585 | 0.2749 | 0.0 | nan | 0.1039 | 0.4578 | 0.3698 | 0.0 | 0.8810 | 0.7603 | 0.9383 | 0.0037 | 0.0536 | 0.3669 | 0.0 | | 0.1144 | 11.6 | 2320 | 0.6251 | 0.3706 | 0.4457 | 0.8656 | nan | 0.8156 | 0.9497 | 0.8051 | 0.8338 | 0.5297 | nan | 0.7026 | 0.8641 | 0.0015 | 0.9475 | 0.0807 | 0.0 | 0.5249 | 0.0 | 0.7219 | 0.0 | 0.0 | 0.9244 | 0.1660 | 0.5631 | 0.2868 | 0.0 | nan | 0.0826 | 0.5438 | 0.5127 | 0.0 | 0.9554 | 0.8453 | 0.9789 | 0.0003 | 0.1870 | 0.4395 | 0.0 | nan | 0.7347 | 0.8709 | 0.6465 | 0.7180 | 0.3731 | nan | 0.5482 | 0.5144 | 0.0008 | 0.8519 | 0.0725 | 0.0 | 0.4376 | 0.0 | 0.5665 | 0.0 | 0.0 | 0.7457 | 0.1464 | 0.4915 | 0.2523 | 0.0 | nan | 0.0738 | 0.4565 | 0.3591 | 0.0 | 0.8778 | 0.7599 | 0.9440 | 0.0003 | 0.0663 | 0.3493 | 0.0 | | 0.1364 | 11.7 | 2340 | 0.6131 | 0.3674 | 0.4372 | 0.8665 | nan | 0.8147 | 0.9550 | 0.7895 | 0.8548 | 0.4751 | nan | 0.6712 | 0.7707 | 0.0073 | 0.9493 | 0.0768 | 0.0 | 0.0791 | 0.0 | 0.7509 | 0.0 | 0.0 | 0.9269 | 0.2658 | 0.5650 | 0.3656 | 0.0 | nan | 0.1236 | 0.5760 | 0.5266 | 0.0 | 0.9430 | 0.8617 | 0.9851 | 0.0006 | 0.2049 | 0.4518 | 0.0 | nan | 0.7291 | 0.8676 | 0.6741 | 0.6949 | 0.3800 | nan | 0.5491 | 0.5884 | 0.0023 | 0.8533 | 0.0695 | 0.0 | 0.0772 | 0.0 | 0.5511 | 0.0 | 0.0 | 0.7475 | 0.2158 | 0.4930 | 0.2880 | 0.0 | nan | 0.1135 | 0.4663 | 0.3732 | 0.0 | 0.8833 | 0.7754 | 0.9408 | 0.0006 | 0.0709 | 0.3511 | 0.0 | | 0.2197 | 11.8 | 2360 | 0.5734 | 0.3734 | 0.4465 | 0.8701 | nan | 0.8327 | 0.9541 | 0.7742 | 0.8469 | 0.5193 | nan | 0.6696 | 0.7860 | 0.0435 | 0.9492 | 0.0867 | 0.0 | 0.1161 | 0.0 | 0.7444 | 0.0 | 0.0 | 0.9244 | 0.2077 | 0.6531 | 0.3496 | 0.0 | nan | 0.2229 | 0.5427 | 0.5329 | 0.0 | 0.9429 | 0.8597 | 0.9803 | 0.0501 | 0.2688 | 0.4307 | 0.0 | nan | 0.7428 | 0.8713 | 0.6954 | 0.7325 | 0.3784 | nan | 0.5642 | 0.5273 | 0.0127 | 0.8512 | 0.0784 | 0.0 | 0.1132 | 0.0 | 0.5606 | 0.0 | 0.0 | 0.7609 | 0.1806 | 0.5258 | 0.2898 | 0.0 | nan | 0.1667 | 0.4618 | 0.3730 | 0.0 | 0.8832 | 0.7786 | 0.9455 | 0.0381 | 0.0867 | 0.3316 | 0.0 | | 0.2554 | 11.9 | 2380 | 0.5918 | 0.3705 | 0.4438 | 0.8689 | nan | 0.8391 | 0.9453 | 0.7625 | 0.8541 | 0.5608 | nan | 0.7264 | 0.8564 | 0.0 | 0.9460 | 0.1626 | 0.0 | 0.1483 | 0.0 | 0.7141 | 0.0 | 0.0 | 0.9265 | 0.1750 | 0.5530 | 0.3455 | 0.0 | nan | 0.1034 | 0.5706 | 0.5443 | 0.0 | 0.9554 | 0.8439 | 0.9802 | 0.0052 | 0.3029 | 0.3817 | 0.0 | nan | 0.7510 | 0.8732 | 0.6893 | 0.7540 | 0.3691 | nan | 0.5639 | 0.5705 | 0.0 | 0.8499 | 0.1376 | 0.0 | 0.1311 | 0.0 | 0.5748 | 0.0 | 0.0 | 0.7445 | 0.1558 | 0.4815 | 0.2829 | 0.0 | nan | 0.0955 | 0.4686 | 0.3581 | 0.0 | 0.8806 | 0.7759 | 0.9447 | 0.0045 | 0.0779 | 0.3222 | 0.0 | | 0.2711 | 12.0 | 2400 | 0.6319 | 0.3632 | 0.4330 | 0.8671 | nan | 0.8422 | 0.9554 | 0.7821 | 0.8127 | 0.5184 | nan | 0.6174 | 0.6287 | 0.2510 | 0.9395 | 0.1899 | 0.0 | 0.0304 | 0.0 | 0.7609 | 0.0 | 0.0 | 0.9433 | 0.1254 | 0.5463 | 0.3575 | 0.0 | nan | 0.0987 | 0.5525 | 0.4929 | 0.0 | 0.9425 | 0.8727 | 0.9788 | 0.0021 | 0.1966 | 0.4170 | 0.0 | nan | 0.7427 | 0.8662 | 0.6861 | 0.7240 | 0.3806 | nan | 0.5351 | 0.5373 | 0.0305 | 0.8533 | 0.1544 | 0.0 | 0.0300 | 0.0 | 0.5271 | 0.0 | 0.0 | 0.7409 | 0.1153 | 0.4812 | 0.2847 | 0.0 | nan | 0.0876 | 0.4611 | 0.3645 | 0.0 | 0.8860 | 0.7777 | 0.9444 | 0.0015 | 0.0678 | 0.3417 | 0.0 | | 0.1332 | 12.1 | 2420 | 0.6043 | 0.3666 | 0.4293 | 0.8684 | nan | 0.8262 | 0.9548 | 0.7555 | 0.8478 | 0.5225 | nan | 0.6493 | 0.8324 | 0.0038 | 0.9513 | 0.1307 | 0.0 | 0.0527 | 0.0 | 0.6916 | 0.0 | 0.0 | 0.9388 | 0.1511 | 0.5920 | 0.3121 | 0.0 | nan | 0.0948 | 0.5519 | 0.4701 | 0.0 | 0.9443 | 0.8559 | 0.9821 | 0.0002 | 0.1775 | 0.4472 | 0.0 | nan | 0.7376 | 0.8679 | 0.7048 | 0.7237 | 0.3671 | nan | 0.5495 | 0.5856 | 0.0021 | 0.8509 | 0.1173 | 0.0 | 0.0517 | 0.0 | 0.5722 | 0.0 | 0.0 | 0.7460 | 0.1402 | 0.5015 | 0.2701 | 0.0 | nan | 0.0822 | 0.4603 | 0.3715 | 0.0 | 0.8849 | 0.7750 | 0.9458 | 0.0002 | 0.0680 | 0.3565 | 0.0 | | 0.2308 | 12.2 | 2440 | 0.5906 | 0.3718 | 0.4328 | 0.8706 | nan | 0.8531 | 0.9513 | 0.7697 | 0.7995 | 0.5138 | nan | 0.6808 | 0.8252 | 0.0 | 0.9505 | 0.1204 | 0.0 | 0.1194 | 0.0 | 0.6586 | 0.0 | 0.0 | 0.9241 | 0.0845 | 0.5784 | 0.3090 | 0.0 | nan | 0.1074 | 0.5745 | 0.5363 | 0.0 | 0.9534 | 0.8744 | 0.9838 | 0.0069 | 0.1622 | 0.5118 | 0.0 | nan | 0.7502 | 0.8699 | 0.7013 | 0.7250 | 0.3777 | nan | 0.5536 | 0.6461 | 0.0 | 0.8491 | 0.1068 | 0.0 | 0.1119 | 0.0 | 0.5816 | 0.0 | 0.0 | 0.7484 | 0.0827 | 0.5018 | 0.2731 | 0.0 | nan | 0.0941 | 0.4679 | 0.3875 | 0.0 | 0.8832 | 0.7777 | 0.9435 | 0.0049 | 0.0723 | 0.3883 | 0.0 | | 0.1515 | 12.3 | 2460 | 0.5838 | 0.3799 | 0.4509 | 0.8692 | nan | 0.8421 | 0.9520 | 0.7778 | 0.8411 | 0.5345 | nan | 0.6488 | 0.8483 | 0.0140 | 0.9347 | 0.1763 | 0.0 | 0.4235 | 0.0 | 0.6691 | 0.0 | 0.0 | 0.9141 | 0.1371 | 0.6244 | 0.3622 | 0.0 | nan | 0.1605 | 0.5416 | 0.5749 | 0.0 | 0.9604 | 0.8121 | 0.9781 | 0.0010 | 0.2560 | 0.4436 | 0.0 | nan | 0.7502 | 0.8705 | 0.6726 | 0.7230 | 0.3918 | nan | 0.5458 | 0.5629 | 0.0056 | 0.8510 | 0.1486 | 0.0 | 0.3683 | 0.0 | 0.5684 | 0.0 | 0.0 | 0.7557 | 0.1281 | 0.5094 | 0.3033 | 0.0 | nan | 0.1315 | 0.4601 | 0.3689 | 0.0 | 0.8734 | 0.7513 | 0.9462 | 0.0008 | 0.1108 | 0.3598 | 0.0 | | 0.1225 | 12.4 | 2480 | 0.6349 | 0.3697 | 0.4419 | 0.8653 | nan | 0.8298 | 0.9563 | 0.7270 | 0.8458 | 0.4908 | nan | 0.6935 | 0.8170 | 0.0016 | 0.9535 | 0.1277 | 0.0 | 0.1943 | 0.0 | 0.6818 | 0.0 | 0.0 | 0.9243 | 0.0993 | 0.5907 | 0.3374 | 0.0 | nan | 0.1578 | 0.5889 | 0.5923 | 0.0 | 0.9541 | 0.7663 | 0.9832 | 0.0017 | 0.4623 | 0.3629 | 0.0 | nan | 0.7383 | 0.8689 | 0.6986 | 0.7201 | 0.3735 | nan | 0.5547 | 0.6206 | 0.0008 | 0.8499 | 0.1108 | 0.0 | 0.1761 | 0.0 | 0.5647 | 0.0 | 0.0 | 0.7506 | 0.0919 | 0.5152 | 0.2830 | 0.0 | nan | 0.1368 | 0.4754 | 0.3761 | 0.0 | 0.8704 | 0.7117 | 0.9433 | 0.0012 | 0.0886 | 0.3105 | 0.0 | | 0.17 | 12.5 | 2500 | 0.6301 | 0.3746 | 0.4515 | 0.8591 | nan | 0.7800 | 0.9414 | 0.7639 | 0.8348 | 0.5670 | nan | 0.6941 | 0.8344 | 0.0025 | 0.9472 | 0.1556 | 0.0 | 0.2321 | 0.0 | 0.7384 | 0.0 | 0.0 | 0.9207 | 0.1489 | 0.5901 | 0.3355 | 0.0 | nan | 0.2611 | 0.5878 | 0.5471 | 0.0 | 0.9567 | 0.8401 | 0.9839 | 0.0020 | 0.3890 | 0.3939 | 0.0 | nan | 0.6970 | 0.8698 | 0.7009 | 0.6399 | 0.3747 | nan | 0.5503 | 0.6201 | 0.0013 | 0.8517 | 0.1344 | 0.0 | 0.2098 | 0.0 | 0.5732 | 0.0 | 0.0 | 0.7501 | 0.1287 | 0.5132 | 0.2866 | 0.0 | nan | 0.2142 | 0.4713 | 0.3932 | 0.0 | 0.8804 | 0.7604 | 0.9405 | 0.0012 | 0.0953 | 0.3289 | 0.0 | | 0.165 | 12.6 | 2520 | 0.6584 | 0.3661 | 0.4352 | 0.8570 | nan | 0.7368 | 0.9524 | 0.7641 | 0.8569 | 0.5238 | nan | 0.6428 | 0.8209 | 0.0036 | 0.9346 | 0.1399 | 0.0 | 0.1906 | 0.0 | 0.7413 | 0.0 | 0.0 | 0.9404 | 0.2085 | 0.5960 | 0.3233 | 0.0 | nan | 0.1134 | 0.5718 | 0.5039 | 0.0 | 0.9471 | 0.8686 | 0.9802 | 0.0023 | 0.1083 | 0.4567 | 0.0 | nan | 0.6616 | 0.8708 | 0.6893 | 0.5929 | 0.3828 | nan | 0.5459 | 0.5759 | 0.0018 | 0.8532 | 0.1247 | 0.0 | 0.1748 | 0.0 | 0.5620 | 0.0 | 0.0 | 0.7446 | 0.1639 | 0.4943 | 0.2780 | 0.0 | nan | 0.0852 | 0.4645 | 0.3842 | 0.0 | 0.8847 | 0.7794 | 0.9446 | 0.0020 | 0.0760 | 0.3766 | 0.0 | | 0.1429 | 12.7 | 2540 | 0.6257 | 0.3674 | 0.4410 | 0.8642 | nan | 0.8160 | 0.9534 | 0.7486 | 0.8338 | 0.4912 | nan | 0.6777 | 0.8433 | 0.0 | 0.9496 | 0.1179 | 0.0 | 0.3490 | 0.0 | 0.7247 | 0.0 | 0.0 | 0.9433 | 0.1158 | 0.5400 | 0.3428 | 0.0 | nan | 0.0911 | 0.6048 | 0.5402 | 0.0 | 0.9376 | 0.8745 | 0.9812 | 0.0009 | 0.2446 | 0.3913 | 0.0 | nan | 0.7176 | 0.8700 | 0.6719 | 0.6720 | 0.3812 | nan | 0.5478 | 0.5687 | 0.0 | 0.8470 | 0.1068 | 0.0 | 0.2404 | 0.0 | 0.5742 | 0.0 | 0.0 | 0.7373 | 0.1023 | 0.4654 | 0.2893 | 0.0 | nan | 0.0700 | 0.4721 | 0.3740 | 0.0 | 0.8863 | 0.7742 | 0.9449 | 0.0008 | 0.1089 | 0.3334 | 0.0 | | 0.2148 | 12.8 | 2560 | 0.6506 | 0.3615 | 0.4361 | 0.8657 | nan | 0.8294 | 0.9461 | 0.7752 | 0.8529 | 0.5457 | nan | 0.6227 | 0.8170 | 0.0 | 0.9457 | 0.1039 | 0.0 | 0.1968 | 0.0 | 0.7317 | 0.0 | 0.0 | 0.9237 | 0.0543 | 0.5955 | 0.3732 | 0.0 | nan | 0.0714 | 0.5452 | 0.5722 | 0.0 | 0.9555 | 0.8497 | 0.9830 | 0.0013 | 0.3009 | 0.3629 | 0.0 | nan | 0.7423 | 0.8720 | 0.6351 | 0.7116 | 0.3801 | nan | 0.5362 | 0.5780 | 0.0 | 0.8514 | 0.0939 | 0.0 | 0.1506 | 0.0 | 0.5538 | 0.0 | 0.0 | 0.7452 | 0.0523 | 0.4998 | 0.3005 | 0.0 | nan | 0.0557 | 0.4536 | 0.3680 | 0.0 | 0.8828 | 0.7690 | 0.9439 | 0.0008 | 0.0857 | 0.3067 | 0.0 | | 0.1008 | 12.9 | 2580 | 0.6366 | 0.3646 | 0.4308 | 0.8690 | nan | 0.8348 | 0.9596 | 0.7654 | 0.8486 | 0.4781 | nan | 0.6440 | 0.8230 | 0.0 | 0.9401 | 0.0809 | 0.0 | 0.0403 | 0.0 | 0.7241 | 0.0 | 0.0 | 0.9404 | 0.0910 | 0.5811 | 0.3870 | 0.0 | nan | 0.0529 | 0.6005 | 0.5440 | 0.0 | 0.9417 | 0.8558 | 0.9761 | 0.0091 | 0.2742 | 0.3930 | 0.0 | nan | 0.7476 | 0.8739 | 0.6846 | 0.7247 | 0.3771 | nan | 0.5427 | 0.6177 | 0.0 | 0.8530 | 0.0728 | 0.0 | 0.0367 | 0.0 | 0.5974 | 0.0 | 0.0 | 0.7460 | 0.0835 | 0.4990 | 0.3228 | 0.0 | nan | 0.0427 | 0.4805 | 0.3583 | 0.0 | 0.8849 | 0.7654 | 0.9476 | 0.0044 | 0.0760 | 0.3287 | 0.0 | | 0.1522 | 13.0 | 2600 | 0.6428 | 0.3646 | 0.4295 | 0.8690 | nan | 0.8303 | 0.9632 | 0.7664 | 0.8164 | 0.4636 | nan | 0.6786 | 0.7941 | 0.0266 | 0.9519 | 0.0963 | 0.0 | 0.0264 | 0.0 | 0.7352 | 0.0 | 0.0 | 0.9292 | 0.1689 | 0.5832 | 0.3683 | 0.0 | nan | 0.0134 | 0.5600 | 0.5372 | 0.0 | 0.9532 | 0.8346 | 0.9827 | 0.0022 | 0.2319 | 0.4291 | 0.0 | nan | 0.7495 | 0.8672 | 0.6985 | 0.7028 | 0.3705 | nan | 0.5573 | 0.5974 | 0.0090 | 0.8505 | 0.0855 | 0.0 | 0.0226 | 0.0 | 0.5779 | 0.0 | 0.0 | 0.7493 | 0.1519 | 0.4990 | 0.3153 | 0.0 | nan | 0.0110 | 0.4666 | 0.3700 | 0.0 | 0.8830 | 0.7672 | 0.9455 | 0.0016 | 0.0710 | 0.3461 | 0.0 | | 0.2057 | 13.1 | 2620 | 0.6362 | 0.3648 | 0.4290 | 0.8681 | nan | 0.8362 | 0.9518 | 0.7971 | 0.8013 | 0.4958 | nan | 0.6809 | 0.8544 | 0.0 | 0.9491 | 0.1131 | 0.0 | 0.0089 | 0.0 | 0.7202 | 0.0 | 0.0 | 0.9329 | 0.0788 | 0.5645 | 0.3541 | 0.0 | nan | 0.0622 | 0.5870 | 0.5498 | 0.0 | 0.9617 | 0.8374 | 0.9785 | 0.0082 | 0.1591 | 0.4444 | 0.0 | nan | 0.7457 | 0.8653 | 0.7014 | 0.7078 | 0.3798 | nan | 0.5451 | 0.6337 | 0.0 | 0.8506 | 0.1009 | 0.0 | 0.0084 | 0.0 | 0.5899 | 0.0 | 0.0 | 0.7449 | 0.0734 | 0.4999 | 0.3024 | 0.0 | nan | 0.0552 | 0.4695 | 0.3761 | 0.0 | 0.8769 | 0.7513 | 0.9487 | 0.0065 | 0.0730 | 0.3664 | 0.0 | | 0.1275 | 13.2 | 2640 | 0.6206 | 0.3704 | 0.4324 | 0.8716 | nan | 0.8514 | 0.9568 | 0.7906 | 0.8056 | 0.5298 | nan | 0.6604 | 0.8550 | 0.0 | 0.9368 | 0.1211 | 0.0 | 0.0309 | 0.0 | 0.7165 | 0.0 | 0.0 | 0.9430 | 0.1613 | 0.5821 | 0.3947 | 0.0 | nan | 0.0998 | 0.5349 | 0.5008 | 0.0 | 0.9382 | 0.8708 | 0.9835 | 0.0024 | 0.0890 | 0.4824 | 0.0 | nan | 0.7574 | 0.8693 | 0.7222 | 0.7114 | 0.3926 | nan | 0.5542 | 0.5887 | 0.0 | 0.8555 | 0.1079 | 0.0 | 0.0280 | 0.0 | 0.6065 | 0.0 | 0.0 | 0.7420 | 0.1495 | 0.5062 | 0.3216 | 0.0 | nan | 0.0856 | 0.4645 | 0.3651 | 0.0 | 0.8853 | 0.7743 | 0.9452 | 0.0023 | 0.0430 | 0.3754 | 0.0 | | 0.1347 | 13.3 | 2660 | 0.6095 | 0.3719 | 0.4455 | 0.8700 | nan | 0.8279 | 0.9563 | 0.7853 | 0.8425 | 0.5450 | nan | 0.6901 | 0.8485 | 0.0 | 0.9497 | 0.1046 | 0.0 | 0.0767 | 0.0 | 0.7409 | 0.0 | 0.0 | 0.9110 | 0.2988 | 0.5294 | 0.3394 | 0.0 | nan | 0.1154 | 0.6044 | 0.5436 | 0.0 | 0.9577 | 0.8662 | 0.9823 | 0.0019 | 0.3407 | 0.3988 | 0.0 | nan | 0.7461 | 0.8734 | 0.6989 | 0.7428 | 0.3914 | nan | 0.5611 | 0.5997 | 0.0 | 0.8506 | 0.0949 | 0.0 | 0.0700 | 0.0 | 0.5830 | 0.0 | 0.0 | 0.7483 | 0.2296 | 0.4694 | 0.2874 | 0.0 | nan | 0.0914 | 0.4729 | 0.3638 | 0.0 | 0.8769 | 0.7703 | 0.9457 | 0.0016 | 0.0945 | 0.3362 | 0.0 | | 0.2485 | 13.4 | 2680 | 0.6268 | 0.3714 | 0.4387 | 0.8683 | nan | 0.8264 | 0.9537 | 0.8050 | 0.8298 | 0.5365 | nan | 0.6674 | 0.8409 | 0.0 | 0.9517 | 0.1271 | 0.0 | 0.0620 | 0.0 | 0.7334 | 0.0 | 0.0001 | 0.9130 | 0.1673 | 0.6948 | 0.2833 | 0.0 | nan | 0.1799 | 0.5789 | 0.4927 | 0.0 | 0.9572 | 0.7666 | 0.9814 | 0.0028 | 0.2146 | 0.4709 | 0.0 | nan | 0.7393 | 0.8683 | 0.6553 | 0.7410 | 0.3897 | nan | 0.5474 | 0.6279 | 0.0 | 0.8533 | 0.1132 | 0.0 | 0.0553 | 0.0 | 0.6019 | 0.0 | 0.0001 | 0.7628 | 0.1513 | 0.5262 | 0.2585 | 0.0 | nan | 0.1395 | 0.4651 | 0.3625 | 0.0 | 0.8740 | 0.7215 | 0.9462 | 0.0020 | 0.1102 | 0.3729 | 0.0 | | 0.1525 | 13.5 | 2700 | 0.6110 | 0.3650 | 0.4255 | 0.8704 | nan | 0.8474 | 0.9557 | 0.7782 | 0.8435 | 0.5040 | nan | 0.6619 | 0.7982 | 0.0 | 0.9442 | 0.0899 | 0.0 | 0.0353 | 0.0 | 0.6722 | 0.0 | 0.0 | 0.9479 | 0.0813 | 0.5063 | 0.3830 | 0.0 | nan | 0.0530 | 0.5744 | 0.5097 | 0.0 | 0.9437 | 0.8528 | 0.9840 | 0.0021 | 0.1918 | 0.4547 | 0.0 | nan | 0.7505 | 0.8771 | 0.6676 | 0.7346 | 0.3815 | nan | 0.5483 | 0.6471 | 0.0 | 0.8551 | 0.0823 | 0.0 | 0.0322 | 0.0 | 0.5850 | 0.0 | 0.0 | 0.7303 | 0.0762 | 0.4660 | 0.3171 | 0.0 | nan | 0.0483 | 0.4664 | 0.3777 | 0.0 | 0.8836 | 0.7640 | 0.9446 | 0.0016 | 0.0762 | 0.3674 | 0.0 | | 0.143 | 13.6 | 2720 | 0.6194 | 0.3740 | 0.4436 | 0.8696 | nan | 0.8290 | 0.9541 | 0.7935 | 0.8487 | 0.5119 | nan | 0.6949 | 0.8598 | 0.0002 | 0.9392 | 0.1333 | 0.0 | 0.1574 | 0.0 | 0.6618 | 0.0 | 0.0 | 0.9352 | 0.2334 | 0.5895 | 0.3310 | 0.0 | nan | 0.1549 | 0.5873 | 0.5214 | 0.0 | 0.9487 | 0.8358 | 0.9781 | 0.0015 | 0.2644 | 0.4311 | 0.0 | nan | 0.7467 | 0.8738 | 0.6418 | 0.7283 | 0.3822 | nan | 0.5565 | 0.6359 | 0.0002 | 0.8549 | 0.1172 | 0.0 | 0.1287 | 0.0 | 0.5698 | 0.0 | 0.0 | 0.7493 | 0.1940 | 0.5161 | 0.2898 | 0.0 | nan | 0.1212 | 0.4736 | 0.3769 | 0.0 | 0.8815 | 0.7545 | 0.9493 | 0.0012 | 0.0793 | 0.3459 | 0.0 | | 0.1094 | 13.7 | 2740 | 0.6446 | 0.3740 | 0.4465 | 0.8647 | nan | 0.8092 | 0.9513 | 0.7723 | 0.8255 | 0.5528 | nan | 0.6324 | 0.8453 | 0.0056 | 0.9468 | 0.1589 | 0.0 | 0.2978 | 0.0000 | 0.6940 | 0.0 | 0.0 | 0.9222 | 0.1008 | 0.6086 | 0.4271 | 0.0 | nan | 0.1750 | 0.5541 | 0.5967 | 0.0 | 0.9486 | 0.8623 | 0.9790 | 0.0019 | 0.1964 | 0.4233 | 0.0 | nan | 0.7344 | 0.8628 | 0.6750 | 0.6872 | 0.3726 | nan | 0.5472 | 0.6229 | 0.0033 | 0.8487 | 0.1334 | 0.0 | 0.2540 | 0.0000 | 0.5686 | 0.0 | 0.0 | 0.7526 | 0.0916 | 0.5157 | 0.3333 | 0.0 | nan | 0.1280 | 0.4655 | 0.3615 | 0.0 | 0.8826 | 0.7633 | 0.9485 | 0.0014 | 0.0656 | 0.3464 | 0.0 | | 0.1719 | 13.8 | 2760 | 0.6517 | 0.3692 | 0.4442 | 0.8633 | nan | 0.8127 | 0.9523 | 0.7780 | 0.8328 | 0.4678 | nan | 0.6884 | 0.8158 | 0.0169 | 0.9453 | 0.1204 | 0.0 | 0.3134 | 0.0 | 0.7445 | 0.0 | 0.0 | 0.9160 | 0.1944 | 0.5866 | 0.3583 | 0.0 | nan | 0.0522 | 0.5863 | 0.5834 | 0.0 | 0.9586 | 0.7890 | 0.9860 | 0.0052 | 0.2654 | 0.4462 | 0.0 | nan | 0.7298 | 0.8677 | 0.6782 | 0.6843 | 0.3674 | nan | 0.5435 | 0.6222 | 0.0066 | 0.8529 | 0.1059 | 0.0 | 0.2405 | 0.0 | 0.5573 | 0.0 | 0.0 | 0.7502 | 0.1709 | 0.5163 | 0.2980 | 0.0 | nan | 0.0417 | 0.4815 | 0.3531 | 0.0 | 0.8734 | 0.7261 | 0.9391 | 0.0036 | 0.0683 | 0.3350 | 0.0 | | 0.197 | 13.9 | 2780 | 0.6215 | 0.3729 | 0.4418 | 0.8687 | nan | 0.8668 | 0.9514 | 0.7759 | 0.7972 | 0.5255 | nan | 0.6496 | 0.8175 | 0.0228 | 0.9515 | 0.0731 | 0.0 | 0.2487 | 0.0 | 0.7049 | 0.0 | 0.0 | 0.9290 | 0.1966 | 0.5696 | 0.3480 | 0.0 | nan | 0.1606 | 0.5942 | 0.5603 | 0.0 | 0.9526 | 0.7950 | 0.9848 | 0.0269 | 0.2232 | 0.4112 | 0.0 | nan | 0.7569 | 0.8791 | 0.6367 | 0.7200 | 0.3761 | nan | 0.5464 | 0.6425 | 0.0090 | 0.8494 | 0.0679 | 0.0 | 0.1974 | 0.0 | 0.5789 | 0.0 | 0.0 | 0.7471 | 0.1676 | 0.5024 | 0.2995 | 0.0 | nan | 0.1214 | 0.4876 | 0.3790 | 0.0 | 0.8760 | 0.7302 | 0.9424 | 0.0164 | 0.0679 | 0.3361 | 0.0 | | 0.1238 | 14.0 | 2800 | 0.6008 | 0.3822 | 0.4546 | 0.8742 | nan | 0.8710 | 0.9515 | 0.7717 | 0.8274 | 0.4743 | nan | 0.7189 | 0.8324 | 0.0504 | 0.9358 | 0.1246 | 0.0 | 0.4430 | 0.0 | 0.7318 | 0.0 | 0.0173 | 0.9097 | 0.0782 | 0.6669 | 0.3868 | 0.0 | nan | 0.1546 | 0.5889 | 0.5536 | 0.0 | 0.9543 | 0.8475 | 0.9784 | 0.0074 | 0.1882 | 0.4838 | 0.0 | nan | 0.7677 | 0.8817 | 0.6589 | 0.7277 | 0.3922 | nan | 0.5545 | 0.6251 | 0.0184 | 0.8534 | 0.1129 | 0.0 | 0.3401 | 0.0 | 0.5678 | 0.0 | 0.0133 | 0.7551 | 0.0757 | 0.5311 | 0.3201 | 0.0 | nan | 0.1346 | 0.4819 | 0.3939 | 0.0 | 0.8806 | 0.7508 | 0.9467 | 0.0057 | 0.0696 | 0.3693 | 0.0 | | 0.1317 | 14.1 | 2820 | 0.6037 | 0.3756 | 0.4368 | 0.8766 | nan | 0.8728 | 0.9625 | 0.7553 | 0.8525 | 0.4601 | nan | 0.6745 | 0.8292 | 0.0124 | 0.9444 | 0.1426 | 0.0 | 0.0085 | 0.0 | 0.6589 | 0.0 | 0.0052 | 0.9372 | 0.1747 | 0.5779 | 0.3701 | 0.0 | nan | 0.1536 | 0.6059 | 0.5533 | 0.0 | 0.9529 | 0.8101 | 0.9833 | 0.0028 | 0.2206 | 0.4557 | 0.0 | nan | 0.7769 | 0.8838 | 0.7156 | 0.7342 | 0.3727 | nan | 0.5626 | 0.6729 | 0.0066 | 0.8449 | 0.1261 | 0.0 | 0.0054 | 0.0 | 0.5665 | 0.0 | 0.0038 | 0.7449 | 0.1511 | 0.5088 | 0.3111 | 0.0 | nan | 0.1232 | 0.4809 | 0.3857 | 0.0 | 0.8812 | 0.7537 | 0.9452 | 0.0027 | 0.0850 | 0.3737 | 0.0 | | 0.1561 | 14.2 | 2840 | 0.6731 | 0.3774 | 0.4507 | 0.8659 | nan | 0.8091 | 0.9530 | 0.7726 | 0.8470 | 0.5168 | nan | 0.7039 | 0.8604 | 0.0233 | 0.9451 | 0.1814 | 0.0 | 0.4042 | 0.0002 | 0.7136 | 0.0 | 0.0019 | 0.9294 | 0.1412 | 0.5499 | 0.3167 | 0.0 | nan | 0.2017 | 0.5679 | 0.5189 | 0.0 | 0.9553 | 0.8126 | 0.9828 | 0.0065 | 0.2318 | 0.4767 | 0.0 | nan | 0.7279 | 0.8725 | 0.6738 | 0.6924 | 0.3847 | nan | 0.5622 | 0.6163 | 0.0140 | 0.8548 | 0.1535 | 0.0 | 0.2975 | 0.0002 | 0.5667 | 0.0 | 0.0014 | 0.7453 | 0.1297 | 0.4810 | 0.2752 | 0.0 | nan | 0.1615 | 0.4737 | 0.3750 | 0.0 | 0.8758 | 0.7340 | 0.9463 | 0.0055 | 0.0817 | 0.3751 | 0.0 | | 0.1034 | 14.3 | 2860 | 0.6550 | 0.3748 | 0.4388 | 0.8693 | nan | 0.8425 | 0.9480 | 0.7923 | 0.8447 | 0.5247 | nan | 0.6776 | 0.7910 | 0.0 | 0.9425 | 0.1608 | 0.0 | 0.0004 | 0.0 | 0.6871 | 0.0 | 0.0 | 0.9286 | 0.1559 | 0.4997 | 0.3733 | 0.0035 | nan | 0.2962 | 0.5880 | 0.5191 | 0.0 | 0.9588 | 0.8267 | 0.9847 | 0.0279 | 0.1601 | 0.5066 | 0.0 | nan | 0.7517 | 0.8791 | 0.6509 | 0.7343 | 0.3776 | nan | 0.5648 | 0.6827 | 0.0 | 0.8562 | 0.1422 | 0.0 | 0.0004 | 0.0 | 0.5781 | 0.0 | 0.0 | 0.7368 | 0.1417 | 0.4471 | 0.3179 | 0.0035 | nan | 0.2035 | 0.4832 | 0.3897 | 0.0 | 0.8790 | 0.7543 | 0.9447 | 0.0138 | 0.0668 | 0.3940 | 0.0 | | 0.1665 | 14.4 | 2880 | 0.6118 | 0.3831 | 0.4646 | 0.8676 | nan | 0.7911 | 0.9537 | 0.7726 | 0.8726 | 0.4892 | nan | 0.7120 | 0.8720 | 0.0038 | 0.9533 | 0.1108 | 0.0 | 0.1787 | 0.0 | 0.6998 | 0.0 | 0.0 | 0.9148 | 0.3593 | 0.6052 | 0.4110 | 0.0 | nan | 0.3748 | 0.6291 | 0.5697 | 0.0 | 0.9530 | 0.8561 | 0.9818 | 0.0107 | 0.3523 | 0.4381 | 0.0 | nan | 0.7136 | 0.8780 | 0.6737 | 0.6563 | 0.3897 | nan | 0.5721 | 0.6119 | 0.0021 | 0.8477 | 0.0971 | 0.0 | 0.1439 | 0.0 | 0.5845 | 0.0 | 0.0 | 0.7619 | 0.2726 | 0.5318 | 0.3321 | 0.0 | nan | 0.2603 | 0.4942 | 0.3754 | 0.0 | 0.8866 | 0.7676 | 0.9465 | 0.0086 | 0.0947 | 0.3578 | 0.0 | | 0.1398 | 14.5 | 2900 | 0.6139 | 0.3737 | 0.4408 | 0.8712 | nan | 0.8352 | 0.9550 | 0.7588 | 0.8556 | 0.5251 | nan | 0.6427 | 0.8668 | 0.0036 | 0.9531 | 0.1265 | 0.0 | 0.0435 | 0.0 | 0.6477 | 0.0 | 0.0 | 0.9243 | 0.2127 | 0.6372 | 0.3441 | 0.0 | nan | 0.1601 | 0.5896 | 0.5623 | 0.0 | 0.9568 | 0.8104 | 0.9820 | 0.0034 | 0.2632 | 0.4466 | 0.0 | nan | 0.7429 | 0.8702 | 0.6934 | 0.7224 | 0.3869 | nan | 0.5542 | 0.6216 | 0.0023 | 0.8513 | 0.1085 | 0.0 | 0.0360 | 0.0 | 0.5713 | 0.0 | 0.0 | 0.7667 | 0.1898 | 0.5516 | 0.2950 | 0.0 | nan | 0.1310 | 0.4800 | 0.3634 | 0.0 | 0.8795 | 0.7426 | 0.9460 | 0.0033 | 0.0936 | 0.3543 | 0.0 | | 0.1276 | 14.6 | 2920 | 0.6433 | 0.3719 | 0.4329 | 0.8700 | nan | 0.8228 | 0.9536 | 0.7414 | 0.8581 | 0.5513 | nan | 0.6904 | 0.8408 | 0.0 | 0.9392 | 0.1324 | 0.0 | 0.0 | 0.0033 | 0.6332 | 0.0 | 0.0 | 0.9406 | 0.1394 | 0.6059 | 0.3639 | 0.0 | nan | 0.1756 | 0.5418 | 0.5289 | 0.0 | 0.9547 | 0.8334 | 0.9822 | 0.0075 | 0.2013 | 0.4118 | 0.0 | nan | 0.7384 | 0.8704 | 0.6956 | 0.7200 | 0.3889 | nan | 0.5758 | 0.6502 | 0.0 | 0.8530 | 0.1164 | 0.0 | 0.0 | 0.0033 | 0.5765 | 0.0 | 0.0 | 0.7459 | 0.1295 | 0.5186 | 0.3150 | 0.0 | nan | 0.1326 | 0.4657 | 0.3834 | 0.0 | 0.8835 | 0.7562 | 0.9467 | 0.0067 | 0.0838 | 0.3460 | 0.0 | | 0.1534 | 14.7 | 2940 | 0.6143 | 0.3755 | 0.4412 | 0.8690 | nan | 0.8303 | 0.9475 | 0.7472 | 0.8284 | 0.5513 | nan | 0.6966 | 0.8160 | 0.0036 | 0.9502 | 0.1188 | 0.0 | 0.0228 | 0.0112 | 0.6890 | 0.0 | 0.0 | 0.9327 | 0.1902 | 0.6148 | 0.3769 | 0.0 | nan | 0.2047 | 0.6157 | 0.5613 | 0.0 | 0.9489 | 0.8397 | 0.9842 | 0.0051 | 0.1582 | 0.4742 | 0.0 | nan | 0.7388 | 0.8680 | 0.7020 | 0.7018 | 0.3790 | nan | 0.5612 | 0.6487 | 0.0019 | 0.8491 | 0.1049 | 0.0 | 0.0214 | 0.0111 | 0.5808 | 0.0 | 0.0 | 0.7596 | 0.1761 | 0.5272 | 0.3231 | 0.0 | nan | 0.1657 | 0.4738 | 0.3796 | 0.0 | 0.8843 | 0.7596 | 0.9453 | 0.0042 | 0.0823 | 0.3670 | 0.0 | | 0.7322 | 14.8 | 2960 | 0.5924 | 0.3867 | 0.4634 | 0.8740 | nan | 0.8622 | 0.9550 | 0.7538 | 0.8256 | 0.4927 | nan | 0.6743 | 0.8511 | 0.0127 | 0.9476 | 0.1325 | 0.0 | 0.2133 | 0.0395 | 0.7274 | 0.0 | 0.0 | 0.9184 | 0.3518 | 0.6067 | 0.4630 | 0.0 | nan | 0.2801 | 0.5880 | 0.5689 | 0.0 | 0.9469 | 0.8717 | 0.9805 | 0.0367 | 0.2787 | 0.4502 | 0.0 | nan | 0.7656 | 0.8766 | 0.6694 | 0.7074 | 0.3811 | nan | 0.5560 | 0.6148 | 0.0075 | 0.8546 | 0.1119 | 0.0 | 0.1506 | 0.0395 | 0.5836 | 0.0 | 0.0 | 0.7703 | 0.2928 | 0.5369 | 0.3488 | 0.0 | nan | 0.2015 | 0.4753 | 0.3556 | 0.0 | 0.8884 | 0.7731 | 0.9472 | 0.0223 | 0.0949 | 0.3494 | 0.0 | | 0.1363 | 14.9 | 2980 | 0.5998 | 0.3843 | 0.4588 | 0.8758 | nan | 0.8613 | 0.9508 | 0.7900 | 0.8233 | 0.5551 | nan | 0.6846 | 0.7923 | 0.1220 | 0.9444 | 0.1305 | 0.0 | 0.2587 | 0.0345 | 0.7305 | 0.0 | 0.0 | 0.9263 | 0.2417 | 0.6455 | 0.3663 | 0.0 | nan | 0.1970 | 0.5860 | 0.5722 | 0.0 | 0.9550 | 0.8504 | 0.9841 | 0.0068 | 0.1978 | 0.4733 | 0.0 | nan | 0.7683 | 0.8822 | 0.6306 | 0.7341 | 0.3911 | nan | 0.5612 | 0.6044 | 0.0279 | 0.8514 | 0.1080 | 0.0 | 0.1795 | 0.0345 | 0.5812 | 0.0 | 0.0 | 0.7676 | 0.2177 | 0.5378 | 0.3020 | 0.0 | nan | 0.1535 | 0.4746 | 0.3613 | 0.0 | 0.8867 | 0.7781 | 0.9441 | 0.0047 | 0.1336 | 0.3809 | 0.0 | | 0.1537 | 15.0 | 3000 | 0.6302 | 0.3781 | 0.4487 | 0.8742 | nan | 0.8608 | 0.9579 | 0.7474 | 0.8513 | 0.5297 | nan | 0.6613 | 0.8043 | 0.0522 | 0.9385 | 0.1654 | 0.0 | 0.2167 | 0.0953 | 0.7527 | 0.0 | 0.0 | 0.9337 | 0.1895 | 0.5803 | 0.3136 | 0.0 | nan | 0.1148 | 0.6138 | 0.5339 | 0.0 | 0.9526 | 0.8402 | 0.9592 | 0.0053 | 0.2302 | 0.4568 | 0.0 | nan | 0.7672 | 0.8787 | 0.6736 | 0.7405 | 0.3974 | nan | 0.5523 | 0.5813 | 0.0141 | 0.8544 | 0.1415 | 0.0 | 0.1406 | 0.0953 | 0.5765 | 0.0 | 0.0 | 0.7532 | 0.1724 | 0.5052 | 0.2751 | 0.0 | nan | 0.0875 | 0.4545 | 0.3693 | 0.0 | 0.8846 | 0.7687 | 0.9310 | 0.0042 | 0.1148 | 0.3662 | 0.0 | | 0.1485 | 15.1 | 3020 | 0.6229 | 0.3789 | 0.4409 | 0.8729 | nan | 0.8526 | 0.9580 | 0.7556 | 0.8393 | 0.4950 | nan | 0.6441 | 0.7653 | 0.0076 | 0.9464 | 0.0919 | 0.0 | 0.0298 | 0.0986 | 0.7278 | 0.0 | 0.0 | 0.9373 | 0.2535 | 0.5704 | 0.3935 | 0.0 | nan | 0.1312 | 0.5455 | 0.5439 | 0.0 | 0.9529 | 0.8331 | 0.9853 | 0.0344 | 0.2334 | 0.4811 | 0.0 | nan | 0.7615 | 0.8757 | 0.6904 | 0.7305 | 0.3855 | nan | 0.5535 | 0.6358 | 0.0036 | 0.8538 | 0.0840 | 0.0 | 0.0231 | 0.0986 | 0.5857 | 0.0 | 0.0 | 0.7518 | 0.2238 | 0.5053 | 0.3084 | 0.0 | nan | 0.1041 | 0.4677 | 0.3890 | 0.0 | 0.8814 | 0.7567 | 0.9427 | 0.0189 | 0.1170 | 0.3772 | 0.0 | | 0.1718 | 15.2 | 3040 | 0.6701 | 0.3798 | 0.4523 | 0.8674 | nan | 0.7972 | 0.9520 | 0.8221 | 0.8217 | 0.5098 | nan | 0.7271 | 0.8380 | 0.0024 | 0.9484 | 0.0892 | 0.0 | 0.2547 | 0.0806 | 0.7163 | 0.0 | 0.0 | 0.9314 | 0.2443 | 0.5898 | 0.4167 | 0.0 | nan | 0.1191 | 0.6087 | 0.5311 | 0.0 | 0.9565 | 0.8495 | 0.9722 | 0.0061 | 0.2044 | 0.4828 | 0.0 | nan | 0.7285 | 0.8674 | 0.6197 | 0.6923 | 0.3759 | nan | 0.5581 | 0.6158 | 0.0015 | 0.8562 | 0.0833 | 0.0 | 0.1950 | 0.0806 | 0.5738 | 0.0 | 0.0 | 0.7618 | 0.2240 | 0.5185 | 0.3276 | 0.0 | nan | 0.1040 | 0.4752 | 0.3956 | 0.0 | 0.8844 | 0.7688 | 0.9478 | 0.0047 | 0.1121 | 0.3815 | 0.0 | | 0.2132 | 15.3 | 3060 | 0.6335 | 0.3901 | 0.4622 | 0.8752 | nan | 0.8468 | 0.9570 | 0.7840 | 0.8462 | 0.4725 | nan | 0.6814 | 0.8495 | 0.0 | 0.9521 | 0.0949 | 0.0 | 0.4795 | 0.0581 | 0.7033 | 0.0 | 0.0 | 0.9150 | 0.2418 | 0.6258 | 0.4263 | 0.0077 | nan | 0.1338 | 0.5820 | 0.5426 | 0.0 | 0.9475 | 0.8725 | 0.9865 | 0.0032 | 0.2268 | 0.5548 | 0.0 | nan | 0.7638 | 0.8767 | 0.6849 | 0.7075 | 0.3723 | nan | 0.5592 | 0.6402 | 0.0 | 0.8557 | 0.0874 | 0.0 | 0.3410 | 0.0580 | 0.5791 | 0.0 | 0.0 | 0.7701 | 0.2224 | 0.5375 | 0.3389 | 0.0077 | nan | 0.1213 | 0.4724 | 0.3778 | 0.0 | 0.8877 | 0.7703 | 0.9448 | 0.0028 | 0.1137 | 0.3887 | 0.0 | | 0.1008 | 15.4 | 3080 | 0.6443 | 0.3736 | 0.4415 | 0.8748 | nan | 0.8665 | 0.9538 | 0.7823 | 0.8147 | 0.5270 | nan | 0.6705 | 0.8323 | 0.0113 | 0.9477 | 0.1022 | 0.0 | 0.0254 | 0.0193 | 0.7429 | 0.0 | 0.0 | 0.9372 | 0.1383 | 0.5936 | 0.3546 | 0.0074 | nan | 0.1973 | 0.5904 | 0.5592 | 0.0 | 0.9524 | 0.8656 | 0.9749 | 0.0057 | 0.2072 | 0.4480 | 0.0 | nan | 0.7751 | 0.8771 | 0.7096 | 0.7160 | 0.3638 | nan | 0.5610 | 0.6100 | 0.0053 | 0.8599 | 0.0934 | 0.0 | 0.0221 | 0.0193 | 0.5593 | 0.0 | 0.0 | 0.7542 | 0.1324 | 0.5138 | 0.3004 | 0.0074 | nan | 0.1418 | 0.4798 | 0.3722 | 0.0 | 0.8868 | 0.7715 | 0.9507 | 0.0051 | 0.0993 | 0.3687 | 0.0 | | 0.0901 | 15.5 | 3100 | 0.6716 | 0.3743 | 0.4341 | 0.8706 | nan | 0.8558 | 0.9587 | 0.6992 | 0.7659 | 0.5143 | nan | 0.6714 | 0.8210 | 0.0 | 0.9460 | 0.1066 | 0.0 | 0.0 | 0.0136 | 0.6662 | 0.0 | 0.0001 | 0.9354 | 0.2046 | 0.5539 | 0.3793 | 0.0 | nan | 0.2436 | 0.5818 | 0.5368 | 0.0 | 0.9485 | 0.8743 | 0.9826 | 0.0010 | 0.1007 | 0.5287 | 0.0 | nan | 0.7455 | 0.8663 | 0.6775 | 0.6980 | 0.3842 | nan | 0.5610 | 0.6267 | 0.0 | 0.8585 | 0.0963 | 0.0 | 0.0 | 0.0136 | 0.5838 | 0.0 | 0.0001 | 0.7477 | 0.1866 | 0.4929 | 0.3108 | 0.0 | nan | 0.1832 | 0.4803 | 0.3828 | 0.0 | 0.8878 | 0.7795 | 0.9468 | 0.0009 | 0.0737 | 0.3941 | 0.0 | | 0.1319 | 15.6 | 3120 | 0.6274 | 0.3881 | 0.4643 | 0.8720 | nan | 0.8344 | 0.9556 | 0.7932 | 0.8385 | 0.4978 | nan | 0.7199 | 0.8512 | 0.0053 | 0.9580 | 0.0994 | 0.0 | 0.3726 | 0.0014 | 0.6745 | 0.0 | 0.0032 | 0.9289 | 0.2962 | 0.6204 | 0.3558 | 0.0025 | nan | 0.4086 | 0.5790 | 0.5311 | 0.0 | 0.9423 | 0.8488 | 0.9777 | 0.0062 | 0.2870 | 0.4688 | 0.0 | nan | 0.7538 | 0.8741 | 0.6520 | 0.7409 | 0.3659 | nan | 0.5670 | 0.6344 | 0.0029 | 0.8528 | 0.0888 | 0.0 | 0.2982 | 0.0014 | 0.5932 | 0.0 | 0.0024 | 0.7569 | 0.2374 | 0.5227 | 0.3058 | 0.0025 | nan | 0.2161 | 0.4790 | 0.3802 | 0.0 | 0.8840 | 0.7661 | 0.9486 | 0.0043 | 0.1112 | 0.3771 | 0.0 | | 0.1196 | 15.7 | 3140 | 0.6027 | 0.3772 | 0.4491 | 0.8713 | nan | 0.8343 | 0.9517 | 0.7782 | 0.8481 | 0.4852 | nan | 0.6646 | 0.8781 | 0.0082 | 0.9449 | 0.1244 | 0.0 | 0.1531 | 0.0005 | 0.6859 | 0.0 | 0.0001 | 0.9241 | 0.1114 | 0.6287 | 0.4011 | 0.0004 | nan | 0.2774 | 0.5891 | 0.5923 | 0.0 | 0.9515 | 0.8736 | 0.9861 | 0.0088 | 0.1966 | 0.4716 | 0.0 | nan | 0.7500 | 0.8720 | 0.6315 | 0.7355 | 0.3701 | nan | 0.5496 | 0.6068 | 0.0071 | 0.8624 | 0.1100 | 0.0 | 0.1400 | 0.0005 | 0.6012 | 0.0 | 0.0001 | 0.7640 | 0.1051 | 0.5415 | 0.3113 | 0.0004 | nan | 0.1998 | 0.4782 | 0.3660 | 0.0 | 0.8829 | 0.7651 | 0.9441 | 0.0053 | 0.0993 | 0.3703 | 0.0 | | 0.2085 | 15.8 | 3160 | 0.6267 | 0.3826 | 0.4602 | 0.8690 | nan | 0.8379 | 0.9526 | 0.7944 | 0.8073 | 0.5268 | nan | 0.6582 | 0.8739 | 0.0182 | 0.9506 | 0.1455 | 0.0 | 0.4964 | 0.0267 | 0.7657 | 0.0 | 0.0 | 0.9212 | 0.0797 | 0.6069 | 0.3732 | 0.0039 | nan | 0.2219 | 0.5556 | 0.5704 | 0.0 | 0.9611 | 0.8149 | 0.9818 | 0.0101 | 0.3218 | 0.4499 | 0.0 | nan | 0.7522 | 0.8662 | 0.6466 | 0.7271 | 0.3761 | nan | 0.5491 | 0.5897 | 0.0104 | 0.8602 | 0.1250 | 0.0 | 0.3505 | 0.0267 | 0.6110 | 0.0 | 0.0 | 0.7596 | 0.0743 | 0.5171 | 0.3070 | 0.0039 | nan | 0.1908 | 0.4781 | 0.3434 | 0.0 | 0.8789 | 0.7513 | 0.9492 | 0.0062 | 0.1302 | 0.3627 | 0.0 | | 0.1353 | 15.9 | 3180 | 0.6430 | 0.3835 | 0.4563 | 0.8631 | nan | 0.7710 | 0.9483 | 0.7555 | 0.8692 | 0.4833 | nan | 0.7177 | 0.8501 | 0.0091 | 0.9469 | 0.1603 | 0.0 | 0.3588 | 0.0500 | 0.7433 | 0.0 | 0.0 | 0.9276 | 0.0763 | 0.5958 | 0.3994 | 0.0 | nan | 0.2782 | 0.5942 | 0.5438 | 0.0 | 0.9520 | 0.8643 | 0.9798 | 0.0108 | 0.2056 | 0.5113 | 0.0 | nan | 0.6972 | 0.8681 | 0.6930 | 0.6404 | 0.3562 | nan | 0.5509 | 0.6160 | 0.0048 | 0.8632 | 0.1420 | 0.0 | 0.2866 | 0.0500 | 0.6075 | 0.0 | 0.0 | 0.7604 | 0.0732 | 0.5174 | 0.3142 | 0.0 | nan | 0.2496 | 0.4831 | 0.3790 | 0.0 | 0.8853 | 0.7660 | 0.9496 | 0.0087 | 0.1189 | 0.3913 | 0.0 | | 0.1617 | 16.0 | 3200 | 0.6151 | 0.3882 | 0.4699 | 0.8694 | nan | 0.8348 | 0.9375 | 0.8239 | 0.8528 | 0.6061 | nan | 0.6822 | 0.8461 | 0.0035 | 0.9569 | 0.1134 | 0.0 | 0.2992 | 0.0593 | 0.7013 | 0.0 | 0.0 | 0.9299 | 0.2929 | 0.6016 | 0.3728 | 0.0 | nan | 0.3863 | 0.5991 | 0.5887 | 0.0 | 0.9438 | 0.8419 | 0.9836 | 0.0086 | 0.3252 | 0.4467 | 0.0 | nan | 0.7545 | 0.8715 | 0.7219 | 0.7414 | 0.3574 | nan | 0.5604 | 0.5901 | 0.0020 | 0.8559 | 0.1012 | 0.0 | 0.2251 | 0.0593 | 0.5656 | 0.0 | 0.0 | 0.7552 | 0.2467 | 0.5171 | 0.3168 | 0.0 | nan | 0.2600 | 0.4849 | 0.3788 | 0.0 | 0.8828 | 0.7523 | 0.9466 | 0.0066 | 0.1108 | 0.3571 | 0.0 | | 0.1087 | 16.1 | 3220 | 0.6156 | 0.3946 | 0.4679 | 0.8739 | nan | 0.8424 | 0.9612 | 0.7933 | 0.8578 | 0.4260 | nan | 0.7008 | 0.8356 | 0.0025 | 0.9456 | 0.1843 | 0.0 | 0.4258 | 0.1187 | 0.7147 | 0.0 | 0.0011 | 0.9300 | 0.2992 | 0.5966 | 0.3569 | 0.0 | nan | 0.4344 | 0.5571 | 0.5288 | 0.0 | 0.9551 | 0.8390 | 0.9786 | 0.0000 | 0.2392 | 0.4465 | 0.0 | nan | 0.7647 | 0.8734 | 0.6996 | 0.7312 | 0.3438 | nan | 0.5670 | 0.6188 | 0.0015 | 0.8613 | 0.1567 | 0.0 | 0.3108 | 0.1187 | 0.6023 | 0.0 | 0.0008 | 0.7553 | 0.2500 | 0.5166 | 0.2998 | 0.0 | nan | 0.2646 | 0.4777 | 0.3785 | 0.0 | 0.8838 | 0.7618 | 0.9511 | 0.0000 | 0.0869 | 0.3505 | 0.0 | | 0.1534 | 16.2 | 3240 | 0.6361 | 0.3907 | 0.4569 | 0.8695 | nan | 0.8248 | 0.9618 | 0.7594 | 0.8464 | 0.4330 | nan | 0.6412 | 0.7748 | 0.0027 | 0.9560 | 0.1846 | 0.0 | 0.2197 | 0.0492 | 0.6962 | 0.0 | 0.0 | 0.9159 | 0.2669 | 0.5864 | 0.3982 | 0.0130 | nan | 0.4334 | 0.5939 | 0.5933 | 0.0 | 0.9521 | 0.8627 | 0.9699 | 0.0202 | 0.1637 | 0.5010 | 0.0 | nan | 0.7452 | 0.8576 | 0.7096 | 0.7390 | 0.3564 | nan | 0.5582 | 0.6255 | 0.0012 | 0.8545 | 0.1652 | 0.0 | 0.2012 | 0.0492 | 0.5941 | 0.0 | 0.0 | 0.7608 | 0.2184 | 0.5128 | 0.3134 | 0.0129 | nan | 0.3033 | 0.4586 | 0.3854 | 0.0 | 0.8842 | 0.7705 | 0.9393 | 0.0105 | 0.0960 | 0.3782 | 0.0 | | 0.1424 | 16.3 | 3260 | 0.6442 | 0.3731 | 0.4348 | 0.8686 | nan | 0.8203 | 0.9599 | 0.7388 | 0.8626 | 0.4214 | nan | 0.6717 | 0.8149 | 0.0025 | 0.9475 | 0.1189 | 0.0 | 0.0003 | 0.0073 | 0.6840 | 0.0 | 0.0 | 0.9272 | 0.1288 | 0.6072 | 0.3303 | 0.0014 | nan | 0.3258 | 0.5757 | 0.5233 | 0.0 | 0.9592 | 0.8548 | 0.9727 | 0.0326 | 0.2164 | 0.4083 | 0.0 | nan | 0.7306 | 0.8685 | 0.7158 | 0.7007 | 0.3514 | nan | 0.5668 | 0.6272 | 0.0014 | 0.8631 | 0.1051 | 0.0 | 0.0003 | 0.0073 | 0.5916 | 0.0 | 0.0 | 0.7522 | 0.1159 | 0.5177 | 0.2870 | 0.0014 | nan | 0.2639 | 0.4648 | 0.3685 | 0.0 | 0.8824 | 0.7719 | 0.9418 | 0.0204 | 0.0798 | 0.3408 | 0.0 | | 0.0966 | 16.4 | 3280 | 0.6606 | 0.3902 | 0.4710 | 0.8664 | nan | 0.8024 | 0.9532 | 0.8037 | 0.8563 | 0.4764 | nan | 0.6834 | 0.8516 | 0.0098 | 0.9516 | 0.1549 | 0.0 | 0.7565 | 0.1112 | 0.6759 | 0.0 | 0.0 | 0.9240 | 0.1866 | 0.5807 | 0.3483 | 0.0 | nan | 0.2897 | 0.5930 | 0.5241 | 0.0 | 0.9495 | 0.8388 | 0.9864 | 0.0284 | 0.2430 | 0.4930 | 0.0 | nan | 0.7277 | 0.8624 | 0.6978 | 0.7045 | 0.3633 | nan | 0.5624 | 0.6023 | 0.0055 | 0.8623 | 0.1326 | 0.0 | 0.3993 | 0.1112 | 0.6022 | 0.0 | 0.0 | 0.7582 | 0.1648 | 0.5129 | 0.3044 | 0.0 | nan | 0.2335 | 0.4796 | 0.3474 | 0.0 | 0.8846 | 0.7624 | 0.9436 | 0.0170 | 0.0817 | 0.3635 | 0.0 | | 0.1438 | 16.5 | 3300 | 0.6547 | 0.3949 | 0.4663 | 0.8686 | nan | 0.8167 | 0.9584 | 0.7571 | 0.8461 | 0.4767 | nan | 0.6833 | 0.7965 | 0.0007 | 0.9491 | 0.1242 | 0.0 | 0.6012 | 0.1260 | 0.7100 | 0.0 | 0.0 | 0.9347 | 0.2388 | 0.5549 | 0.4005 | 0.0 | nan | 0.2431 | 0.5893 | 0.5809 | 0.0 | 0.9484 | 0.8448 | 0.9847 | 0.0081 | 0.2865 | 0.4624 | 0.0 | nan | 0.7351 | 0.8619 | 0.7252 | 0.7187 | 0.3620 | nan | 0.5654 | 0.6422 | 0.0003 | 0.8603 | 0.1085 | 0.0 | 0.4644 | 0.1260 | 0.6126 | 0.0 | 0.0 | 0.7527 | 0.1927 | 0.4990 | 0.3261 | 0.0 | nan | 0.1837 | 0.4828 | 0.3534 | 0.0 | 0.8866 | 0.7707 | 0.9463 | 0.0061 | 0.0907 | 0.3638 | 0.0 | | 0.1047 | 16.6 | 3320 | 0.6620 | 0.3902 | 0.4587 | 0.8690 | nan | 0.8165 | 0.9626 | 0.7851 | 0.8390 | 0.4763 | nan | 0.6498 | 0.7912 | 0.0249 | 0.9441 | 0.1145 | 0.0 | 0.4841 | 0.1315 | 0.7786 | 0.0 | 0.0 | 0.9267 | 0.1411 | 0.6071 | 0.3302 | 0.0021 | nan | 0.2145 | 0.5729 | 0.5669 | 0.0 | 0.9516 | 0.8311 | 0.9806 | 0.0048 | 0.2571 | 0.4927 | 0.0 | nan | 0.7392 | 0.8618 | 0.7085 | 0.7183 | 0.3847 | nan | 0.5524 | 0.6180 | 0.0084 | 0.8613 | 0.1008 | 0.0 | 0.4199 | 0.1315 | 0.6120 | 0.0 | 0.0 | 0.7561 | 0.1279 | 0.5092 | 0.2864 | 0.0021 | nan | 0.1764 | 0.4765 | 0.3684 | 0.0 | 0.8822 | 0.7540 | 0.9490 | 0.0033 | 0.0984 | 0.3790 | 0.0 | | 0.1603 | 16.7 | 3340 | 0.6201 | 0.3909 | 0.4604 | 0.8724 | nan | 0.8357 | 0.9505 | 0.7955 | 0.8510 | 0.5418 | nan | 0.6921 | 0.7958 | 0.0189 | 0.9505 | 0.1030 | 0.0 | 0.4512 | 0.1370 | 0.7554 | 0.0 | 0.0 | 0.9323 | 0.1324 | 0.5851 | 0.2937 | 0.0 | nan | 0.2254 | 0.5794 | 0.5298 | 0.0 | 0.9565 | 0.8437 | 0.9786 | 0.0100 | 0.2826 | 0.5059 | 0.0 | nan | 0.7510 | 0.8757 | 0.7049 | 0.7267 | 0.4078 | nan | 0.5574 | 0.6164 | 0.0068 | 0.8607 | 0.0917 | 0.0 | 0.4070 | 0.1369 | 0.5831 | 0.0 | 0.0 | 0.7535 | 0.1208 | 0.5183 | 0.2661 | 0.0 | nan | 0.1880 | 0.4874 | 0.3682 | 0.0 | 0.8805 | 0.7568 | 0.9503 | 0.0065 | 0.1034 | 0.3822 | 0.0 | | 0.1489 | 16.8 | 3360 | 0.6502 | 0.3922 | 0.4645 | 0.8689 | nan | 0.8335 | 0.9446 | 0.7831 | 0.8308 | 0.5645 | nan | 0.6905 | 0.7748 | 0.0126 | 0.9532 | 0.1575 | 0.0 | 0.5452 | 0.0976 | 0.7534 | 0.0 | 0.0 | 0.9381 | 0.1427 | 0.5909 | 0.4154 | 0.0 | nan | 0.2283 | 0.5970 | 0.5611 | 0.0 | 0.9464 | 0.8215 | 0.9784 | 0.0080 | 0.1876 | 0.5064 | 0.0 | nan | 0.7445 | 0.8666 | 0.7103 | 0.7164 | 0.3841 | nan | 0.5533 | 0.6011 | 0.0027 | 0.8621 | 0.1280 | 0.0 | 0.4344 | 0.0975 | 0.6057 | 0.0 | 0.0 | 0.7533 | 0.1269 | 0.5119 | 0.3197 | 0.0 | nan | 0.1745 | 0.4893 | 0.3693 | 0.0 | 0.8832 | 0.7496 | 0.9511 | 0.0055 | 0.1152 | 0.3944 | 0.0 | | 0.1685 | 16.9 | 3380 | 0.6708 | 0.3851 | 0.4575 | 0.8690 | nan | 0.8334 | 0.9582 | 0.7801 | 0.8063 | 0.5133 | nan | 0.6511 | 0.8619 | 0.0251 | 0.9376 | 0.1521 | 0.0 | 0.6353 | 0.0269 | 0.7188 | 0.0 | 0.0 | 0.9257 | 0.1267 | 0.5672 | 0.3617 | 0.0372 | nan | 0.2287 | 0.5864 | 0.5349 | 0.0 | 0.9582 | 0.8384 | 0.9849 | 0.0080 | 0.1262 | 0.4551 | 0.0 | nan | 0.7410 | 0.8653 | 0.6794 | 0.7120 | 0.3925 | nan | 0.5553 | 0.5044 | 0.0111 | 0.8650 | 0.1305 | 0.0 | 0.4608 | 0.0269 | 0.6250 | 0.0 | 0.0 | 0.7559 | 0.1139 | 0.5049 | 0.3013 | 0.0369 | nan | 0.1731 | 0.4805 | 0.3695 | 0.0 | 0.8804 | 0.7613 | 0.9454 | 0.0058 | 0.0652 | 0.3586 | 0.0 | | 0.1464 | 17.0 | 3400 | 0.6396 | 0.3889 | 0.4638 | 0.8711 | nan | 0.8434 | 0.9486 | 0.8471 | 0.8117 | 0.5337 | nan | 0.6774 | 0.8109 | 0.0009 | 0.9510 | 0.1685 | 0.0 | 0.6241 | 0.0945 | 0.7062 | 0.0 | 0.0 | 0.9337 | 0.0732 | 0.5948 | 0.3886 | 0.0499 | nan | 0.2157 | 0.5693 | 0.5502 | 0.0 | 0.9559 | 0.8320 | 0.9821 | 0.0024 | 0.2056 | 0.4701 | 0.0 | nan | 0.7574 | 0.8809 | 0.6622 | 0.7190 | 0.3777 | nan | 0.5570 | 0.6090 | 0.0005 | 0.8596 | 0.1422 | 0.0 | 0.4428 | 0.0944 | 0.5854 | 0.0 | 0.0 | 0.7552 | 0.0679 | 0.5222 | 0.3214 | 0.0494 | nan | 0.1783 | 0.4779 | 0.3635 | 0.0 | 0.8814 | 0.7449 | 0.9474 | 0.0013 | 0.0831 | 0.3613 | 0.0 | | 0.1365 | 17.1 | 3420 | 0.6572 | 0.3892 | 0.4633 | 0.8705 | nan | 0.8382 | 0.9572 | 0.8112 | 0.7848 | 0.5338 | nan | 0.6472 | 0.8088 | 0.0071 | 0.9497 | 0.1396 | 0.0 | 0.5627 | 0.0748 | 0.7273 | 0.0 | 0.0 | 0.9303 | 0.1357 | 0.6070 | 0.4103 | 0.0277 | nan | 0.2033 | 0.5932 | 0.5764 | 0.0 | 0.9469 | 0.8637 | 0.9832 | 0.0022 | 0.2535 | 0.4493 | 0.0 | nan | 0.7516 | 0.8673 | 0.6850 | 0.7140 | 0.3789 | nan | 0.5561 | 0.6163 | 0.0036 | 0.8644 | 0.1202 | 0.0 | 0.4187 | 0.0748 | 0.5798 | 0.0 | 0.0 | 0.7616 | 0.1225 | 0.5294 | 0.3379 | 0.0275 | nan | 0.1374 | 0.4880 | 0.3775 | 0.0 | 0.8883 | 0.7683 | 0.9473 | 0.0013 | 0.0840 | 0.3516 | 0.0 | | 0.1378 | 17.2 | 3440 | 0.6191 | 0.3847 | 0.4545 | 0.8740 | nan | 0.8495 | 0.9487 | 0.7936 | 0.8689 | 0.4940 | nan | 0.6837 | 0.8027 | 0.0011 | 0.9480 | 0.1273 | 0.0 | 0.3769 | 0.0300 | 0.7042 | 0.0 | 0.0 | 0.9262 | 0.1578 | 0.6273 | 0.3385 | 0.0095 | nan | 0.2106 | 0.5912 | 0.5896 | 0.0 | 0.9543 | 0.8506 | 0.9832 | 0.0015 | 0.1770 | 0.4974 | 0.0 | nan | 0.7596 | 0.8815 | 0.6459 | 0.7035 | 0.3778 | nan | 0.5592 | 0.6375 | 0.0007 | 0.8580 | 0.1145 | 0.0 | 0.3289 | 0.0300 | 0.5752 | 0.0 | 0.0 | 0.7606 | 0.1426 | 0.5268 | 0.2930 | 0.0094 | nan | 0.1559 | 0.4862 | 0.3845 | 0.0 | 0.8857 | 0.7672 | 0.9472 | 0.0014 | 0.0916 | 0.3863 | 0.0 | | 0.1417 | 17.3 | 3460 | 0.6328 | 0.3898 | 0.4660 | 0.8735 | nan | 0.8465 | 0.9501 | 0.7913 | 0.8431 | 0.5206 | nan | 0.6615 | 0.8699 | 0.0089 | 0.9539 | 0.2104 | 0.0 | 0.4512 | 0.0147 | 0.6838 | 0.0 | 0.0 | 0.9255 | 0.2034 | 0.6098 | 0.3673 | 0.0256 | nan | 0.2154 | 0.5991 | 0.6197 | 0.0 | 0.9511 | 0.8755 | 0.9824 | 0.0223 | 0.2381 | 0.4713 | 0.0 | nan | 0.7532 | 0.8780 | 0.6962 | 0.7382 | 0.3778 | nan | 0.5484 | 0.6016 | 0.0075 | 0.8590 | 0.1806 | 0.0 | 0.3458 | 0.0147 | 0.5818 | 0.0 | 0.0 | 0.7599 | 0.1754 | 0.5210 | 0.3029 | 0.0253 | nan | 0.1574 | 0.4859 | 0.3672 | 0.0 | 0.8871 | 0.7619 | 0.9488 | 0.0140 | 0.1032 | 0.3791 | 0.0 | | 0.1277 | 17.4 | 3480 | 0.6331 | 0.3872 | 0.4582 | 0.8722 | nan | 0.8338 | 0.9512 | 0.7777 | 0.8407 | 0.5404 | nan | 0.7127 | 0.8462 | 0.0131 | 0.9527 | 0.1605 | 0.0 | 0.3180 | 0.0268 | 0.6960 | 0.0 | 0.0 | 0.9278 | 0.2041 | 0.6040 | 0.3440 | 0.0067 | nan | 0.2529 | 0.5973 | 0.5703 | 0.0 | 0.9532 | 0.8474 | 0.9828 | 0.0080 | 0.2203 | 0.4740 | 0.0 | nan | 0.7460 | 0.8775 | 0.7051 | 0.7192 | 0.3927 | nan | 0.5591 | 0.6078 | 0.0072 | 0.8635 | 0.1412 | 0.0 | 0.2497 | 0.0268 | 0.5889 | 0.0 | 0.0 | 0.7585 | 0.1791 | 0.5135 | 0.2937 | 0.0066 | nan | 0.2063 | 0.4870 | 0.3853 | 0.0 | 0.8854 | 0.7689 | 0.9492 | 0.0046 | 0.0924 | 0.3759 | 0.0 | | 0.162 | 17.5 | 3500 | 0.6330 | 0.3854 | 0.4526 | 0.8730 | nan | 0.8373 | 0.9588 | 0.7677 | 0.8370 | 0.4894 | nan | 0.6701 | 0.8162 | 0.0102 | 0.9564 | 0.1115 | 0.0 | 0.3171 | 0.0528 | 0.6832 | 0.0 | 0.0 | 0.9220 | 0.2651 | 0.6014 | 0.3459 | 0.0011 | nan | 0.1705 | 0.5879 | 0.5539 | 0.0 | 0.9609 | 0.8585 | 0.9792 | 0.0009 | 0.2789 | 0.4478 | 0.0 | nan | 0.7443 | 0.8716 | 0.7125 | 0.7195 | 0.3858 | nan | 0.5647 | 0.6139 | 0.0049 | 0.8575 | 0.0998 | 0.0 | 0.2619 | 0.0528 | 0.5751 | 0.0 | 0.0 | 0.7591 | 0.2179 | 0.5136 | 0.2962 | 0.0010 | nan | 0.1336 | 0.4891 | 0.3825 | 0.0 | 0.8828 | 0.7760 | 0.9499 | 0.0009 | 0.1019 | 0.3647 | 0.0 | | 0.0967 | 17.6 | 3520 | 0.6176 | 0.3858 | 0.4542 | 0.8752 | nan | 0.8580 | 0.9520 | 0.7998 | 0.8507 | 0.5326 | nan | 0.6841 | 0.8336 | 0.0020 | 0.9521 | 0.1145 | 0.0 | 0.2032 | 0.0751 | 0.6629 | 0.0 | 0.0 | 0.9342 | 0.2073 | 0.5489 | 0.3947 | 0.0218 | nan | 0.2637 | 0.6003 | 0.5635 | 0.0 | 0.9521 | 0.8432 | 0.9831 | 0.0086 | 0.2128 | 0.4805 | 0.0 | nan | 0.7633 | 0.8815 | 0.6620 | 0.7389 | 0.3922 | nan | 0.5692 | 0.6264 | 0.0013 | 0.8571 | 0.1023 | 0.0 | 0.1693 | 0.0751 | 0.5715 | 0.0 | 0.0 | 0.7558 | 0.1879 | 0.4861 | 0.3087 | 0.0216 | nan | 0.2168 | 0.4879 | 0.3836 | 0.0 | 0.8844 | 0.7608 | 0.9469 | 0.0077 | 0.1060 | 0.3815 | 0.0 | | 0.1003 | 17.7 | 3540 | 0.6420 | 0.3874 | 0.4692 | 0.8713 | nan | 0.8484 | 0.9538 | 0.7812 | 0.8201 | 0.5088 | nan | 0.6856 | 0.8652 | 0.0007 | 0.9537 | 0.1706 | 0.0 | 0.6290 | 0.0624 | 0.7385 | 0.0 | 0.0 | 0.9386 | 0.2066 | 0.5970 | 0.3658 | 0.0267 | nan | 0.2712 | 0.5962 | 0.5393 | 0.0 | 0.9480 | 0.8226 | 0.9837 | 0.0082 | 0.2721 | 0.4197 | 0.0 | nan | 0.7533 | 0.8758 | 0.6771 | 0.6990 | 0.3944 | nan | 0.5631 | 0.5618 | 0.0005 | 0.8596 | 0.1445 | 0.0 | 0.3542 | 0.0624 | 0.5718 | 0.0 | 0.0 | 0.7510 | 0.1778 | 0.4933 | 0.3078 | 0.0262 | nan | 0.2112 | 0.4891 | 0.3718 | 0.0 | 0.8834 | 0.7588 | 0.9469 | 0.0075 | 0.1096 | 0.3450 | 0.0 | | 0.1321 | 17.8 | 3560 | 0.6472 | 0.3854 | 0.4690 | 0.8700 | nan | 0.8190 | 0.9575 | 0.7796 | 0.8176 | 0.5442 | nan | 0.7002 | 0.8532 | 0.0009 | 0.9549 | 0.2109 | 0.0357 | 0.8843 | 0.1077 | 0.7413 | 0.0 | 0.0 | 0.9250 | 0.1177 | 0.6343 | 0.3388 | 0.0348 | nan | 0.1354 | 0.5508 | 0.4907 | 0.0 | 0.9538 | 0.8184 | 0.9834 | 0.0006 | 0.1281 | 0.4906 | 0.0 | nan | 0.7362 | 0.8674 | 0.6764 | 0.7011 | 0.3939 | nan | 0.5691 | 0.5895 | 0.0007 | 0.8628 | 0.1768 | 0.0357 | 0.3133 | 0.1077 | 0.5636 | 0.0 | 0.0 | 0.7612 | 0.1111 | 0.5214 | 0.2874 | 0.0343 | nan | 0.1176 | 0.4788 | 0.3656 | 0.0 | 0.8833 | 0.7533 | 0.9473 | 0.0006 | 0.0951 | 0.3820 | 0.0 | | 0.1012 | 17.9 | 3580 | 0.6327 | 0.3945 | 0.4744 | 0.8750 | nan | 0.8444 | 0.9511 | 0.7980 | 0.8497 | 0.5546 | nan | 0.6874 | 0.8387 | 0.0047 | 0.9528 | 0.1516 | 0.0241 | 0.6486 | 0.1604 | 0.7541 | 0.0 | 0.0 | 0.9332 | 0.1317 | 0.6395 | 0.3957 | 0.0456 | nan | 0.1993 | 0.5825 | 0.5292 | 0.0 | 0.9510 | 0.8507 | 0.9821 | 0.0028 | 0.2793 | 0.4377 | 0.0 | nan | 0.7602 | 0.8801 | 0.6560 | 0.7283 | 0.3917 | nan | 0.5599 | 0.6319 | 0.0034 | 0.8636 | 0.1319 | 0.0241 | 0.3515 | 0.1603 | 0.5733 | 0.0 | 0.0 | 0.7636 | 0.1224 | 0.5281 | 0.3166 | 0.0446 | nan | 0.1600 | 0.4877 | 0.3838 | 0.0 | 0.8873 | 0.7671 | 0.9494 | 0.0024 | 0.1333 | 0.3614 | 0.0 | | 0.1115 | 18.0 | 3600 | 0.6471 | 0.3958 | 0.4775 | 0.8741 | nan | 0.8373 | 0.9557 | 0.8146 | 0.8568 | 0.5123 | nan | 0.6811 | 0.8384 | 0.0147 | 0.9499 | 0.1619 | 0.0286 | 0.6451 | 0.2463 | 0.7429 | 0.0 | 0.0 | 0.9359 | 0.1783 | 0.5939 | 0.3325 | 0.0481 | nan | 0.2130 | 0.5999 | 0.5620 | 0.0 | 0.9495 | 0.8572 | 0.9796 | 0.0047 | 0.2841 | 0.4547 | 0.0 | nan | 0.7578 | 0.8810 | 0.6269 | 0.7312 | 0.3883 | nan | 0.5636 | 0.6215 | 0.0093 | 0.8634 | 0.1373 | 0.0285 | 0.3512 | 0.2463 | 0.5963 | 0.0 | 0.0 | 0.7620 | 0.1596 | 0.5112 | 0.2902 | 0.0468 | nan | 0.1611 | 0.4861 | 0.3712 | 0.0 | 0.8848 | 0.7659 | 0.9499 | 0.0038 | 0.1134 | 0.3581 | 0.0 | | 0.1348 | 18.1 | 3620 | 0.6481 | 0.3917 | 0.4693 | 0.8706 | nan | 0.8306 | 0.9477 | 0.8035 | 0.8420 | 0.5514 | nan | 0.7045 | 0.8461 | 0.0400 | 0.9511 | 0.1532 | 0.0096 | 0.6127 | 0.1254 | 0.7203 | 0.0001 | 0.0 | 0.9398 | 0.1342 | 0.5573 | 0.3896 | 0.0337 | nan | 0.2287 | 0.5552 | 0.5724 | 0.0 | 0.9505 | 0.8600 | 0.9802 | 0.0028 | 0.2236 | 0.4512 | 0.0 | nan | 0.7478 | 0.8727 | 0.6513 | 0.7269 | 0.3812 | nan | 0.5678 | 0.6342 | 0.0202 | 0.8619 | 0.1277 | 0.0091 | 0.4013 | 0.1253 | 0.5868 | 0.0001 | 0.0 | 0.7536 | 0.1229 | 0.4895 | 0.2892 | 0.0330 | nan | 0.1947 | 0.4772 | 0.3720 | 0.0 | 0.8859 | 0.7637 | 0.9502 | 0.0024 | 0.1190 | 0.3666 | 0.0 | | 0.1008 | 18.2 | 3640 | 0.6400 | 0.3961 | 0.4778 | 0.8700 | nan | 0.8315 | 0.9466 | 0.7874 | 0.8600 | 0.5177 | nan | 0.6957 | 0.8418 | 0.0366 | 0.9475 | 0.2405 | 0.0 | 0.6111 | 0.1058 | 0.7397 | 0.0 | 0.0009 | 0.9279 | 0.2627 | 0.5572 | 0.3947 | 0.0730 | nan | 0.2279 | 0.6045 | 0.5431 | 0.0 | 0.9459 | 0.8792 | 0.9865 | 0.0192 | 0.2656 | 0.4386 | 0.0 | nan | 0.7440 | 0.8732 | 0.6501 | 0.7160 | 0.3869 | nan | 0.5630 | 0.6315 | 0.0223 | 0.8652 | 0.2004 | 0.0 | 0.4063 | 0.1058 | 0.5690 | 0.0 | 0.0008 | 0.7539 | 0.2115 | 0.4873 | 0.2980 | 0.0709 | nan | 0.1864 | 0.4805 | 0.3760 | 0.0 | 0.8870 | 0.7709 | 0.9467 | 0.0142 | 0.1040 | 0.3519 | 0.0 | | 0.1345 | 18.3 | 3660 | 0.6472 | 0.3959 | 0.4781 | 0.8706 | nan | 0.8146 | 0.9530 | 0.8235 | 0.8520 | 0.5497 | nan | 0.6968 | 0.8381 | 0.0207 | 0.9516 | 0.1768 | 0.0 | 0.6848 | 0.0982 | 0.7017 | 0.0 | 0.0016 | 0.9232 | 0.2083 | 0.5915 | 0.4181 | 0.0692 | nan | 0.2702 | 0.6141 | 0.5410 | 0.0 | 0.9518 | 0.8357 | 0.9695 | 0.0106 | 0.2471 | 0.4844 | 0.0 | nan | 0.7450 | 0.8758 | 0.6078 | 0.7318 | 0.3818 | nan | 0.5719 | 0.6405 | 0.0114 | 0.8615 | 0.1511 | 0.0 | 0.4094 | 0.0981 | 0.5907 | 0.0 | 0.0014 | 0.7618 | 0.1787 | 0.5066 | 0.3096 | 0.0666 | nan | 0.2119 | 0.4778 | 0.3787 | 0.0 | 0.8857 | 0.7576 | 0.9448 | 0.0082 | 0.1287 | 0.3737 | 0.0 | | 0.1169 | 18.4 | 3680 | 0.6531 | 0.3883 | 0.4689 | 0.8691 | nan | 0.8289 | 0.9521 | 0.7879 | 0.8127 | 0.5339 | nan | 0.6864 | 0.8635 | 0.0215 | 0.9581 | 0.1386 | 0.0544 | 0.8072 | 0.0718 | 0.6534 | 0.0 | 0.0 | 0.9327 | 0.1959 | 0.5625 | 0.3530 | 0.0056 | nan | 0.2673 | 0.5698 | 0.5440 | 0.0 | 0.9512 | 0.8463 | 0.9864 | 0.0051 | 0.1302 | 0.4835 | 0.0 | nan | 0.7451 | 0.8685 | 0.6699 | 0.7229 | 0.3542 | nan | 0.5663 | 0.5963 | 0.0124 | 0.8566 | 0.1181 | 0.0511 | 0.4035 | 0.0717 | 0.5656 | 0.0 | 0.0 | 0.7549 | 0.1712 | 0.4952 | 0.2960 | 0.0056 | nan | 0.2144 | 0.4820 | 0.3652 | 0.0 | 0.8849 | 0.7623 | 0.9443 | 0.0045 | 0.0663 | 0.3781 | 0.0 | | 0.1532 | 18.5 | 3700 | 0.6728 | 0.3907 | 0.4725 | 0.8686 | nan | 0.8117 | 0.9581 | 0.7649 | 0.8659 | 0.5009 | nan | 0.6302 | 0.7146 | 0.1802 | 0.9449 | 0.1444 | 0.0 | 0.7378 | 0.1248 | 0.7747 | 0.0 | 0.0 | 0.9387 | 0.1801 | 0.6004 | 0.3819 | 0.0629 | nan | 0.1859 | 0.5827 | 0.5751 | 0.0 | 0.9523 | 0.8323 | 0.9805 | 0.0229 | 0.2482 | 0.4215 | 0.0 | nan | 0.7378 | 0.8700 | 0.6787 | 0.6915 | 0.3825 | nan | 0.5584 | 0.6061 | 0.0625 | 0.8632 | 0.1243 | 0.0 | 0.3961 | 0.1248 | 0.5644 | 0.0 | 0.0 | 0.7546 | 0.1585 | 0.5011 | 0.3042 | 0.0613 | nan | 0.1455 | 0.4883 | 0.3701 | 0.0 | 0.8856 | 0.7606 | 0.9497 | 0.0151 | 0.1017 | 0.3440 | 0.0 | | 0.0933 | 18.6 | 3720 | 0.6657 | 0.3892 | 0.4705 | 0.8671 | nan | 0.7966 | 0.9503 | 0.7910 | 0.8426 | 0.5041 | nan | 0.7030 | 0.8504 | 0.0036 | 0.9521 | 0.1420 | 0.0 | 0.7654 | 0.0719 | 0.6865 | 0.0 | 0.0 | 0.9344 | 0.1655 | 0.6533 | 0.3030 | 0.0829 | nan | 0.2544 | 0.5910 | 0.5748 | 0.0 | 0.9539 | 0.8573 | 0.9809 | 0.0126 | 0.1971 | 0.4359 | 0.0 | nan | 0.7158 | 0.8649 | 0.6562 | 0.6928 | 0.3857 | nan | 0.5605 | 0.5842 | 0.0018 | 0.8581 | 0.1180 | 0.0 | 0.4258 | 0.0718 | 0.5858 | 0.0 | 0.0 | 0.7634 | 0.1494 | 0.5398 | 0.2672 | 0.0798 | nan | 0.1992 | 0.4898 | 0.3707 | 0.0 | 0.8857 | 0.7744 | 0.9496 | 0.0094 | 0.0952 | 0.3577 | 0.0 | | 0.1438 | 18.7 | 3740 | 0.6791 | 0.3893 | 0.4663 | 0.8655 | nan | 0.7821 | 0.9529 | 0.7963 | 0.8464 | 0.5156 | nan | 0.6654 | 0.8413 | 0.0087 | 0.9405 | 0.1420 | 0.0 | 0.5114 | 0.1117 | 0.7339 | 0.0 | 0.0 | 0.9284 | 0.2376 | 0.6176 | 0.3744 | 0.1001 | nan | 0.1974 | 0.6001 | 0.5361 | 0.0 | 0.9529 | 0.8762 | 0.9833 | 0.0107 | 0.2175 | 0.4416 | 0.0 | nan | 0.7087 | 0.8600 | 0.6665 | 0.7063 | 0.3783 | nan | 0.5571 | 0.5747 | 0.0045 | 0.8647 | 0.1161 | 0.0 | 0.3470 | 0.1117 | 0.5896 | 0.0 | 0.0 | 0.7619 | 0.2037 | 0.5308 | 0.3091 | 0.0951 | nan | 0.1643 | 0.4884 | 0.3719 | 0.0 | 0.8887 | 0.7752 | 0.9486 | 0.0075 | 0.0816 | 0.3465 | 0.0 | | 0.0849 | 18.8 | 3760 | 0.6566 | 0.3937 | 0.4671 | 0.8737 | nan | 0.8372 | 0.9555 | 0.7537 | 0.8370 | 0.5635 | nan | 0.6648 | 0.8707 | 0.0169 | 0.9553 | 0.1224 | 0.0 | 0.4110 | 0.0835 | 0.7136 | 0.0 | 0.0012 | 0.9242 | 0.2331 | 0.6116 | 0.4140 | 0.0660 | nan | 0.3150 | 0.5872 | 0.5567 | 0.0 | 0.9522 | 0.8598 | 0.9851 | 0.0051 | 0.1902 | 0.4594 | 0.0 | nan | 0.7514 | 0.8717 | 0.7081 | 0.7288 | 0.3997 | nan | 0.5627 | 0.5637 | 0.0104 | 0.8600 | 0.1056 | 0.0 | 0.3094 | 0.0835 | 0.5931 | 0.0 | 0.0010 | 0.7622 | 0.2070 | 0.5229 | 0.3329 | 0.0644 | nan | 0.2304 | 0.4850 | 0.3810 | 0.0 | 0.8919 | 0.7864 | 0.9474 | 0.0038 | 0.0767 | 0.3569 | 0.0 | | 0.1116 | 18.9 | 3780 | 0.6471 | 0.3966 | 0.4666 | 0.8774 | nan | 0.8720 | 0.9482 | 0.8046 | 0.8431 | 0.5345 | nan | 0.6738 | 0.8331 | 0.0113 | 0.9485 | 0.0919 | 0.0 | 0.4484 | 0.1166 | 0.7373 | 0.0 | 0.0026 | 0.9275 | 0.1592 | 0.6165 | 0.4077 | 0.0179 | nan | 0.3014 | 0.5796 | 0.5465 | 0.0 | 0.9580 | 0.8396 | 0.9802 | 0.0419 | 0.1764 | 0.5117 | 0.0 | nan | 0.7768 | 0.8863 | 0.6934 | 0.7475 | 0.3869 | nan | 0.5524 | 0.6067 | 0.0064 | 0.8653 | 0.0813 | 0.0 | 0.3712 | 0.1166 | 0.6076 | 0.0 | 0.0022 | 0.7640 | 0.1476 | 0.5426 | 0.3372 | 0.0176 | nan | 0.2389 | 0.4833 | 0.3878 | 0.0 | 0.8879 | 0.7702 | 0.9496 | 0.0196 | 0.0705 | 0.3732 | 0.0 | | 0.1391 | 19.0 | 3800 | 0.6216 | 0.3908 | 0.4653 | 0.8761 | nan | 0.8496 | 0.9555 | 0.7994 | 0.8339 | 0.5083 | nan | 0.7163 | 0.8324 | 0.0129 | 0.9491 | 0.1400 | 0.0 | 0.5319 | 0.0279 | 0.7549 | 0.0 | 0.0020 | 0.9199 | 0.1396 | 0.6369 | 0.3589 | 0.0829 | nan | 0.2064 | 0.6205 | 0.5965 | 0.0 | 0.9594 | 0.8644 | 0.9757 | 0.0076 | 0.1460 | 0.4598 | 0.0 | nan | 0.7640 | 0.8820 | 0.6720 | 0.7404 | 0.3806 | nan | 0.5694 | 0.5559 | 0.0067 | 0.8639 | 0.1234 | 0.0 | 0.4172 | 0.0279 | 0.5877 | 0.0 | 0.0018 | 0.7649 | 0.1263 | 0.5285 | 0.3136 | 0.0805 | nan | 0.1776 | 0.4837 | 0.3792 | 0.0 | 0.8865 | 0.7708 | 0.9521 | 0.0050 | 0.0771 | 0.3654 | 0.0 | | 0.1196 | 19.1 | 3820 | 0.6366 | 0.3926 | 0.4722 | 0.8727 | nan | 0.8250 | 0.9484 | 0.8132 | 0.8418 | 0.5690 | nan | 0.7304 | 0.8154 | 0.0102 | 0.9577 | 0.1662 | 0.0002 | 0.6124 | 0.0631 | 0.7290 | 0.0 | 0.0075 | 0.9306 | 0.1782 | 0.6425 | 0.3602 | 0.0330 | nan | 0.2764 | 0.5907 | 0.5957 | 0.0 | 0.9500 | 0.8442 | 0.9833 | 0.0043 | 0.1566 | 0.4757 | 0.0 | nan | 0.7522 | 0.8838 | 0.6799 | 0.7295 | 0.3469 | nan | 0.5803 | 0.5606 | 0.0042 | 0.8591 | 0.1443 | 0.0002 | 0.4289 | 0.0631 | 0.6002 | 0.0 | 0.0067 | 0.7621 | 0.1557 | 0.5209 | 0.3123 | 0.0322 | nan | 0.2066 | 0.4849 | 0.3736 | 0.0 | 0.8885 | 0.7692 | 0.9480 | 0.0035 | 0.1009 | 0.3666 | 0.0 | | 0.0959 | 19.2 | 3840 | 0.6288 | 0.3935 | 0.4702 | 0.8750 | nan | 0.8575 | 0.9453 | 0.7918 | 0.8497 | 0.5645 | nan | 0.6963 | 0.8299 | 0.0091 | 0.9490 | 0.2160 | 0.0 | 0.4691 | 0.0974 | 0.7566 | 0.0 | 0.0057 | 0.9396 | 0.1275 | 0.5798 | 0.4401 | 0.0737 | nan | 0.2358 | 0.5932 | 0.5844 | 0.0 | 0.9462 | 0.8775 | 0.9824 | 0.0096 | 0.1730 | 0.4441 | 0.0 | nan | 0.7664 | 0.8847 | 0.6672 | 0.7389 | 0.3696 | nan | 0.5766 | 0.5705 | 0.0046 | 0.8629 | 0.1796 | 0.0 | 0.3385 | 0.0973 | 0.5875 | 0.0 | 0.0046 | 0.7543 | 0.1155 | 0.4952 | 0.3465 | 0.0703 | nan | 0.1918 | 0.4893 | 0.3725 | 0.0 | 0.8922 | 0.7813 | 0.9511 | 0.0061 | 0.1146 | 0.3613 | 0.0 | | 0.0953 | 19.3 | 3860 | 0.6623 | 0.3801 | 0.4474 | 0.8713 | nan | 0.8262 | 0.9593 | 0.7969 | 0.8096 | 0.5081 | nan | 0.6969 | 0.8623 | 0.0131 | 0.9501 | 0.1314 | 0.0012 | 0.2272 | 0.0296 | 0.6862 | 0.0 | 0.0 | 0.9403 | 0.1217 | 0.5800 | 0.3706 | 0.0808 | nan | 0.2683 | 0.5718 | 0.5212 | 0.0 | 0.9522 | 0.8629 | 0.9806 | 0.0062 | 0.1071 | 0.4561 | 0.0 | nan | 0.7435 | 0.8717 | 0.7078 | 0.7134 | 0.3755 | nan | 0.5783 | 0.5435 | 0.0070 | 0.8618 | 0.1100 | 0.0012 | 0.1734 | 0.0295 | 0.5762 | 0.0 | 0.0 | 0.7504 | 0.1110 | 0.4996 | 0.3201 | 0.0776 | nan | 0.2004 | 0.4833 | 0.3712 | 0.0 | 0.8898 | 0.7813 | 0.9505 | 0.0037 | 0.0657 | 0.3646 | 0.0 | | 0.1373 | 19.4 | 3880 | 0.6685 | 0.3895 | 0.4625 | 0.8714 | nan | 0.8442 | 0.9510 | 0.7916 | 0.8176 | 0.5431 | nan | 0.6855 | 0.8213 | 0.0075 | 0.9520 | 0.1370 | 0.0018 | 0.5377 | 0.0521 | 0.6675 | 0.0 | 0.0017 | 0.9306 | 0.1636 | 0.5739 | 0.3619 | 0.0569 | nan | 0.2615 | 0.5481 | 0.5454 | 0.0 | 0.9551 | 0.8495 | 0.9859 | 0.0112 | 0.2837 | 0.4612 | 0.0 | nan | 0.7560 | 0.8749 | 0.7109 | 0.7168 | 0.3679 | nan | 0.5734 | 0.5998 | 0.0039 | 0.8604 | 0.1156 | 0.0018 | 0.3666 | 0.0521 | 0.5734 | 0.0 | 0.0015 | 0.7527 | 0.1418 | 0.4988 | 0.3208 | 0.0558 | nan | 0.1980 | 0.4756 | 0.3725 | 0.0 | 0.8879 | 0.7700 | 0.9445 | 0.0067 | 0.1031 | 0.3614 | 0.0 | | 0.0634 | 19.5 | 3900 | 0.6595 | 0.4015 | 0.4852 | 0.8721 | nan | 0.8256 | 0.9453 | 0.8299 | 0.8586 | 0.5898 | nan | 0.6855 | 0.8259 | 0.0306 | 0.9562 | 0.1417 | 0.0069 | 0.6597 | 0.0814 | 0.7130 | 0.0 | 0.0123 | 0.9206 | 0.2307 | 0.6144 | 0.3796 | 0.1257 | nan | 0.3290 | 0.6291 | 0.5818 | 0.0 | 0.9508 | 0.8416 | 0.9838 | 0.0177 | 0.2719 | 0.4868 | 0.0 | nan | 0.7497 | 0.8837 | 0.6989 | 0.7101 | 0.3746 | nan | 0.5735 | 0.6215 | 0.0142 | 0.8605 | 0.1227 | 0.0069 | 0.4337 | 0.0814 | 0.5765 | 0.0 | 0.0110 | 0.7668 | 0.1922 | 0.5229 | 0.3180 | 0.1211 | nan | 0.2457 | 0.4939 | 0.3811 | 0.0 | 0.8884 | 0.7650 | 0.9487 | 0.0094 | 0.1052 | 0.3722 | 0.0 | | 0.1144 | 19.6 | 3920 | 0.6743 | 0.3943 | 0.4708 | 0.8711 | nan | 0.8363 | 0.9524 | 0.8027 | 0.7968 | 0.5293 | nan | 0.6985 | 0.8225 | 0.1192 | 0.9560 | 0.1437 | 0.0100 | 0.7422 | 0.0896 | 0.6518 | 0.0 | 0.0 | 0.9286 | 0.1067 | 0.6105 | 0.3336 | 0.0772 | nan | 0.2570 | 0.5919 | 0.5545 | 0.0 | 0.9578 | 0.8493 | 0.9787 | 0.0030 | 0.1745 | 0.4916 | 0.0 | nan | 0.7460 | 0.8646 | 0.7031 | 0.7061 | 0.3837 | nan | 0.5698 | 0.6244 | 0.0406 | 0.8595 | 0.1206 | 0.0100 | 0.4231 | 0.0895 | 0.5612 | 0.0 | 0.0 | 0.7606 | 0.0981 | 0.5160 | 0.2920 | 0.0753 | nan | 0.2151 | 0.4904 | 0.3711 | 0.0 | 0.8875 | 0.7760 | 0.9499 | 0.0023 | 0.0951 | 0.3875 | 0.0 | | 0.1188 | 19.7 | 3940 | 0.6705 | 0.4008 | 0.4736 | 0.8727 | nan | 0.8352 | 0.9605 | 0.7775 | 0.7891 | 0.5159 | nan | 0.6744 | 0.8470 | 0.0115 | 0.9497 | 0.1475 | 0.0024 | 0.6870 | 0.1569 | 0.6805 | 0.0 | 0.0003 | 0.9275 | 0.2195 | 0.6293 | 0.3641 | 0.0765 | nan | 0.2214 | 0.6162 | 0.5561 | 0.0 | 0.9539 | 0.8622 | 0.9766 | 0.0012 | 0.2277 | 0.4884 | 0.0 | nan | 0.7518 | 0.8605 | 0.7179 | 0.7051 | 0.3932 | nan | 0.5742 | 0.6278 | 0.0058 | 0.8634 | 0.1261 | 0.0024 | 0.4491 | 0.1568 | 0.5786 | 0.0 | 0.0002 | 0.7646 | 0.1904 | 0.5319 | 0.3115 | 0.0745 | nan | 0.1672 | 0.5011 | 0.3704 | 0.0 | 0.8884 | 0.7778 | 0.9518 | 0.0011 | 0.1013 | 0.3798 | 0.0 | | 0.0921 | 19.8 | 3960 | 0.5977 | 0.3969 | 0.4735 | 0.8792 | nan | 0.8769 | 0.9573 | 0.7823 | 0.8502 | 0.4808 | nan | 0.6991 | 0.8520 | 0.0049 | 0.9545 | 0.1090 | 0.0058 | 0.6390 | 0.0450 | 0.6867 | 0.0 | 0.0 | 0.9265 | 0.1901 | 0.6006 | 0.3956 | 0.1275 | nan | 0.2660 | 0.6230 | 0.6010 | 0.0 | 0.9521 | 0.8688 | 0.9846 | 0.0038 | 0.2451 | 0.4230 | 0.0 | nan | 0.7787 | 0.8830 | 0.6752 | 0.7446 | 0.3796 | nan | 0.5851 | 0.6326 | 0.0033 | 0.8611 | 0.0962 | 0.0058 | 0.4192 | 0.0450 | 0.5756 | 0.0 | 0.0 | 0.7626 | 0.1657 | 0.5145 | 0.3129 | 0.1224 | nan | 0.1892 | 0.4940 | 0.3688 | 0.0 | 0.8890 | 0.7783 | 0.9494 | 0.0036 | 0.1144 | 0.3510 | 0.0 | | 0.0998 | 19.9 | 3980 | 0.6695 | 0.3898 | 0.4654 | 0.8695 | nan | 0.8016 | 0.9558 | 0.7933 | 0.8599 | 0.5248 | nan | 0.7058 | 0.8529 | 0.0 | 0.9482 | 0.1023 | 0.0142 | 0.5224 | 0.0380 | 0.7253 | 0.0 | 0.0163 | 0.9390 | 0.1337 | 0.5630 | 0.3497 | 0.1847 | nan | 0.2445 | 0.5742 | 0.5538 | 0.0 | 0.9543 | 0.8615 | 0.9816 | 0.0138 | 0.2513 | 0.4269 | 0.0 | nan | 0.7260 | 0.8725 | 0.7126 | 0.7037 | 0.3809 | nan | 0.5887 | 0.6069 | 0.0 | 0.8603 | 0.0903 | 0.0136 | 0.3584 | 0.0380 | 0.5701 | 0.0 | 0.0148 | 0.7466 | 0.1170 | 0.4905 | 0.2939 | 0.1709 | nan | 0.1710 | 0.4849 | 0.3728 | 0.0 | 0.8872 | 0.7744 | 0.9510 | 0.0121 | 0.1114 | 0.3524 | 0.0 | | 0.1001 | 20.0 | 4000 | 0.6360 | 0.3940 | 0.4690 | 0.8747 | nan | 0.8478 | 0.9596 | 0.8029 | 0.8542 | 0.4806 | nan | 0.7011 | 0.8270 | 0.0153 | 0.9524 | 0.1601 | 0.0 | 0.5663 | 0.0927 | 0.7436 | 0.0 | 0.0156 | 0.9355 | 0.1451 | 0.5621 | 0.3605 | 0.1619 | nan | 0.1624 | 0.6078 | 0.5374 | 0.0 | 0.9556 | 0.8211 | 0.9820 | 0.0169 | 0.3125 | 0.4290 | 0.0 | nan | 0.7591 | 0.8734 | 0.6983 | 0.7558 | 0.3819 | nan | 0.5784 | 0.6133 | 0.0099 | 0.8632 | 0.1415 | 0.0 | 0.3731 | 0.0927 | 0.5487 | 0.0 | 0.0145 | 0.7561 | 0.1293 | 0.4910 | 0.3003 | 0.1506 | nan | 0.1298 | 0.4896 | 0.3837 | 0.0 | 0.8829 | 0.7556 | 0.9518 | 0.0153 | 0.1128 | 0.3558 | 0.0 | | 0.1253 | 20.1 | 4020 | 0.6375 | 0.3945 | 0.4689 | 0.8761 | nan | 0.8562 | 0.9533 | 0.8124 | 0.8497 | 0.5275 | nan | 0.7055 | 0.8616 | 0.0107 | 0.9555 | 0.2256 | 0.0008 | 0.5598 | 0.1015 | 0.7113 | 0.0 | 0.0024 | 0.9286 | 0.1968 | 0.5871 | 0.3271 | 0.1187 | nan | 0.1832 | 0.5976 | 0.5238 | 0.0 | 0.9577 | 0.8331 | 0.9858 | 0.0137 | 0.1747 | 0.4445 | 0.0 | nan | 0.7673 | 0.8788 | 0.6697 | 0.7518 | 0.3957 | nan | 0.5791 | 0.5823 | 0.0078 | 0.8610 | 0.1961 | 0.0008 | 0.3487 | 0.1015 | 0.5737 | 0.0 | 0.0022 | 0.7595 | 0.1723 | 0.5084 | 0.2912 | 0.1149 | nan | 0.1545 | 0.4909 | 0.3803 | 0.0 | 0.8825 | 0.7555 | 0.9488 | 0.0121 | 0.0726 | 0.3626 | 0.0 | | 0.0963 | 20.2 | 4040 | 0.6343 | 0.3986 | 0.4709 | 0.8758 | nan | 0.8581 | 0.9509 | 0.7889 | 0.8497 | 0.5508 | nan | 0.6903 | 0.8440 | 0.0020 | 0.9561 | 0.1398 | 0.0 | 0.6282 | 0.0328 | 0.6605 | 0.0 | 0.0034 | 0.9246 | 0.2302 | 0.5814 | 0.3759 | 0.1717 | nan | 0.2484 | 0.5943 | 0.5692 | 0.0 | 0.9611 | 0.8160 | 0.9809 | 0.0170 | 0.1377 | 0.5036 | 0.0 | nan | 0.7668 | 0.8783 | 0.6873 | 0.7581 | 0.3905 | nan | 0.5727 | 0.6263 | 0.0016 | 0.8572 | 0.1219 | 0.0 | 0.4296 | 0.0328 | 0.5626 | 0.0 | 0.0032 | 0.7619 | 0.2007 | 0.5207 | 0.2996 | 0.1609 | nan | 0.2098 | 0.4903 | 0.3758 | 0.0 | 0.8799 | 0.7447 | 0.9505 | 0.0132 | 0.0692 | 0.3891 | 0.0 | | 0.1137 | 20.3 | 4060 | 0.6375 | 0.3946 | 0.4692 | 0.8769 | nan | 0.8613 | 0.9550 | 0.8020 | 0.8515 | 0.5362 | nan | 0.7077 | 0.8397 | 0.0242 | 0.9510 | 0.1386 | 0.0 | 0.6277 | 0.0927 | 0.7196 | 0.0 | 0.0015 | 0.9251 | 0.1790 | 0.6060 | 0.3227 | 0.1148 | nan | 0.1834 | 0.5894 | 0.5724 | 0.0 | 0.9528 | 0.8344 | 0.9837 | 0.0036 | 0.1638 | 0.4759 | 0.0 | nan | 0.7764 | 0.8810 | 0.6866 | 0.7539 | 0.3881 | nan | 0.5824 | 0.5827 | 0.0154 | 0.8616 | 0.1163 | 0.0 | 0.4547 | 0.0926 | 0.5692 | 0.0 | 0.0014 | 0.7587 | 0.1571 | 0.5074 | 0.2865 | 0.1074 | nan | 0.1429 | 0.4889 | 0.3755 | 0.0 | 0.8853 | 0.7612 | 0.9461 | 0.0029 | 0.0755 | 0.3681 | 0.0 | | 0.1062 | 20.4 | 4080 | 0.6567 | 0.3995 | 0.4775 | 0.8730 | nan | 0.8379 | 0.9542 | 0.8219 | 0.8067 | 0.5245 | nan | 0.7027 | 0.8269 | 0.0109 | 0.9520 | 0.1389 | 0.0 | 0.5794 | 0.1255 | 0.7120 | 0.0 | 0.0142 | 0.9348 | 0.1926 | 0.6014 | 0.4170 | 0.1640 | nan | 0.2494 | 0.6041 | 0.5862 | 0.0 | 0.9498 | 0.8683 | 0.9818 | 0.0056 | 0.2779 | 0.4382 | 0.0 | nan | 0.7553 | 0.8689 | 0.6614 | 0.7353 | 0.3795 | nan | 0.5757 | 0.6132 | 0.0066 | 0.8633 | 0.1199 | 0.0 | 0.4289 | 0.1255 | 0.5749 | 0.0 | 0.0132 | 0.7611 | 0.1679 | 0.5161 | 0.3362 | 0.1469 | nan | 0.1785 | 0.4951 | 0.3797 | 0.0 | 0.8900 | 0.7775 | 0.9491 | 0.0047 | 0.1043 | 0.3541 | 0.0 | | 0.1041 | 20.5 | 4100 | 0.6890 | 0.4013 | 0.4795 | 0.8670 | nan | 0.7937 | 0.9591 | 0.7886 | 0.8263 | 0.4823 | nan | 0.7058 | 0.8277 | 0.0570 | 0.9562 | 0.2314 | 0.0089 | 0.5893 | 0.0884 | 0.7255 | 0.0 | 0.0212 | 0.9247 | 0.2537 | 0.6046 | 0.3496 | 0.2725 | nan | 0.2060 | 0.5952 | 0.5589 | 0.0 | 0.9555 | 0.8368 | 0.9832 | 0.0022 | 0.2807 | 0.4606 | 0.0 | nan | 0.7252 | 0.8511 | 0.7002 | 0.7143 | 0.3809 | nan | 0.5730 | 0.5817 | 0.0289 | 0.8606 | 0.1952 | 0.0089 | 0.4421 | 0.0883 | 0.5674 | 0.0 | 0.0208 | 0.7613 | 0.2071 | 0.5142 | 0.3038 | 0.2314 | nan | 0.1489 | 0.4961 | 0.3774 | 0.0 | 0.8847 | 0.7648 | 0.9499 | 0.0020 | 0.1005 | 0.3606 | 0.0 | | 0.1587 | 20.6 | 4120 | 0.6767 | 0.4083 | 0.4812 | 0.8711 | nan | 0.8364 | 0.9537 | 0.7705 | 0.7938 | 0.5484 | nan | 0.6784 | 0.8085 | 0.0284 | 0.9558 | 0.1908 | 0.0 | 0.5209 | 0.2507 | 0.6850 | 0.0 | 0.0004 | 0.9251 | 0.2019 | 0.6493 | 0.3894 | 0.2335 | nan | 0.2594 | 0.5984 | 0.5619 | 0.0 | 0.9466 | 0.8498 | 0.9837 | 0.0110 | 0.2769 | 0.4914 | 0.0 | nan | 0.7442 | 0.8592 | 0.7000 | 0.7254 | 0.3821 | nan | 0.5722 | 0.6206 | 0.0132 | 0.8605 | 0.1656 | 0.0 | 0.4081 | 0.2504 | 0.5632 | 0.0 | 0.0004 | 0.7681 | 0.1775 | 0.5537 | 0.3305 | 0.2045 | nan | 0.1971 | 0.4928 | 0.3788 | 0.0 | 0.8873 | 0.7669 | 0.9484 | 0.0090 | 0.1096 | 0.3747 | 0.0 | | 0.1263 | 20.7 | 4140 | 0.6792 | 0.4052 | 0.4798 | 0.8693 | nan | 0.8127 | 0.9591 | 0.8023 | 0.8156 | 0.5044 | nan | 0.6911 | 0.8283 | 0.0084 | 0.9539 | 0.1576 | 0.0 | 0.4543 | 0.2301 | 0.6969 | 0.0 | 0.0087 | 0.9293 | 0.1909 | 0.6220 | 0.4202 | 0.2310 | nan | 0.3082 | 0.6124 | 0.5514 | 0.0 | 0.9555 | 0.8294 | 0.9521 | 0.0289 | 0.3463 | 0.4527 | 0.0 | nan | 0.7352 | 0.8585 | 0.6929 | 0.7207 | 0.3930 | nan | 0.5796 | 0.6131 | 0.0048 | 0.8608 | 0.1425 | 0.0 | 0.3678 | 0.2301 | 0.5863 | 0.0 | 0.0082 | 0.7663 | 0.1664 | 0.5473 | 0.3381 | 0.2010 | nan | 0.2321 | 0.4778 | 0.3794 | 0.0 | 0.8840 | 0.7550 | 0.9309 | 0.0198 | 0.1169 | 0.3594 | 0.0 | | 0.0687 | 20.8 | 4160 | 0.6763 | 0.4026 | 0.4814 | 0.8705 | nan | 0.8374 | 0.9388 | 0.8149 | 0.8092 | 0.6507 | nan | 0.7003 | 0.7479 | 0.0486 | 0.9459 | 0.1209 | 0.0 | 0.4576 | 0.1684 | 0.7734 | 0.0 | 0.0052 | 0.9254 | 0.1351 | 0.6627 | 0.3857 | 0.2935 | nan | 0.3222 | 0.5700 | 0.5956 | 0.0 | 0.9561 | 0.8280 | 0.9860 | 0.0265 | 0.2518 | 0.4466 | 0.0 | nan | 0.7526 | 0.8674 | 0.7003 | 0.7256 | 0.3858 | nan | 0.5779 | 0.5933 | 0.0159 | 0.8653 | 0.1066 | 0.0 | 0.3861 | 0.1684 | 0.5383 | 0.0 | 0.0047 | 0.7688 | 0.1184 | 0.5338 | 0.3312 | 0.2460 | nan | 0.2383 | 0.4819 | 0.3783 | 0.0 | 0.8830 | 0.7539 | 0.9475 | 0.0183 | 0.1309 | 0.3649 | 0.0 | | 0.1022 | 20.9 | 4180 | 0.6515 | 0.4094 | 0.4840 | 0.8714 | nan | 0.8243 | 0.9531 | 0.8246 | 0.8207 | 0.5230 | nan | 0.7051 | 0.8164 | 0.0200 | 0.9530 | 0.1783 | 0.0 | 0.6442 | 0.2868 | 0.7147 | 0.0 | 0.0004 | 0.9344 | 0.1501 | 0.5645 | 0.3791 | 0.2135 | nan | 0.2752 | 0.6152 | 0.5677 | 0.0 | 0.9529 | 0.8576 | 0.9788 | 0.0186 | 0.2173 | 0.5001 | 0.0 | nan | 0.7452 | 0.8705 | 0.6665 | 0.7204 | 0.3726 | nan | 0.5795 | 0.6388 | 0.0120 | 0.8618 | 0.1561 | 0.0 | 0.4798 | 0.2868 | 0.5968 | 0.0 | 0.0004 | 0.7536 | 0.1331 | 0.4925 | 0.3280 | 0.1859 | nan | 0.2251 | 0.4905 | 0.3779 | 0.0 | 0.8884 | 0.7699 | 0.9526 | 0.0146 | 0.1169 | 0.3848 | 0.0 | | 0.1278 | 21.0 | 4200 | 0.6634 | 0.4075 | 0.4829 | 0.8711 | nan | 0.8245 | 0.9530 | 0.7866 | 0.8225 | 0.5349 | nan | 0.6884 | 0.8640 | 0.0566 | 0.9489 | 0.2054 | 0.0 | 0.5859 | 0.2362 | 0.7052 | 0.0 | 0.0008 | 0.9347 | 0.1644 | 0.6074 | 0.3896 | 0.2307 | nan | 0.3313 | 0.5614 | 0.5768 | 0.0 | 0.9521 | 0.8605 | 0.9856 | 0.0116 | 0.1722 | 0.4631 | 0.0 | nan | 0.7408 | 0.8674 | 0.7110 | 0.7096 | 0.3785 | nan | 0.5794 | 0.5541 | 0.0353 | 0.8636 | 0.1842 | 0.0 | 0.4354 | 0.2361 | 0.5915 | 0.0 | 0.0008 | 0.7613 | 0.1453 | 0.5112 | 0.3317 | 0.2033 | nan | 0.2664 | 0.4866 | 0.3724 | 0.0 | 0.8884 | 0.7736 | 0.9489 | 0.0089 | 0.0912 | 0.3618 | 0.0 | | 0.0846 | 21.1 | 4220 | 0.6718 | 0.3986 | 0.4677 | 0.8711 | nan | 0.8204 | 0.9642 | 0.7579 | 0.8309 | 0.4685 | nan | 0.6837 | 0.8341 | 0.0109 | 0.9496 | 0.1461 | 0.0 | 0.4288 | 0.1567 | 0.6752 | 0.0 | 0.0 | 0.9378 | 0.1636 | 0.5715 | 0.3840 | 0.3030 | nan | 0.2801 | 0.6112 | 0.5545 | 0.0 | 0.9499 | 0.8698 | 0.9818 | 0.0113 | 0.1875 | 0.4330 | 0.0 | nan | 0.7372 | 0.8662 | 0.7208 | 0.7000 | 0.3749 | nan | 0.5771 | 0.5614 | 0.0071 | 0.8616 | 0.1265 | 0.0 | 0.3334 | 0.1567 | 0.5866 | 0.0 | 0.0 | 0.7540 | 0.1424 | 0.5010 | 0.3260 | 0.2542 | nan | 0.2147 | 0.4983 | 0.3801 | 0.0 | 0.8891 | 0.7807 | 0.9504 | 0.0093 | 0.0879 | 0.3571 | 0.0 | | 0.1115 | 21.2 | 4240 | 0.6552 | 0.4076 | 0.4832 | 0.8737 | nan | 0.8317 | 0.9473 | 0.8047 | 0.8574 | 0.5807 | nan | 0.6939 | 0.8171 | 0.0246 | 0.9492 | 0.1527 | 0.0037 | 0.5044 | 0.2308 | 0.6829 | 0.0 | 0.0009 | 0.9300 | 0.1914 | 0.6670 | 0.4029 | 0.3364 | nan | 0.3103 | 0.5589 | 0.5355 | 0.0 | 0.9510 | 0.8408 | 0.9833 | 0.0029 | 0.1955 | 0.4733 | 0.0 | nan | 0.7517 | 0.8763 | 0.6704 | 0.7468 | 0.3714 | nan | 0.5771 | 0.5795 | 0.0150 | 0.8643 | 0.1306 | 0.0037 | 0.3847 | 0.2307 | 0.5843 | 0.0 | 0.0008 | 0.7695 | 0.1615 | 0.5434 | 0.3354 | 0.2658 | nan | 0.2342 | 0.4869 | 0.3854 | 0.0 | 0.8869 | 0.7624 | 0.9506 | 0.0023 | 0.1051 | 0.3660 | 0.0 | | 0.1084 | 21.3 | 4260 | 0.6892 | 0.3892 | 0.4577 | 0.8640 | nan | 0.7884 | 0.9506 | 0.7385 | 0.8094 | 0.5658 | nan | 0.7285 | 0.8160 | 0.0189 | 0.9520 | 0.1778 | 0.0 | 0.2717 | 0.1533 | 0.6612 | 0.0 | 0.0003 | 0.9476 | 0.1546 | 0.5538 | 0.4004 | 0.1485 | nan | 0.2246 | 0.6010 | 0.5424 | 0.0 | 0.9481 | 0.8537 | 0.9823 | 0.0129 | 0.1972 | 0.4453 | 0.0 | nan | 0.7146 | 0.8556 | 0.7159 | 0.7010 | 0.3691 | nan | 0.5797 | 0.6040 | 0.0121 | 0.8610 | 0.1574 | 0.0 | 0.2095 | 0.1532 | 0.5598 | 0.0 | 0.0002 | 0.7475 | 0.1321 | 0.4993 | 0.3277 | 0.1327 | nan | 0.1811 | 0.4931 | 0.3776 | 0.0 | 0.8874 | 0.7649 | 0.9522 | 0.0083 | 0.0941 | 0.3645 | 0.0 | | 0.1338 | 21.4 | 4280 | 0.6709 | 0.3973 | 0.4641 | 0.8726 | nan | 0.8313 | 0.9602 | 0.7449 | 0.8214 | 0.5155 | nan | 0.7123 | 0.8303 | 0.0138 | 0.9432 | 0.1363 | 0.0 | 0.4912 | 0.1223 | 0.6385 | 0.0 | 0.0003 | 0.9390 | 0.1575 | 0.6410 | 0.3586 | 0.2166 | nan | 0.2556 | 0.5782 | 0.5075 | 0.0 | 0.9518 | 0.8385 | 0.9821 | 0.0096 | 0.2061 | 0.4482 | 0.0 | nan | 0.7484 | 0.8677 | 0.7121 | 0.7044 | 0.3990 | nan | 0.5868 | 0.5942 | 0.0097 | 0.8629 | 0.1184 | 0.0 | 0.3635 | 0.1222 | 0.5545 | 0.0 | 0.0002 | 0.7618 | 0.1388 | 0.5326 | 0.3045 | 0.1839 | nan | 0.2106 | 0.4879 | 0.3757 | 0.0 | 0.8841 | 0.7657 | 0.9515 | 0.0074 | 0.1076 | 0.3582 | 0.0 | | 0.0738 | 21.5 | 4300 | 0.6573 | 0.4033 | 0.4748 | 0.8713 | nan | 0.8115 | 0.9567 | 0.8002 | 0.8162 | 0.5514 | nan | 0.7062 | 0.8208 | 0.0280 | 0.9496 | 0.1044 | 0.0029 | 0.4920 | 0.1151 | 0.6800 | 0.0 | 0.0 | 0.9278 | 0.1859 | 0.6386 | 0.3948 | 0.3072 | nan | 0.3047 | 0.6096 | 0.5394 | 0.0 | 0.9550 | 0.8316 | 0.9853 | 0.0043 | 0.1767 | 0.4969 | 0.0 | nan | 0.7366 | 0.8644 | 0.6744 | 0.7142 | 0.3915 | nan | 0.5877 | 0.6245 | 0.0180 | 0.8641 | 0.0928 | 0.0028 | 0.3760 | 0.1150 | 0.5753 | 0.0 | 0.0 | 0.7681 | 0.1669 | 0.5448 | 0.3265 | 0.2540 | nan | 0.2388 | 0.4946 | 0.3862 | 0.0 | 0.8849 | 0.7629 | 0.9491 | 0.0037 | 0.1074 | 0.3806 | 0.0 | | 0.1029 | 21.6 | 4320 | 0.6469 | 0.4044 | 0.4782 | 0.8737 | nan | 0.8398 | 0.9561 | 0.7955 | 0.8276 | 0.5186 | nan | 0.7033 | 0.7922 | 0.0450 | 0.9536 | 0.1347 | 0.0001 | 0.6867 | 0.1459 | 0.7371 | 0.0 | 0.0003 | 0.9364 | 0.1667 | 0.6232 | 0.3445 | 0.2556 | nan | 0.2941 | 0.5700 | 0.5599 | 0.0 | 0.9584 | 0.8165 | 0.9835 | 0.0045 | 0.1937 | 0.4606 | 0.0 | nan | 0.7543 | 0.8712 | 0.6931 | 0.7240 | 0.3945 | nan | 0.5843 | 0.6372 | 0.0275 | 0.8614 | 0.1184 | 0.0001 | 0.4375 | 0.1459 | 0.5797 | 0.0 | 0.0003 | 0.7593 | 0.1475 | 0.5194 | 0.2930 | 0.2124 | nan | 0.2283 | 0.4873 | 0.3849 | 0.0 | 0.8807 | 0.7483 | 0.9503 | 0.0040 | 0.1232 | 0.3739 | 0.0 | | 0.1186 | 21.7 | 4340 | 0.6585 | 0.4079 | 0.4883 | 0.8723 | nan | 0.8288 | 0.9558 | 0.7707 | 0.8673 | 0.5224 | nan | 0.6848 | 0.8085 | 0.0581 | 0.9490 | 0.1224 | 0.0 | 0.6958 | 0.0732 | 0.7552 | 0.0 | 0.0 | 0.9204 | 0.2677 | 0.5686 | 0.3612 | 0.3455 | nan | 0.3524 | 0.6461 | 0.5403 | 0.0 | 0.9572 | 0.8196 | 0.9823 | 0.0281 | 0.2567 | 0.4876 | 0.0 | nan | 0.7477 | 0.8743 | 0.7007 | 0.7153 | 0.4094 | nan | 0.5850 | 0.6034 | 0.0355 | 0.8668 | 0.1055 | 0.0 | 0.4602 | 0.0732 | 0.6005 | 0.0 | 0.0 | 0.7550 | 0.2271 | 0.5002 | 0.3071 | 0.2702 | nan | 0.2727 | 0.4949 | 0.3814 | 0.0 | 0.8794 | 0.7426 | 0.9508 | 0.0204 | 0.0945 | 0.3773 | 0.0 | | 0.0677 | 21.8 | 4360 | 0.6361 | 0.3993 | 0.4769 | 0.8750 | nan | 0.8544 | 0.9537 | 0.7839 | 0.8400 | 0.5455 | nan | 0.6787 | 0.8215 | 0.0144 | 0.9538 | 0.1153 | 0.0 | 0.7256 | 0.0217 | 0.7075 | 0.0 | 0.0 | 0.9299 | 0.1329 | 0.5876 | 0.4313 | 0.2619 | nan | 0.2513 | 0.6280 | 0.5770 | 0.0 | 0.9517 | 0.8389 | 0.9813 | 0.0087 | 0.2031 | 0.4615 | 0.0 | nan | 0.7686 | 0.8791 | 0.6233 | 0.7480 | 0.3951 | nan | 0.5785 | 0.6149 | 0.0087 | 0.8604 | 0.0980 | 0.0 | 0.4816 | 0.0217 | 0.5856 | 0.0 | 0.0 | 0.7597 | 0.1237 | 0.5133 | 0.3455 | 0.2226 | nan | 0.2174 | 0.4940 | 0.3838 | 0.0 | 0.8837 | 0.7542 | 0.9505 | 0.0060 | 0.0886 | 0.3699 | 0.0 | | 0.0966 | 21.9 | 4380 | 0.6521 | 0.4025 | 0.4823 | 0.8755 | nan | 0.8575 | 0.9535 | 0.7738 | 0.8313 | 0.5383 | nan | 0.6988 | 0.8177 | 0.0173 | 0.9548 | 0.1243 | 0.0 | 0.6781 | 0.1098 | 0.7482 | 0.0 | 0.0 | 0.9375 | 0.1988 | 0.5860 | 0.3846 | 0.2233 | nan | 0.2717 | 0.6065 | 0.6044 | 0.0 | 0.9486 | 0.8565 | 0.9802 | 0.0117 | 0.2642 | 0.4555 | 0.0 | nan | 0.7634 | 0.8766 | 0.6806 | 0.7478 | 0.3922 | nan | 0.5740 | 0.6078 | 0.0107 | 0.8622 | 0.1051 | 0.0 | 0.4432 | 0.1098 | 0.5781 | 0.0 | 0.0 | 0.7588 | 0.1735 | 0.5266 | 0.3227 | 0.1965 | nan | 0.2137 | 0.4956 | 0.3696 | 0.0 | 0.8882 | 0.7733 | 0.9508 | 0.0081 | 0.0910 | 0.3591 | 0.0 | | 0.1168 | 22.0 | 4400 | 0.6743 | 0.4063 | 0.4888 | 0.8718 | nan | 0.8217 | 0.9470 | 0.8187 | 0.8473 | 0.5740 | nan | 0.6884 | 0.8092 | 0.0404 | 0.9540 | 0.1233 | 0.0 | 0.7544 | 0.1954 | 0.7602 | 0.0 | 0.0011 | 0.9278 | 0.1784 | 0.6134 | 0.3906 | 0.2784 | nan | 0.1975 | 0.6000 | 0.5872 | 0.0 | 0.9548 | 0.8470 | 0.9854 | 0.0077 | 0.2577 | 0.4814 | 0.0 | nan | 0.7449 | 0.8696 | 0.6629 | 0.7342 | 0.3708 | nan | 0.5749 | 0.6200 | 0.0206 | 0.8656 | 0.1045 | 0.0 | 0.4958 | 0.1954 | 0.5853 | 0.0 | 0.0010 | 0.7691 | 0.1619 | 0.5505 | 0.3202 | 0.2488 | nan | 0.1564 | 0.4904 | 0.3779 | 0.0 | 0.8859 | 0.7710 | 0.9489 | 0.0067 | 0.0918 | 0.3755 | 0.0 | | 0.1196 | 22.1 | 4420 | 0.6573 | 0.4029 | 0.4845 | 0.8736 | nan | 0.8246 | 0.9584 | 0.7864 | 0.8363 | 0.5195 | nan | 0.6933 | 0.7868 | 0.1454 | 0.9523 | 0.1428 | 0.0 | 0.7212 | 0.0582 | 0.7861 | 0.0 | 0.0001 | 0.9330 | 0.2043 | 0.6240 | 0.3923 | 0.2184 | nan | 0.2291 | 0.6001 | 0.5717 | 0.0 | 0.9499 | 0.8588 | 0.9833 | 0.0066 | 0.2396 | 0.4826 | 0.0 | nan | 0.7455 | 0.8696 | 0.6918 | 0.7157 | 0.3893 | nan | 0.5780 | 0.6101 | 0.0592 | 0.8655 | 0.1215 | 0.0 | 0.4956 | 0.0582 | 0.5283 | 0.0 | 0.0001 | 0.7669 | 0.1772 | 0.5426 | 0.3298 | 0.1941 | nan | 0.1912 | 0.4937 | 0.3772 | 0.0 | 0.8886 | 0.7717 | 0.9503 | 0.0059 | 0.0940 | 0.3812 | 0.0 | | 0.0916 | 22.2 | 4440 | 0.6671 | 0.4016 | 0.4835 | 0.8750 | nan | 0.8546 | 0.9526 | 0.7848 | 0.8181 | 0.5306 | nan | 0.7050 | 0.7740 | 0.0639 | 0.9540 | 0.1298 | 0.0 | 0.7229 | 0.0259 | 0.7771 | 0.0 | 0.0 | 0.9282 | 0.1856 | 0.6414 | 0.4127 | 0.1601 | nan | 0.3675 | 0.6012 | 0.6054 | 0.0 | 0.9586 | 0.8275 | 0.9830 | 0.0156 | 0.2546 | 0.4364 | 0.0 | nan | 0.7593 | 0.8716 | 0.7117 | 0.7337 | 0.3855 | nan | 0.5776 | 0.6147 | 0.0252 | 0.8617 | 0.1121 | 0.0 | 0.4832 | 0.0259 | 0.5267 | 0.0 | 0.0 | 0.7674 | 0.1577 | 0.5443 | 0.3348 | 0.1446 | nan | 0.2485 | 0.4981 | 0.3778 | 0.0 | 0.8851 | 0.7590 | 0.9504 | 0.0125 | 0.1175 | 0.3647 | 0.0 | | 0.0867 | 22.3 | 4460 | 0.6636 | 0.4042 | 0.4808 | 0.8755 | nan | 0.8381 | 0.9610 | 0.7599 | 0.8121 | 0.5300 | nan | 0.6977 | 0.8275 | 0.0271 | 0.9569 | 0.1220 | 0.0 | 0.7163 | 0.0654 | 0.7150 | 0.0 | 0.0 | 0.9224 | 0.2333 | 0.6599 | 0.3537 | 0.1412 | nan | 0.3349 | 0.5894 | 0.5763 | 0.0 | 0.9542 | 0.8557 | 0.9837 | 0.0079 | 0.2481 | 0.4951 | 0.0 | nan | 0.7549 | 0.8665 | 0.7145 | 0.7176 | 0.4058 | nan | 0.5809 | 0.6203 | 0.0147 | 0.8614 | 0.1082 | 0.0 | 0.4864 | 0.0654 | 0.5410 | 0.0 | 0.0 | 0.7728 | 0.2017 | 0.5630 | 0.3100 | 0.1315 | nan | 0.2410 | 0.4916 | 0.3759 | 0.0 | 0.8892 | 0.7703 | 0.9499 | 0.0067 | 0.1155 | 0.3761 | 0.0 | | 0.1021 | 22.4 | 4480 | 0.6624 | 0.4058 | 0.4860 | 0.8766 | nan | 0.8609 | 0.9531 | 0.7598 | 0.8181 | 0.5222 | nan | 0.6821 | 0.8110 | 0.1090 | 0.9507 | 0.1304 | 0.0 | 0.7379 | 0.0732 | 0.7369 | 0.0 | 0.0 | 0.9206 | 0.1910 | 0.6836 | 0.4226 | 0.2346 | nan | 0.2167 | 0.6100 | 0.5774 | 0.0 | 0.9527 | 0.8661 | 0.9817 | 0.0058 | 0.2534 | 0.4897 | 0.0 | nan | 0.7584 | 0.8716 | 0.7144 | 0.7334 | 0.3854 | nan | 0.5682 | 0.5957 | 0.0472 | 0.8653 | 0.1130 | 0.0 | 0.4996 | 0.0732 | 0.5125 | 0.0 | 0.0 | 0.7731 | 0.1765 | 0.5758 | 0.3386 | 0.1997 | nan | 0.1902 | 0.4950 | 0.3736 | 0.0 | 0.8902 | 0.7776 | 0.9509 | 0.0048 | 0.1255 | 0.3748 | 0.0 | | 0.0872 | 22.5 | 4500 | 0.6689 | 0.4063 | 0.4868 | 0.8745 | nan | 0.8574 | 0.9529 | 0.8159 | 0.8035 | 0.5175 | nan | 0.6869 | 0.7413 | 0.2261 | 0.9517 | 0.1434 | 0.0002 | 0.7643 | 0.0877 | 0.6973 | 0.0 | 0.0 | 0.9308 | 0.1619 | 0.5909 | 0.4018 | 0.3157 | nan | 0.2351 | 0.6278 | 0.5717 | 0.0 | 0.9580 | 0.8582 | 0.9820 | 0.0176 | 0.2352 | 0.4442 | 0.0 | nan | 0.7623 | 0.8723 | 0.6897 | 0.7278 | 0.3847 | nan | 0.5636 | 0.6053 | 0.0667 | 0.8630 | 0.1199 | 0.0002 | 0.5100 | 0.0877 | 0.5358 | 0.0 | 0.0 | 0.7563 | 0.1431 | 0.5301 | 0.3409 | 0.2571 | nan | 0.1989 | 0.4989 | 0.3886 | 0.0 | 0.8874 | 0.7713 | 0.9510 | 0.0122 | 0.1084 | 0.3682 | 0.0 | | 0.0983 | 22.6 | 4520 | 0.7023 | 0.3976 | 0.4705 | 0.8686 | nan | 0.8058 | 0.9578 | 0.8044 | 0.8129 | 0.5313 | nan | 0.6656 | 0.8009 | 0.0983 | 0.9533 | 0.1359 | 0.0 | 0.5326 | 0.0680 | 0.6913 | 0.0 | 0.0016 | 0.9358 | 0.1274 | 0.5976 | 0.3870 | 0.3395 | nan | 0.1922 | 0.6224 | 0.5431 | 0.0 | 0.9529 | 0.8436 | 0.9845 | 0.0118 | 0.2043 | 0.4549 | 0.0 | nan | 0.7301 | 0.8665 | 0.6552 | 0.7167 | 0.3797 | nan | 0.5711 | 0.6217 | 0.0392 | 0.8612 | 0.1163 | 0.0 | 0.4311 | 0.0680 | 0.5479 | 0.0 | 0.0015 | 0.7544 | 0.1128 | 0.5237 | 0.3256 | 0.2716 | nan | 0.1567 | 0.4959 | 0.3898 | 0.0 | 0.8861 | 0.7586 | 0.9494 | 0.0068 | 0.1104 | 0.3756 | 0.0 | | 0.1209 | 22.7 | 4540 | 0.6888 | 0.4058 | 0.4847 | 0.8721 | nan | 0.8228 | 0.9569 | 0.7871 | 0.8190 | 0.5117 | nan | 0.7215 | 0.8375 | 0.0268 | 0.9549 | 0.1433 | 0.0 | 0.5613 | 0.1067 | 0.6863 | 0.0 | 0.0124 | 0.9308 | 0.1523 | 0.6273 | 0.4237 | 0.4561 | nan | 0.2320 | 0.6087 | 0.5332 | 0.0 | 0.9446 | 0.8729 | 0.9841 | 0.0056 | 0.3310 | 0.4608 | 0.0 | nan | 0.7441 | 0.8699 | 0.6695 | 0.7164 | 0.3835 | nan | 0.5778 | 0.6299 | 0.0147 | 0.8619 | 0.1226 | 0.0 | 0.4366 | 0.1066 | 0.5622 | 0.0 | 0.0117 | 0.7632 | 0.1349 | 0.5324 | 0.3502 | 0.3325 | nan | 0.1801 | 0.4954 | 0.3844 | 0.0 | 0.8894 | 0.7696 | 0.9505 | 0.0040 | 0.1223 | 0.3692 | 0.0 | | 0.0824 | 22.8 | 4560 | 0.6736 | 0.3993 | 0.4740 | 0.8745 | nan | 0.8310 | 0.9552 | 0.7977 | 0.8584 | 0.5197 | nan | 0.6920 | 0.8539 | 0.0106 | 0.9566 | 0.1252 | 0.0011 | 0.6397 | 0.0409 | 0.6607 | 0.0 | 0.0032 | 0.9340 | 0.1506 | 0.5987 | 0.3985 | 0.3083 | nan | 0.2090 | 0.5958 | 0.5371 | 0.0 | 0.9536 | 0.8469 | 0.9855 | 0.0051 | 0.2170 | 0.4835 | 0.0 | nan | 0.7479 | 0.8744 | 0.6568 | 0.7333 | 0.3959 | nan | 0.5785 | 0.6096 | 0.0067 | 0.8609 | 0.1105 | 0.0011 | 0.4306 | 0.0409 | 0.5658 | 0.0 | 0.0031 | 0.7627 | 0.1352 | 0.5206 | 0.3350 | 0.2573 | nan | 0.1663 | 0.4924 | 0.3826 | 0.0 | 0.8878 | 0.7731 | 0.9471 | 0.0040 | 0.1089 | 0.3872 | 0.0 | | 0.0955 | 22.9 | 4580 | 0.6769 | 0.3973 | 0.4726 | 0.8750 | nan | 0.8395 | 0.9518 | 0.8094 | 0.8573 | 0.5389 | nan | 0.6844 | 0.8181 | 0.0240 | 0.9536 | 0.1233 | 0.0 | 0.6615 | 0.0566 | 0.6939 | 0.0 | 0.0015 | 0.9378 | 0.1305 | 0.6115 | 0.3674 | 0.2542 | nan | 0.1922 | 0.5850 | 0.5552 | 0.0 | 0.9538 | 0.8550 | 0.9818 | 0.0104 | 0.2114 | 0.4616 | 0.0 | nan | 0.7574 | 0.8781 | 0.6485 | 0.7476 | 0.4006 | nan | 0.5743 | 0.6027 | 0.0124 | 0.8644 | 0.1103 | 0.0 | 0.4160 | 0.0566 | 0.5664 | 0.0 | 0.0014 | 0.7584 | 0.1177 | 0.5146 | 0.3165 | 0.2226 | nan | 0.1647 | 0.4900 | 0.3889 | 0.0 | 0.8875 | 0.7725 | 0.9506 | 0.0059 | 0.1090 | 0.3789 | 0.0 | | 0.1196 | 23.0 | 4600 | 0.6728 | 0.4035 | 0.4759 | 0.8745 | nan | 0.8255 | 0.9565 | 0.7957 | 0.8567 | 0.5352 | nan | 0.7149 | 0.7660 | 0.0164 | 0.9594 | 0.1322 | 0.0020 | 0.7373 | 0.0622 | 0.6802 | 0.0 | 0.0034 | 0.9197 | 0.1516 | 0.6637 | 0.3872 | 0.2374 | nan | 0.2423 | 0.5818 | 0.5662 | 0.0 | 0.9616 | 0.8030 | 0.9793 | 0.0105 | 0.1726 | 0.5070 | 0.0 | nan | 0.7482 | 0.8715 | 0.6640 | 0.7457 | 0.4033 | nan | 0.5877 | 0.6187 | 0.0062 | 0.8592 | 0.1182 | 0.0020 | 0.4994 | 0.0622 | 0.5617 | 0.0 | 0.0033 | 0.7726 | 0.1381 | 0.5452 | 0.3192 | 0.2098 | nan | 0.1879 | 0.4902 | 0.3920 | 0.0 | 0.8796 | 0.7461 | 0.9529 | 0.0078 | 0.1243 | 0.3965 | 0.0 | | 0.1144 | 23.1 | 4620 | 0.6609 | 0.4076 | 0.4822 | 0.8764 | nan | 0.8449 | 0.9573 | 0.7680 | 0.8435 | 0.5289 | nan | 0.6911 | 0.8111 | 0.0271 | 0.9566 | 0.1334 | 0.0065 | 0.7120 | 0.1387 | 0.6763 | 0.0 | 0.0017 | 0.9177 | 0.1278 | 0.6233 | 0.4501 | 0.2858 | nan | 0.1839 | 0.6163 | 0.5656 | 0.0 | 0.9523 | 0.8649 | 0.9804 | 0.0073 | 0.2385 | 0.5180 | 0.0 | nan | 0.7512 | 0.8727 | 0.6915 | 0.7513 | 0.4048 | nan | 0.5798 | 0.6187 | 0.0118 | 0.8619 | 0.1208 | 0.0065 | 0.4702 | 0.1386 | 0.5600 | 0.0 | 0.0016 | 0.7692 | 0.1193 | 0.5338 | 0.3470 | 0.2441 | nan | 0.1581 | 0.4946 | 0.3897 | 0.0 | 0.8891 | 0.7725 | 0.9528 | 0.0054 | 0.1337 | 0.3932 | 0.0 | | 0.1147 | 23.2 | 4640 | 0.6673 | 0.4042 | 0.4842 | 0.8765 | nan | 0.8462 | 0.9556 | 0.7864 | 0.8425 | 0.5362 | nan | 0.7036 | 0.8250 | 0.0328 | 0.9598 | 0.1363 | 0.0 | 0.7146 | 0.0945 | 0.6819 | 0.0 | 0.0 | 0.9310 | 0.1955 | 0.6343 | 0.4073 | 0.2949 | nan | 0.2135 | 0.6039 | 0.5779 | 0.0 | 0.9519 | 0.8486 | 0.9804 | 0.0079 | 0.2781 | 0.4553 | 0.0 | nan | 0.7555 | 0.8744 | 0.6860 | 0.7501 | 0.3970 | nan | 0.5739 | 0.6089 | 0.0172 | 0.8584 | 0.1163 | 0.0 | 0.4415 | 0.0944 | 0.5553 | 0.0 | 0.0 | 0.7657 | 0.1683 | 0.5395 | 0.3418 | 0.2349 | nan | 0.1732 | 0.4906 | 0.3635 | 0.0 | 0.8903 | 0.7753 | 0.9524 | 0.0064 | 0.1280 | 0.3766 | 0.0 | | 0.0797 | 23.3 | 4660 | 0.6934 | 0.4041 | 0.4799 | 0.8736 | nan | 0.8226 | 0.9545 | 0.7549 | 0.8589 | 0.5554 | nan | 0.6764 | 0.8197 | 0.0164 | 0.9529 | 0.1220 | 0.0 | 0.6507 | 0.0652 | 0.6900 | 0.0 | 0.0 | 0.9273 | 0.1986 | 0.6386 | 0.3753 | 0.2942 | nan | 0.2419 | 0.6171 | 0.5872 | 0.0 | 0.9531 | 0.8602 | 0.9845 | 0.0076 | 0.2604 | 0.4724 | 0.0 | nan | 0.7388 | 0.8706 | 0.6904 | 0.7298 | 0.3955 | nan | 0.5787 | 0.6149 | 0.0105 | 0.8611 | 0.1066 | 0.0 | 0.4802 | 0.0651 | 0.5646 | 0.0 | 0.0 | 0.7635 | 0.1704 | 0.5359 | 0.3274 | 0.2392 | nan | 0.1948 | 0.4917 | 0.3767 | 0.0 | 0.8902 | 0.7752 | 0.9505 | 0.0069 | 0.1253 | 0.3780 | 0.0 | | 0.0965 | 23.4 | 4680 | 0.7067 | 0.4014 | 0.4765 | 0.8717 | nan | 0.8330 | 0.9523 | 0.7460 | 0.8488 | 0.5634 | nan | 0.6929 | 0.8100 | 0.0555 | 0.9566 | 0.1322 | 0.0 | 0.6135 | 0.1201 | 0.7288 | 0.0 | 0.0 | 0.9314 | 0.1291 | 0.5895 | 0.3716 | 0.3378 | nan | 0.2263 | 0.5871 | 0.5389 | 0.0 | 0.9494 | 0.8527 | 0.9822 | 0.0012 | 0.2427 | 0.4541 | 0.0 | nan | 0.7438 | 0.8718 | 0.6884 | 0.7370 | 0.3878 | nan | 0.5796 | 0.6060 | 0.0254 | 0.8590 | 0.1164 | 0.0 | 0.4471 | 0.1201 | 0.5629 | 0.0 | 0.0 | 0.7508 | 0.1147 | 0.4962 | 0.3251 | 0.2780 | nan | 0.1828 | 0.4891 | 0.3770 | 0.0 | 0.8855 | 0.7589 | 0.9526 | 0.0011 | 0.1225 | 0.3662 | 0.0 | | 0.1366 | 23.5 | 4700 | 0.6898 | 0.4134 | 0.4982 | 0.8730 | nan | 0.8253 | 0.9597 | 0.7906 | 0.8212 | 0.5107 | nan | 0.6995 | 0.8227 | 0.0240 | 0.9501 | 0.1438 | 0.0149 | 0.8054 | 0.2313 | 0.7367 | 0.0 | 0.0085 | 0.8985 | 0.3260 | 0.6925 | 0.3509 | 0.2820 | nan | 0.3289 | 0.6285 | 0.5403 | 0.0 | 0.9552 | 0.8376 | 0.9854 | 0.0036 | 0.2508 | 0.5173 | 0.0 | nan | 0.7431 | 0.8684 | 0.6978 | 0.7212 | 0.3905 | nan | 0.5702 | 0.6110 | 0.0107 | 0.8661 | 0.1260 | 0.0148 | 0.4738 | 0.2312 | 0.5671 | 0.0 | 0.0082 | 0.7703 | 0.2687 | 0.5575 | 0.3061 | 0.2209 | nan | 0.2486 | 0.4899 | 0.3764 | 0.0 | 0.8827 | 0.7573 | 0.9497 | 0.0033 | 0.1107 | 0.3857 | 0.0 | | 0.0908 | 23.6 | 4720 | 0.6770 | 0.4098 | 0.4881 | 0.8756 | nan | 0.8416 | 0.9511 | 0.7898 | 0.8537 | 0.5607 | nan | 0.6894 | 0.8402 | 0.0040 | 0.9572 | 0.1218 | 0.0037 | 0.6462 | 0.1959 | 0.6786 | 0.0 | 0.0095 | 0.9199 | 0.2052 | 0.6373 | 0.3794 | 0.3195 | nan | 0.3122 | 0.6081 | 0.5116 | 0.0 | 0.9548 | 0.8582 | 0.9822 | 0.0023 | 0.3116 | 0.4751 | 0.0 | nan | 0.7554 | 0.8787 | 0.6755 | 0.7475 | 0.3776 | nan | 0.5807 | 0.6166 | 0.0025 | 0.8624 | 0.1074 | 0.0037 | 0.4293 | 0.1958 | 0.5660 | 0.0 | 0.0091 | 0.7657 | 0.1809 | 0.5559 | 0.3133 | 0.2630 | nan | 0.2546 | 0.4961 | 0.3786 | 0.0 | 0.8875 | 0.7730 | 0.9517 | 0.0019 | 0.1079 | 0.3751 | 0.0 | | 0.1014 | 23.7 | 4740 | 0.6535 | 0.4063 | 0.4823 | 0.8735 | nan | 0.8430 | 0.9501 | 0.8058 | 0.8021 | 0.5354 | nan | 0.7165 | 0.8378 | 0.0109 | 0.9548 | 0.1042 | 0.0133 | 0.6325 | 0.0786 | 0.6767 | 0.0 | 0.0003 | 0.9215 | 0.1791 | 0.6628 | 0.3357 | 0.3620 | nan | 0.3204 | 0.6167 | 0.5248 | 0.0 | 0.9541 | 0.8546 | 0.9855 | 0.0132 | 0.2390 | 0.5036 | 0.0 | nan | 0.7544 | 0.8748 | 0.6597 | 0.7311 | 0.3679 | nan | 0.5745 | 0.6423 | 0.0075 | 0.8625 | 0.0925 | 0.0133 | 0.4476 | 0.0785 | 0.5741 | 0.0 | 0.0003 | 0.7645 | 0.1578 | 0.5596 | 0.2964 | 0.2935 | nan | 0.2556 | 0.4971 | 0.3898 | 0.0 | 0.8873 | 0.7724 | 0.9498 | 0.0085 | 0.1011 | 0.3875 | 0.0 | | 0.102 | 23.8 | 4760 | 0.6747 | 0.4017 | 0.4770 | 0.8742 | nan | 0.8188 | 0.9561 | 0.8030 | 0.8568 | 0.5305 | nan | 0.6926 | 0.8115 | 0.0258 | 0.9537 | 0.0933 | 0.0 | 0.6117 | 0.0918 | 0.7096 | 0.0 | 0.0003 | 0.9281 | 0.1678 | 0.6539 | 0.3376 | 0.3058 | nan | 0.3111 | 0.6180 | 0.5367 | 0.0 | 0.9519 | 0.8692 | 0.9838 | 0.0037 | 0.1622 | 0.4779 | 0.0 | nan | 0.7398 | 0.8785 | 0.6542 | 0.7370 | 0.3722 | nan | 0.5798 | 0.5885 | 0.0108 | 0.8663 | 0.0825 | 0.0 | 0.4224 | 0.0918 | 0.5818 | 0.0 | 0.0003 | 0.7673 | 0.1502 | 0.5593 | 0.2970 | 0.2544 | nan | 0.2420 | 0.4978 | 0.3933 | 0.0 | 0.8884 | 0.7793 | 0.9499 | 0.0026 | 0.0884 | 0.3787 | 0.0 | | 0.1469 | 23.9 | 4780 | 0.6532 | 0.4060 | 0.4836 | 0.8787 | nan | 0.8692 | 0.9580 | 0.7645 | 0.8406 | 0.5217 | nan | 0.6817 | 0.7427 | 0.0926 | 0.9555 | 0.1089 | 0.0 | 0.5908 | 0.1198 | 0.7532 | 0.0 | 0.0020 | 0.9314 | 0.2335 | 0.6322 | 0.3620 | 0.3100 | nan | 0.3448 | 0.5898 | 0.5816 | 0.0 | 0.9518 | 0.8482 | 0.9816 | 0.0056 | 0.2243 | 0.4756 | 0.0 | nan | 0.7724 | 0.8805 | 0.6834 | 0.7630 | 0.3912 | nan | 0.5737 | 0.5794 | 0.0273 | 0.8663 | 0.0947 | 0.0 | 0.4012 | 0.1198 | 0.5659 | 0.0 | 0.0020 | 0.7664 | 0.2007 | 0.5522 | 0.3059 | 0.2548 | nan | 0.2398 | 0.4910 | 0.3870 | 0.0 | 0.8851 | 0.7671 | 0.9514 | 0.0042 | 0.0953 | 0.3706 | 0.0 | | 0.0832 | 24.0 | 4800 | 0.6606 | 0.4047 | 0.4815 | 0.8758 | nan | 0.8290 | 0.9556 | 0.7845 | 0.8629 | 0.5526 | nan | 0.6877 | 0.8289 | 0.0126 | 0.9567 | 0.1224 | 0.0 | 0.6626 | 0.1311 | 0.6637 | 0.0 | 0.0090 | 0.9326 | 0.0793 | 0.6616 | 0.3595 | 0.3732 | nan | 0.2827 | 0.6086 | 0.5777 | 0.0 | 0.9524 | 0.8613 | 0.9847 | 0.0140 | 0.2216 | 0.4388 | 0.0 | nan | 0.7495 | 0.8800 | 0.6644 | 0.7370 | 0.3917 | nan | 0.5861 | 0.6419 | 0.0082 | 0.8640 | 0.1073 | 0.0 | 0.4011 | 0.1310 | 0.5753 | 0.0 | 0.0089 | 0.7656 | 0.0747 | 0.5575 | 0.3065 | 0.2953 | nan | 0.2261 | 0.4960 | 0.3872 | 0.0 | 0.8866 | 0.7757 | 0.9502 | 0.0096 | 0.1024 | 0.3698 | 0.0 | | 0.1054 | 24.1 | 4820 | 0.6621 | 0.4131 | 0.4917 | 0.8779 | nan | 0.8668 | 0.9545 | 0.7446 | 0.8404 | 0.5394 | nan | 0.6960 | 0.8436 | 0.0180 | 0.9535 | 0.1369 | 0.0 | 0.6041 | 0.2740 | 0.6958 | 0.0 | 0.0260 | 0.9239 | 0.2214 | 0.6427 | 0.3805 | 0.2423 | nan | 0.4048 | 0.5851 | 0.5905 | 0.0 | 0.9549 | 0.8283 | 0.9830 | 0.0152 | 0.2682 | 0.5001 | 0.0 | nan | 0.7693 | 0.8823 | 0.6863 | 0.7551 | 0.3808 | nan | 0.5851 | 0.6125 | 0.0100 | 0.8680 | 0.1182 | 0.0 | 0.3845 | 0.2736 | 0.5852 | 0.0 | 0.0255 | 0.7721 | 0.1944 | 0.5505 | 0.3180 | 0.2073 | nan | 0.2891 | 0.4955 | 0.3669 | 0.0 | 0.8834 | 0.7560 | 0.9512 | 0.0100 | 0.1112 | 0.3785 | 0.0 | | 0.0628 | 24.2 | 4840 | 0.6659 | 0.4126 | 0.4973 | 0.8764 | nan | 0.8482 | 0.9532 | 0.8024 | 0.8445 | 0.5336 | nan | 0.6979 | 0.8672 | 0.0248 | 0.9586 | 0.1514 | 0.0 | 0.6141 | 0.2115 | 0.6955 | 0.0 | 0.0052 | 0.9133 | 0.2168 | 0.6544 | 0.4175 | 0.3947 | nan | 0.3573 | 0.6188 | 0.5859 | 0.0 | 0.9544 | 0.8420 | 0.9826 | 0.0076 | 0.2786 | 0.4811 | 0.0 | nan | 0.7621 | 0.8762 | 0.6918 | 0.7523 | 0.3880 | nan | 0.5765 | 0.5741 | 0.0149 | 0.8643 | 0.1291 | 0.0 | 0.4119 | 0.2112 | 0.5678 | 0.0 | 0.0052 | 0.7720 | 0.1904 | 0.5424 | 0.3399 | 0.3102 | nan | 0.2791 | 0.4959 | 0.3576 | 0.0 | 0.8856 | 0.7586 | 0.9511 | 0.0060 | 0.1135 | 0.3753 | 0.0 | | 0.0991 | 24.3 | 4860 | 0.6955 | 0.4082 | 0.4875 | 0.8761 | nan | 0.8536 | 0.9518 | 0.8091 | 0.8414 | 0.5575 | nan | 0.6870 | 0.8289 | 0.0324 | 0.9480 | 0.1085 | 0.0 | 0.5813 | 0.1625 | 0.7587 | 0.0 | 0.0232 | 0.9303 | 0.2009 | 0.6298 | 0.3871 | 0.4062 | nan | 0.2810 | 0.5868 | 0.5593 | 0.0 | 0.9548 | 0.8246 | 0.9849 | 0.0032 | 0.2298 | 0.4767 | 0.0 | nan | 0.7655 | 0.8783 | 0.6721 | 0.7540 | 0.3832 | nan | 0.5760 | 0.5723 | 0.0179 | 0.8676 | 0.0964 | 0.0 | 0.4293 | 0.1624 | 0.5609 | 0.0 | 0.0230 | 0.7676 | 0.1764 | 0.5444 | 0.3243 | 0.3140 | nan | 0.2434 | 0.4931 | 0.3775 | 0.0 | 0.8835 | 0.7535 | 0.9503 | 0.0026 | 0.0972 | 0.3749 | 0.0 | | 0.1244 | 24.4 | 4880 | 0.6913 | 0.4070 | 0.4834 | 0.8750 | nan | 0.8484 | 0.9514 | 0.7952 | 0.8303 | 0.5339 | nan | 0.7220 | 0.8061 | 0.0202 | 0.9484 | 0.1076 | 0.0185 | 0.5105 | 0.1573 | 0.7423 | 0.0 | 0.0147 | 0.9346 | 0.2283 | 0.6113 | 0.3916 | 0.3409 | nan | 0.3059 | 0.6206 | 0.5449 | 0.0 | 0.9550 | 0.8421 | 0.9806 | 0.0050 | 0.2446 | 0.4554 | 0.0 | nan | 0.7589 | 0.8740 | 0.6844 | 0.7453 | 0.3829 | nan | 0.5773 | 0.6150 | 0.0107 | 0.8623 | 0.0949 | 0.0184 | 0.3722 | 0.1573 | 0.5755 | 0.0 | 0.0146 | 0.7645 | 0.1975 | 0.5299 | 0.3193 | 0.2635 | nan | 0.2408 | 0.4983 | 0.3800 | 0.0 | 0.8852 | 0.7619 | 0.9523 | 0.0040 | 0.1092 | 0.3730 | 0.0 | | 0.1156 | 24.5 | 4900 | 0.6872 | 0.4104 | 0.4857 | 0.8740 | nan | 0.8472 | 0.9539 | 0.8035 | 0.7991 | 0.5354 | nan | 0.6961 | 0.8381 | 0.0211 | 0.9500 | 0.1042 | 0.0164 | 0.5515 | 0.2262 | 0.7248 | 0.0 | 0.0216 | 0.9260 | 0.1495 | 0.6373 | 0.4013 | 0.3315 | nan | 0.3628 | 0.5909 | 0.5302 | 0.0 | 0.9539 | 0.8443 | 0.9846 | 0.0044 | 0.2430 | 0.4948 | 0.0 | nan | 0.7509 | 0.8737 | 0.6822 | 0.7162 | 0.3894 | nan | 0.5797 | 0.6099 | 0.0130 | 0.8638 | 0.0916 | 0.0164 | 0.4414 | 0.2259 | 0.5915 | 0.0 | 0.0215 | 0.7678 | 0.1346 | 0.5359 | 0.3177 | 0.2581 | nan | 0.2672 | 0.4943 | 0.3782 | 0.0 | 0.8868 | 0.7687 | 0.9510 | 0.0034 | 0.1181 | 0.3842 | 0.0 | | 0.1161 | 24.6 | 4920 | 0.6739 | 0.4109 | 0.4828 | 0.8753 | nan | 0.8585 | 0.9563 | 0.7776 | 0.8028 | 0.5214 | nan | 0.6980 | 0.8283 | 0.0100 | 0.9572 | 0.1064 | 0.0 | 0.5024 | 0.2197 | 0.7194 | 0.0 | 0.0005 | 0.9343 | 0.1388 | 0.6306 | 0.4056 | 0.2956 | nan | 0.4097 | 0.6059 | 0.5564 | 0.0 | 0.9504 | 0.8251 | 0.9853 | 0.0201 | 0.2503 | 0.4829 | 0.0 | nan | 0.7581 | 0.8764 | 0.6901 | 0.7238 | 0.4003 | nan | 0.5772 | 0.6326 | 0.0064 | 0.8622 | 0.0935 | 0.0 | 0.4033 | 0.2191 | 0.5943 | 0.0 | 0.0005 | 0.7664 | 0.1255 | 0.5445 | 0.3369 | 0.2515 | nan | 0.3306 | 0.5028 | 0.3723 | 0.0 | 0.8853 | 0.7556 | 0.9504 | 0.0122 | 0.1017 | 0.3758 | 0.0 | | 0.0637 | 24.7 | 4940 | 0.6632 | 0.4097 | 0.4835 | 0.8764 | nan | 0.8464 | 0.9557 | 0.8120 | 0.8365 | 0.5221 | nan | 0.6857 | 0.8429 | 0.0299 | 0.9485 | 0.1338 | 0.0 | 0.4987 | 0.1103 | 0.7168 | 0.0 | 0.0032 | 0.9244 | 0.1512 | 0.6448 | 0.3985 | 0.3803 | nan | 0.3515 | 0.6227 | 0.5915 | 0.0 | 0.9588 | 0.8445 | 0.9812 | 0.0259 | 0.1926 | 0.4600 | 0.0 | nan | 0.7592 | 0.8776 | 0.6662 | 0.7446 | 0.4036 | nan | 0.5652 | 0.6249 | 0.0187 | 0.8637 | 0.1150 | 0.0 | 0.4005 | 0.1101 | 0.5944 | 0.0 | 0.0032 | 0.7679 | 0.1344 | 0.5437 | 0.3184 | 0.3012 | nan | 0.2867 | 0.4977 | 0.3855 | 0.0 | 0.8851 | 0.7606 | 0.9509 | 0.0149 | 0.1346 | 0.3814 | 0.0 | | 0.0985 | 24.8 | 4960 | 0.6682 | 0.4116 | 0.4878 | 0.8794 | nan | 0.8668 | 0.9547 | 0.7974 | 0.8458 | 0.5332 | nan | 0.6825 | 0.8161 | 0.0215 | 0.9582 | 0.1394 | 0.0 | 0.6203 | 0.1289 | 0.6870 | 0.0 | 0.0065 | 0.9294 | 0.1776 | 0.6455 | 0.3973 | 0.4017 | nan | 0.2997 | 0.6134 | 0.5755 | 0.0 | 0.9516 | 0.8513 | 0.9842 | 0.0146 | 0.2445 | 0.4660 | 0.0 | nan | 0.7747 | 0.8812 | 0.6855 | 0.7584 | 0.3930 | nan | 0.5663 | 0.6446 | 0.0122 | 0.8604 | 0.1209 | 0.0 | 0.4040 | 0.1288 | 0.5813 | 0.0 | 0.0065 | 0.7709 | 0.1549 | 0.5461 | 0.3154 | 0.3114 | nan | 0.2470 | 0.4980 | 0.3755 | 0.0 | 0.8899 | 0.7729 | 0.9505 | 0.0092 | 0.1344 | 0.3784 | 0.0 | | 0.0761 | 24.9 | 4980 | 0.6739 | 0.4110 | 0.4890 | 0.8768 | nan | 0.8640 | 0.9506 | 0.7996 | 0.8113 | 0.5416 | nan | 0.6980 | 0.8474 | 0.0488 | 0.9506 | 0.1134 | 0.0036 | 0.6406 | 0.1765 | 0.7130 | 0.0 | 0.0 | 0.9279 | 0.2156 | 0.6395 | 0.3693 | 0.2851 | nan | 0.3728 | 0.5819 | 0.5480 | 0.0 | 0.9541 | 0.8624 | 0.9829 | 0.0202 | 0.2360 | 0.4945 | 0.0 | nan | 0.7687 | 0.8807 | 0.6984 | 0.7350 | 0.3854 | nan | 0.5674 | 0.6164 | 0.0271 | 0.8653 | 0.0989 | 0.0036 | 0.4341 | 0.1763 | 0.5854 | 0.0 | 0.0 | 0.7705 | 0.1856 | 0.5469 | 0.3020 | 0.2374 | nan | 0.2850 | 0.4921 | 0.3710 | 0.0 | 0.8867 | 0.7714 | 0.9507 | 0.0110 | 0.1154 | 0.3827 | 0.0 | | 0.0735 | 25.0 | 5000 | 0.6830 | 0.4082 | 0.4872 | 0.8746 | nan | 0.8526 | 0.9559 | 0.8056 | 0.7894 | 0.5199 | nan | 0.6850 | 0.8030 | 0.1254 | 0.9522 | 0.1047 | 0.0 | 0.6198 | 0.1374 | 0.7549 | 0.0 | 0.0015 | 0.9279 | 0.1950 | 0.6612 | 0.3706 | 0.2416 | nan | 0.3857 | 0.6293 | 0.5656 | 0.0 | 0.9538 | 0.8347 | 0.9810 | 0.0132 | 0.2424 | 0.4825 | 0.0 | nan | 0.7567 | 0.8744 | 0.6918 | 0.7206 | 0.3850 | nan | 0.5699 | 0.6128 | 0.0516 | 0.8634 | 0.0923 | 0.0 | 0.4538 | 0.1373 | 0.5654 | 0.0 | 0.0014 | 0.7704 | 0.1717 | 0.5527 | 0.3110 | 0.2065 | nan | 0.2869 | 0.4934 | 0.3816 | 0.0 | 0.8848 | 0.7593 | 0.9523 | 0.0082 | 0.1241 | 0.3836 | 0.0 | | 0.0938 | 25.1 | 5020 | 0.6645 | 0.4116 | 0.4882 | 0.8781 | nan | 0.8499 | 0.9551 | 0.8119 | 0.8610 | 0.5020 | nan | 0.6796 | 0.8278 | 0.0481 | 0.9580 | 0.1147 | 0.0 | 0.6345 | 0.1092 | 0.7140 | 0.0 | 0.0107 | 0.9178 | 0.1656 | 0.6692 | 0.4067 | 0.3954 | nan | 0.2975 | 0.6107 | 0.5512 | 0.0 | 0.9543 | 0.8624 | 0.9816 | 0.0058 | 0.2392 | 0.4898 | 0.0 | nan | 0.7644 | 0.8812 | 0.6835 | 0.7389 | 0.3873 | nan | 0.5716 | 0.6184 | 0.0260 | 0.8597 | 0.1006 | 0.0 | 0.4904 | 0.1090 | 0.5648 | 0.0 | 0.0104 | 0.7717 | 0.1496 | 0.5551 | 0.3267 | 0.3090 | nan | 0.2399 | 0.5007 | 0.3807 | 0.0 | 0.8883 | 0.7700 | 0.9509 | 0.0040 | 0.1360 | 0.3809 | 0.0 | | 0.093 | 25.2 | 5040 | 0.6889 | 0.4094 | 0.4870 | 0.8747 | nan | 0.8459 | 0.9616 | 0.7975 | 0.7967 | 0.5299 | nan | 0.6816 | 0.8436 | 0.0369 | 0.9504 | 0.1119 | 0.0 | 0.5495 | 0.1542 | 0.7090 | 0.0 | 0.0110 | 0.9289 | 0.2162 | 0.6300 | 0.3669 | 0.4266 | nan | 0.3106 | 0.6245 | 0.5737 | 0.0 | 0.9493 | 0.8377 | 0.9829 | 0.0034 | 0.2728 | 0.4795 | 0.0 | nan | 0.7568 | 0.8718 | 0.6990 | 0.7256 | 0.3905 | nan | 0.5750 | 0.5860 | 0.0194 | 0.8629 | 0.0962 | 0.0 | 0.4309 | 0.1541 | 0.5898 | 0.0 | 0.0107 | 0.7672 | 0.1901 | 0.5447 | 0.3203 | 0.3227 | nan | 0.2274 | 0.4974 | 0.3704 | 0.0 | 0.8879 | 0.7670 | 0.9509 | 0.0023 | 0.1111 | 0.3720 | 0.0 | | 0.075 | 25.3 | 5060 | 0.6667 | 0.4123 | 0.4879 | 0.8759 | nan | 0.8341 | 0.9488 | 0.8000 | 0.8593 | 0.5572 | nan | 0.7305 | 0.8057 | 0.0129 | 0.9501 | 0.1211 | 0.0 | 0.5537 | 0.1523 | 0.6850 | 0.0 | 0.0085 | 0.9328 | 0.1743 | 0.6494 | 0.3818 | 0.4368 | nan | 0.3142 | 0.5996 | 0.5596 | 0.0 | 0.9507 | 0.8710 | 0.9842 | 0.0056 | 0.2599 | 0.4734 | 0.0 | nan | 0.7514 | 0.8764 | 0.7078 | 0.7405 | 0.3926 | nan | 0.5763 | 0.6279 | 0.0074 | 0.8605 | 0.1038 | 0.0 | 0.4455 | 0.1521 | 0.5823 | 0.0 | 0.0081 | 0.7689 | 0.1574 | 0.5531 | 0.3275 | 0.3353 | nan | 0.2247 | 0.4981 | 0.3831 | 0.0 | 0.8894 | 0.7808 | 0.9505 | 0.0039 | 0.1115 | 0.3755 | 0.0 | | 0.0883 | 25.4 | 5080 | 0.6826 | 0.4071 | 0.4811 | 0.8782 | nan | 0.8498 | 0.9660 | 0.7923 | 0.8490 | 0.4725 | nan | 0.6617 | 0.8420 | 0.0195 | 0.9434 | 0.1262 | 0.0 | 0.6498 | 0.1398 | 0.6956 | 0.0 | 0.0 | 0.9290 | 0.1854 | 0.6294 | 0.3749 | 0.3810 | nan | 0.1998 | 0.6090 | 0.5451 | 0.0 | 0.9577 | 0.8428 | 0.9847 | 0.0020 | 0.2837 | 0.4641 | 0.0 | nan | 0.7571 | 0.8752 | 0.7074 | 0.7545 | 0.3956 | nan | 0.5682 | 0.5973 | 0.0116 | 0.8641 | 0.1088 | 0.0 | 0.4206 | 0.1397 | 0.5925 | 0.0 | 0.0 | 0.7698 | 0.1680 | 0.5493 | 0.3142 | 0.3030 | nan | 0.1681 | 0.4992 | 0.3759 | 0.0 | 0.8855 | 0.7756 | 0.9511 | 0.0017 | 0.1063 | 0.3670 | 0.0 | | 0.0816 | 25.5 | 5100 | 0.6807 | 0.4114 | 0.4955 | 0.8749 | nan | 0.8345 | 0.9478 | 0.8324 | 0.8536 | 0.5572 | nan | 0.7033 | 0.7725 | 0.1826 | 0.9557 | 0.1266 | 0.0 | 0.7428 | 0.0977 | 0.7280 | 0.0 | 0.0050 | 0.9258 | 0.2657 | 0.6480 | 0.3696 | 0.3062 | nan | 0.3317 | 0.6083 | 0.5698 | 0.0 | 0.9562 | 0.8370 | 0.9822 | 0.0077 | 0.2130 | 0.4945 | 0.0 | nan | 0.7530 | 0.8759 | 0.6533 | 0.7513 | 0.4000 | nan | 0.5683 | 0.6028 | 0.0512 | 0.8618 | 0.1103 | 0.0 | 0.5336 | 0.0977 | 0.5827 | 0.0 | 0.0049 | 0.7748 | 0.2246 | 0.5527 | 0.3123 | 0.2552 | nan | 0.2516 | 0.4962 | 0.3681 | 0.0 | 0.8845 | 0.7606 | 0.9528 | 0.0048 | 0.0989 | 0.3816 | 0.0 | | 0.0916 | 25.6 | 5120 | 0.7149 | 0.4123 | 0.4943 | 0.8733 | nan | 0.8150 | 0.9578 | 0.8043 | 0.8584 | 0.5208 | nan | 0.6757 | 0.7541 | 0.2774 | 0.9568 | 0.1717 | 0.0 | 0.6621 | 0.0629 | 0.7186 | 0.0 | 0.0065 | 0.9310 | 0.2619 | 0.6287 | 0.3896 | 0.3775 | nan | 0.2891 | 0.6290 | 0.5645 | 0.0 | 0.9525 | 0.8389 | 0.9825 | 0.0055 | 0.2428 | 0.4810 | 0.0 | nan | 0.7377 | 0.8732 | 0.6804 | 0.7161 | 0.3912 | nan | 0.5710 | 0.6170 | 0.0662 | 0.8598 | 0.1508 | 0.0 | 0.4993 | 0.0629 | 0.5755 | 0.0 | 0.0064 | 0.7720 | 0.2194 | 0.5453 | 0.3182 | 0.3081 | nan | 0.2384 | 0.4975 | 0.3752 | 0.0 | 0.8870 | 0.7674 | 0.9532 | 0.0039 | 0.1123 | 0.3868 | 0.0 | | 0.0756 | 25.7 | 5140 | 0.6823 | 0.4106 | 0.4847 | 0.8764 | nan | 0.8456 | 0.9563 | 0.8080 | 0.8325 | 0.5236 | nan | 0.7090 | 0.8032 | 0.0222 | 0.9552 | 0.1291 | 0.0 | 0.6489 | 0.1083 | 0.6883 | 0.0 | 0.0041 | 0.9370 | 0.2403 | 0.6077 | 0.3606 | 0.3683 | nan | 0.3181 | 0.6060 | 0.5516 | 0.0 | 0.9540 | 0.8402 | 0.9832 | 0.0093 | 0.2087 | 0.4923 | 0.0 | nan | 0.7600 | 0.8755 | 0.7023 | 0.7397 | 0.4002 | nan | 0.5797 | 0.6311 | 0.0085 | 0.8632 | 0.1114 | 0.0 | 0.4545 | 0.1082 | 0.5811 | 0.0 | 0.0039 | 0.7603 | 0.2042 | 0.5239 | 0.3056 | 0.2967 | nan | 0.2467 | 0.4979 | 0.3758 | 0.0 | 0.8863 | 0.7610 | 0.9513 | 0.0073 | 0.1088 | 0.3932 | 0.0 | | 0.0803 | 25.8 | 5160 | 0.6809 | 0.4101 | 0.4872 | 0.8765 | nan | 0.8516 | 0.9534 | 0.7921 | 0.8191 | 0.5660 | nan | 0.6904 | 0.8421 | 0.0473 | 0.9555 | 0.1229 | 0.0 | 0.6922 | 0.0765 | 0.7293 | 0.0 | 0.0033 | 0.9232 | 0.1971 | 0.6275 | 0.3768 | 0.3648 | nan | 0.2874 | 0.5885 | 0.5604 | 0.0 | 0.9569 | 0.8549 | 0.9839 | 0.0082 | 0.2141 | 0.5045 | 0.0 | nan | 0.7598 | 0.8736 | 0.7219 | 0.7322 | 0.4040 | nan | 0.5802 | 0.6153 | 0.0208 | 0.8647 | 0.1069 | 0.0 | 0.4975 | 0.0765 | 0.5739 | 0.0 | 0.0033 | 0.7707 | 0.1722 | 0.5427 | 0.3144 | 0.2992 | nan | 0.2225 | 0.4932 | 0.3685 | 0.0 | 0.8873 | 0.7701 | 0.9497 | 0.0060 | 0.1062 | 0.3894 | 0.0 | | 0.0942 | 25.9 | 5180 | 0.6411 | 0.4150 | 0.4918 | 0.8804 | nan | 0.8641 | 0.9544 | 0.8129 | 0.8409 | 0.5218 | nan | 0.7097 | 0.8362 | 0.0164 | 0.9534 | 0.1471 | 0.0 | 0.6927 | 0.0914 | 0.7230 | 0.0 | 0.0008 | 0.9245 | 0.2612 | 0.6589 | 0.3817 | 0.3129 | nan | 0.2753 | 0.6200 | 0.5529 | 0.0 | 0.9531 | 0.8723 | 0.9840 | 0.0115 | 0.2889 | 0.4774 | 0.0 | nan | 0.7693 | 0.8800 | 0.7346 | 0.7522 | 0.3931 | nan | 0.5823 | 0.6294 | 0.0100 | 0.8652 | 0.1280 | 0.0 | 0.4820 | 0.0913 | 0.5905 | 0.0 | 0.0008 | 0.7745 | 0.2181 | 0.5536 | 0.3162 | 0.2624 | nan | 0.2308 | 0.5001 | 0.3704 | 0.0 | 0.8909 | 0.7826 | 0.9502 | 0.0081 | 0.1282 | 0.3862 | 0.0 | | 0.0807 | 26.0 | 5200 | 0.6570 | 0.4128 | 0.4904 | 0.8781 | nan | 0.8506 | 0.9551 | 0.8030 | 0.8504 | 0.5486 | nan | 0.6728 | 0.8390 | 0.0246 | 0.9582 | 0.1389 | 0.0 | 0.7115 | 0.0633 | 0.7197 | 0.0 | 0.0044 | 0.9195 | 0.2165 | 0.6409 | 0.3865 | 0.4242 | nan | 0.2336 | 0.6239 | 0.5434 | 0.0 | 0.9546 | 0.8558 | 0.9826 | 0.0088 | 0.2644 | 0.4987 | 0.0 | nan | 0.7612 | 0.8776 | 0.7084 | 0.7527 | 0.3977 | nan | 0.5819 | 0.6278 | 0.0145 | 0.8635 | 0.1234 | 0.0 | 0.4967 | 0.0632 | 0.5861 | 0.0 | 0.0043 | 0.7734 | 0.1857 | 0.5453 | 0.3198 | 0.3231 | nan | 0.1996 | 0.5033 | 0.3803 | 0.0 | 0.8897 | 0.7765 | 0.9515 | 0.0058 | 0.1138 | 0.3839 | 0.0 | | 0.0989 | 26.1 | 5220 | 0.6766 | 0.4118 | 0.4929 | 0.8761 | nan | 0.8360 | 0.9501 | 0.7735 | 0.8566 | 0.5484 | nan | 0.7156 | 0.7947 | 0.1476 | 0.9545 | 0.1832 | 0.0 | 0.7047 | 0.0546 | 0.7455 | 0.0 | 0.0060 | 0.9283 | 0.1483 | 0.6515 | 0.4213 | 0.4199 | nan | 0.2458 | 0.6013 | 0.5454 | 0.0 | 0.9561 | 0.8676 | 0.9824 | 0.0061 | 0.2510 | 0.4768 | 0.0 | nan | 0.7478 | 0.8754 | 0.7067 | 0.7406 | 0.3945 | nan | 0.5775 | 0.6051 | 0.0539 | 0.8656 | 0.1609 | 0.0 | 0.4929 | 0.0546 | 0.5695 | 0.0 | 0.0059 | 0.7716 | 0.1330 | 0.5479 | 0.3357 | 0.3199 | nan | 0.2093 | 0.5010 | 0.3795 | 0.0 | 0.8896 | 0.7777 | 0.9525 | 0.0044 | 0.1170 | 0.3885 | 0.0 | | 0.0975 | 26.2 | 5240 | 0.6686 | 0.4147 | 0.4966 | 0.8766 | nan | 0.8481 | 0.9563 | 0.7830 | 0.8141 | 0.5606 | nan | 0.7064 | 0.8431 | 0.0839 | 0.9516 | 0.2203 | 0.0 | 0.6956 | 0.0612 | 0.7290 | 0.0 | 0.0090 | 0.9187 | 0.2126 | 0.6371 | 0.4440 | 0.4329 | nan | 0.2754 | 0.6159 | 0.5759 | 0.0 | 0.9525 | 0.8437 | 0.9847 | 0.0022 | 0.2114 | 0.5207 | 0.0 | nan | 0.7606 | 0.8729 | 0.7049 | 0.7360 | 0.3947 | nan | 0.5843 | 0.5929 | 0.0388 | 0.8677 | 0.1858 | 0.0 | 0.4800 | 0.0611 | 0.5925 | 0.0 | 0.0081 | 0.7734 | 0.1887 | 0.5505 | 0.3515 | 0.3299 | nan | 0.2270 | 0.5001 | 0.3717 | 0.0 | 0.8893 | 0.7697 | 0.9519 | 0.0017 | 0.0951 | 0.3903 | 0.0 | | 0.1222 | 26.3 | 5260 | 0.7007 | 0.4112 | 0.4917 | 0.8733 | nan | 0.8112 | 0.9544 | 0.7829 | 0.8609 | 0.5339 | nan | 0.6915 | 0.8542 | 0.0131 | 0.9557 | 0.1905 | 0.0 | 0.7522 | 0.0119 | 0.7064 | 0.0 | 0.0188 | 0.9257 | 0.1751 | 0.6510 | 0.4046 | 0.4786 | nan | 0.2755 | 0.6000 | 0.5597 | 0.0 | 0.9580 | 0.8450 | 0.9834 | 0.0081 | 0.2413 | 0.4900 | 0.0 | nan | 0.7343 | 0.8749 | 0.6988 | 0.6905 | 0.4023 | nan | 0.5853 | 0.6266 | 0.0117 | 0.8660 | 0.1613 | 0.0 | 0.4892 | 0.0119 | 0.6045 | 0.0 | 0.0182 | 0.7709 | 0.1540 | 0.5544 | 0.3348 | 0.3448 | nan | 0.2370 | 0.4994 | 0.3791 | 0.0 | 0.8864 | 0.7669 | 0.9530 | 0.0057 | 0.1116 | 0.3860 | 0.0 | | 0.1183 | 26.4 | 5280 | 0.6994 | 0.4117 | 0.4921 | 0.8735 | nan | 0.8099 | 0.9557 | 0.8170 | 0.8511 | 0.5359 | nan | 0.7151 | 0.8481 | 0.0280 | 0.9527 | 0.1402 | 0.0 | 0.7074 | 0.0186 | 0.7115 | 0.0 | 0.0120 | 0.9337 | 0.1862 | 0.6441 | 0.4187 | 0.4533 | nan | 0.3415 | 0.6097 | 0.5326 | 0.0 | 0.9519 | 0.8405 | 0.9825 | 0.0120 | 0.2630 | 0.4755 | 0.0 | nan | 0.7376 | 0.8751 | 0.6819 | 0.6959 | 0.4059 | nan | 0.5829 | 0.6393 | 0.0226 | 0.8642 | 0.1216 | 0.0 | 0.4882 | 0.0186 | 0.5983 | 0.0 | 0.0119 | 0.7705 | 0.1645 | 0.5578 | 0.3381 | 0.3383 | nan | 0.2654 | 0.5020 | 0.3845 | 0.0 | 0.8874 | 0.7637 | 0.9527 | 0.0081 | 0.1143 | 0.3839 | 0.0 | | 0.0751 | 26.5 | 5300 | 0.6749 | 0.4114 | 0.4891 | 0.8763 | nan | 0.8349 | 0.9526 | 0.7899 | 0.8599 | 0.5438 | nan | 0.7032 | 0.8391 | 0.0881 | 0.9516 | 0.1227 | 0.0 | 0.6688 | 0.0251 | 0.7336 | 0.0 | 0.0 | 0.9283 | 0.1750 | 0.6450 | 0.3704 | 0.4431 | nan | 0.2957 | 0.6250 | 0.5594 | 0.0 | 0.9550 | 0.8531 | 0.9841 | 0.0046 | 0.1953 | 0.5032 | 0.0 | nan | 0.7504 | 0.8752 | 0.7089 | 0.7251 | 0.3993 | nan | 0.5903 | 0.6139 | 0.0498 | 0.8656 | 0.1065 | 0.0 | 0.4799 | 0.0251 | 0.5932 | 0.0 | 0.0 | 0.7705 | 0.1554 | 0.5509 | 0.3146 | 0.3413 | nan | 0.2379 | 0.5074 | 0.3865 | 0.0 | 0.8889 | 0.7779 | 0.9511 | 0.0034 | 0.1035 | 0.3907 | 0.0 | | 0.0861 | 26.6 | 5320 | 0.6729 | 0.4105 | 0.4900 | 0.8744 | nan | 0.8149 | 0.9547 | 0.8105 | 0.8521 | 0.5684 | nan | 0.6942 | 0.8356 | 0.0508 | 0.9531 | 0.1030 | 0.0 | 0.7563 | 0.0308 | 0.7031 | 0.0 | 0.0034 | 0.9281 | 0.2793 | 0.6487 | 0.3257 | 0.4017 | nan | 0.3096 | 0.6137 | 0.5375 | 0.0 | 0.9537 | 0.8597 | 0.9814 | 0.0027 | 0.2138 | 0.4945 | 0.0 | nan | 0.7427 | 0.8728 | 0.6707 | 0.7218 | 0.4035 | nan | 0.5825 | 0.6148 | 0.0303 | 0.8661 | 0.0904 | 0.0 | 0.5141 | 0.0308 | 0.5987 | 0.0 | 0.0034 | 0.7702 | 0.2363 | 0.5536 | 0.2826 | 0.3157 | nan | 0.2324 | 0.5074 | 0.3798 | 0.0 | 0.8882 | 0.7793 | 0.9523 | 0.0021 | 0.1075 | 0.3858 | 0.0 | | 0.1072 | 26.7 | 5340 | 0.6385 | 0.4165 | 0.5051 | 0.8771 | nan | 0.8402 | 0.9509 | 0.8062 | 0.8605 | 0.5389 | nan | 0.7122 | 0.8531 | 0.0539 | 0.9585 | 0.1279 | 0.0 | 0.8263 | 0.0386 | 0.7015 | 0.0 | 0.0135 | 0.9078 | 0.4718 | 0.6603 | 0.3479 | 0.3799 | nan | 0.3560 | 0.6282 | 0.5538 | 0.0 | 0.9560 | 0.8533 | 0.9839 | 0.0029 | 0.2515 | 0.5266 | 0.0 | nan | 0.7579 | 0.8816 | 0.6728 | 0.7196 | 0.3906 | nan | 0.5818 | 0.6118 | 0.0338 | 0.8640 | 0.1143 | 0.0 | 0.5220 | 0.0386 | 0.5987 | 0.0 | 0.0133 | 0.7761 | 0.3066 | 0.5636 | 0.2985 | 0.3097 | nan | 0.2655 | 0.5059 | 0.3742 | 0.0 | 0.8888 | 0.7756 | 0.9515 | 0.0024 | 0.1193 | 0.3895 | 0.0 | | 0.075 | 26.8 | 5360 | 0.6882 | 0.4178 | 0.4994 | 0.8770 | nan | 0.8276 | 0.9542 | 0.8144 | 0.8555 | 0.5746 | nan | 0.6765 | 0.8361 | 0.0220 | 0.9521 | 0.1819 | 0.0 | 0.7861 | 0.0414 | 0.7041 | 0.0 | 0.0025 | 0.9256 | 0.3586 | 0.6814 | 0.3652 | 0.3536 | nan | 0.3433 | 0.6089 | 0.5657 | 0.0 | 0.9546 | 0.8467 | 0.9802 | 0.0030 | 0.2599 | 0.5062 | 0.0 | nan | 0.7502 | 0.8759 | 0.6701 | 0.7418 | 0.3962 | nan | 0.5801 | 0.6557 | 0.0151 | 0.8644 | 0.1547 | 0.0 | 0.5061 | 0.0414 | 0.6178 | 0.0 | 0.0024 | 0.7767 | 0.2725 | 0.5582 | 0.3139 | 0.2876 | nan | 0.2507 | 0.5072 | 0.3787 | 0.0 | 0.8898 | 0.7779 | 0.9536 | 0.0026 | 0.1324 | 0.3968 | 0.0 | | 0.1095 | 26.9 | 5380 | 0.6906 | 0.4114 | 0.4891 | 0.8781 | nan | 0.8369 | 0.9576 | 0.7971 | 0.8501 | 0.5551 | nan | 0.7060 | 0.8135 | 0.0291 | 0.9532 | 0.1185 | 0.0 | 0.6903 | 0.0319 | 0.7166 | 0.0 | 0.0 | 0.9269 | 0.2260 | 0.6456 | 0.3905 | 0.3490 | nan | 0.3174 | 0.6271 | 0.5923 | 0.0 | 0.9549 | 0.8555 | 0.9855 | 0.0044 | 0.2436 | 0.4768 | 0.0 | nan | 0.7561 | 0.8745 | 0.6692 | 0.7556 | 0.4017 | nan | 0.5824 | 0.6476 | 0.0187 | 0.8661 | 0.1019 | 0.0 | 0.4578 | 0.0319 | 0.6070 | 0.0 | 0.0 | 0.7750 | 0.1968 | 0.5555 | 0.3165 | 0.2879 | nan | 0.2456 | 0.5073 | 0.3771 | 0.0 | 0.8899 | 0.7780 | 0.9496 | 0.0039 | 0.1203 | 0.3899 | 0.0 | | 0.1506 | 27.0 | 5400 | 0.6823 | 0.4105 | 0.4968 | 0.8779 | nan | 0.8466 | 0.9546 | 0.7995 | 0.8469 | 0.5295 | nan | 0.7037 | 0.8394 | 0.0635 | 0.9543 | 0.1163 | 0.0 | 0.7633 | 0.0475 | 0.7615 | 0.0 | 0.0 | 0.9358 | 0.1898 | 0.6437 | 0.3914 | 0.4259 | nan | 0.3250 | 0.6204 | 0.5437 | 0.0 | 0.9495 | 0.8699 | 0.9813 | 0.0103 | 0.3204 | 0.4652 | 0.0 | nan | 0.7623 | 0.8762 | 0.6641 | 0.7556 | 0.3897 | nan | 0.5861 | 0.6245 | 0.0368 | 0.8685 | 0.0992 | 0.0 | 0.4243 | 0.0475 | 0.6058 | 0.0 | 0.0 | 0.7695 | 0.1659 | 0.5516 | 0.3248 | 0.3197 | nan | 0.2507 | 0.5065 | 0.3748 | 0.0 | 0.8914 | 0.7819 | 0.9530 | 0.0080 | 0.1189 | 0.3790 | 0.0 | | 0.0938 | 27.1 | 5420 | 0.6557 | 0.4105 | 0.4863 | 0.8819 | nan | 0.8763 | 0.9536 | 0.7881 | 0.8415 | 0.5490 | nan | 0.7220 | 0.8324 | 0.0224 | 0.9549 | 0.1229 | 0.0 | 0.7538 | 0.0387 | 0.7186 | 0.0 | 0.0004 | 0.9286 | 0.1472 | 0.6398 | 0.3440 | 0.3325 | nan | 0.2764 | 0.6080 | 0.5649 | 0.0 | 0.9557 | 0.8681 | 0.9825 | 0.0012 | 0.2378 | 0.5020 | 0.0 | nan | 0.7828 | 0.8807 | 0.6887 | 0.7740 | 0.3939 | nan | 0.5933 | 0.6221 | 0.0134 | 0.8674 | 0.1067 | 0.0 | 0.4857 | 0.0387 | 0.6042 | 0.0 | 0.0004 | 0.7721 | 0.1336 | 0.5439 | 0.2990 | 0.2742 | nan | 0.2357 | 0.5013 | 0.3789 | 0.0 | 0.8894 | 0.7802 | 0.9524 | 0.0011 | 0.1293 | 0.3938 | 0.0 | | 0.0922 | 27.2 | 5440 | 0.6789 | 0.4124 | 0.4937 | 0.8778 | nan | 0.8441 | 0.9528 | 0.7871 | 0.8561 | 0.5597 | nan | 0.7062 | 0.8316 | 0.0127 | 0.9555 | 0.1165 | 0.0 | 0.7839 | 0.0331 | 0.7121 | 0.0 | 0.0036 | 0.9277 | 0.2226 | 0.6475 | 0.3821 | 0.3915 | nan | 0.3323 | 0.6105 | 0.6060 | 0.0 | 0.9571 | 0.8456 | 0.9830 | 0.0124 | 0.2411 | 0.4829 | 0.0 | nan | 0.7593 | 0.8755 | 0.7016 | 0.7540 | 0.3872 | nan | 0.5912 | 0.6258 | 0.0077 | 0.8663 | 0.1008 | 0.0 | 0.4567 | 0.0331 | 0.6051 | 0.0 | 0.0034 | 0.7729 | 0.1901 | 0.5497 | 0.3172 | 0.3099 | nan | 0.2680 | 0.5033 | 0.3734 | 0.0 | 0.8883 | 0.7725 | 0.9521 | 0.0100 | 0.1303 | 0.3909 | 0.0 | | 0.096 | 27.3 | 5460 | 0.6704 | 0.4060 | 0.4903 | 0.8754 | nan | 0.8254 | 0.9533 | 0.7920 | 0.8735 | 0.5246 | nan | 0.6949 | 0.8410 | 0.0581 | 0.9554 | 0.1050 | 0.0 | 0.7937 | 0.0133 | 0.7730 | 0.0 | 0.0007 | 0.9311 | 0.2743 | 0.6469 | 0.3820 | 0.3244 | nan | 0.2645 | 0.6060 | 0.5833 | 0.0 | 0.9549 | 0.8581 | 0.9833 | 0.0101 | 0.2081 | 0.4576 | 0.0 | nan | 0.7435 | 0.8795 | 0.6874 | 0.7072 | 0.3941 | nan | 0.5839 | 0.5840 | 0.0354 | 0.8667 | 0.0918 | 0.0 | 0.4936 | 0.0133 | 0.5650 | 0.0 | 0.0007 | 0.7718 | 0.2335 | 0.5515 | 0.3144 | 0.2713 | nan | 0.2367 | 0.5008 | 0.3740 | 0.0 | 0.8891 | 0.7737 | 0.9509 | 0.0084 | 0.0967 | 0.3735 | 0.0 | | 0.0978 | 27.4 | 5480 | 0.6725 | 0.4047 | 0.4838 | 0.8759 | nan | 0.8290 | 0.9558 | 0.8021 | 0.8683 | 0.4995 | nan | 0.7111 | 0.8203 | 0.0517 | 0.9528 | 0.1147 | 0.0 | 0.6795 | 0.0295 | 0.7583 | 0.0 | 0.0 | 0.9286 | 0.2051 | 0.6574 | 0.3569 | 0.3430 | nan | 0.2604 | 0.6237 | 0.5778 | 0.0 | 0.9572 | 0.8412 | 0.9829 | 0.0104 | 0.1870 | 0.4760 | 0.0 | nan | 0.7490 | 0.8809 | 0.6835 | 0.7109 | 0.3761 | nan | 0.5841 | 0.5973 | 0.0304 | 0.8676 | 0.0987 | 0.0 | 0.4418 | 0.0295 | 0.5886 | 0.0 | 0.0 | 0.7723 | 0.1829 | 0.5593 | 0.3080 | 0.2819 | nan | 0.2282 | 0.5039 | 0.3866 | 0.0 | 0.8879 | 0.7694 | 0.9508 | 0.0081 | 0.0898 | 0.3821 | 0.0 | | 0.1081 | 27.5 | 5500 | 0.6632 | 0.4076 | 0.4835 | 0.8783 | nan | 0.8643 | 0.9516 | 0.7971 | 0.8583 | 0.5222 | nan | 0.7090 | 0.8066 | 0.0313 | 0.9567 | 0.1047 | 0.0 | 0.6207 | 0.0330 | 0.7483 | 0.0 | 0.0 | 0.9300 | 0.2751 | 0.6632 | 0.3831 | 0.2637 | nan | 0.3216 | 0.6242 | 0.5814 | 0.0 | 0.9588 | 0.7810 | 0.9817 | 0.0221 | 0.1954 | 0.4874 | 0.0 | nan | 0.7753 | 0.8867 | 0.6966 | 0.7449 | 0.3831 | nan | 0.5873 | 0.6360 | 0.0203 | 0.8656 | 0.0909 | 0.0 | 0.4102 | 0.0330 | 0.5889 | 0.0 | 0.0 | 0.7705 | 0.2247 | 0.5607 | 0.3196 | 0.2249 | nan | 0.2714 | 0.5076 | 0.3911 | 0.0 | 0.8806 | 0.7286 | 0.9519 | 0.0116 | 0.0938 | 0.3870 | 0.0 | | 0.0712 | 27.6 | 5520 | 0.6932 | 0.4087 | 0.4852 | 0.8768 | nan | 0.8332 | 0.9596 | 0.8028 | 0.8385 | 0.5178 | nan | 0.7153 | 0.8222 | 0.0493 | 0.9471 | 0.1109 | 0.0 | 0.6470 | 0.0600 | 0.7521 | 0.0 | 0.0019 | 0.9282 | 0.2295 | 0.6720 | 0.3390 | 0.2774 | nan | 0.3487 | 0.6283 | 0.5638 | 0.0 | 0.9566 | 0.8371 | 0.9832 | 0.0143 | 0.2063 | 0.4831 | 0.0 | nan | 0.7533 | 0.8757 | 0.6709 | 0.7481 | 0.3994 | nan | 0.5842 | 0.6202 | 0.0310 | 0.8688 | 0.0967 | 0.0 | 0.4354 | 0.0600 | 0.5986 | 0.0 | 0.0018 | 0.7740 | 0.1988 | 0.5605 | 0.2965 | 0.2358 | nan | 0.2775 | 0.5082 | 0.3916 | 0.0 | 0.8852 | 0.7556 | 0.9511 | 0.0095 | 0.1038 | 0.3855 | 0.0 | | 0.1027 | 27.7 | 5540 | 0.6726 | 0.4118 | 0.4918 | 0.8777 | nan | 0.8437 | 0.9552 | 0.7899 | 0.8477 | 0.5416 | nan | 0.7039 | 0.8241 | 0.0552 | 0.9555 | 0.1165 | 0.0 | 0.7872 | 0.0404 | 0.7305 | 0.0 | 0.0 | 0.9259 | 0.2684 | 0.6667 | 0.3302 | 0.3230 | nan | 0.3216 | 0.6274 | 0.5729 | 0.0 | 0.9556 | 0.8468 | 0.9843 | 0.0049 | 0.2345 | 0.4851 | 0.0 | nan | 0.7561 | 0.8770 | 0.6958 | 0.7531 | 0.3973 | nan | 0.5829 | 0.6304 | 0.0335 | 0.8641 | 0.1020 | 0.0 | 0.4693 | 0.0404 | 0.5951 | 0.0 | 0.0 | 0.7749 | 0.2252 | 0.5508 | 0.2920 | 0.2629 | nan | 0.2534 | 0.5062 | 0.3897 | 0.0 | 0.8872 | 0.7672 | 0.9510 | 0.0033 | 0.1268 | 0.3899 | 0.0 | | 0.0892 | 27.8 | 5560 | 0.6352 | 0.4118 | 0.4884 | 0.8813 | nan | 0.8804 | 0.9525 | 0.7938 | 0.8414 | 0.5485 | nan | 0.7087 | 0.8153 | 0.0402 | 0.9562 | 0.1179 | 0.0 | 0.8002 | 0.0301 | 0.6878 | 0.0 | 0.0 | 0.9299 | 0.1701 | 0.6682 | 0.3440 | 0.4775 | nan | 0.2656 | 0.5936 | 0.5302 | 0.0 | 0.9541 | 0.8372 | 0.9861 | 0.0032 | 0.1952 | 0.5014 | 0.0 | nan | 0.7858 | 0.8827 | 0.7003 | 0.7763 | 0.3848 | nan | 0.5858 | 0.6419 | 0.0244 | 0.8634 | 0.1014 | 0.0 | 0.4713 | 0.0301 | 0.5872 | 0.0 | 0.0 | 0.7710 | 0.1535 | 0.5495 | 0.3039 | 0.3359 | nan | 0.2132 | 0.4990 | 0.3878 | 0.0 | 0.8881 | 0.7675 | 0.9497 | 0.0020 | 0.1255 | 0.3961 | 0.0 | | 0.0857 | 27.9 | 5580 | 0.6976 | 0.4086 | 0.4967 | 0.8745 | nan | 0.8227 | 0.9517 | 0.8055 | 0.8587 | 0.5235 | nan | 0.7266 | 0.8383 | 0.0748 | 0.9593 | 0.1181 | 0.0 | 0.7492 | 0.0193 | 0.7157 | 0.0 | 0.0019 | 0.9173 | 0.2253 | 0.6414 | 0.4203 | 0.4937 | nan | 0.2772 | 0.6241 | 0.5605 | 0.0 | 0.9543 | 0.8592 | 0.9834 | 0.0037 | 0.2642 | 0.5051 | 0.0 | nan | 0.7448 | 0.8754 | 0.6542 | 0.7406 | 0.3817 | nan | 0.5784 | 0.6356 | 0.0402 | 0.8633 | 0.1012 | 0.0 | 0.4460 | 0.0193 | 0.5878 | 0.0 | 0.0018 | 0.7743 | 0.1996 | 0.5449 | 0.3332 | 0.3409 | nan | 0.2216 | 0.5022 | 0.3775 | 0.0 | 0.8887 | 0.7681 | 0.9517 | 0.0029 | 0.1119 | 0.3869 | 0.0 | | 0.0972 | 28.0 | 5600 | 0.6871 | 0.4086 | 0.4893 | 0.8766 | nan | 0.8299 | 0.9635 | 0.8034 | 0.8568 | 0.4856 | nan | 0.6668 | 0.8288 | 0.0648 | 0.9496 | 0.1159 | 0.0 | 0.6802 | 0.0382 | 0.7206 | 0.0 | 0.0052 | 0.9283 | 0.2539 | 0.6530 | 0.3671 | 0.4558 | nan | 0.2910 | 0.6120 | 0.5659 | 0.0 | 0.9574 | 0.8490 | 0.9824 | 0.0049 | 0.2732 | 0.4544 | 0.0 | nan | 0.7460 | 0.8773 | 0.6448 | 0.7394 | 0.3866 | nan | 0.5750 | 0.6389 | 0.0358 | 0.8679 | 0.1019 | 0.0 | 0.4226 | 0.0382 | 0.5962 | 0.0 | 0.0051 | 0.7747 | 0.2163 | 0.5552 | 0.3197 | 0.3337 | nan | 0.2179 | 0.5055 | 0.3791 | 0.0 | 0.8878 | 0.7706 | 0.9520 | 0.0039 | 0.1103 | 0.3722 | 0.0 | | 0.0743 | 28.1 | 5620 | 0.6541 | 0.4056 | 0.4848 | 0.8790 | nan | 0.8631 | 0.9537 | 0.8229 | 0.8409 | 0.5408 | nan | 0.6942 | 0.7871 | 0.0371 | 0.9509 | 0.1217 | 0.0 | 0.6734 | 0.0553 | 0.7003 | 0.0 | 0.0110 | 0.9429 | 0.1248 | 0.6635 | 0.3811 | 0.5042 | nan | 0.2089 | 0.6030 | 0.5817 | 0.0 | 0.9534 | 0.8205 | 0.9791 | 0.0077 | 0.2077 | 0.4827 | 0.0 | nan | 0.7753 | 0.8886 | 0.5865 | 0.7629 | 0.3910 | nan | 0.5861 | 0.6408 | 0.0199 | 0.8648 | 0.1062 | 0.0 | 0.4367 | 0.0553 | 0.5790 | 0.0 | 0.0104 | 0.7665 | 0.1147 | 0.5501 | 0.3330 | 0.3413 | nan | 0.1779 | 0.5043 | 0.3899 | 0.0 | 0.8862 | 0.7527 | 0.9532 | 0.0058 | 0.1139 | 0.3876 | 0.0 | | 0.0942 | 28.2 | 5640 | 0.6635 | 0.4081 | 0.4960 | 0.8790 | nan | 0.8525 | 0.9555 | 0.8179 | 0.8496 | 0.5351 | nan | 0.7077 | 0.8176 | 0.0422 | 0.9503 | 0.1236 | 0.0101 | 0.7732 | 0.0344 | 0.7263 | 0.0024 | 0.0073 | 0.9277 | 0.1708 | 0.6411 | 0.4058 | 0.5404 | nan | 0.2374 | 0.6133 | 0.5887 | 0.0 | 0.9534 | 0.8582 | 0.9830 | 0.0130 | 0.2590 | 0.4756 | 0.0 | nan | 0.7672 | 0.8834 | 0.6328 | 0.7648 | 0.4010 | nan | 0.5878 | 0.6325 | 0.0224 | 0.8678 | 0.1067 | 0.0091 | 0.4179 | 0.0344 | 0.5821 | 0.0024 | 0.0070 | 0.7730 | 0.1557 | 0.5549 | 0.3368 | 0.3462 | nan | 0.2009 | 0.5048 | 0.3854 | 0.0 | 0.8885 | 0.7682 | 0.9517 | 0.0087 | 0.0918 | 0.3746 | 0.0 | | 0.0886 | 28.3 | 5660 | 0.6853 | 0.4070 | 0.4902 | 0.8768 | nan | 0.8376 | 0.9575 | 0.8021 | 0.8487 | 0.5298 | nan | 0.7247 | 0.8240 | 0.0681 | 0.9529 | 0.1197 | 0.0025 | 0.7321 | 0.0702 | 0.7406 | 0.0 | 0.0094 | 0.9398 | 0.1876 | 0.6085 | 0.3805 | 0.3841 | nan | 0.2739 | 0.6085 | 0.5558 | 0.0 | 0.9502 | 0.8390 | 0.9827 | 0.0155 | 0.2440 | 0.4976 | 0.0 | nan | 0.7588 | 0.8773 | 0.6697 | 0.7653 | 0.3903 | nan | 0.5894 | 0.6166 | 0.0336 | 0.8655 | 0.1039 | 0.0024 | 0.4033 | 0.0702 | 0.5867 | 0.0 | 0.0092 | 0.7629 | 0.1709 | 0.5351 | 0.3236 | 0.2880 | nan | 0.2085 | 0.5044 | 0.3839 | 0.0 | 0.8877 | 0.7654 | 0.9520 | 0.0106 | 0.0980 | 0.3892 | 0.0 | | 0.0686 | 28.4 | 5680 | 0.6927 | 0.4090 | 0.4976 | 0.8763 | nan | 0.8348 | 0.9564 | 0.8030 | 0.8511 | 0.5345 | nan | 0.7257 | 0.8044 | 0.0892 | 0.9515 | 0.1484 | 0.0079 | 0.7587 | 0.0563 | 0.7866 | 0.0 | 0.0058 | 0.9275 | 0.2435 | 0.6232 | 0.3731 | 0.4607 | nan | 0.2507 | 0.6339 | 0.5748 | 0.0 | 0.9553 | 0.8259 | 0.9828 | 0.0151 | 0.2420 | 0.4988 | 0.0 | nan | 0.7546 | 0.8767 | 0.6618 | 0.7606 | 0.3896 | nan | 0.5871 | 0.6214 | 0.0452 | 0.8665 | 0.1262 | 0.0077 | 0.4076 | 0.0563 | 0.5419 | 0.0 | 0.0057 | 0.7684 | 0.2128 | 0.5385 | 0.3169 | 0.3450 | nan | 0.2047 | 0.5044 | 0.3781 | 0.0 | 0.8854 | 0.7555 | 0.9524 | 0.0115 | 0.1106 | 0.3952 | 0.0 | | 0.0875 | 28.5 | 5700 | 0.6821 | 0.4057 | 0.4935 | 0.8757 | nan | 0.8305 | 0.9575 | 0.8115 | 0.8588 | 0.5157 | nan | 0.7006 | 0.7422 | 0.2954 | 0.9579 | 0.1075 | 0.0196 | 0.8595 | 0.0428 | 0.7431 | 0.0 | 0.0050 | 0.9338 | 0.1709 | 0.6354 | 0.3660 | 0.3553 | nan | 0.2370 | 0.6056 | 0.5617 | 0.0 | 0.9548 | 0.8313 | 0.9851 | 0.0051 | 0.2237 | 0.4795 | 0.0 | nan | 0.7482 | 0.8784 | 0.6547 | 0.7435 | 0.3981 | nan | 0.5876 | 0.5876 | 0.0877 | 0.8614 | 0.0961 | 0.0196 | 0.4713 | 0.0428 | 0.5644 | 0.0 | 0.0050 | 0.7676 | 0.1565 | 0.5372 | 0.3097 | 0.2924 | nan | 0.2009 | 0.5005 | 0.3819 | 0.0 | 0.8853 | 0.7565 | 0.9506 | 0.0041 | 0.1105 | 0.3831 | 0.0 | | 0.1119 | 28.6 | 5720 | 0.6901 | 0.4077 | 0.4900 | 0.8763 | nan | 0.8302 | 0.9586 | 0.7965 | 0.8520 | 0.5337 | nan | 0.7242 | 0.8196 | 0.0748 | 0.9513 | 0.1156 | 0.0012 | 0.7790 | 0.0846 | 0.7200 | 0.0 | 0.0159 | 0.9362 | 0.2396 | 0.6415 | 0.3514 | 0.3662 | nan | 0.2889 | 0.6017 | 0.5220 | 0.0 | 0.9530 | 0.8405 | 0.9830 | 0.0093 | 0.2357 | 0.4550 | 0.0 | nan | 0.7512 | 0.8756 | 0.6802 | 0.7477 | 0.3918 | nan | 0.5922 | 0.5992 | 0.0375 | 0.8659 | 0.1026 | 0.0012 | 0.4171 | 0.0845 | 0.5942 | 0.0 | 0.0158 | 0.7687 | 0.2068 | 0.5397 | 0.3038 | 0.2886 | nan | 0.2047 | 0.5016 | 0.3771 | 0.0 | 0.8878 | 0.7668 | 0.9526 | 0.0076 | 0.1130 | 0.3719 | 0.0 | | 0.065 | 28.7 | 5740 | 0.6797 | 0.4143 | 0.4985 | 0.8774 | nan | 0.8452 | 0.9559 | 0.8006 | 0.8493 | 0.5016 | nan | 0.7138 | 0.8127 | 0.0544 | 0.9576 | 0.1546 | 0.0006 | 0.7711 | 0.1104 | 0.7272 | 0.0 | 0.0075 | 0.9237 | 0.2652 | 0.6554 | 0.4013 | 0.4589 | nan | 0.3079 | 0.6201 | 0.5540 | 0.0 | 0.9557 | 0.8299 | 0.9835 | 0.0105 | 0.2206 | 0.5035 | 0.0 | nan | 0.7559 | 0.8769 | 0.6712 | 0.7529 | 0.4004 | nan | 0.5876 | 0.6260 | 0.0293 | 0.8636 | 0.1338 | 0.0005 | 0.4466 | 0.1104 | 0.5750 | 0.0 | 0.0075 | 0.7745 | 0.2198 | 0.5464 | 0.3310 | 0.3431 | nan | 0.2321 | 0.5006 | 0.3856 | 0.0 | 0.8848 | 0.7497 | 0.9528 | 0.0091 | 0.1076 | 0.3819 | 0.0 | | 0.0934 | 28.8 | 5760 | 0.6465 | 0.4116 | 0.4904 | 0.8790 | nan | 0.8431 | 0.9547 | 0.7972 | 0.8962 | 0.5132 | nan | 0.7213 | 0.7866 | 0.0628 | 0.9542 | 0.2099 | 0.0010 | 0.6565 | 0.0928 | 0.6967 | 0.0 | 0.0017 | 0.9361 | 0.1630 | 0.6224 | 0.3858 | 0.4652 | nan | 0.2589 | 0.6364 | 0.5783 | 0.0 | 0.9554 | 0.8465 | 0.9811 | 0.0046 | 0.2253 | 0.4466 | 0.0 | nan | 0.7549 | 0.8802 | 0.6827 | 0.7726 | 0.4038 | nan | 0.5928 | 0.6084 | 0.0322 | 0.8658 | 0.1788 | 0.0010 | 0.4057 | 0.0927 | 0.5834 | 0.0 | 0.0017 | 0.7648 | 0.1460 | 0.5400 | 0.3277 | 0.3493 | nan | 0.2143 | 0.5050 | 0.3887 | 0.0 | 0.8878 | 0.7628 | 0.9532 | 0.0042 | 0.1013 | 0.3690 | 0.0 | | 0.0913 | 28.9 | 5780 | 0.7101 | 0.4135 | 0.4998 | 0.8756 | nan | 0.8296 | 0.9634 | 0.8012 | 0.8352 | 0.4835 | nan | 0.7196 | 0.8435 | 0.0852 | 0.9564 | 0.2459 | 0.0 | 0.6448 | 0.0940 | 0.7343 | 0.0 | 0.0033 | 0.9281 | 0.2470 | 0.6256 | 0.4120 | 0.5362 | nan | 0.2655 | 0.5973 | 0.6031 | 0.0 | 0.9490 | 0.8486 | 0.9847 | 0.0051 | 0.2759 | 0.4760 | 0.0 | nan | 0.7505 | 0.8698 | 0.6877 | 0.7422 | 0.3776 | nan | 0.5863 | 0.5962 | 0.0473 | 0.8671 | 0.2066 | 0.0 | 0.4120 | 0.0940 | 0.5942 | 0.0 | 0.0032 | 0.7700 | 0.2057 | 0.5449 | 0.3383 | 0.3747 | nan | 0.2026 | 0.5004 | 0.3717 | 0.0 | 0.8905 | 0.7700 | 0.9527 | 0.0047 | 0.1002 | 0.3704 | 0.0 | | 0.072 | 29.0 | 5800 | 0.6968 | 0.4099 | 0.4834 | 0.8762 | nan | 0.8334 | 0.9610 | 0.8071 | 0.8424 | 0.4897 | nan | 0.7077 | 0.8352 | 0.0391 | 0.9520 | 0.1520 | 0.0 | 0.5859 | 0.0863 | 0.7320 | 0.0 | 0.0046 | 0.9333 | 0.1649 | 0.6309 | 0.3745 | 0.4073 | nan | 0.3005 | 0.6135 | 0.5685 | 0.0 | 0.9567 | 0.8291 | 0.9849 | 0.0106 | 0.1745 | 0.4902 | 0.0 | nan | 0.7528 | 0.8734 | 0.6771 | 0.7447 | 0.3839 | nan | 0.5856 | 0.6162 | 0.0255 | 0.8688 | 0.1347 | 0.0 | 0.4398 | 0.0863 | 0.5976 | 0.0 | 0.0046 | 0.7651 | 0.1485 | 0.5473 | 0.3246 | 0.3305 | nan | 0.2484 | 0.4965 | 0.3926 | 0.0 | 0.8861 | 0.7581 | 0.9514 | 0.0094 | 0.0807 | 0.3850 | 0.0 | | 0.0781 | 29.1 | 5820 | 0.6973 | 0.4131 | 0.4886 | 0.8773 | nan | 0.8343 | 0.9599 | 0.7973 | 0.8468 | 0.5317 | nan | 0.7142 | 0.8298 | 0.0164 | 0.9593 | 0.1801 | 0.0 | 0.6026 | 0.0882 | 0.7129 | 0.0 | 0.0032 | 0.9345 | 0.2456 | 0.6484 | 0.3544 | 0.4435 | nan | 0.3119 | 0.6138 | 0.5755 | 0.0 | 0.9525 | 0.8419 | 0.9814 | 0.0059 | 0.1920 | 0.4572 | 0.0 | nan | 0.7539 | 0.8760 | 0.6968 | 0.7523 | 0.3918 | nan | 0.5919 | 0.6012 | 0.0116 | 0.8643 | 0.1596 | 0.0 | 0.4408 | 0.0882 | 0.5925 | 0.0 | 0.0032 | 0.7668 | 0.2068 | 0.5522 | 0.3123 | 0.3525 | nan | 0.2465 | 0.4988 | 0.3882 | 0.0 | 0.8884 | 0.7642 | 0.9537 | 0.0050 | 0.0883 | 0.3717 | 0.0 | | 0.0882 | 29.2 | 5840 | 0.6949 | 0.4107 | 0.4873 | 0.8777 | nan | 0.8417 | 0.9601 | 0.7950 | 0.8391 | 0.5112 | nan | 0.7123 | 0.8305 | 0.0195 | 0.9536 | 0.1604 | 0.0 | 0.6562 | 0.0472 | 0.7099 | 0.0 | 0.0 | 0.9325 | 0.1996 | 0.6680 | 0.3698 | 0.5032 | nan | 0.2998 | 0.5804 | 0.5492 | 0.0 | 0.9570 | 0.8352 | 0.9825 | 0.0130 | 0.1928 | 0.4749 | 0.0 | nan | 0.7550 | 0.8757 | 0.6943 | 0.7509 | 0.4055 | nan | 0.5837 | 0.6050 | 0.0133 | 0.8664 | 0.1407 | 0.0 | 0.4336 | 0.0472 | 0.5829 | 0.0 | 0.0 | 0.7708 | 0.1744 | 0.5551 | 0.3162 | 0.3544 | nan | 0.2495 | 0.4980 | 0.3864 | 0.0 | 0.8861 | 0.7591 | 0.9531 | 0.0100 | 0.1005 | 0.3753 | 0.0 | | 0.0707 | 29.3 | 5860 | 0.6947 | 0.4112 | 0.4895 | 0.8774 | nan | 0.8247 | 0.9609 | 0.7985 | 0.8580 | 0.5161 | nan | 0.7280 | 0.7997 | 0.0293 | 0.9579 | 0.1443 | 0.0 | 0.6550 | 0.0522 | 0.6908 | 0.0 | 0.0013 | 0.9334 | 0.2002 | 0.6496 | 0.3956 | 0.4688 | nan | 0.2862 | 0.6281 | 0.5941 | 0.0 | 0.9545 | 0.8436 | 0.9820 | 0.0123 | 0.2338 | 0.4644 | 0.0 | nan | 0.7498 | 0.8752 | 0.6773 | 0.7486 | 0.4050 | nan | 0.5895 | 0.6463 | 0.0182 | 0.8622 | 0.1285 | 0.0 | 0.4120 | 0.0521 | 0.5773 | 0.0 | 0.0013 | 0.7698 | 0.1752 | 0.5495 | 0.3277 | 0.3557 | nan | 0.2325 | 0.5066 | 0.3845 | 0.0 | 0.8877 | 0.7646 | 0.9535 | 0.0102 | 0.1159 | 0.3829 | 0.0 | | 0.0808 | 29.4 | 5880 | 0.7035 | 0.4153 | 0.5005 | 0.8777 | nan | 0.8409 | 0.9609 | 0.8013 | 0.8432 | 0.5292 | nan | 0.6840 | 0.8192 | 0.0704 | 0.9569 | 0.1940 | 0.0002 | 0.7892 | 0.0759 | 0.6909 | 0.0 | 0.0115 | 0.9330 | 0.2708 | 0.6312 | 0.3761 | 0.4916 | nan | 0.3595 | 0.6176 | 0.5648 | 0.0 | 0.9512 | 0.8414 | 0.9804 | 0.0128 | 0.2294 | 0.4890 | 0.0 | nan | 0.7548 | 0.8765 | 0.6734 | 0.7502 | 0.4074 | nan | 0.5854 | 0.6300 | 0.0378 | 0.8660 | 0.1712 | 0.0002 | 0.3926 | 0.0759 | 0.5810 | 0.0 | 0.0114 | 0.7675 | 0.2253 | 0.5374 | 0.3222 | 0.3649 | nan | 0.2490 | 0.5043 | 0.3850 | 0.0 | 0.8883 | 0.7650 | 0.9545 | 0.0100 | 0.1134 | 0.3900 | 0.0 | | 0.0959 | 29.5 | 5900 | 0.6633 | 0.4123 | 0.4992 | 0.8769 | nan | 0.8516 | 0.9532 | 0.8257 | 0.8463 | 0.5411 | nan | 0.7325 | 0.7945 | 0.0996 | 0.9575 | 0.1970 | 0.0 | 0.8361 | 0.0722 | 0.7040 | 0.0 | 0.0015 | 0.9325 | 0.1979 | 0.6232 | 0.3559 | 0.4814 | nan | 0.3650 | 0.6066 | 0.5344 | 0.0 | 0.9574 | 0.7861 | 0.9828 | 0.0202 | 0.2246 | 0.4946 | 0.0 | nan | 0.7671 | 0.8815 | 0.6558 | 0.7607 | 0.4057 | nan | 0.5869 | 0.6192 | 0.0441 | 0.8637 | 0.1692 | 0.0 | 0.4204 | 0.0721 | 0.5866 | 0.0 | 0.0014 | 0.7655 | 0.1765 | 0.5306 | 0.3051 | 0.3484 | nan | 0.2809 | 0.5017 | 0.3784 | 0.0 | 0.8789 | 0.7286 | 0.9534 | 0.0144 | 0.1115 | 0.3860 | 0.0 | | 0.0771 | 29.6 | 5920 | 0.6884 | 0.4142 | 0.5007 | 0.8764 | nan | 0.8283 | 0.9584 | 0.8123 | 0.8427 | 0.5157 | nan | 0.7270 | 0.8310 | 0.0579 | 0.9523 | 0.2168 | 0.0 | 0.8208 | 0.0862 | 0.7416 | 0.0 | 0.0 | 0.9275 | 0.1550 | 0.6739 | 0.3409 | 0.5263 | nan | 0.3536 | 0.6307 | 0.5499 | 0.0 | 0.9567 | 0.8381 | 0.9828 | 0.0116 | 0.2159 | 0.4682 | 0.0 | nan | 0.7503 | 0.8739 | 0.6630 | 0.7414 | 0.4013 | nan | 0.5855 | 0.6113 | 0.0326 | 0.8692 | 0.1840 | 0.0 | 0.4372 | 0.0862 | 0.5978 | 0.0 | 0.0 | 0.7733 | 0.1416 | 0.5535 | 0.2984 | 0.3676 | nan | 0.2926 | 0.5067 | 0.3830 | 0.0 | 0.8852 | 0.7577 | 0.9536 | 0.0100 | 0.1185 | 0.3791 | 0.0 | | 0.0705 | 29.7 | 5940 | 0.6747 | 0.4181 | 0.5036 | 0.8780 | nan | 0.8426 | 0.9584 | 0.7923 | 0.8350 | 0.5360 | nan | 0.7188 | 0.8237 | 0.0655 | 0.9586 | 0.2272 | 0.0001 | 0.8067 | 0.0805 | 0.7526 | 0.0 | 0.0 | 0.9323 | 0.2434 | 0.6474 | 0.3673 | 0.5485 | nan | 0.3330 | 0.6053 | 0.5587 | 0.0 | 0.9538 | 0.8479 | 0.9846 | 0.0074 | 0.2233 | 0.4627 | 0.0 | nan | 0.7568 | 0.8740 | 0.6958 | 0.7447 | 0.4053 | nan | 0.5883 | 0.6244 | 0.0361 | 0.8644 | 0.1895 | 0.0001 | 0.4437 | 0.0805 | 0.5893 | 0.0 | 0.0 | 0.7708 | 0.2032 | 0.5473 | 0.3117 | 0.3816 | nan | 0.2713 | 0.5021 | 0.3805 | 0.0 | 0.8884 | 0.7714 | 0.9529 | 0.0068 | 0.1211 | 0.3776 | 0.0 | | 0.0472 | 29.8 | 5960 | 0.7109 | 0.4086 | 0.4956 | 0.8758 | nan | 0.8353 | 0.9621 | 0.8017 | 0.8276 | 0.5187 | nan | 0.7011 | 0.8241 | 0.1174 | 0.9530 | 0.1418 | 0.0 | 0.7473 | 0.0818 | 0.7646 | 0.0 | 0.0 | 0.9265 | 0.1480 | 0.6075 | 0.4195 | 0.5112 | nan | 0.2423 | 0.6126 | 0.5446 | 0.0 | 0.9582 | 0.8396 | 0.9838 | 0.0135 | 0.3227 | 0.4524 | 0.0 | nan | 0.7546 | 0.8730 | 0.6628 | 0.7390 | 0.4098 | nan | 0.5845 | 0.6204 | 0.0569 | 0.8708 | 0.1223 | 0.0 | 0.4144 | 0.0818 | 0.5821 | 0.0 | 0.0 | 0.7687 | 0.1324 | 0.5337 | 0.3295 | 0.3688 | nan | 0.2083 | 0.5037 | 0.3762 | 0.0 | 0.8858 | 0.7640 | 0.9523 | 0.0120 | 0.1100 | 0.3565 | 0.0 | | 0.0856 | 29.9 | 5980 | 0.7068 | 0.4075 | 0.4915 | 0.8769 | nan | 0.8328 | 0.9616 | 0.7846 | 0.8604 | 0.5267 | nan | 0.7001 | 0.8317 | 0.0490 | 0.9578 | 0.1086 | 0.0 | 0.7661 | 0.0397 | 0.6910 | 0.0 | 0.0025 | 0.9349 | 0.2551 | 0.6181 | 0.3754 | 0.4470 | nan | 0.2880 | 0.6232 | 0.5678 | 0.0 | 0.9521 | 0.8409 | 0.9837 | 0.0102 | 0.2774 | 0.4412 | 0.0 | nan | 0.7531 | 0.8765 | 0.6674 | 0.7471 | 0.4108 | nan | 0.5886 | 0.6380 | 0.0292 | 0.8636 | 0.0955 | 0.0 | 0.4072 | 0.0397 | 0.5824 | 0.0 | 0.0025 | 0.7642 | 0.2054 | 0.5298 | 0.3210 | 0.3295 | nan | 0.2042 | 0.5070 | 0.3838 | 0.0 | 0.8882 | 0.7660 | 0.9520 | 0.0091 | 0.1154 | 0.3630 | 0.0 | | 0.0632 | 30.0 | 6000 | 0.6955 | 0.4046 | 0.4834 | 0.8758 | nan | 0.8442 | 0.9537 | 0.7839 | 0.8517 | 0.5396 | nan | 0.7162 | 0.8110 | 0.0400 | 0.9540 | 0.1004 | 0.0 | 0.7228 | 0.0127 | 0.6716 | 0.0 | 0.0072 | 0.9378 | 0.1141 | 0.6040 | 0.3381 | 0.4786 | nan | 0.3070 | 0.6319 | 0.5860 | 0.0 | 0.9547 | 0.8347 | 0.9842 | 0.0080 | 0.2084 | 0.4721 | 0.0 | nan | 0.7577 | 0.8775 | 0.6634 | 0.7512 | 0.3930 | nan | 0.5910 | 0.6374 | 0.0219 | 0.8630 | 0.0879 | 0.0 | 0.4588 | 0.0127 | 0.5791 | 0.0 | 0.0070 | 0.7596 | 0.1049 | 0.5244 | 0.2932 | 0.3452 | nan | 0.2441 | 0.5018 | 0.3876 | 0.0 | 0.8857 | 0.7614 | 0.9505 | 0.0066 | 0.1052 | 0.3755 | 0.0 | | 0.0753 | 30.1 | 6020 | 0.6949 | 0.4082 | 0.4973 | 0.8758 | nan | 0.8345 | 0.9571 | 0.8362 | 0.8264 | 0.5438 | nan | 0.7323 | 0.8108 | 0.1971 | 0.9566 | 0.1114 | 0.0 | 0.7618 | 0.0325 | 0.7024 | 0.0 | 0.0191 | 0.9319 | 0.1148 | 0.6197 | 0.3594 | 0.5067 | nan | 0.3401 | 0.6437 | 0.5648 | 0.0 | 0.9520 | 0.8443 | 0.9797 | 0.0129 | 0.2513 | 0.4698 | 0.0 | nan | 0.7561 | 0.8772 | 0.6487 | 0.7454 | 0.4034 | nan | 0.5893 | 0.6161 | 0.0728 | 0.8618 | 0.0976 | 0.0 | 0.4558 | 0.0325 | 0.5780 | 0.0 | 0.0188 | 0.7642 | 0.1055 | 0.5314 | 0.3079 | 0.3534 | nan | 0.2505 | 0.5066 | 0.3820 | 0.0 | 0.8864 | 0.7637 | 0.9524 | 0.0108 | 0.1209 | 0.3744 | 0.0 | | 0.1257 | 30.2 | 6040 | 0.6917 | 0.4070 | 0.4841 | 0.8779 | nan | 0.8493 | 0.9572 | 0.8337 | 0.8431 | 0.5213 | nan | 0.6946 | 0.7666 | 0.1476 | 0.9528 | 0.1400 | 0.0 | 0.6802 | 0.0394 | 0.7341 | 0.0 | 0.0169 | 0.9393 | 0.0977 | 0.6252 | 0.3417 | 0.4498 | nan | 0.2825 | 0.6021 | 0.5489 | 0.0 | 0.9565 | 0.8478 | 0.9835 | 0.0138 | 0.1719 | 0.4552 | 0.0 | nan | 0.7609 | 0.8821 | 0.6538 | 0.7509 | 0.3998 | nan | 0.5869 | 0.6337 | 0.0614 | 0.8635 | 0.1233 | 0.0 | 0.4218 | 0.0394 | 0.5784 | 0.0 | 0.0168 | 0.7609 | 0.0906 | 0.5322 | 0.3005 | 0.3418 | nan | 0.2392 | 0.4983 | 0.3973 | 0.0 | 0.8847 | 0.7660 | 0.9524 | 0.0114 | 0.1011 | 0.3764 | 0.0 | | 0.1099 | 30.3 | 6060 | 0.6502 | 0.4117 | 0.4958 | 0.8784 | nan | 0.8603 | 0.9469 | 0.8447 | 0.8613 | 0.5677 | nan | 0.6992 | 0.8168 | 0.0617 | 0.9599 | 0.1942 | 0.0 | 0.7344 | 0.0195 | 0.6950 | 0.0 | 0.0363 | 0.9261 | 0.1359 | 0.6417 | 0.3702 | 0.4821 | nan | 0.2626 | 0.6125 | 0.5872 | 0.0 | 0.9514 | 0.8377 | 0.9843 | 0.0076 | 0.2688 | 0.4986 | 0.0 | nan | 0.7764 | 0.8886 | 0.6787 | 0.7535 | 0.3592 | nan | 0.5988 | 0.6382 | 0.0345 | 0.8602 | 0.1687 | 0.0 | 0.4401 | 0.0195 | 0.5770 | 0.0 | 0.0358 | 0.7705 | 0.1235 | 0.5355 | 0.3169 | 0.3607 | nan | 0.2241 | 0.5019 | 0.3836 | 0.0 | 0.8868 | 0.7598 | 0.9519 | 0.0056 | 0.1383 | 0.3852 | 0.0 | | 0.0772 | 30.4 | 6080 | 0.6619 | 0.4126 | 0.4875 | 0.8790 | nan | 0.8764 | 0.9468 | 0.8257 | 0.8391 | 0.5504 | nan | 0.7166 | 0.8301 | 0.0395 | 0.9459 | 0.1677 | 0.0 | 0.5851 | 0.0885 | 0.7132 | 0.0 | 0.0414 | 0.9361 | 0.1862 | 0.6286 | 0.3409 | 0.3462 | nan | 0.2937 | 0.6241 | 0.5592 | 0.0 | 0.9572 | 0.8257 | 0.9805 | 0.0118 | 0.2612 | 0.4819 | 0.0 | nan | 0.7804 | 0.8881 | 0.7290 | 0.7596 | 0.3569 | nan | 0.5928 | 0.6354 | 0.0243 | 0.8688 | 0.1465 | 0.0 | 0.4213 | 0.0884 | 0.5870 | 0.0 | 0.0406 | 0.7637 | 0.1650 | 0.5315 | 0.3011 | 0.2810 | nan | 0.2343 | 0.5056 | 0.3912 | 0.0 | 0.8844 | 0.7534 | 0.9542 | 0.0080 | 0.1283 | 0.3817 | 0.0 | | 0.0667 | 30.5 | 6100 | 0.6734 | 0.4156 | 0.4975 | 0.8768 | nan | 0.8659 | 0.9488 | 0.8169 | 0.8425 | 0.5445 | nan | 0.7401 | 0.8129 | 0.1878 | 0.9561 | 0.1335 | 0.0 | 0.6130 | 0.0796 | 0.7363 | 0.0 | 0.0373 | 0.9295 | 0.2599 | 0.5996 | 0.3926 | 0.4041 | nan | 0.3321 | 0.6299 | 0.5870 | 0.0 | 0.9605 | 0.7830 | 0.9819 | 0.0121 | 0.2828 | 0.4500 | 0.0 | nan | 0.7741 | 0.8840 | 0.7218 | 0.7529 | 0.3690 | nan | 0.5993 | 0.6478 | 0.0927 | 0.8646 | 0.1188 | 0.0 | 0.4250 | 0.0795 | 0.5757 | 0.0 | 0.0369 | 0.7654 | 0.2164 | 0.5279 | 0.3266 | 0.3253 | nan | 0.2681 | 0.5043 | 0.3908 | 0.0 | 0.8761 | 0.7178 | 0.9529 | 0.0091 | 0.1090 | 0.3682 | 0.0 | | 0.0914 | 30.6 | 6120 | 0.6724 | 0.4115 | 0.4907 | 0.8772 | nan | 0.8487 | 0.9539 | 0.8272 | 0.8489 | 0.5201 | nan | 0.7346 | 0.8388 | 0.0661 | 0.9553 | 0.1135 | 0.0 | 0.6458 | 0.0374 | 0.7067 | 0.0 | 0.0146 | 0.9313 | 0.2638 | 0.6162 | 0.3748 | 0.4614 | nan | 0.2940 | 0.6111 | 0.5433 | 0.0 | 0.9517 | 0.8375 | 0.9826 | 0.0122 | 0.2292 | 0.4805 | 0.0 | nan | 0.7623 | 0.8822 | 0.6902 | 0.7479 | 0.3890 | nan | 0.5967 | 0.6272 | 0.0419 | 0.8660 | 0.1011 | 0.0 | 0.4417 | 0.0374 | 0.5728 | 0.0 | 0.0145 | 0.7672 | 0.2226 | 0.5318 | 0.3195 | 0.3483 | nan | 0.2485 | 0.5019 | 0.3951 | 0.0 | 0.8840 | 0.7441 | 0.9528 | 0.0088 | 0.0989 | 0.3731 | 0.0 | | 0.0661 | 30.7 | 6140 | 0.7170 | 0.4061 | 0.4897 | 0.8731 | nan | 0.8210 | 0.9529 | 0.8176 | 0.8697 | 0.5278 | nan | 0.7301 | 0.8420 | 0.0351 | 0.9562 | 0.1284 | 0.0 | 0.6482 | 0.0361 | 0.7150 | 0.0 | 0.0179 | 0.9353 | 0.2255 | 0.6233 | 0.3457 | 0.5291 | nan | 0.2833 | 0.6036 | 0.5790 | 0.0 | 0.9537 | 0.8027 | 0.9850 | 0.0134 | 0.2334 | 0.4607 | 0.0 | nan | 0.7420 | 0.8802 | 0.6737 | 0.7257 | 0.3774 | nan | 0.5939 | 0.6169 | 0.0226 | 0.8641 | 0.1118 | 0.0 | 0.4206 | 0.0361 | 0.5798 | 0.0 | 0.0178 | 0.7671 | 0.1958 | 0.5363 | 0.2960 | 0.3592 | nan | 0.2366 | 0.4952 | 0.3927 | 0.0 | 0.8805 | 0.7368 | 0.9513 | 0.0107 | 0.1042 | 0.3705 | 0.0 | | 0.0741 | 30.8 | 6160 | 0.7035 | 0.4059 | 0.4835 | 0.8739 | nan | 0.8332 | 0.9534 | 0.8044 | 0.8594 | 0.5470 | nan | 0.6966 | 0.8466 | 0.0264 | 0.9513 | 0.1178 | 0.0 | 0.6525 | 0.0279 | 0.6862 | 0.0 | 0.0111 | 0.9339 | 0.2435 | 0.6108 | 0.3731 | 0.4133 | nan | 0.2855 | 0.6059 | 0.5583 | 0.0 | 0.9556 | 0.8048 | 0.9822 | 0.0117 | 0.2116 | 0.4694 | 0.0 | nan | 0.7500 | 0.8788 | 0.6764 | 0.7474 | 0.3757 | nan | 0.5922 | 0.6360 | 0.0169 | 0.8647 | 0.1031 | 0.0 | 0.4324 | 0.0278 | 0.5708 | 0.0 | 0.0111 | 0.7611 | 0.2063 | 0.5166 | 0.3174 | 0.3225 | nan | 0.2321 | 0.4967 | 0.3917 | 0.0 | 0.8810 | 0.7370 | 0.9529 | 0.0099 | 0.1061 | 0.3742 | 0.0 | | 0.058 | 30.9 | 6180 | 0.6949 | 0.4057 | 0.4874 | 0.8758 | nan | 0.8500 | 0.9529 | 0.8062 | 0.8529 | 0.5494 | nan | 0.7168 | 0.8530 | 0.0539 | 0.9533 | 0.1434 | 0.0 | 0.6919 | 0.0333 | 0.7170 | 0.0 | 0.0029 | 0.9345 | 0.1742 | 0.5842 | 0.3866 | 0.4719 | nan | 0.2287 | 0.6061 | 0.5550 | 0.0 | 0.9526 | 0.8203 | 0.9840 | 0.0032 | 0.2581 | 0.4611 | 0.0 | nan | 0.7621 | 0.8822 | 0.6865 | 0.7517 | 0.3810 | nan | 0.5951 | 0.6290 | 0.0340 | 0.8645 | 0.1196 | 0.0 | 0.4348 | 0.0333 | 0.5701 | 0.0 | 0.0029 | 0.7594 | 0.1569 | 0.5025 | 0.3272 | 0.3516 | nan | 0.1977 | 0.4968 | 0.3827 | 0.0 | 0.8834 | 0.7426 | 0.9522 | 0.0028 | 0.1132 | 0.3673 | 0.0 | | 0.0769 | 31.0 | 6200 | 0.7021 | 0.4053 | 0.4841 | 0.8771 | nan | 0.8492 | 0.9555 | 0.8176 | 0.8517 | 0.5460 | nan | 0.6977 | 0.8309 | 0.0346 | 0.9528 | 0.1484 | 0.0 | 0.6787 | 0.0428 | 0.7329 | 0.0 | 0.0019 | 0.9380 | 0.1775 | 0.5846 | 0.3760 | 0.3971 | nan | 0.2414 | 0.5944 | 0.5549 | 0.0 | 0.9527 | 0.8520 | 0.9845 | 0.0045 | 0.2571 | 0.4371 | 0.0 | nan | 0.7645 | 0.8833 | 0.6655 | 0.7506 | 0.3847 | nan | 0.5952 | 0.6331 | 0.0236 | 0.8665 | 0.1226 | 0.0 | 0.4309 | 0.0428 | 0.5769 | 0.0 | 0.0018 | 0.7567 | 0.1589 | 0.5088 | 0.3230 | 0.3165 | nan | 0.2053 | 0.4955 | 0.3887 | 0.0 | 0.8882 | 0.7661 | 0.9518 | 0.0039 | 0.1047 | 0.3583 | 0.0 | | 0.1001 | 31.1 | 6220 | 0.6853 | 0.4093 | 0.4854 | 0.8775 | nan | 0.8416 | 0.9577 | 0.8128 | 0.8535 | 0.5338 | nan | 0.7194 | 0.8196 | 0.0204 | 0.9562 | 0.1343 | 0.0 | 0.6562 | 0.0496 | 0.7100 | 0.0 | 0.0097 | 0.9281 | 0.1991 | 0.6091 | 0.3812 | 0.4154 | nan | 0.2431 | 0.6048 | 0.5839 | 0.0 | 0.9558 | 0.8400 | 0.9842 | 0.0073 | 0.2275 | 0.4778 | 0.0 | nan | 0.7583 | 0.8790 | 0.6821 | 0.7485 | 0.3949 | nan | 0.5976 | 0.6516 | 0.0143 | 0.8642 | 0.1142 | 0.0 | 0.4418 | 0.0496 | 0.5859 | 0.0 | 0.0096 | 0.7646 | 0.1751 | 0.5216 | 0.3246 | 0.3335 | nan | 0.1983 | 0.4977 | 0.3874 | 0.0 | 0.8879 | 0.7664 | 0.9515 | 0.0064 | 0.1118 | 0.3783 | 0.0 | | 0.0825 | 31.2 | 6240 | 0.7171 | 0.4077 | 0.4846 | 0.8742 | nan | 0.8166 | 0.9577 | 0.8072 | 0.8654 | 0.5347 | nan | 0.7027 | 0.8295 | 0.0439 | 0.9538 | 0.1191 | 0.0 | 0.6460 | 0.0475 | 0.7103 | 0.0 | 0.0139 | 0.9326 | 0.1952 | 0.6114 | 0.3621 | 0.3947 | nan | 0.2462 | 0.6264 | 0.5722 | 0.0 | 0.9531 | 0.8397 | 0.9826 | 0.0105 | 0.2558 | 0.4754 | 0.0 | nan | 0.7417 | 0.8761 | 0.7062 | 0.7212 | 0.3914 | nan | 0.5990 | 0.6442 | 0.0283 | 0.8653 | 0.1020 | 0.0 | 0.4481 | 0.0475 | 0.5948 | 0.0 | 0.0138 | 0.7624 | 0.1714 | 0.5208 | 0.3110 | 0.3095 | nan | 0.2061 | 0.5013 | 0.3833 | 0.0 | 0.8875 | 0.7596 | 0.9522 | 0.0089 | 0.1172 | 0.3768 | 0.0 | | 0.0719 | 31.3 | 6260 | 0.7227 | 0.4029 | 0.4776 | 0.8724 | nan | 0.8252 | 0.9572 | 0.8001 | 0.8361 | 0.5346 | nan | 0.7171 | 0.8383 | 0.0841 | 0.9517 | 0.1249 | 0.0004 | 0.6682 | 0.0438 | 0.7044 | 0.0 | 0.0090 | 0.9348 | 0.1377 | 0.5864 | 0.3301 | 0.3852 | nan | 0.2310 | 0.5876 | 0.5513 | 0.0 | 0.9574 | 0.8105 | 0.9836 | 0.0052 | 0.1920 | 0.4961 | 0.0 | nan | 0.7465 | 0.8720 | 0.6935 | 0.7138 | 0.3899 | nan | 0.5954 | 0.6378 | 0.0479 | 0.8665 | 0.1055 | 0.0004 | 0.4430 | 0.0438 | 0.5924 | 0.0 | 0.0089 | 0.7560 | 0.1232 | 0.5036 | 0.2936 | 0.3103 | nan | 0.1915 | 0.4936 | 0.3802 | 0.0 | 0.8833 | 0.7456 | 0.9516 | 0.0045 | 0.1095 | 0.3888 | 0.0 | | 0.0706 | 31.4 | 6280 | 0.7254 | 0.4055 | 0.4860 | 0.8726 | nan | 0.8178 | 0.9526 | 0.8173 | 0.8460 | 0.5517 | nan | 0.7147 | 0.8187 | 0.0855 | 0.9556 | 0.1163 | 0.0048 | 0.6842 | 0.0582 | 0.7201 | 0.0 | 0.0240 | 0.9390 | 0.1200 | 0.5932 | 0.3625 | 0.4347 | nan | 0.2568 | 0.5897 | 0.5849 | 0.0 | 0.9546 | 0.8502 | 0.9816 | 0.0098 | 0.2429 | 0.4652 | 0.0 | nan | 0.7442 | 0.8756 | 0.6588 | 0.7151 | 0.3918 | nan | 0.5943 | 0.6401 | 0.0474 | 0.8644 | 0.0997 | 0.0046 | 0.4438 | 0.0581 | 0.5896 | 0.0 | 0.0238 | 0.7558 | 0.1079 | 0.5100 | 0.3098 | 0.3399 | nan | 0.2168 | 0.4966 | 0.3814 | 0.0 | 0.8874 | 0.7607 | 0.9527 | 0.0083 | 0.1194 | 0.3773 | 0.0 | | 0.0874 | 31.5 | 6300 | 0.7268 | 0.4020 | 0.4789 | 0.8717 | nan | 0.8095 | 0.9606 | 0.8034 | 0.8561 | 0.5209 | nan | 0.7001 | 0.8063 | 0.0546 | 0.9517 | 0.1006 | 0.0039 | 0.6399 | 0.0730 | 0.7356 | 0.0 | 0.0078 | 0.9383 | 0.1055 | 0.5675 | 0.3776 | 0.4526 | nan | 0.2018 | 0.5976 | 0.5706 | 0.0 | 0.9536 | 0.8297 | 0.9801 | 0.0118 | 0.2338 | 0.4798 | 0.0 | nan | 0.7348 | 0.8737 | 0.6748 | 0.7096 | 0.3915 | nan | 0.5888 | 0.6465 | 0.0313 | 0.8675 | 0.0893 | 0.0037 | 0.4096 | 0.0730 | 0.5995 | 0.0 | 0.0078 | 0.7556 | 0.0964 | 0.5046 | 0.3131 | 0.3490 | nan | 0.1735 | 0.4953 | 0.3811 | 0.0 | 0.8872 | 0.7585 | 0.9528 | 0.0095 | 0.1093 | 0.3771 | 0.0 | | 0.1078 | 31.6 | 6320 | 0.7038 | 0.4033 | 0.4803 | 0.8737 | nan | 0.8193 | 0.9549 | 0.8142 | 0.8598 | 0.5280 | nan | 0.7177 | 0.8195 | 0.0581 | 0.9601 | 0.1067 | 0.0096 | 0.6746 | 0.0319 | 0.7294 | 0.0 | 0.0 | 0.9362 | 0.1320 | 0.6057 | 0.3791 | 0.3374 | nan | 0.2643 | 0.5980 | 0.5489 | 0.0 | 0.9565 | 0.8236 | 0.9834 | 0.0105 | 0.2318 | 0.4768 | 0.0 | nan | 0.7444 | 0.8757 | 0.6466 | 0.7298 | 0.4080 | nan | 0.5887 | 0.6389 | 0.0335 | 0.8631 | 0.0937 | 0.0093 | 0.4737 | 0.0319 | 0.5699 | 0.0 | 0.0 | 0.7627 | 0.1200 | 0.5175 | 0.3115 | 0.2830 | nan | 0.2302 | 0.4973 | 0.3836 | 0.0 | 0.8837 | 0.7454 | 0.9522 | 0.0086 | 0.1224 | 0.3814 | 0.0 | | 0.0711 | 31.7 | 6340 | 0.7174 | 0.4066 | 0.4866 | 0.8736 | nan | 0.8243 | 0.9516 | 0.8095 | 0.8656 | 0.5432 | nan | 0.7135 | 0.8475 | 0.0550 | 0.9579 | 0.1080 | 0.0546 | 0.7416 | 0.0321 | 0.7189 | 0.0 | 0.0 | 0.9370 | 0.1802 | 0.6009 | 0.3674 | 0.3297 | nan | 0.2785 | 0.6281 | 0.5601 | 0.0 | 0.9530 | 0.8254 | 0.9845 | 0.0107 | 0.2219 | 0.4711 | 0.0 | nan | 0.7459 | 0.8751 | 0.6467 | 0.7309 | 0.3992 | nan | 0.5923 | 0.6307 | 0.0327 | 0.8658 | 0.0929 | 0.0520 | 0.4931 | 0.0321 | 0.5788 | 0.0 | 0.0 | 0.7610 | 0.1603 | 0.5105 | 0.3171 | 0.2766 | nan | 0.2344 | 0.5053 | 0.3758 | 0.0 | 0.8858 | 0.7519 | 0.9518 | 0.0092 | 0.1202 | 0.3818 | 0.0 | | 0.0753 | 31.8 | 6360 | 0.7175 | 0.4085 | 0.4870 | 0.8750 | nan | 0.8346 | 0.9524 | 0.7899 | 0.8642 | 0.5514 | nan | 0.7120 | 0.8405 | 0.0501 | 0.9590 | 0.1016 | 0.0091 | 0.6982 | 0.0294 | 0.6857 | 0.0 | 0.0012 | 0.9302 | 0.1419 | 0.6116 | 0.3700 | 0.5232 | nan | 0.2718 | 0.6032 | 0.5396 | 0.0 | 0.9547 | 0.8306 | 0.9835 | 0.0090 | 0.2432 | 0.4908 | 0.0 | nan | 0.7515 | 0.8752 | 0.6752 | 0.7407 | 0.3913 | nan | 0.5938 | 0.6172 | 0.0311 | 0.8641 | 0.0885 | 0.0088 | 0.5006 | 0.0294 | 0.5643 | 0.0 | 0.0012 | 0.7671 | 0.1301 | 0.5224 | 0.3169 | 0.3806 | nan | 0.2300 | 0.5022 | 0.3785 | 0.0 | 0.8872 | 0.7572 | 0.9525 | 0.0075 | 0.1284 | 0.3792 | 0.0 | | 0.0705 | 31.9 | 6380 | 0.7062 | 0.4112 | 0.4900 | 0.8763 | nan | 0.8427 | 0.9512 | 0.7884 | 0.8599 | 0.5317 | nan | 0.7283 | 0.8216 | 0.0484 | 0.9527 | 0.1100 | 0.0072 | 0.6203 | 0.0666 | 0.7012 | 0.0 | 0.0029 | 0.9403 | 0.1995 | 0.6033 | 0.3720 | 0.5112 | nan | 0.3261 | 0.6278 | 0.5613 | 0.0 | 0.9518 | 0.8619 | 0.9833 | 0.0094 | 0.2489 | 0.4499 | 0.0 | nan | 0.7558 | 0.8745 | 0.6857 | 0.7456 | 0.3975 | nan | 0.5863 | 0.6238 | 0.0294 | 0.8666 | 0.0955 | 0.0069 | 0.4483 | 0.0666 | 0.5581 | 0.0 | 0.0029 | 0.7610 | 0.1724 | 0.5200 | 0.3221 | 0.3774 | nan | 0.2558 | 0.5087 | 0.3873 | 0.0 | 0.8909 | 0.7734 | 0.9525 | 0.0079 | 0.1195 | 0.3665 | 0.0 | | 0.0822 | 32.0 | 6400 | 0.6943 | 0.4123 | 0.4897 | 0.8763 | nan | 0.8249 | 0.9565 | 0.8046 | 0.8503 | 0.5427 | nan | 0.7243 | 0.8296 | 0.0390 | 0.9608 | 0.0953 | 0.0 | 0.6325 | 0.0404 | 0.6859 | 0.0 | 0.0005 | 0.9273 | 0.2207 | 0.6202 | 0.4016 | 0.4459 | nan | 0.3713 | 0.6071 | 0.5696 | 0.0 | 0.9564 | 0.8529 | 0.9832 | 0.0115 | 0.2157 | 0.4988 | 0.0 | nan | 0.7500 | 0.8751 | 0.6614 | 0.7480 | 0.4049 | nan | 0.5941 | 0.6398 | 0.0240 | 0.8622 | 0.0844 | 0.0 | 0.4865 | 0.0404 | 0.5522 | 0.0 | 0.0005 | 0.7696 | 0.1895 | 0.5334 | 0.3299 | 0.3518 | nan | 0.2886 | 0.5030 | 0.3887 | 0.0 | 0.8885 | 0.7667 | 0.9526 | 0.0091 | 0.1140 | 0.3856 | 0.0 | | 0.0668 | 32.1 | 6420 | 0.7170 | 0.4117 | 0.4908 | 0.8756 | nan | 0.8303 | 0.9569 | 0.8044 | 0.8574 | 0.5256 | nan | 0.7134 | 0.8253 | 0.0717 | 0.9551 | 0.1055 | 0.0004 | 0.6658 | 0.0383 | 0.6925 | 0.0 | 0.0016 | 0.9343 | 0.2526 | 0.6060 | 0.3833 | 0.4895 | nan | 0.3324 | 0.6097 | 0.5615 | 0.0 | 0.9505 | 0.8486 | 0.9849 | 0.0083 | 0.2182 | 0.4814 | 0.0 | nan | 0.7487 | 0.8756 | 0.6546 | 0.7416 | 0.4008 | nan | 0.5996 | 0.6438 | 0.0427 | 0.8637 | 0.0916 | 0.0004 | 0.4797 | 0.0383 | 0.5574 | 0.0 | 0.0016 | 0.7635 | 0.2075 | 0.5271 | 0.3201 | 0.3666 | nan | 0.2627 | 0.4996 | 0.3834 | 0.0 | 0.8883 | 0.7686 | 0.9515 | 0.0066 | 0.1079 | 0.3810 | 0.0 | | 0.109 | 32.2 | 6440 | 0.6935 | 0.4073 | 0.4896 | 0.8756 | nan | 0.8346 | 0.9530 | 0.7909 | 0.8605 | 0.5453 | nan | 0.7193 | 0.8166 | 0.1420 | 0.9557 | 0.1090 | 0.0 | 0.6780 | 0.0411 | 0.7133 | 0.0 | 0.0003 | 0.9332 | 0.1200 | 0.6218 | 0.3600 | 0.4940 | nan | 0.2634 | 0.6039 | 0.5729 | 0.0 | 0.9543 | 0.8559 | 0.9829 | 0.0058 | 0.2842 | 0.4566 | 0.0 | nan | 0.7505 | 0.8765 | 0.6700 | 0.7381 | 0.3989 | nan | 0.6023 | 0.6275 | 0.0790 | 0.8640 | 0.0946 | 0.0 | 0.4644 | 0.0411 | 0.5404 | 0.0 | 0.0003 | 0.7660 | 0.1098 | 0.5411 | 0.3056 | 0.3728 | nan | 0.2263 | 0.4973 | 0.3782 | 0.0 | 0.8881 | 0.7704 | 0.9521 | 0.0049 | 0.1110 | 0.3630 | 0.0 | | 0.1092 | 32.3 | 6460 | 0.6913 | 0.4103 | 0.4887 | 0.8780 | nan | 0.8523 | 0.9547 | 0.7885 | 0.8480 | 0.5476 | nan | 0.7248 | 0.8243 | 0.0524 | 0.9527 | 0.1199 | 0.0 | 0.6414 | 0.0525 | 0.6923 | 0.0 | 0.0012 | 0.9293 | 0.1572 | 0.6457 | 0.3509 | 0.5211 | nan | 0.2821 | 0.6106 | 0.5562 | 0.0 | 0.9529 | 0.8452 | 0.9834 | 0.0110 | 0.2626 | 0.4776 | 0.0 | nan | 0.7633 | 0.8792 | 0.6827 | 0.7520 | 0.3946 | nan | 0.5972 | 0.6374 | 0.0305 | 0.8661 | 0.1029 | 0.0 | 0.4491 | 0.0525 | 0.5630 | 0.0 | 0.0012 | 0.7682 | 0.1397 | 0.5439 | 0.3047 | 0.3765 | nan | 0.2310 | 0.5023 | 0.3822 | 0.0 | 0.8886 | 0.7707 | 0.9526 | 0.0090 | 0.1141 | 0.3733 | 0.0 | | 0.0882 | 32.4 | 6480 | 0.6950 | 0.4128 | 0.4911 | 0.8775 | nan | 0.8384 | 0.9547 | 0.8099 | 0.8561 | 0.5512 | nan | 0.7151 | 0.8269 | 0.0289 | 0.9541 | 0.0924 | 0.0 | 0.6422 | 0.0898 | 0.6895 | 0.0 | 0.0008 | 0.9313 | 0.2086 | 0.6159 | 0.4136 | 0.4909 | nan | 0.2832 | 0.6301 | 0.5775 | 0.0 | 0.9506 | 0.8589 | 0.9833 | 0.0161 | 0.2169 | 0.4886 | 0.0 | nan | 0.7574 | 0.8779 | 0.6794 | 0.7450 | 0.4003 | nan | 0.5922 | 0.6370 | 0.0167 | 0.8665 | 0.0826 | 0.0 | 0.4593 | 0.0898 | 0.5692 | 0.0 | 0.0008 | 0.7670 | 0.1782 | 0.5370 | 0.3388 | 0.3688 | nan | 0.2250 | 0.5074 | 0.3948 | 0.0 | 0.8908 | 0.7754 | 0.9528 | 0.0123 | 0.1047 | 0.3829 | 0.0 | | 0.1078 | 32.5 | 6500 | 0.7008 | 0.4077 | 0.4851 | 0.8768 | nan | 0.8260 | 0.9568 | 0.8089 | 0.8641 | 0.5325 | nan | 0.7276 | 0.8352 | 0.0344 | 0.9569 | 0.0890 | 0.0 | 0.6498 | 0.0799 | 0.7141 | 0.0 | 0.0003 | 0.9332 | 0.1337 | 0.6248 | 0.4004 | 0.4473 | nan | 0.2084 | 0.6083 | 0.5986 | 0.0 | 0.9546 | 0.8545 | 0.9826 | 0.0117 | 0.2128 | 0.4765 | 0.0 | nan | 0.7537 | 0.8767 | 0.6901 | 0.7304 | 0.4085 | nan | 0.5935 | 0.6295 | 0.0208 | 0.8652 | 0.0800 | 0.0 | 0.4612 | 0.0799 | 0.5707 | 0.0 | 0.0003 | 0.7653 | 0.1212 | 0.5288 | 0.3344 | 0.3514 | nan | 0.1775 | 0.5027 | 0.3925 | 0.0 | 0.8896 | 0.7701 | 0.9528 | 0.0095 | 0.1122 | 0.3797 | 0.0 | | 0.0839 | 32.6 | 6520 | 0.6931 | 0.4070 | 0.4811 | 0.8762 | nan | 0.8256 | 0.9578 | 0.7687 | 0.8671 | 0.5476 | nan | 0.7239 | 0.8142 | 0.0359 | 0.9583 | 0.0924 | 0.0 | 0.6538 | 0.0468 | 0.6923 | 0.0 | 0.0052 | 0.9296 | 0.1578 | 0.6324 | 0.3887 | 0.4013 | nan | 0.2183 | 0.6161 | 0.5845 | 0.0 | 0.9555 | 0.8465 | 0.9810 | 0.0069 | 0.2266 | 0.4620 | 0.0 | nan | 0.7481 | 0.8773 | 0.7198 | 0.7139 | 0.4052 | nan | 0.5951 | 0.6356 | 0.0227 | 0.8628 | 0.0832 | 0.0 | 0.4647 | 0.0468 | 0.5635 | 0.0 | 0.0052 | 0.7671 | 0.1405 | 0.5332 | 0.3266 | 0.3282 | nan | 0.1798 | 0.5013 | 0.3971 | 0.0 | 0.8898 | 0.7736 | 0.9531 | 0.0060 | 0.1144 | 0.3709 | 0.0 | | 0.0993 | 32.7 | 6540 | 0.7215 | 0.4052 | 0.4845 | 0.8741 | nan | 0.8252 | 0.9563 | 0.7900 | 0.8293 | 0.5540 | nan | 0.7042 | 0.8283 | 0.1096 | 0.9620 | 0.0997 | 0.0 | 0.6786 | 0.0450 | 0.6849 | 0.0 | 0.0052 | 0.9311 | 0.1770 | 0.6198 | 0.3774 | 0.3792 | nan | 0.1926 | 0.6116 | 0.5805 | 0.0 | 0.9537 | 0.8576 | 0.9825 | 0.0017 | 0.2979 | 0.4689 | 0.0 | nan | 0.7450 | 0.8710 | 0.6980 | 0.7201 | 0.4009 | nan | 0.5858 | 0.6205 | 0.0619 | 0.8594 | 0.0887 | 0.0 | 0.4599 | 0.0450 | 0.5572 | 0.0 | 0.0052 | 0.7674 | 0.1559 | 0.5378 | 0.3225 | 0.3122 | nan | 0.1615 | 0.5010 | 0.3915 | 0.0 | 0.8896 | 0.7746 | 0.9524 | 0.0015 | 0.1138 | 0.3657 | 0.0 | | 0.0767 | 32.8 | 6560 | 0.7154 | 0.4119 | 0.4878 | 0.8752 | nan | 0.8259 | 0.9534 | 0.8095 | 0.8543 | 0.5497 | nan | 0.6975 | 0.7829 | 0.1343 | 0.9474 | 0.1097 | 0.0 | 0.6268 | 0.0537 | 0.7137 | 0.0 | 0.0033 | 0.9335 | 0.1615 | 0.6321 | 0.4182 | 0.4522 | nan | 0.2498 | 0.6249 | 0.5742 | 0.0 | 0.9536 | 0.8550 | 0.9824 | 0.0084 | 0.2087 | 0.4922 | 0.0 | nan | 0.7454 | 0.8740 | 0.6896 | 0.7329 | 0.3946 | nan | 0.5824 | 0.6412 | 0.0629 | 0.8653 | 0.0960 | 0.0 | 0.4605 | 0.0537 | 0.5902 | 0.0 | 0.0033 | 0.7651 | 0.1420 | 0.5385 | 0.3385 | 0.3546 | nan | 0.2151 | 0.5038 | 0.3986 | 0.0 | 0.8898 | 0.7714 | 0.9525 | 0.0068 | 0.1229 | 0.3897 | 0.0 | | 0.07 | 32.9 | 6580 | 0.7151 | 0.4086 | 0.4868 | 0.8742 | nan | 0.8154 | 0.9566 | 0.8130 | 0.8613 | 0.5386 | nan | 0.7003 | 0.8201 | 0.0959 | 0.9565 | 0.1221 | 0.0079 | 0.6187 | 0.0545 | 0.7487 | 0.0 | 0.0009 | 0.9347 | 0.2350 | 0.6063 | 0.3738 | 0.4407 | nan | 0.2614 | 0.6001 | 0.5478 | 0.0 | 0.9557 | 0.8371 | 0.9858 | 0.0061 | 0.2000 | 0.4813 | 0.0 | nan | 0.7412 | 0.8754 | 0.6716 | 0.7269 | 0.3948 | nan | 0.5877 | 0.6248 | 0.0466 | 0.8617 | 0.1046 | 0.0078 | 0.4182 | 0.0545 | 0.6030 | 0.0 | 0.0009 | 0.7639 | 0.1969 | 0.5336 | 0.3179 | 0.3477 | nan | 0.2122 | 0.4997 | 0.3894 | 0.0 | 0.8867 | 0.7627 | 0.9507 | 0.0050 | 0.1037 | 0.3853 | 0.0 | | 0.0869 | 33.0 | 6600 | 0.7291 | 0.4098 | 0.4886 | 0.8733 | nan | 0.8000 | 0.9569 | 0.8014 | 0.8654 | 0.5538 | nan | 0.7208 | 0.8246 | 0.0684 | 0.9506 | 0.1201 | 0.0254 | 0.6151 | 0.0426 | 0.7512 | 0.0 | 0.0015 | 0.9316 | 0.2126 | 0.6254 | 0.3746 | 0.4522 | nan | 0.2562 | 0.6426 | 0.5521 | 0.0 | 0.9547 | 0.8444 | 0.9837 | 0.0108 | 0.2265 | 0.4712 | 0.0 | nan | 0.7315 | 0.8745 | 0.6940 | 0.7067 | 0.4021 | nan | 0.5896 | 0.6237 | 0.0401 | 0.8644 | 0.1045 | 0.0251 | 0.4398 | 0.0425 | 0.6079 | 0.0 | 0.0015 | 0.7688 | 0.1816 | 0.5406 | 0.3177 | 0.3504 | nan | 0.1928 | 0.5069 | 0.3942 | 0.0 | 0.8878 | 0.7669 | 0.9524 | 0.0084 | 0.1167 | 0.3818 | 0.0 | | 0.0907 | 33.1 | 6620 | 0.7083 | 0.4140 | 0.4938 | 0.8760 | nan | 0.8313 | 0.9492 | 0.7967 | 0.8513 | 0.5814 | nan | 0.7147 | 0.8239 | 0.0723 | 0.9481 | 0.1296 | 0.0329 | 0.6339 | 0.0548 | 0.7398 | 0.0 | 0.0030 | 0.9302 | 0.2007 | 0.6677 | 0.3820 | 0.4066 | nan | 0.3267 | 0.6150 | 0.5558 | 0.0 | 0.9557 | 0.8548 | 0.9816 | 0.0128 | 0.2743 | 0.4734 | 0.0 | nan | 0.7510 | 0.8790 | 0.6874 | 0.7512 | 0.3778 | nan | 0.5872 | 0.6345 | 0.0395 | 0.8648 | 0.1112 | 0.0328 | 0.4481 | 0.0548 | 0.5977 | 0.0 | 0.0030 | 0.7735 | 0.1747 | 0.5569 | 0.3216 | 0.3227 | nan | 0.2508 | 0.5051 | 0.3969 | 0.0 | 0.8878 | 0.7636 | 0.9535 | 0.0094 | 0.1310 | 0.3795 | 0.0 | | 0.0662 | 33.2 | 6640 | 0.7039 | 0.4131 | 0.4916 | 0.8772 | nan | 0.8298 | 0.9586 | 0.8088 | 0.8580 | 0.5375 | nan | 0.7101 | 0.7987 | 0.1955 | 0.9538 | 0.1375 | 0.0 | 0.6408 | 0.0635 | 0.7458 | 0.0 | 0.0037 | 0.9299 | 0.1396 | 0.6343 | 0.3837 | 0.4410 | nan | 0.2617 | 0.6124 | 0.5711 | 0.0 | 0.9535 | 0.8593 | 0.9840 | 0.0078 | 0.2362 | 0.4741 | 0.0 | nan | 0.7508 | 0.8788 | 0.6696 | 0.7440 | 0.3979 | nan | 0.5928 | 0.6328 | 0.0921 | 0.8642 | 0.1173 | 0.0 | 0.4648 | 0.0635 | 0.5916 | 0.0 | 0.0037 | 0.7705 | 0.1264 | 0.5459 | 0.3223 | 0.3503 | nan | 0.2286 | 0.5051 | 0.3944 | 0.0 | 0.8893 | 0.7750 | 0.9522 | 0.0059 | 0.1141 | 0.3754 | 0.0 | | 0.0726 | 33.3 | 6660 | 0.7065 | 0.4084 | 0.4842 | 0.8759 | nan | 0.8363 | 0.9559 | 0.8071 | 0.8274 | 0.5425 | nan | 0.7215 | 0.8516 | 0.0306 | 0.9505 | 0.1148 | 0.0 | 0.6430 | 0.0477 | 0.6927 | 0.0 | 0.0025 | 0.9344 | 0.1697 | 0.6262 | 0.3146 | 0.4589 | nan | 0.2954 | 0.6111 | 0.5492 | 0.0 | 0.9557 | 0.8535 | 0.9834 | 0.0044 | 0.2195 | 0.4939 | 0.0 | nan | 0.7537 | 0.8756 | 0.6706 | 0.7371 | 0.4011 | nan | 0.5870 | 0.6198 | 0.0206 | 0.8659 | 0.0991 | 0.0 | 0.4766 | 0.0477 | 0.5819 | 0.0 | 0.0025 | 0.7675 | 0.1532 | 0.5420 | 0.2811 | 0.3503 | nan | 0.2542 | 0.5009 | 0.3853 | 0.0 | 0.8869 | 0.7699 | 0.9523 | 0.0035 | 0.1047 | 0.3782 | 0.0 | | 0.0765 | 33.4 | 6680 | 0.7081 | 0.4149 | 0.4950 | 0.8765 | nan | 0.8328 | 0.9578 | 0.8029 | 0.8399 | 0.5425 | nan | 0.7116 | 0.8201 | 0.0353 | 0.9547 | 0.1215 | 0.0 | 0.6405 | 0.0628 | 0.7235 | 0.0 | 0.0085 | 0.9322 | 0.1502 | 0.6530 | 0.4228 | 0.6183 | nan | 0.3093 | 0.6097 | 0.5750 | 0.0 | 0.9516 | 0.8377 | 0.9840 | 0.0098 | 0.2520 | 0.4802 | 0.0 | nan | 0.7528 | 0.8742 | 0.6729 | 0.7404 | 0.4038 | nan | 0.5890 | 0.6490 | 0.0240 | 0.8631 | 0.1061 | 0.0 | 0.4733 | 0.0628 | 0.5778 | 0.0 | 0.0085 | 0.7719 | 0.1361 | 0.5468 | 0.3464 | 0.4091 | nan | 0.2539 | 0.5036 | 0.3912 | 0.0 | 0.8882 | 0.7625 | 0.9521 | 0.0076 | 0.1290 | 0.3809 | 0.0 | | 0.0897 | 33.5 | 6700 | 0.7159 | 0.4094 | 0.4878 | 0.8751 | nan | 0.8256 | 0.9557 | 0.8043 | 0.8559 | 0.5403 | nan | 0.7293 | 0.8425 | 0.0251 | 0.9597 | 0.0810 | 0.0 | 0.6896 | 0.0414 | 0.7049 | 0.0 | 0.0086 | 0.9321 | 0.1699 | 0.6306 | 0.3758 | 0.5154 | nan | 0.2817 | 0.6137 | 0.5449 | 0.0 | 0.9536 | 0.8209 | 0.9839 | 0.0041 | 0.2244 | 0.4935 | 0.0 | nan | 0.7496 | 0.8754 | 0.6659 | 0.7466 | 0.3971 | nan | 0.5921 | 0.6412 | 0.0174 | 0.8602 | 0.0730 | 0.0 | 0.4721 | 0.0414 | 0.5749 | 0.0 | 0.0086 | 0.7695 | 0.1523 | 0.5378 | 0.3204 | 0.3785 | nan | 0.2446 | 0.5059 | 0.3870 | 0.0 | 0.8839 | 0.7483 | 0.9526 | 0.0031 | 0.1198 | 0.3810 | 0.0 | | 0.0776 | 33.6 | 6720 | 0.7226 | 0.4115 | 0.4950 | 0.8749 | nan | 0.8214 | 0.9543 | 0.8080 | 0.8569 | 0.5551 | nan | 0.7005 | 0.8607 | 0.0413 | 0.9565 | 0.0964 | 0.0 | 0.6873 | 0.0378 | 0.7132 | 0.0 | 0.0026 | 0.9314 | 0.2169 | 0.6416 | 0.3864 | 0.5312 | nan | 0.3206 | 0.6181 | 0.5986 | 0.0 | 0.9525 | 0.8474 | 0.9843 | 0.0084 | 0.2475 | 0.4630 | 0.0 | nan | 0.7465 | 0.8753 | 0.6633 | 0.7468 | 0.3910 | nan | 0.5916 | 0.6261 | 0.0278 | 0.8626 | 0.0849 | 0.0 | 0.4609 | 0.0378 | 0.5827 | 0.0 | 0.0026 | 0.7684 | 0.1841 | 0.5377 | 0.3260 | 0.3794 | nan | 0.2578 | 0.5090 | 0.3896 | 0.0 | 0.8880 | 0.7629 | 0.9518 | 0.0058 | 0.1302 | 0.3758 | 0.0 | | 0.1109 | 33.7 | 6740 | 0.7219 | 0.4140 | 0.4975 | 0.8759 | nan | 0.8275 | 0.9576 | 0.8050 | 0.8541 | 0.5187 | nan | 0.7415 | 0.8474 | 0.0535 | 0.9539 | 0.1190 | 0.0 | 0.7124 | 0.0666 | 0.7268 | 0.0 | 0.0017 | 0.9352 | 0.1979 | 0.6278 | 0.3975 | 0.5485 | nan | 0.3189 | 0.6067 | 0.5829 | 0.0 | 0.9535 | 0.8316 | 0.9827 | 0.0100 | 0.2786 | 0.4640 | 0.0 | nan | 0.7494 | 0.8761 | 0.6653 | 0.7520 | 0.3944 | nan | 0.5889 | 0.6479 | 0.0342 | 0.8640 | 0.1020 | 0.0 | 0.4684 | 0.0666 | 0.5976 | 0.0 | 0.0017 | 0.7665 | 0.1717 | 0.5348 | 0.3320 | 0.3754 | nan | 0.2489 | 0.5057 | 0.3869 | 0.0 | 0.8872 | 0.7600 | 0.9523 | 0.0073 | 0.1342 | 0.3753 | 0.0 | | 0.0714 | 33.8 | 6760 | 0.7040 | 0.4191 | 0.5104 | 0.8769 | nan | 0.8311 | 0.9574 | 0.8046 | 0.8543 | 0.5370 | nan | 0.7318 | 0.8065 | 0.3629 | 0.9544 | 0.1247 | 0.0 | 0.7249 | 0.1480 | 0.7440 | 0.0 | 0.0020 | 0.9248 | 0.2265 | 0.6234 | 0.4050 | 0.5126 | nan | 0.2951 | 0.6125 | 0.5795 | 0.0 | 0.9579 | 0.8385 | 0.9825 | 0.0048 | 0.3099 | 0.4775 | 0.0 | nan | 0.7546 | 0.8781 | 0.6648 | 0.7504 | 0.4058 | nan | 0.5879 | 0.6292 | 0.1124 | 0.8647 | 0.1069 | 0.0 | 0.4684 | 0.1479 | 0.5998 | 0.0 | 0.0020 | 0.7721 | 0.1987 | 0.5353 | 0.3365 | 0.3703 | nan | 0.2205 | 0.5051 | 0.3877 | 0.0 | 0.8865 | 0.7618 | 0.9521 | 0.0035 | 0.1365 | 0.3713 | 0.0 | | 0.0615 | 33.9 | 6780 | 0.7146 | 0.4125 | 0.4896 | 0.8775 | nan | 0.8301 | 0.9632 | 0.7929 | 0.8508 | 0.5108 | nan | 0.7121 | 0.8431 | 0.0684 | 0.9536 | 0.1265 | 0.0 | 0.7271 | 0.0748 | 0.6713 | 0.0 | 0.0 | 0.9263 | 0.2143 | 0.6247 | 0.3687 | 0.4975 | nan | 0.2420 | 0.6167 | 0.5793 | 0.0 | 0.9579 | 0.8454 | 0.9836 | 0.0041 | 0.1795 | 0.5033 | 0.0 | nan | 0.7504 | 0.8762 | 0.6669 | 0.7404 | 0.4170 | nan | 0.5865 | 0.6454 | 0.0385 | 0.8639 | 0.1077 | 0.0 | 0.4739 | 0.0748 | 0.5780 | 0.0 | 0.0 | 0.7720 | 0.1918 | 0.5342 | 0.3126 | 0.3662 | nan | 0.1871 | 0.5051 | 0.3988 | 0.0 | 0.8858 | 0.7658 | 0.9516 | 0.0035 | 0.1193 | 0.3869 | 0.0 | | 0.0893 | 34.0 | 6800 | 0.7000 | 0.4168 | 0.4974 | 0.8783 | nan | 0.8374 | 0.9617 | 0.7923 | 0.8609 | 0.5329 | nan | 0.7031 | 0.7904 | 0.1178 | 0.9539 | 0.1435 | 0.0022 | 0.8039 | 0.0996 | 0.7168 | 0.0 | 0.0020 | 0.9357 | 0.2202 | 0.6263 | 0.3937 | 0.5509 | nan | 0.2026 | 0.6028 | 0.5671 | 0.0 | 0.9493 | 0.8424 | 0.9830 | 0.0082 | 0.2195 | 0.4952 | 0.0 | nan | 0.7542 | 0.8793 | 0.6748 | 0.7430 | 0.4192 | nan | 0.5891 | 0.6673 | 0.0697 | 0.8621 | 0.1203 | 0.0022 | 0.4841 | 0.0995 | 0.5736 | 0.0 | 0.0020 | 0.7679 | 0.1916 | 0.5293 | 0.3269 | 0.3909 | nan | 0.1635 | 0.5054 | 0.3861 | 0.0 | 0.8889 | 0.7676 | 0.9519 | 0.0068 | 0.1331 | 0.3868 | 0.0 | | 0.0643 | 34.1 | 6820 | 0.7305 | 0.4052 | 0.4840 | 0.8741 | nan | 0.8217 | 0.9635 | 0.8015 | 0.8551 | 0.4787 | nan | 0.7021 | 0.8348 | 0.0715 | 0.9554 | 0.1245 | 0.0 | 0.6547 | 0.0495 | 0.7125 | 0.0 | 0.0079 | 0.9311 | 0.2483 | 0.5820 | 0.3625 | 0.4930 | nan | 0.1410 | 0.6551 | 0.5725 | 0.0 | 0.9555 | 0.8247 | 0.9831 | 0.0033 | 0.2233 | 0.4796 | 0.0 | nan | 0.7460 | 0.8727 | 0.6596 | 0.7339 | 0.3828 | nan | 0.5858 | 0.6495 | 0.0449 | 0.8623 | 0.1080 | 0.0 | 0.4414 | 0.0495 | 0.5549 | 0.0 | 0.0079 | 0.7635 | 0.2121 | 0.5171 | 0.3097 | 0.3602 | nan | 0.1169 | 0.5045 | 0.3949 | 0.0 | 0.8853 | 0.7568 | 0.9522 | 0.0027 | 0.1130 | 0.3797 | 0.0 | | 0.0835 | 34.2 | 6840 | 0.7173 | 0.4094 | 0.4836 | 0.8773 | nan | 0.8376 | 0.9589 | 0.7892 | 0.8537 | 0.5312 | nan | 0.7159 | 0.8236 | 0.0417 | 0.9520 | 0.1248 | 0.0 | 0.6412 | 0.0239 | 0.6903 | 0.0 | 0.0034 | 0.9344 | 0.1832 | 0.6327 | 0.3625 | 0.5151 | nan | 0.2228 | 0.6314 | 0.5617 | 0.0 | 0.9543 | 0.8310 | 0.9842 | 0.0093 | 0.1743 | 0.4925 | 0.0 | nan | 0.7552 | 0.8780 | 0.6802 | 0.7442 | 0.4123 | nan | 0.5862 | 0.6551 | 0.0259 | 0.8620 | 0.1086 | 0.0 | 0.4597 | 0.0239 | 0.5638 | 0.0 | 0.0034 | 0.7681 | 0.1653 | 0.5335 | 0.3098 | 0.3808 | nan | 0.1794 | 0.5121 | 0.3907 | 0.0 | 0.8856 | 0.7535 | 0.9520 | 0.0069 | 0.1149 | 0.3910 | 0.0 | | 0.0742 | 34.3 | 6860 | 0.7048 | 0.4105 | 0.4888 | 0.8782 | nan | 0.8362 | 0.9591 | 0.8005 | 0.8552 | 0.5264 | nan | 0.7207 | 0.8449 | 0.0987 | 0.9547 | 0.1075 | 0.0 | 0.6571 | 0.0278 | 0.6951 | 0.0 | 0.0081 | 0.9278 | 0.1670 | 0.6597 | 0.3732 | 0.5320 | nan | 0.2193 | 0.6198 | 0.5725 | 0.0 | 0.9523 | 0.8574 | 0.9834 | 0.0013 | 0.1958 | 0.4896 | 0.0 | nan | 0.7562 | 0.8790 | 0.6689 | 0.7413 | 0.4090 | nan | 0.5854 | 0.6317 | 0.0561 | 0.8609 | 0.0946 | 0.0 | 0.4557 | 0.0278 | 0.5638 | 0.0 | 0.0081 | 0.7737 | 0.1521 | 0.5446 | 0.3168 | 0.3893 | nan | 0.1820 | 0.5130 | 0.3914 | 0.0 | 0.8888 | 0.7689 | 0.9520 | 0.0010 | 0.1353 | 0.3876 | 0.0 | | 0.1168 | 34.4 | 6880 | 0.6841 | 0.4150 | 0.4977 | 0.8794 | nan | 0.8529 | 0.9527 | 0.8020 | 0.8523 | 0.5574 | nan | 0.7265 | 0.8365 | 0.0637 | 0.9550 | 0.1016 | 0.0 | 0.6778 | 0.0272 | 0.7031 | 0.0 | 0.0212 | 0.9231 | 0.2925 | 0.6511 | 0.3985 | 0.5207 | nan | 0.2553 | 0.6245 | 0.5977 | 0.0 | 0.9545 | 0.8385 | 0.9839 | 0.0018 | 0.2349 | 0.5202 | 0.0 | nan | 0.7699 | 0.8812 | 0.6835 | 0.7501 | 0.3954 | nan | 0.5866 | 0.6470 | 0.0383 | 0.8615 | 0.0893 | 0.0 | 0.4509 | 0.0272 | 0.5621 | 0.0 | 0.0212 | 0.7771 | 0.2466 | 0.5537 | 0.3301 | 0.3799 | nan | 0.2045 | 0.5060 | 0.3862 | 0.0 | 0.8881 | 0.7639 | 0.9515 | 0.0015 | 0.1364 | 0.3917 | 0.0 | | 0.0595 | 34.5 | 6900 | 0.7130 | 0.4101 | 0.4904 | 0.8775 | nan | 0.8442 | 0.9532 | 0.8013 | 0.8490 | 0.5569 | nan | 0.7149 | 0.8149 | 0.1256 | 0.9551 | 0.1126 | 0.0 | 0.6683 | 0.0307 | 0.7014 | 0.0 | 0.0049 | 0.9346 | 0.2040 | 0.6351 | 0.3813 | 0.4835 | nan | 0.2342 | 0.6111 | 0.5948 | 0.0 | 0.9543 | 0.8439 | 0.9837 | 0.0032 | 0.2197 | 0.4776 | 0.0 | nan | 0.7597 | 0.8772 | 0.6766 | 0.7504 | 0.3995 | nan | 0.5850 | 0.6392 | 0.0631 | 0.8600 | 0.0989 | 0.0 | 0.4336 | 0.0307 | 0.5614 | 0.0 | 0.0049 | 0.7709 | 0.1801 | 0.5459 | 0.3220 | 0.3608 | nan | 0.1852 | 0.5028 | 0.3971 | 0.0 | 0.8874 | 0.7644 | 0.9520 | 0.0026 | 0.1309 | 0.3810 | 0.0 | | 0.0885 | 34.6 | 6920 | 0.7065 | 0.4110 | 0.4956 | 0.8773 | nan | 0.8436 | 0.9571 | 0.7963 | 0.8438 | 0.5365 | nan | 0.7003 | 0.8339 | 0.1525 | 0.9518 | 0.1117 | 0.0 | 0.6927 | 0.0622 | 0.7234 | 0.0 | 0.0065 | 0.9294 | 0.2014 | 0.6350 | 0.3897 | 0.5032 | nan | 0.2568 | 0.6164 | 0.5865 | 0.0 | 0.9536 | 0.8522 | 0.9817 | 0.0117 | 0.2515 | 0.4782 | 0.0 | nan | 0.7598 | 0.8766 | 0.6939 | 0.7435 | 0.4051 | nan | 0.5845 | 0.6021 | 0.0802 | 0.8648 | 0.0967 | 0.0 | 0.4335 | 0.0622 | 0.5517 | 0.0 | 0.0065 | 0.7722 | 0.1793 | 0.5491 | 0.3233 | 0.3673 | nan | 0.1997 | 0.5005 | 0.3978 | 0.0 | 0.8879 | 0.7661 | 0.9532 | 0.0080 | 0.1146 | 0.3710 | 0.0 | | 0.0715 | 34.7 | 6940 | 0.7091 | 0.4125 | 0.4943 | 0.8776 | nan | 0.8471 | 0.9560 | 0.7982 | 0.8358 | 0.5498 | nan | 0.7191 | 0.8264 | 0.0810 | 0.9588 | 0.1080 | 0.0 | 0.6942 | 0.0434 | 0.6989 | 0.0 | 0.0072 | 0.9267 | 0.2175 | 0.6463 | 0.3871 | 0.4835 | nan | 0.3013 | 0.6419 | 0.5906 | 0.0 | 0.9537 | 0.8299 | 0.9838 | 0.0065 | 0.2428 | 0.4815 | 0.0 | nan | 0.7602 | 0.8772 | 0.7021 | 0.7450 | 0.4044 | nan | 0.5904 | 0.6210 | 0.0448 | 0.8598 | 0.0944 | 0.0 | 0.4384 | 0.0434 | 0.5587 | 0.0 | 0.0072 | 0.7725 | 0.1907 | 0.5470 | 0.3270 | 0.3721 | nan | 0.2330 | 0.5032 | 0.4042 | 0.0 | 0.8870 | 0.7600 | 0.9526 | 0.0048 | 0.1173 | 0.3813 | 0.0 | | 0.0908 | 34.8 | 6960 | 0.7034 | 0.4133 | 0.4963 | 0.8781 | nan | 0.8463 | 0.9567 | 0.7999 | 0.8399 | 0.5551 | nan | 0.7234 | 0.8263 | 0.1250 | 0.9508 | 0.1135 | 0.0 | 0.6795 | 0.0507 | 0.7067 | 0.0 | 0.0101 | 0.9316 | 0.2536 | 0.6314 | 0.3883 | 0.4968 | nan | 0.2776 | 0.6215 | 0.5755 | 0.0 | 0.9553 | 0.8408 | 0.9832 | 0.0066 | 0.2792 | 0.4559 | 0.0 | nan | 0.7646 | 0.8796 | 0.6948 | 0.7483 | 0.4008 | nan | 0.5927 | 0.6159 | 0.0623 | 0.8653 | 0.0994 | 0.0 | 0.4262 | 0.0507 | 0.5722 | 0.0 | 0.0101 | 0.7701 | 0.2134 | 0.5379 | 0.3254 | 0.3697 | nan | 0.2032 | 0.5077 | 0.4076 | 0.0 | 0.8870 | 0.7637 | 0.9529 | 0.0047 | 0.1300 | 0.3709 | 0.0 | | 0.0926 | 34.9 | 6980 | 0.7226 | 0.4129 | 0.5021 | 0.8743 | nan | 0.8084 | 0.9567 | 0.7951 | 0.8701 | 0.5444 | nan | 0.7083 | 0.8245 | 0.1514 | 0.9543 | 0.1348 | 0.0 | 0.6805 | 0.0463 | 0.7060 | 0.0 | 0.0069 | 0.9234 | 0.2647 | 0.6287 | 0.3968 | 0.5927 | nan | 0.2616 | 0.6449 | 0.5906 | 0.0 | 0.9519 | 0.8647 | 0.9833 | 0.0077 | 0.2975 | 0.4714 | 0.0 | nan | 0.7343 | 0.8782 | 0.6921 | 0.6995 | 0.4072 | nan | 0.5899 | 0.6290 | 0.0722 | 0.8616 | 0.1157 | 0.0 | 0.4273 | 0.0463 | 0.5762 | 0.0 | 0.0069 | 0.7722 | 0.2214 | 0.5357 | 0.3314 | 0.3707 | nan | 0.1966 | 0.5107 | 0.3979 | 0.0 | 0.8896 | 0.7743 | 0.9526 | 0.0057 | 0.1434 | 0.3734 | 0.0 | | 0.072 | 35.0 | 7000 | 0.7112 | 0.4118 | 0.4961 | 0.8770 | nan | 0.8373 | 0.9527 | 0.7939 | 0.8673 | 0.5509 | nan | 0.7249 | 0.8349 | 0.1411 | 0.9566 | 0.1426 | 0.0 | 0.6715 | 0.0284 | 0.7095 | 0.0 | 0.0032 | 0.9373 | 0.2113 | 0.6021 | 0.3573 | 0.5783 | nan | 0.2289 | 0.6315 | 0.5659 | 0.0 | 0.9492 | 0.8623 | 0.9810 | 0.0086 | 0.2546 | 0.4921 | 0.0 | nan | 0.7609 | 0.8859 | 0.6978 | 0.7222 | 0.3807 | nan | 0.5923 | 0.6311 | 0.0732 | 0.8622 | 0.1218 | 0.0 | 0.4430 | 0.0284 | 0.5737 | 0.0 | 0.0032 | 0.7643 | 0.1859 | 0.5229 | 0.3097 | 0.3805 | nan | 0.1885 | 0.5066 | 0.4033 | 0.0 | 0.8896 | 0.7731 | 0.9540 | 0.0062 | 0.1348 | 0.3822 | 0.0 | | 0.0878 | 35.1 | 7020 | 0.7032 | 0.4121 | 0.4950 | 0.8796 | nan | 0.8632 | 0.9551 | 0.8148 | 0.8522 | 0.5301 | nan | 0.7068 | 0.8316 | 0.0810 | 0.9546 | 0.1448 | 0.0 | 0.6698 | 0.0374 | 0.7077 | 0.0 | 0.0 | 0.9315 | 0.1499 | 0.6055 | 0.3846 | 0.6106 | nan | 0.2612 | 0.6163 | 0.5929 | 0.0 | 0.9509 | 0.8546 | 0.9852 | 0.0106 | 0.2482 | 0.4902 | 0.0 | nan | 0.7732 | 0.8875 | 0.6897 | 0.7523 | 0.3764 | nan | 0.5895 | 0.6392 | 0.0493 | 0.8625 | 0.1203 | 0.0 | 0.4445 | 0.0374 | 0.5728 | 0.0 | 0.0 | 0.7643 | 0.1360 | 0.5188 | 0.3272 | 0.3897 | nan | 0.2108 | 0.5050 | 0.3999 | 0.0 | 0.8897 | 0.7721 | 0.9513 | 0.0080 | 0.1363 | 0.3830 | 0.0 | | 0.0617 | 35.2 | 7040 | 0.7085 | 0.4076 | 0.4862 | 0.8779 | nan | 0.8469 | 0.9541 | 0.8175 | 0.8613 | 0.5417 | nan | 0.7101 | 0.8428 | 0.0342 | 0.9524 | 0.1063 | 0.0 | 0.6526 | 0.0122 | 0.7009 | 0.0 | 0.0007 | 0.9350 | 0.1259 | 0.5991 | 0.3807 | 0.5674 | nan | 0.2156 | 0.6201 | 0.5787 | 0.0 | 0.9548 | 0.8446 | 0.9818 | 0.0100 | 0.2173 | 0.4933 | 0.0 | nan | 0.7655 | 0.8837 | 0.6960 | 0.7419 | 0.3874 | nan | 0.5899 | 0.6424 | 0.0234 | 0.8619 | 0.0929 | 0.0 | 0.4448 | 0.0122 | 0.5773 | 0.0 | 0.0007 | 0.7608 | 0.1156 | 0.5165 | 0.3232 | 0.3777 | nan | 0.1875 | 0.5068 | 0.4089 | 0.0 | 0.8878 | 0.7641 | 0.9526 | 0.0079 | 0.1262 | 0.3872 | 0.0 | | 0.0636 | 35.3 | 7060 | 0.7015 | 0.4130 | 0.4922 | 0.8782 | nan | 0.8488 | 0.9566 | 0.8009 | 0.8527 | 0.5343 | nan | 0.7245 | 0.8251 | 0.0319 | 0.9549 | 0.1212 | 0.0 | 0.6995 | 0.0283 | 0.6915 | 0.0 | 0.0028 | 0.9273 | 0.2412 | 0.6259 | 0.3709 | 0.5330 | nan | 0.2623 | 0.6272 | 0.5727 | 0.0 | 0.9555 | 0.8255 | 0.9827 | 0.0109 | 0.2392 | 0.5022 | 0.0 | nan | 0.7659 | 0.8814 | 0.6998 | 0.7451 | 0.3928 | nan | 0.5907 | 0.6495 | 0.0203 | 0.8597 | 0.1045 | 0.0 | 0.4582 | 0.0283 | 0.5761 | 0.0 | 0.0028 | 0.7697 | 0.2071 | 0.5300 | 0.3199 | 0.3712 | nan | 0.2169 | 0.5065 | 0.4021 | 0.0 | 0.8852 | 0.7539 | 0.9524 | 0.0086 | 0.1298 | 0.3871 | 0.0 | | 0.0551 | 35.4 | 7080 | 0.6996 | 0.4136 | 0.4917 | 0.8791 | nan | 0.8576 | 0.9591 | 0.8023 | 0.8473 | 0.5101 | nan | 0.7156 | 0.8227 | 0.0597 | 0.9540 | 0.1384 | 0.0 | 0.6836 | 0.0255 | 0.7013 | 0.0 | 0.0024 | 0.9314 | 0.2454 | 0.6274 | 0.3617 | 0.5197 | nan | 0.2522 | 0.6283 | 0.5832 | 0.0 | 0.9528 | 0.8283 | 0.9866 | 0.0045 | 0.2473 | 0.4846 | 0.0 | nan | 0.7706 | 0.8814 | 0.7086 | 0.7500 | 0.3846 | nan | 0.5886 | 0.6455 | 0.0359 | 0.8613 | 0.1172 | 0.0 | 0.4454 | 0.0255 | 0.5828 | 0.0 | 0.0024 | 0.7678 | 0.2096 | 0.5277 | 0.3131 | 0.3811 | nan | 0.2117 | 0.5064 | 0.4019 | 0.0 | 0.8859 | 0.7581 | 0.9508 | 0.0038 | 0.1357 | 0.3816 | 0.0 | | 0.0555 | 35.5 | 7100 | 0.7008 | 0.4108 | 0.4897 | 0.8788 | nan | 0.8481 | 0.9579 | 0.8094 | 0.8626 | 0.5052 | nan | 0.7268 | 0.8382 | 0.0548 | 0.9542 | 0.1216 | 0.0 | 0.6768 | 0.0288 | 0.7086 | 0.0 | 0.0040 | 0.9293 | 0.1859 | 0.6175 | 0.3534 | 0.5334 | nan | 0.2739 | 0.6032 | 0.5648 | 0.0 | 0.9600 | 0.8306 | 0.9847 | 0.0082 | 0.2428 | 0.4855 | 0.0 | nan | 0.7684 | 0.8847 | 0.6922 | 0.7430 | 0.3884 | nan | 0.5942 | 0.6428 | 0.0347 | 0.8619 | 0.1038 | 0.0 | 0.4355 | 0.0288 | 0.5760 | 0.0 | 0.0040 | 0.7669 | 0.1662 | 0.5237 | 0.3013 | 0.3811 | nan | 0.2261 | 0.5022 | 0.4037 | 0.0 | 0.8831 | 0.7549 | 0.9516 | 0.0065 | 0.1401 | 0.3809 | 0.0 | | 0.0685 | 35.6 | 7120 | 0.6991 | 0.4137 | 0.4954 | 0.8780 | nan | 0.8426 | 0.9550 | 0.8109 | 0.8636 | 0.5350 | nan | 0.7355 | 0.8334 | 0.0566 | 0.9551 | 0.1427 | 0.0012 | 0.7120 | 0.0213 | 0.6881 | 0.0 | 0.0188 | 0.9347 | 0.2164 | 0.6212 | 0.3516 | 0.5650 | nan | 0.2964 | 0.6247 | 0.5762 | 0.0 | 0.9528 | 0.8363 | 0.9829 | 0.0074 | 0.2304 | 0.4844 | 0.0 | nan | 0.7640 | 0.8856 | 0.6964 | 0.7336 | 0.3824 | nan | 0.5942 | 0.6517 | 0.0340 | 0.8610 | 0.1194 | 0.0012 | 0.4459 | 0.0213 | 0.5836 | 0.0 | 0.0187 | 0.7668 | 0.1879 | 0.5232 | 0.3031 | 0.3881 | nan | 0.2392 | 0.5056 | 0.4001 | 0.0 | 0.8858 | 0.7562 | 0.9529 | 0.0062 | 0.1420 | 0.3893 | 0.0 | | 0.0688 | 35.7 | 7140 | 0.7115 | 0.4137 | 0.4982 | 0.8749 | nan | 0.8133 | 0.9552 | 0.7908 | 0.8666 | 0.5457 | nan | 0.7425 | 0.8438 | 0.0803 | 0.9552 | 0.1694 | 0.0015 | 0.7415 | 0.0262 | 0.6994 | 0.0 | 0.0347 | 0.9313 | 0.1861 | 0.6127 | 0.3975 | 0.5372 | nan | 0.2925 | 0.6242 | 0.5672 | 0.0 | 0.9508 | 0.8592 | 0.9825 | 0.0076 | 0.2294 | 0.4985 | 0.0 | nan | 0.7406 | 0.8803 | 0.7048 | 0.7144 | 0.3771 | nan | 0.5912 | 0.6374 | 0.0471 | 0.8628 | 0.1437 | 0.0015 | 0.4355 | 0.0262 | 0.5822 | 0.0 | 0.0345 | 0.7694 | 0.1668 | 0.5288 | 0.3207 | 0.3802 | nan | 0.2327 | 0.5057 | 0.3945 | 0.0 | 0.8890 | 0.7673 | 0.9535 | 0.0067 | 0.1502 | 0.3926 | 0.0 | | 0.0911 | 35.8 | 7160 | 0.7127 | 0.4171 | 0.5055 | 0.8749 | nan | 0.8198 | 0.9536 | 0.7962 | 0.8677 | 0.5529 | nan | 0.7179 | 0.8123 | 0.2586 | 0.9532 | 0.1314 | 0.0 | 0.7348 | 0.0632 | 0.7108 | 0.0 | 0.0530 | 0.9332 | 0.2061 | 0.6238 | 0.4007 | 0.5218 | nan | 0.3449 | 0.6127 | 0.5598 | 0.0 | 0.9534 | 0.8369 | 0.9825 | 0.0082 | 0.2718 | 0.4932 | 0.0 | nan | 0.7420 | 0.8851 | 0.6962 | 0.7035 | 0.3731 | nan | 0.5890 | 0.6483 | 0.1169 | 0.8648 | 0.1145 | 0.0 | 0.4173 | 0.0632 | 0.5896 | 0.0 | 0.0524 | 0.7712 | 0.1831 | 0.5377 | 0.3297 | 0.3778 | nan | 0.2610 | 0.5062 | 0.3907 | 0.0 | 0.8873 | 0.7594 | 0.9537 | 0.0071 | 0.1381 | 0.3888 | 0.0 | | 0.0745 | 35.9 | 7180 | 0.6970 | 0.4162 | 0.5017 | 0.8765 | nan | 0.8326 | 0.9566 | 0.7924 | 0.8683 | 0.5136 | nan | 0.7218 | 0.7968 | 0.3063 | 0.9562 | 0.1121 | 0.0 | 0.6826 | 0.0618 | 0.7158 | 0.0 | 0.0413 | 0.9271 | 0.2218 | 0.6374 | 0.3751 | 0.4786 | nan | 0.3344 | 0.6221 | 0.5697 | 0.0 | 0.9569 | 0.8356 | 0.9849 | 0.0138 | 0.2779 | 0.4618 | 0.0 | nan | 0.7538 | 0.8827 | 0.7012 | 0.7188 | 0.3717 | nan | 0.5949 | 0.6410 | 0.1112 | 0.8610 | 0.0984 | 0.0 | 0.4344 | 0.0618 | 0.5806 | 0.0 | 0.0409 | 0.7695 | 0.1922 | 0.5333 | 0.3247 | 0.3578 | nan | 0.2487 | 0.5049 | 0.3946 | 0.0 | 0.8862 | 0.7612 | 0.9519 | 0.0110 | 0.1510 | 0.3789 | 0.0 | | 0.098 | 36.0 | 7200 | 0.7067 | 0.4167 | 0.5020 | 0.8761 | nan | 0.8272 | 0.9580 | 0.7997 | 0.8615 | 0.5146 | nan | 0.7204 | 0.7786 | 0.3260 | 0.9567 | 0.1210 | 0.0 | 0.7143 | 0.0673 | 0.7219 | 0.0 | 0.0364 | 0.9272 | 0.1989 | 0.6440 | 0.3668 | 0.4954 | nan | 0.3236 | 0.6110 | 0.5818 | 0.0 | 0.9562 | 0.8395 | 0.9854 | 0.0191 | 0.2414 | 0.4706 | 0.0 | nan | 0.7499 | 0.8770 | 0.6894 | 0.7338 | 0.3880 | nan | 0.5896 | 0.6261 | 0.1040 | 0.8598 | 0.1053 | 0.0 | 0.4644 | 0.0673 | 0.5807 | 0.0 | 0.0362 | 0.7711 | 0.1764 | 0.5369 | 0.3151 | 0.3816 | nan | 0.2475 | 0.5046 | 0.3927 | 0.0 | 0.8861 | 0.7613 | 0.9518 | 0.0145 | 0.1437 | 0.3786 | 0.0 | | 0.0913 | 36.1 | 7220 | 0.6997 | 0.4150 | 0.4988 | 0.8783 | nan | 0.8518 | 0.9561 | 0.8063 | 0.8402 | 0.5311 | nan | 0.7216 | 0.7947 | 0.2903 | 0.9534 | 0.1148 | 0.0 | 0.6703 | 0.0730 | 0.7298 | 0.0 | 0.0158 | 0.9356 | 0.1644 | 0.6500 | 0.3514 | 0.4716 | nan | 0.3353 | 0.6106 | 0.5744 | 0.0 | 0.9499 | 0.8522 | 0.9845 | 0.0146 | 0.2490 | 0.4688 | 0.0 | nan | 0.7663 | 0.8818 | 0.6878 | 0.7574 | 0.3797 | nan | 0.5869 | 0.6227 | 0.1008 | 0.8619 | 0.1000 | 0.0 | 0.4523 | 0.0730 | 0.5852 | 0.0 | 0.0157 | 0.7675 | 0.1473 | 0.5364 | 0.3086 | 0.3620 | nan | 0.2524 | 0.5042 | 0.3920 | 0.0 | 0.8884 | 0.7666 | 0.9527 | 0.0105 | 0.1416 | 0.3776 | 0.0 | | 0.0637 | 36.2 | 7240 | 0.6977 | 0.4142 | 0.4944 | 0.8788 | nan | 0.8588 | 0.9543 | 0.8064 | 0.8406 | 0.5492 | nan | 0.7120 | 0.8186 | 0.1689 | 0.9555 | 0.1137 | 0.0 | 0.6676 | 0.0581 | 0.7305 | 0.0 | 0.0312 | 0.9309 | 0.1872 | 0.6187 | 0.3752 | 0.4368 | nan | 0.2872 | 0.6168 | 0.5789 | 0.0 | 0.9575 | 0.8339 | 0.9822 | 0.0152 | 0.2486 | 0.4868 | 0.0 | nan | 0.7745 | 0.8842 | 0.6891 | 0.7590 | 0.3856 | nan | 0.5883 | 0.6318 | 0.0757 | 0.8617 | 0.1009 | 0.0 | 0.4530 | 0.0581 | 0.5816 | 0.0 | 0.0312 | 0.7674 | 0.1672 | 0.5317 | 0.3163 | 0.3459 | nan | 0.2318 | 0.5034 | 0.3969 | 0.0 | 0.8852 | 0.7579 | 0.9540 | 0.0101 | 0.1300 | 0.3808 | 0.0 | | 0.0577 | 36.3 | 7260 | 0.6996 | 0.4130 | 0.4876 | 0.8784 | nan | 0.8582 | 0.9553 | 0.7909 | 0.8403 | 0.5440 | nan | 0.7311 | 0.8018 | 0.1001 | 0.9540 | 0.1164 | 0.0 | 0.6541 | 0.0438 | 0.6995 | 0.0 | 0.0241 | 0.9389 | 0.1828 | 0.6238 | 0.3535 | 0.4863 | nan | 0.2532 | 0.6019 | 0.5866 | 0.0 | 0.9520 | 0.8268 | 0.9827 | 0.0112 | 0.1905 | 0.5003 | 0.0 | nan | 0.7702 | 0.8815 | 0.7048 | 0.7527 | 0.3851 | nan | 0.5930 | 0.6534 | 0.0464 | 0.8612 | 0.1019 | 0.0 | 0.4455 | 0.0438 | 0.5897 | 0.0 | 0.0240 | 0.7643 | 0.1630 | 0.5292 | 0.3078 | 0.3715 | nan | 0.2076 | 0.5011 | 0.3996 | 0.0 | 0.8853 | 0.7536 | 0.9532 | 0.0087 | 0.1253 | 0.3925 | 0.0 | | 0.0702 | 36.4 | 7280 | 0.6983 | 0.4149 | 0.4925 | 0.8794 | nan | 0.8620 | 0.9546 | 0.8033 | 0.8408 | 0.5509 | nan | 0.7143 | 0.8175 | 0.0806 | 0.9553 | 0.1160 | 0.0 | 0.6629 | 0.0396 | 0.6983 | 0.0 | 0.0254 | 0.9273 | 0.2064 | 0.6429 | 0.3873 | 0.4986 | nan | 0.2752 | 0.6216 | 0.5645 | 0.0 | 0.9558 | 0.8278 | 0.9822 | 0.0105 | 0.2456 | 0.4927 | 0.0 | nan | 0.7758 | 0.8830 | 0.6970 | 0.7579 | 0.3846 | nan | 0.5898 | 0.6602 | 0.0442 | 0.8605 | 0.1009 | 0.0 | 0.4327 | 0.0396 | 0.5747 | 0.0 | 0.0254 | 0.7718 | 0.1811 | 0.5403 | 0.3225 | 0.3797 | nan | 0.2263 | 0.5026 | 0.3968 | 0.0 | 0.8848 | 0.7530 | 0.9532 | 0.0078 | 0.1448 | 0.3873 | 0.0 | | 0.0964 | 36.5 | 7300 | 0.7178 | 0.4104 | 0.4857 | 0.8780 | nan | 0.8624 | 0.9553 | 0.8127 | 0.8304 | 0.5364 | nan | 0.7039 | 0.8188 | 0.0519 | 0.9557 | 0.1239 | 0.0 | 0.6651 | 0.0256 | 0.6564 | 0.0 | 0.0216 | 0.9335 | 0.2001 | 0.6143 | 0.3561 | 0.4989 | nan | 0.2867 | 0.6079 | 0.5412 | 0.0 | 0.9570 | 0.8218 | 0.9826 | 0.0098 | 0.2261 | 0.4856 | 0.0 | nan | 0.7748 | 0.8830 | 0.6690 | 0.7528 | 0.3793 | nan | 0.5862 | 0.6547 | 0.0308 | 0.8591 | 0.1060 | 0.0 | 0.4377 | 0.0256 | 0.5692 | 0.0 | 0.0215 | 0.7674 | 0.1777 | 0.5325 | 0.3070 | 0.3708 | nan | 0.2353 | 0.5018 | 0.3911 | 0.0 | 0.8832 | 0.7496 | 0.9528 | 0.0068 | 0.1247 | 0.3816 | 0.0 | | 0.0779 | 36.6 | 7320 | 0.7006 | 0.4113 | 0.4883 | 0.8802 | nan | 0.8665 | 0.9604 | 0.7948 | 0.8530 | 0.4945 | nan | 0.6943 | 0.8122 | 0.0683 | 0.9578 | 0.1127 | 0.0 | 0.6658 | 0.0479 | 0.6622 | 0.0 | 0.0275 | 0.9319 | 0.1801 | 0.6153 | 0.3630 | 0.5298 | nan | 0.2535 | 0.6365 | 0.5624 | 0.0 | 0.9528 | 0.8414 | 0.9845 | 0.0084 | 0.2596 | 0.4869 | 0.0 | nan | 0.7764 | 0.8868 | 0.6850 | 0.7540 | 0.3797 | nan | 0.5877 | 0.6487 | 0.0401 | 0.8588 | 0.0975 | 0.0 | 0.4239 | 0.0478 | 0.5621 | 0.0 | 0.0273 | 0.7669 | 0.1615 | 0.5356 | 0.3141 | 0.3862 | nan | 0.2107 | 0.5080 | 0.3909 | 0.0 | 0.8876 | 0.7632 | 0.9520 | 0.0057 | 0.1236 | 0.3791 | 0.0 | | 0.0819 | 36.7 | 7340 | 0.6940 | 0.4133 | 0.4937 | 0.8800 | nan | 0.8635 | 0.9556 | 0.8034 | 0.8453 | 0.5412 | nan | 0.7088 | 0.8302 | 0.0977 | 0.9553 | 0.1181 | 0.0 | 0.6735 | 0.0520 | 0.6906 | 0.0024 | 0.0285 | 0.9300 | 0.1750 | 0.6211 | 0.3797 | 0.4705 | nan | 0.2997 | 0.6162 | 0.5961 | 0.0 | 0.9550 | 0.8468 | 0.9837 | 0.0117 | 0.2575 | 0.4882 | 0.0 | nan | 0.7777 | 0.8874 | 0.6822 | 0.7598 | 0.3861 | nan | 0.5925 | 0.6507 | 0.0582 | 0.8616 | 0.1026 | 0.0 | 0.4242 | 0.0520 | 0.5613 | 0.0024 | 0.0280 | 0.7668 | 0.1571 | 0.5304 | 0.3237 | 0.3636 | nan | 0.2547 | 0.5030 | 0.3807 | 0.0 | 0.8867 | 0.7599 | 0.9527 | 0.0080 | 0.1316 | 0.3807 | 0.0 | | 0.0808 | 36.8 | 7360 | 0.7044 | 0.4140 | 0.4945 | 0.8787 | nan | 0.8521 | 0.9593 | 0.8008 | 0.8295 | 0.5274 | nan | 0.7146 | 0.8379 | 0.0985 | 0.9559 | 0.1124 | 0.0 | 0.7056 | 0.0324 | 0.7043 | 0.0 | 0.0238 | 0.9325 | 0.2648 | 0.6339 | 0.3744 | 0.4287 | nan | 0.3486 | 0.6128 | 0.5930 | 0.0 | 0.9544 | 0.8436 | 0.9831 | 0.0106 | 0.2225 | 0.4661 | 0.0 | nan | 0.7659 | 0.8791 | 0.6990 | 0.7507 | 0.3942 | nan | 0.5905 | 0.6398 | 0.0591 | 0.8624 | 0.0988 | 0.0 | 0.4251 | 0.0324 | 0.5589 | 0.0 | 0.0237 | 0.7672 | 0.2195 | 0.5346 | 0.3235 | 0.3453 | nan | 0.2728 | 0.5047 | 0.3876 | 0.0 | 0.8877 | 0.7607 | 0.9530 | 0.0081 | 0.1244 | 0.3778 | 0.0 | | 0.0728 | 36.9 | 7380 | 0.7084 | 0.4142 | 0.4939 | 0.8775 | nan | 0.8526 | 0.9574 | 0.7959 | 0.8252 | 0.5435 | nan | 0.7087 | 0.8486 | 0.0754 | 0.9540 | 0.1073 | 0.0 | 0.7192 | 0.0265 | 0.6834 | 0.0 | 0.0597 | 0.9352 | 0.2742 | 0.6030 | 0.3712 | 0.4912 | nan | 0.3306 | 0.6073 | 0.5723 | 0.0 | 0.9549 | 0.8389 | 0.9843 | 0.0061 | 0.1954 | 0.4826 | 0.0 | nan | 0.7638 | 0.8777 | 0.7027 | 0.7416 | 0.3910 | nan | 0.5880 | 0.6377 | 0.0475 | 0.8630 | 0.0950 | 0.0 | 0.4225 | 0.0265 | 0.5725 | 0.0 | 0.0590 | 0.7648 | 0.2240 | 0.5250 | 0.3208 | 0.3676 | nan | 0.2636 | 0.5029 | 0.3931 | 0.0 | 0.8882 | 0.7638 | 0.9525 | 0.0050 | 0.1131 | 0.3814 | 0.0 | | 0.1764 | 37.0 | 7400 | 0.7037 | 0.4157 | 0.4940 | 0.8778 | nan | 0.8468 | 0.9595 | 0.7919 | 0.8447 | 0.5265 | nan | 0.6947 | 0.8410 | 0.0717 | 0.9525 | 0.1123 | 0.0 | 0.6746 | 0.0423 | 0.6907 | 0.0 | 0.0719 | 0.9325 | 0.2086 | 0.6238 | 0.3980 | 0.5253 | nan | 0.3076 | 0.6047 | 0.5729 | 0.0 | 0.9522 | 0.8394 | 0.9850 | 0.0059 | 0.2324 | 0.4974 | 0.0 | nan | 0.7613 | 0.8786 | 0.7001 | 0.7416 | 0.3913 | nan | 0.5885 | 0.6472 | 0.0451 | 0.8637 | 0.0987 | 0.0 | 0.4451 | 0.0423 | 0.5728 | 0.0 | 0.0705 | 0.7681 | 0.1829 | 0.5288 | 0.3336 | 0.3785 | nan | 0.2514 | 0.5018 | 0.3933 | 0.0 | 0.8883 | 0.7619 | 0.9517 | 0.0048 | 0.1275 | 0.3840 | 0.0 | | 0.0895 | 37.1 | 7420 | 0.6999 | 0.4126 | 0.4867 | 0.8771 | nan | 0.8503 | 0.9592 | 0.7848 | 0.8346 | 0.5228 | nan | 0.7162 | 0.8293 | 0.0593 | 0.9567 | 0.1175 | 0.0 | 0.6605 | 0.0387 | 0.6881 | 0.0 | 0.0442 | 0.9354 | 0.1439 | 0.6318 | 0.3631 | 0.4673 | nan | 0.3297 | 0.6239 | 0.5750 | 0.0 | 0.9561 | 0.8125 | 0.9823 | 0.0063 | 0.2213 | 0.4635 | 0.0 | nan | 0.7630 | 0.8783 | 0.7080 | 0.7374 | 0.3937 | nan | 0.5928 | 0.6596 | 0.0380 | 0.8603 | 0.1037 | 0.0 | 0.4432 | 0.0387 | 0.5748 | 0.0 | 0.0438 | 0.7654 | 0.1303 | 0.5292 | 0.3127 | 0.3603 | nan | 0.2676 | 0.5057 | 0.3996 | 0.0 | 0.8828 | 0.7418 | 0.9535 | 0.0050 | 0.1345 | 0.3791 | 0.0 | | 0.0634 | 37.2 | 7440 | 0.7077 | 0.4125 | 0.4917 | 0.8772 | nan | 0.8502 | 0.9576 | 0.7968 | 0.8249 | 0.5279 | nan | 0.7244 | 0.8363 | 0.0593 | 0.9579 | 0.1177 | 0.0 | 0.7277 | 0.0377 | 0.6814 | 0.0 | 0.0241 | 0.9348 | 0.1902 | 0.6227 | 0.3439 | 0.4986 | nan | 0.3346 | 0.6298 | 0.5920 | 0.0 | 0.9556 | 0.8270 | 0.9823 | 0.0088 | 0.2083 | 0.4821 | 0.0 | nan | 0.7617 | 0.8781 | 0.7008 | 0.7425 | 0.3955 | nan | 0.5903 | 0.6593 | 0.0376 | 0.8607 | 0.1038 | 0.0 | 0.4193 | 0.0377 | 0.5790 | 0.0 | 0.0239 | 0.7656 | 0.1680 | 0.5331 | 0.3008 | 0.3671 | nan | 0.2706 | 0.5073 | 0.3960 | 0.0 | 0.8836 | 0.7501 | 0.9531 | 0.0067 | 0.1227 | 0.3856 | 0.0 | | 0.0869 | 37.3 | 7460 | 0.7203 | 0.4136 | 0.4989 | 0.8760 | nan | 0.8378 | 0.9566 | 0.8226 | 0.8214 | 0.5284 | nan | 0.7195 | 0.8496 | 0.0744 | 0.9566 | 0.1154 | 0.0 | 0.7636 | 0.0293 | 0.6973 | 0.0 | 0.0175 | 0.9256 | 0.2485 | 0.6449 | 0.3697 | 0.5119 | nan | 0.3441 | 0.6346 | 0.5931 | 0.0 | 0.9548 | 0.8344 | 0.9848 | 0.0120 | 0.2231 | 0.4933 | 0.0 | nan | 0.7570 | 0.8759 | 0.6561 | 0.7358 | 0.3974 | nan | 0.5874 | 0.6473 | 0.0484 | 0.8628 | 0.1012 | 0.0 | 0.4442 | 0.0293 | 0.5714 | 0.0 | 0.0173 | 0.7723 | 0.2101 | 0.5453 | 0.3139 | 0.3701 | nan | 0.2793 | 0.5088 | 0.3939 | 0.0 | 0.8855 | 0.7551 | 0.9520 | 0.0085 | 0.1230 | 0.3860 | 0.0 | | 0.0597 | 37.4 | 7480 | 0.7213 | 0.4157 | 0.4955 | 0.8759 | nan | 0.8351 | 0.9555 | 0.8265 | 0.8139 | 0.5288 | nan | 0.7252 | 0.8324 | 0.0834 | 0.9548 | 0.1119 | 0.0 | 0.6856 | 0.0357 | 0.6952 | 0.0 | 0.0188 | 0.9232 | 0.2232 | 0.6509 | 0.4116 | 0.4663 | nan | 0.3377 | 0.6356 | 0.5721 | 0.0 | 0.9563 | 0.8457 | 0.9834 | 0.0146 | 0.2253 | 0.5080 | 0.0 | nan | 0.7572 | 0.8750 | 0.6576 | 0.7290 | 0.3910 | nan | 0.5886 | 0.6493 | 0.0514 | 0.8631 | 0.0988 | 0.0 | 0.4646 | 0.0357 | 0.5736 | 0.0 | 0.0186 | 0.7752 | 0.1952 | 0.5484 | 0.3319 | 0.3610 | nan | 0.2802 | 0.5078 | 0.4058 | 0.0 | 0.8867 | 0.7619 | 0.9530 | 0.0098 | 0.1402 | 0.3920 | 0.0 | | 0.1193 | 37.5 | 7500 | 0.7207 | 0.4120 | 0.4897 | 0.8770 | nan | 0.8432 | 0.9606 | 0.8109 | 0.8180 | 0.5229 | nan | 0.7120 | 0.8275 | 0.1034 | 0.9543 | 0.1219 | 0.0 | 0.6833 | 0.0345 | 0.6969 | 0.0 | 0.0102 | 0.9327 | 0.1481 | 0.6324 | 0.3758 | 0.4912 | nan | 0.2921 | 0.6100 | 0.5838 | 0.0 | 0.9552 | 0.8455 | 0.9850 | 0.0088 | 0.2433 | 0.4661 | 0.0 | nan | 0.7599 | 0.8764 | 0.6587 | 0.7355 | 0.4006 | nan | 0.5885 | 0.6496 | 0.0616 | 0.8628 | 0.1051 | 0.0 | 0.4668 | 0.0345 | 0.5765 | 0.0 | 0.0101 | 0.7681 | 0.1350 | 0.5379 | 0.3189 | 0.3620 | nan | 0.2479 | 0.5048 | 0.3974 | 0.0 | 0.8879 | 0.7671 | 0.9516 | 0.0066 | 0.1352 | 0.3757 | 0.0 | | 0.0614 | 37.6 | 7520 | 0.6856 | 0.4164 | 0.4986 | 0.8806 | nan | 0.8557 | 0.9552 | 0.8226 | 0.8593 | 0.5294 | nan | 0.7159 | 0.8328 | 0.1112 | 0.9576 | 0.1192 | 0.0 | 0.6645 | 0.0129 | 0.6900 | 0.0 | 0.0179 | 0.9247 | 0.1651 | 0.6610 | 0.4152 | 0.5309 | nan | 0.3175 | 0.6542 | 0.5826 | 0.0 | 0.9500 | 0.8539 | 0.9860 | 0.0117 | 0.2712 | 0.4871 | 0.0 | nan | 0.7709 | 0.8876 | 0.6646 | 0.7462 | 0.3925 | nan | 0.5944 | 0.6430 | 0.0640 | 0.8613 | 0.1034 | 0.0 | 0.4655 | 0.0129 | 0.5776 | 0.0 | 0.0176 | 0.7756 | 0.1487 | 0.5568 | 0.3333 | 0.3772 | nan | 0.2724 | 0.5143 | 0.3912 | 0.0 | 0.8902 | 0.7694 | 0.9512 | 0.0086 | 0.1547 | 0.3807 | 0.0 | | 0.0606 | 37.7 | 7540 | 0.6941 | 0.4182 | 0.5017 | 0.8805 | nan | 0.8650 | 0.9560 | 0.7933 | 0.8414 | 0.5326 | nan | 0.7177 | 0.8293 | 0.2166 | 0.9546 | 0.1315 | 0.0 | 0.6449 | 0.0166 | 0.7135 | 0.0 | 0.0131 | 0.9264 | 0.1896 | 0.6603 | 0.3691 | 0.5088 | nan | 0.3542 | 0.6202 | 0.5767 | 0.0 | 0.9502 | 0.8751 | 0.9825 | 0.0109 | 0.3413 | 0.4621 | 0.0 | nan | 0.7733 | 0.8840 | 0.6919 | 0.7561 | 0.3889 | nan | 0.5943 | 0.6378 | 0.1103 | 0.8633 | 0.1141 | 0.0 | 0.4581 | 0.0166 | 0.5770 | 0.0 | 0.0129 | 0.7770 | 0.1652 | 0.5553 | 0.3172 | 0.3632 | nan | 0.2839 | 0.5131 | 0.3938 | 0.0 | 0.8873 | 0.7691 | 0.9542 | 0.0079 | 0.1494 | 0.3677 | 0.0 | | 0.0705 | 37.8 | 7560 | 0.7229 | 0.4164 | 0.4947 | 0.8785 | nan | 0.8464 | 0.9580 | 0.8100 | 0.8481 | 0.5249 | nan | 0.7054 | 0.8062 | 0.1369 | 0.9560 | 0.1133 | 0.0 | 0.6437 | 0.0216 | 0.7081 | 0.0 | 0.0221 | 0.9333 | 0.1667 | 0.6620 | 0.3972 | 0.4877 | nan | 0.3476 | 0.6257 | 0.5574 | 0.0 | 0.9519 | 0.8326 | 0.9843 | 0.0128 | 0.2983 | 0.4736 | 0.0 | nan | 0.7628 | 0.8790 | 0.6833 | 0.7575 | 0.3933 | nan | 0.5903 | 0.6564 | 0.0770 | 0.8623 | 0.0997 | 0.0 | 0.4658 | 0.0216 | 0.5761 | 0.0 | 0.0218 | 0.7726 | 0.1472 | 0.5519 | 0.3323 | 0.3607 | nan | 0.2752 | 0.5137 | 0.3994 | 0.0 | 0.8860 | 0.7577 | 0.9535 | 0.0095 | 0.1437 | 0.3755 | 0.0 | | 0.0537 | 37.9 | 7580 | 0.7356 | 0.4172 | 0.5007 | 0.8764 | nan | 0.8459 | 0.9600 | 0.8097 | 0.8236 | 0.5127 | nan | 0.7044 | 0.8155 | 0.2175 | 0.9523 | 0.1212 | 0.0 | 0.6535 | 0.0820 | 0.7624 | 0.0 | 0.0245 | 0.9178 | 0.2273 | 0.6465 | 0.3677 | 0.4937 | nan | 0.3210 | 0.6487 | 0.5791 | 0.0 | 0.9585 | 0.8132 | 0.9828 | 0.0122 | 0.2730 | 0.4941 | 0.0 | nan | 0.7605 | 0.8744 | 0.6860 | 0.7425 | 0.3914 | nan | 0.5853 | 0.6235 | 0.1081 | 0.8668 | 0.1056 | 0.0 | 0.4666 | 0.0820 | 0.5846 | 0.0 | 0.0237 | 0.7741 | 0.1965 | 0.5475 | 0.3153 | 0.3550 | nan | 0.2637 | 0.5131 | 0.3937 | 0.0 | 0.8822 | 0.7447 | 0.9532 | 0.0093 | 0.1260 | 0.3765 | 0.0 | | 0.0794 | 38.0 | 7600 | 0.7181 | 0.4144 | 0.4909 | 0.8764 | nan | 0.8466 | 0.9593 | 0.7987 | 0.8244 | 0.5173 | nan | 0.6945 | 0.8151 | 0.1500 | 0.9550 | 0.1158 | 0.0 | 0.6362 | 0.0640 | 0.7349 | 0.0 | 0.0074 | 0.9308 | 0.2132 | 0.6209 | 0.3703 | 0.4600 | nan | 0.3048 | 0.6224 | 0.5582 | 0.0 | 0.9569 | 0.8241 | 0.9826 | 0.0143 | 0.2317 | 0.5001 | 0.0 | nan | 0.7596 | 0.8736 | 0.6989 | 0.7417 | 0.3967 | nan | 0.5822 | 0.6178 | 0.0823 | 0.8657 | 0.1009 | 0.0 | 0.4681 | 0.0640 | 0.5913 | 0.0 | 0.0073 | 0.7699 | 0.1893 | 0.5379 | 0.3136 | 0.3458 | nan | 0.2566 | 0.5099 | 0.3935 | 0.0 | 0.8832 | 0.7484 | 0.9532 | 0.0109 | 0.1155 | 0.3826 | 0.0 | | 0.0976 | 38.1 | 7620 | 0.7275 | 0.4167 | 0.4948 | 0.8778 | nan | 0.8452 | 0.9596 | 0.7945 | 0.8375 | 0.5127 | nan | 0.7134 | 0.8046 | 0.2441 | 0.9569 | 0.1222 | 0.0 | 0.6225 | 0.0448 | 0.7151 | 0.0 | 0.0046 | 0.9291 | 0.1728 | 0.6484 | 0.3735 | 0.5232 | nan | 0.3094 | 0.6309 | 0.5690 | 0.0 | 0.9544 | 0.8394 | 0.9851 | 0.0144 | 0.2225 | 0.4840 | 0.0 | nan | 0.7599 | 0.8755 | 0.7011 | 0.7472 | 0.3964 | nan | 0.5875 | 0.6215 | 0.1186 | 0.8636 | 0.1064 | 0.0 | 0.4711 | 0.0448 | 0.5874 | 0.0 | 0.0046 | 0.7716 | 0.1564 | 0.5438 | 0.3163 | 0.3747 | nan | 0.2580 | 0.5110 | 0.3998 | 0.0 | 0.8870 | 0.7616 | 0.9522 | 0.0109 | 0.1265 | 0.3795 | 0.0 | | 0.1365 | 38.2 | 7640 | 0.7119 | 0.4121 | 0.4928 | 0.8766 | nan | 0.8474 | 0.9524 | 0.8074 | 0.8526 | 0.5495 | nan | 0.6898 | 0.8332 | 0.0948 | 0.9565 | 0.1132 | 0.0 | 0.6853 | 0.0279 | 0.7159 | 0.0 | 0.0181 | 0.9358 | 0.1758 | 0.6504 | 0.3769 | 0.5242 | nan | 0.2627 | 0.6129 | 0.5666 | 0.0 | 0.9530 | 0.8232 | 0.9847 | 0.0102 | 0.2972 | 0.4530 | 0.0 | nan | 0.7633 | 0.8785 | 0.6964 | 0.7575 | 0.3707 | nan | 0.5854 | 0.6259 | 0.0489 | 0.8638 | 0.0986 | 0.0 | 0.4741 | 0.0279 | 0.5917 | 0.0 | 0.0180 | 0.7694 | 0.1586 | 0.5437 | 0.3192 | 0.3748 | nan | 0.2147 | 0.5088 | 0.3942 | 0.0 | 0.8850 | 0.7495 | 0.9525 | 0.0080 | 0.1414 | 0.3649 | 0.0 | | 0.083 | 38.3 | 7660 | 0.6949 | 0.4127 | 0.4875 | 0.8774 | nan | 0.8514 | 0.9562 | 0.7999 | 0.8520 | 0.5273 | nan | 0.6936 | 0.8133 | 0.0601 | 0.9558 | 0.1203 | 0.0 | 0.6775 | 0.0249 | 0.6821 | 0.0 | 0.0110 | 0.9317 | 0.2019 | 0.6379 | 0.3723 | 0.4744 | nan | 0.2766 | 0.6250 | 0.5635 | 0.0 | 0.9551 | 0.8210 | 0.9834 | 0.0084 | 0.2563 | 0.4670 | 0.0 | nan | 0.7648 | 0.8777 | 0.7051 | 0.7562 | 0.3775 | nan | 0.5838 | 0.6511 | 0.0339 | 0.8629 | 0.1044 | 0.0 | 0.4645 | 0.0249 | 0.5882 | 0.0 | 0.0109 | 0.7678 | 0.1786 | 0.5393 | 0.3193 | 0.3608 | nan | 0.2288 | 0.5063 | 0.3969 | 0.0 | 0.8832 | 0.7467 | 0.9530 | 0.0070 | 0.1366 | 0.3761 | 0.0 | | 0.0565 | 38.4 | 7680 | 0.7109 | 0.4145 | 0.4937 | 0.8779 | nan | 0.8473 | 0.9554 | 0.7978 | 0.8475 | 0.5370 | nan | 0.7277 | 0.8299 | 0.0863 | 0.9572 | 0.1351 | 0.0 | 0.7158 | 0.0126 | 0.6839 | 0.0 | 0.0220 | 0.9303 | 0.1862 | 0.6312 | 0.3700 | 0.4975 | nan | 0.3249 | 0.6206 | 0.5891 | 0.0 | 0.9540 | 0.8441 | 0.9846 | 0.0100 | 0.2242 | 0.4755 | 0.0 | nan | 0.7633 | 0.8766 | 0.7151 | 0.7567 | 0.3933 | nan | 0.5880 | 0.6438 | 0.0540 | 0.8617 | 0.1155 | 0.0 | 0.4699 | 0.0126 | 0.5720 | 0.0 | 0.0219 | 0.7696 | 0.1657 | 0.5374 | 0.3103 | 0.3687 | nan | 0.2571 | 0.5025 | 0.3951 | 0.0 | 0.8852 | 0.7603 | 0.9523 | 0.0079 | 0.1289 | 0.3780 | 0.0 | | 0.0839 | 38.5 | 7700 | 0.7159 | 0.4131 | 0.4910 | 0.8775 | nan | 0.8444 | 0.9584 | 0.8049 | 0.8463 | 0.5358 | nan | 0.7017 | 0.8419 | 0.0834 | 0.9532 | 0.1406 | 0.0 | 0.7470 | 0.0308 | 0.6798 | 0.0014 | 0.0332 | 0.9384 | 0.1592 | 0.6145 | 0.3729 | 0.5277 | nan | 0.2538 | 0.6062 | 0.5783 | 0.0 | 0.9537 | 0.8386 | 0.9830 | 0.0098 | 0.2066 | 0.4680 | 0.0 | nan | 0.7614 | 0.8756 | 0.7064 | 0.7559 | 0.3953 | nan | 0.5875 | 0.6396 | 0.0518 | 0.8645 | 0.1198 | 0.0 | 0.4784 | 0.0308 | 0.5764 | 0.0014 | 0.0330 | 0.7649 | 0.1447 | 0.5283 | 0.3164 | 0.3755 | nan | 0.2125 | 0.5022 | 0.3936 | 0.0 | 0.8854 | 0.7581 | 0.9531 | 0.0079 | 0.1206 | 0.3774 | 0.0 | | 0.0716 | 38.6 | 7720 | 0.7162 | 0.4187 | 0.5017 | 0.8788 | nan | 0.8526 | 0.9551 | 0.8050 | 0.8509 | 0.5184 | nan | 0.7209 | 0.8335 | 0.1556 | 0.9543 | 0.1366 | 0.0 | 0.7470 | 0.0510 | 0.7137 | 0.0 | 0.0399 | 0.9287 | 0.1901 | 0.6347 | 0.3853 | 0.5274 | nan | 0.3012 | 0.6383 | 0.5748 | 0.0 | 0.9536 | 0.8538 | 0.9817 | 0.0128 | 0.2525 | 0.4866 | 0.0 | nan | 0.7630 | 0.8781 | 0.7102 | 0.7565 | 0.3939 | nan | 0.5860 | 0.6298 | 0.0866 | 0.8660 | 0.1174 | 0.0 | 0.4762 | 0.0510 | 0.5783 | 0.0 | 0.0391 | 0.7713 | 0.1708 | 0.5383 | 0.3233 | 0.3755 | nan | 0.2561 | 0.5111 | 0.3920 | 0.0 | 0.8877 | 0.7657 | 0.9540 | 0.0100 | 0.1305 | 0.3816 | 0.0 | | 0.0676 | 38.7 | 7740 | 0.7302 | 0.4197 | 0.5022 | 0.8779 | nan | 0.8427 | 0.9615 | 0.8072 | 0.8324 | 0.5191 | nan | 0.7091 | 0.8190 | 0.1884 | 0.9549 | 0.1308 | 0.0 | 0.7349 | 0.0611 | 0.7321 | 0.0 | 0.0403 | 0.9298 | 0.2212 | 0.6389 | 0.3802 | 0.5425 | nan | 0.3131 | 0.6210 | 0.5711 | 0.0 | 0.9553 | 0.8361 | 0.9837 | 0.0132 | 0.2581 | 0.4737 | 0.0 | nan | 0.7592 | 0.8737 | 0.7083 | 0.7446 | 0.4115 | nan | 0.5851 | 0.6418 | 0.1032 | 0.8648 | 0.1133 | 0.0 | 0.4610 | 0.0611 | 0.5807 | 0.0 | 0.0396 | 0.7703 | 0.1939 | 0.5360 | 0.3207 | 0.3842 | nan | 0.2520 | 0.5080 | 0.3975 | 0.0 | 0.8857 | 0.7559 | 0.9533 | 0.0114 | 0.1337 | 0.3807 | 0.0 | | 0.0372 | 38.8 | 7760 | 0.7275 | 0.4183 | 0.4979 | 0.8781 | nan | 0.8438 | 0.9619 | 0.8070 | 0.8256 | 0.5068 | nan | 0.7279 | 0.8360 | 0.1696 | 0.9537 | 0.1189 | 0.0 | 0.7087 | 0.0612 | 0.7082 | 0.0 | 0.0407 | 0.9307 | 0.2306 | 0.6416 | 0.3363 | 0.4796 | nan | 0.3186 | 0.6241 | 0.5594 | 0.0 | 0.9526 | 0.8501 | 0.9834 | 0.0108 | 0.2527 | 0.4908 | 0.0 | nan | 0.7588 | 0.8734 | 0.7065 | 0.7448 | 0.4069 | nan | 0.5842 | 0.6397 | 0.0891 | 0.8659 | 0.1040 | 0.0 | 0.4616 | 0.0612 | 0.5898 | 0.0 | 0.0400 | 0.7725 | 0.2028 | 0.5434 | 0.2970 | 0.3622 | nan | 0.2571 | 0.5070 | 0.3964 | 0.0 | 0.8866 | 0.7624 | 0.9531 | 0.0095 | 0.1274 | 0.3815 | 0.0 | | 0.1237 | 38.9 | 7780 | 0.7390 | 0.4173 | 0.4961 | 0.8774 | nan | 0.8399 | 0.9606 | 0.8099 | 0.8417 | 0.5393 | nan | 0.7047 | 0.8089 | 0.1372 | 0.9554 | 0.1194 | 0.0 | 0.7419 | 0.0579 | 0.7088 | 0.0 | 0.0401 | 0.9338 | 0.2093 | 0.6252 | 0.3535 | 0.5579 | nan | 0.2733 | 0.6027 | 0.5453 | 0.0 | 0.9556 | 0.8218 | 0.9823 | 0.0119 | 0.2484 | 0.4868 | 0.0 | nan | 0.7578 | 0.8762 | 0.6966 | 0.7555 | 0.4077 | nan | 0.5903 | 0.6490 | 0.0734 | 0.8654 | 0.1039 | 0.0 | 0.4925 | 0.0579 | 0.5851 | 0.0 | 0.0395 | 0.7697 | 0.1866 | 0.5410 | 0.3043 | 0.3889 | nan | 0.2289 | 0.5039 | 0.3952 | 0.0 | 0.8828 | 0.7474 | 0.9537 | 0.0093 | 0.1161 | 0.3756 | 0.0 | | 0.0878 | 39.0 | 7800 | 0.7379 | 0.4135 | 0.4932 | 0.8779 | nan | 0.8465 | 0.9608 | 0.8103 | 0.8408 | 0.5221 | nan | 0.7165 | 0.8213 | 0.1609 | 0.9555 | 0.1149 | 0.0 | 0.6741 | 0.0498 | 0.7193 | 0.0 | 0.0385 | 0.9362 | 0.1633 | 0.6127 | 0.3813 | 0.5541 | nan | 0.2256 | 0.6193 | 0.5516 | 0.0 | 0.9528 | 0.8320 | 0.9839 | 0.0110 | 0.2697 | 0.4585 | 0.0 | nan | 0.7617 | 0.8781 | 0.6954 | 0.7586 | 0.4065 | nan | 0.5940 | 0.6398 | 0.0858 | 0.8650 | 0.0996 | 0.0 | 0.4466 | 0.0498 | 0.5844 | 0.0 | 0.0380 | 0.7653 | 0.1480 | 0.5314 | 0.3164 | 0.3876 | nan | 0.1939 | 0.5069 | 0.3963 | 0.0 | 0.8841 | 0.7475 | 0.9528 | 0.0084 | 0.1212 | 0.3684 | 0.0 | | 0.062 | 39.1 | 7820 | 0.7108 | 0.4157 | 0.4923 | 0.8784 | nan | 0.8438 | 0.9597 | 0.7949 | 0.8504 | 0.5365 | nan | 0.7155 | 0.8291 | 0.0994 | 0.9575 | 0.1482 | 0.0 | 0.6562 | 0.0225 | 0.6960 | 0.0 | 0.0527 | 0.9367 | 0.1711 | 0.6192 | 0.3725 | 0.5414 | nan | 0.2670 | 0.6328 | 0.5576 | 0.0 | 0.9509 | 0.8439 | 0.9822 | 0.0065 | 0.2352 | 0.4735 | 0.0 | nan | 0.7621 | 0.8780 | 0.7092 | 0.7552 | 0.4032 | nan | 0.5936 | 0.6374 | 0.0592 | 0.8633 | 0.1272 | 0.0 | 0.4597 | 0.0225 | 0.5862 | 0.0 | 0.0517 | 0.7645 | 0.1543 | 0.5282 | 0.3163 | 0.3868 | nan | 0.2224 | 0.5101 | 0.4008 | 0.0 | 0.8880 | 0.7637 | 0.9543 | 0.0051 | 0.1229 | 0.3766 | 0.0 | | 0.1004 | 39.2 | 7840 | 0.7033 | 0.4185 | 0.4942 | 0.8791 | nan | 0.8559 | 0.9565 | 0.7930 | 0.8491 | 0.5355 | nan | 0.7101 | 0.8301 | 0.1027 | 0.9562 | 0.1725 | 0.0 | 0.6637 | 0.0242 | 0.7070 | 0.0 | 0.0722 | 0.9286 | 0.1791 | 0.6424 | 0.3857 | 0.5214 | nan | 0.2739 | 0.6100 | 0.5467 | 0.0 | 0.9586 | 0.8115 | 0.9831 | 0.0036 | 0.2308 | 0.5100 | 0.0 | nan | 0.7675 | 0.8797 | 0.7109 | 0.7577 | 0.3964 | nan | 0.5891 | 0.6448 | 0.0609 | 0.8643 | 0.1490 | 0.0 | 0.4624 | 0.0242 | 0.5880 | 0.0 | 0.0703 | 0.7721 | 0.1616 | 0.5424 | 0.3222 | 0.3832 | nan | 0.2333 | 0.5032 | 0.3985 | 0.0 | 0.8826 | 0.7440 | 0.9536 | 0.0029 | 0.1356 | 0.3931 | 0.0 | | 0.0602 | 39.3 | 7860 | 0.7216 | 0.4169 | 0.4948 | 0.8780 | nan | 0.8340 | 0.9595 | 0.8077 | 0.8507 | 0.5321 | nan | 0.7220 | 0.8371 | 0.1258 | 0.9539 | 0.1473 | 0.0 | 0.6460 | 0.0272 | 0.7104 | 0.0 | 0.0401 | 0.9283 | 0.2397 | 0.6438 | 0.3579 | 0.4705 | nan | 0.2827 | 0.6228 | 0.5608 | 0.0 | 0.9569 | 0.8349 | 0.9816 | 0.0061 | 0.2532 | 0.4996 | 0.0 | nan | 0.7584 | 0.8766 | 0.6849 | 0.7489 | 0.3992 | nan | 0.5886 | 0.6289 | 0.0706 | 0.8663 | 0.1302 | 0.0 | 0.4607 | 0.0272 | 0.5851 | 0.0 | 0.0398 | 0.7749 | 0.2118 | 0.5500 | 0.3089 | 0.3623 | nan | 0.2385 | 0.5067 | 0.3993 | 0.0 | 0.8862 | 0.7622 | 0.9543 | 0.0048 | 0.1298 | 0.3845 | 0.0 | | 0.0722 | 39.4 | 7880 | 0.7279 | 0.4158 | 0.4944 | 0.8793 | nan | 0.8447 | 0.9610 | 0.8107 | 0.8430 | 0.5297 | nan | 0.7024 | 0.8319 | 0.1052 | 0.9593 | 0.1157 | 0.0 | 0.6465 | 0.0196 | 0.7043 | 0.0 | 0.0278 | 0.9274 | 0.2287 | 0.6582 | 0.3815 | 0.4803 | nan | 0.2805 | 0.6389 | 0.5961 | 0.0 | 0.9555 | 0.8343 | 0.9837 | 0.0073 | 0.2698 | 0.4762 | 0.0 | nan | 0.7635 | 0.8777 | 0.6774 | 0.7572 | 0.4057 | nan | 0.5842 | 0.6384 | 0.0617 | 0.8630 | 0.1021 | 0.0 | 0.4500 | 0.0196 | 0.5784 | 0.0 | 0.0277 | 0.7755 | 0.2023 | 0.5536 | 0.3216 | 0.3683 | nan | 0.2345 | 0.5113 | 0.4037 | 0.0 | 0.8876 | 0.7632 | 0.9530 | 0.0058 | 0.1393 | 0.3805 | 0.0 | | 0.0719 | 39.5 | 7900 | 0.7208 | 0.4149 | 0.4961 | 0.8786 | nan | 0.8398 | 0.9592 | 0.8124 | 0.8508 | 0.5282 | nan | 0.7058 | 0.8316 | 0.1139 | 0.9600 | 0.1281 | 0.0 | 0.6649 | 0.0164 | 0.7065 | 0.0 | 0.0230 | 0.9300 | 0.1827 | 0.6428 | 0.4054 | 0.5390 | nan | 0.2653 | 0.6270 | 0.5696 | 0.0 | 0.9535 | 0.8449 | 0.9823 | 0.0082 | 0.2985 | 0.4852 | 0.0 | nan | 0.7593 | 0.8784 | 0.6730 | 0.7511 | 0.4080 | nan | 0.5888 | 0.6418 | 0.0683 | 0.8619 | 0.1105 | 0.0 | 0.4541 | 0.0164 | 0.5749 | 0.0 | 0.0230 | 0.7736 | 0.1657 | 0.5479 | 0.3315 | 0.3771 | nan | 0.2269 | 0.5095 | 0.4005 | 0.0 | 0.8888 | 0.7629 | 0.9537 | 0.0065 | 0.1402 | 0.3813 | 0.0 | | 0.0846 | 39.6 | 7920 | 0.6984 | 0.4183 | 0.4961 | 0.8808 | nan | 0.8589 | 0.9569 | 0.8112 | 0.8473 | 0.5346 | nan | 0.7155 | 0.8169 | 0.0974 | 0.9561 | 0.1416 | 0.0 | 0.6767 | 0.0101 | 0.7005 | 0.0 | 0.0254 | 0.9290 | 0.1983 | 0.6502 | 0.3968 | 0.5309 | nan | 0.2897 | 0.6273 | 0.5516 | 0.0 | 0.9538 | 0.8456 | 0.9819 | 0.0081 | 0.2572 | 0.5060 | 0.0 | nan | 0.7752 | 0.8828 | 0.6751 | 0.7567 | 0.4072 | nan | 0.5912 | 0.6550 | 0.0613 | 0.8624 | 0.1209 | 0.0 | 0.4808 | 0.0101 | 0.5745 | 0.0 | 0.0253 | 0.7752 | 0.1782 | 0.5485 | 0.3296 | 0.3791 | nan | 0.2458 | 0.5081 | 0.4058 | 0.0 | 0.8873 | 0.7594 | 0.9539 | 0.0062 | 0.1392 | 0.3899 | 0.0 | | 0.0809 | 39.7 | 7940 | 0.7169 | 0.4148 | 0.4923 | 0.8793 | nan | 0.8536 | 0.9586 | 0.8105 | 0.8475 | 0.5322 | nan | 0.6982 | 0.8256 | 0.0994 | 0.9558 | 0.1317 | 0.0 | 0.6672 | 0.0141 | 0.7055 | 0.0 | 0.0191 | 0.9298 | 0.1881 | 0.6427 | 0.3488 | 0.5112 | nan | 0.2933 | 0.6179 | 0.5871 | 0.0 | 0.9564 | 0.8229 | 0.9850 | 0.0119 | 0.2337 | 0.5052 | 0.0 | nan | 0.7701 | 0.8802 | 0.6757 | 0.7586 | 0.4029 | nan | 0.5886 | 0.6480 | 0.0595 | 0.8637 | 0.1131 | 0.0 | 0.4732 | 0.0141 | 0.5779 | 0.0 | 0.0190 | 0.7734 | 0.1697 | 0.5488 | 0.3030 | 0.3724 | nan | 0.2432 | 0.5065 | 0.4009 | 0.0 | 0.8848 | 0.7553 | 0.9522 | 0.0083 | 0.1235 | 0.3883 | 0.0 | | 0.0945 | 39.8 | 7960 | 0.7017 | 0.4152 | 0.4958 | 0.8797 | nan | 0.8576 | 0.9532 | 0.8158 | 0.8498 | 0.5588 | nan | 0.7138 | 0.8261 | 0.1482 | 0.9577 | 0.1459 | 0.0 | 0.6522 | 0.0102 | 0.7086 | 0.0 | 0.0185 | 0.9304 | 0.1772 | 0.6265 | 0.3742 | 0.5060 | nan | 0.2929 | 0.6277 | 0.5883 | 0.0 | 0.9563 | 0.8431 | 0.9832 | 0.0089 | 0.2584 | 0.4771 | 0.0 | nan | 0.7751 | 0.8855 | 0.6741 | 0.7566 | 0.3922 | nan | 0.5920 | 0.6459 | 0.0836 | 0.8614 | 0.1255 | 0.0 | 0.4616 | 0.0102 | 0.5758 | 0.0 | 0.0184 | 0.7704 | 0.1604 | 0.5422 | 0.3159 | 0.3706 | nan | 0.2470 | 0.5085 | 0.3991 | 0.0 | 0.8875 | 0.7653 | 0.9526 | 0.0060 | 0.1240 | 0.3785 | 0.0 | | 0.0709 | 39.9 | 7980 | 0.6824 | 0.4177 | 0.4990 | 0.8802 | nan | 0.8637 | 0.9545 | 0.8064 | 0.8450 | 0.5502 | nan | 0.7172 | 0.8241 | 0.1955 | 0.9567 | 0.1354 | 0.0 | 0.6952 | 0.0083 | 0.7013 | 0.0 | 0.0200 | 0.9256 | 0.1924 | 0.6498 | 0.3677 | 0.5193 | nan | 0.3254 | 0.6198 | 0.5777 | 0.0 | 0.9556 | 0.8344 | 0.9831 | 0.0078 | 0.2394 | 0.4968 | 0.0 | nan | 0.7781 | 0.8862 | 0.7001 | 0.7581 | 0.3762 | nan | 0.5914 | 0.6247 | 0.1057 | 0.8610 | 0.1174 | 0.0 | 0.4881 | 0.0083 | 0.5716 | 0.0 | 0.0199 | 0.7752 | 0.1735 | 0.5509 | 0.3146 | 0.3802 | nan | 0.2705 | 0.5070 | 0.3950 | 0.0 | 0.8869 | 0.7625 | 0.9528 | 0.0055 | 0.1265 | 0.3797 | 0.0 | | 0.0547 | 40.0 | 8000 | 0.7111 | 0.4160 | 0.4967 | 0.8793 | nan | 0.8592 | 0.9570 | 0.7988 | 0.8468 | 0.5238 | nan | 0.7063 | 0.8328 | 0.1482 | 0.9541 | 0.1329 | 0.0 | 0.6587 | 0.0168 | 0.7095 | 0.0 | 0.0289 | 0.9318 | 0.2070 | 0.6181 | 0.3898 | 0.5362 | nan | 0.3203 | 0.6264 | 0.5822 | 0.0 | 0.9551 | 0.8464 | 0.9843 | 0.0083 | 0.2529 | 0.4631 | 0.0 | nan | 0.7693 | 0.8781 | 0.7153 | 0.7588 | 0.3972 | nan | 0.5880 | 0.6141 | 0.0841 | 0.8626 | 0.1162 | 0.0 | 0.4551 | 0.0168 | 0.5758 | 0.0 | 0.0288 | 0.7695 | 0.1828 | 0.5380 | 0.3252 | 0.3743 | nan | 0.2568 | 0.5091 | 0.3966 | 0.0 | 0.8883 | 0.7678 | 0.9528 | 0.0061 | 0.1174 | 0.3673 | 0.0 | | 0.0858 | 40.1 | 8020 | 0.7221 | 0.4135 | 0.4925 | 0.8782 | nan | 0.8552 | 0.9559 | 0.7976 | 0.8474 | 0.5309 | nan | 0.7098 | 0.8381 | 0.1045 | 0.9552 | 0.1035 | 0.0 | 0.6336 | 0.0115 | 0.7002 | 0.0 | 0.0388 | 0.9335 | 0.2127 | 0.5985 | 0.3998 | 0.5091 | nan | 0.3033 | 0.6293 | 0.5819 | 0.0 | 0.9555 | 0.8360 | 0.9838 | 0.0103 | 0.2591 | 0.4654 | 0.0 | nan | 0.7694 | 0.8781 | 0.7211 | 0.7589 | 0.3944 | nan | 0.5895 | 0.6134 | 0.0608 | 0.8625 | 0.0920 | 0.0 | 0.4409 | 0.0115 | 0.5837 | 0.0 | 0.0385 | 0.7643 | 0.1876 | 0.5244 | 0.3306 | 0.3740 | nan | 0.2456 | 0.5079 | 0.3983 | 0.0 | 0.8871 | 0.7570 | 0.9532 | 0.0073 | 0.1129 | 0.3659 | 0.0 | | 0.0695 | 40.2 | 8040 | 0.7099 | 0.4159 | 0.4970 | 0.8800 | nan | 0.8637 | 0.9564 | 0.7999 | 0.8444 | 0.5332 | nan | 0.7073 | 0.8490 | 0.1056 | 0.9560 | 0.1068 | 0.0 | 0.6443 | 0.0205 | 0.7023 | 0.0 | 0.0346 | 0.9285 | 0.2501 | 0.6405 | 0.3662 | 0.5070 | nan | 0.3411 | 0.6337 | 0.5808 | 0.0 | 0.9531 | 0.8505 | 0.9833 | 0.0080 | 0.2765 | 0.4612 | 0.0 | nan | 0.7759 | 0.8817 | 0.7130 | 0.7633 | 0.3899 | nan | 0.5911 | 0.6022 | 0.0613 | 0.8624 | 0.0944 | 0.0 | 0.4481 | 0.0205 | 0.5844 | 0.0 | 0.0342 | 0.7705 | 0.2157 | 0.5417 | 0.3178 | 0.3717 | nan | 0.2610 | 0.5101 | 0.3980 | 0.0 | 0.8885 | 0.7637 | 0.9531 | 0.0057 | 0.1230 | 0.3653 | 0.0 | | 0.0796 | 40.3 | 8060 | 0.7107 | 0.4157 | 0.4954 | 0.8796 | nan | 0.8609 | 0.9568 | 0.7905 | 0.8493 | 0.5125 | nan | 0.7184 | 0.8283 | 0.1212 | 0.9543 | 0.1012 | 0.0 | 0.6666 | 0.0362 | 0.7047 | 0.0 | 0.0299 | 0.9276 | 0.1844 | 0.6379 | 0.3452 | 0.5341 | nan | 0.3662 | 0.6248 | 0.5844 | 0.0 | 0.9572 | 0.8399 | 0.9828 | 0.0140 | 0.2255 | 0.4967 | 0.0 | nan | 0.7721 | 0.8809 | 0.7172 | 0.7599 | 0.3866 | nan | 0.5904 | 0.6015 | 0.0668 | 0.8629 | 0.0898 | 0.0 | 0.4694 | 0.0362 | 0.5856 | 0.0 | 0.0297 | 0.7715 | 0.1673 | 0.5417 | 0.3024 | 0.3677 | nan | 0.2845 | 0.5088 | 0.3984 | 0.0 | 0.8865 | 0.7622 | 0.9529 | 0.0097 | 0.1214 | 0.3776 | 0.0 | | 0.0724 | 40.4 | 8080 | 0.7106 | 0.4169 | 0.5013 | 0.8799 | nan | 0.8624 | 0.9575 | 0.7998 | 0.8468 | 0.5267 | nan | 0.7205 | 0.8300 | 0.1562 | 0.9553 | 0.1078 | 0.0 | 0.6866 | 0.0474 | 0.7193 | 0.0 | 0.0365 | 0.9303 | 0.1972 | 0.6227 | 0.3779 | 0.5839 | nan | 0.3352 | 0.6225 | 0.5880 | 0.0 | 0.9522 | 0.8457 | 0.9839 | 0.0096 | 0.2625 | 0.4761 | 0.0 | nan | 0.7742 | 0.8807 | 0.7120 | 0.7606 | 0.3870 | nan | 0.5935 | 0.6092 | 0.0852 | 0.8635 | 0.0951 | 0.0 | 0.4542 | 0.0474 | 0.5808 | 0.0 | 0.0362 | 0.7702 | 0.1771 | 0.5395 | 0.3223 | 0.3795 | nan | 0.2676 | 0.5092 | 0.3938 | 0.0 | 0.8893 | 0.7672 | 0.9525 | 0.0073 | 0.1162 | 0.3706 | 0.0 | | 0.0861 | 40.5 | 8100 | 0.7126 | 0.4154 | 0.4968 | 0.8788 | nan | 0.8509 | 0.9591 | 0.8039 | 0.8473 | 0.5250 | nan | 0.7123 | 0.8259 | 0.1234 | 0.9525 | 0.1220 | 0.0 | 0.6810 | 0.0406 | 0.7185 | 0.0 | 0.0481 | 0.9290 | 0.2089 | 0.6287 | 0.3642 | 0.5502 | nan | 0.2832 | 0.6238 | 0.5661 | 0.0 | 0.9556 | 0.8417 | 0.9821 | 0.0075 | 0.2778 | 0.4699 | 0.0 | nan | 0.7683 | 0.8775 | 0.7027 | 0.7596 | 0.3931 | nan | 0.5943 | 0.6091 | 0.0731 | 0.8649 | 0.1067 | 0.0 | 0.4585 | 0.0406 | 0.5824 | 0.0 | 0.0475 | 0.7699 | 0.1858 | 0.5358 | 0.3135 | 0.3763 | nan | 0.2335 | 0.5090 | 0.3943 | 0.0 | 0.8875 | 0.7648 | 0.9533 | 0.0059 | 0.1189 | 0.3664 | 0.0 | | 0.0864 | 40.6 | 8120 | 0.7126 | 0.4158 | 0.4996 | 0.8781 | nan | 0.8529 | 0.9577 | 0.7999 | 0.8500 | 0.5235 | nan | 0.7092 | 0.8165 | 0.1371 | 0.9546 | 0.1320 | 0.0 | 0.7268 | 0.0451 | 0.7072 | 0.0 | 0.0315 | 0.9346 | 0.1732 | 0.6225 | 0.3816 | 0.6257 | nan | 0.2730 | 0.6131 | 0.5802 | 0.0 | 0.9530 | 0.8303 | 0.9835 | 0.0093 | 0.2946 | 0.4686 | 0.0 | nan | 0.7672 | 0.8786 | 0.7101 | 0.7576 | 0.3883 | nan | 0.5942 | 0.6285 | 0.0779 | 0.8631 | 0.1131 | 0.0 | 0.4647 | 0.0451 | 0.5853 | 0.0 | 0.0312 | 0.7671 | 0.1563 | 0.5319 | 0.3204 | 0.4060 | nan | 0.2306 | 0.5065 | 0.3911 | 0.0 | 0.8866 | 0.7560 | 0.9531 | 0.0071 | 0.1212 | 0.3685 | 0.0 | | 0.0489 | 40.7 | 8140 | 0.7001 | 0.4168 | 0.4990 | 0.8786 | nan | 0.8581 | 0.9576 | 0.7935 | 0.8392 | 0.5069 | nan | 0.7259 | 0.7895 | 0.2352 | 0.9557 | 0.1464 | 0.0 | 0.7005 | 0.0433 | 0.6987 | 0.0 | 0.0159 | 0.9282 | 0.1917 | 0.6173 | 0.3791 | 0.5618 | nan | 0.2642 | 0.6416 | 0.5948 | 0.0 | 0.9565 | 0.8391 | 0.9837 | 0.0108 | 0.2517 | 0.4799 | 0.0 | nan | 0.7651 | 0.8791 | 0.7217 | 0.7586 | 0.3793 | nan | 0.5944 | 0.6412 | 0.0998 | 0.8614 | 0.1255 | 0.0 | 0.4618 | 0.0433 | 0.5817 | 0.0 | 0.0158 | 0.7689 | 0.1718 | 0.5362 | 0.3195 | 0.3800 | nan | 0.2271 | 0.5068 | 0.3928 | 0.0 | 0.8861 | 0.7587 | 0.9531 | 0.0083 | 0.1205 | 0.3787 | 0.0 | | 0.0885 | 40.8 | 8160 | 0.7055 | 0.4151 | 0.4967 | 0.8789 | nan | 0.8546 | 0.9578 | 0.8063 | 0.8368 | 0.5227 | nan | 0.7276 | 0.8306 | 0.1052 | 0.9568 | 0.1536 | 0.0 | 0.7151 | 0.0256 | 0.6851 | 0.0 | 0.0230 | 0.9323 | 0.1821 | 0.6141 | 0.3939 | 0.5730 | nan | 0.2549 | 0.6408 | 0.5897 | 0.0 | 0.9506 | 0.8533 | 0.9838 | 0.0067 | 0.2379 | 0.4792 | 0.0 | nan | 0.7664 | 0.8800 | 0.7130 | 0.7574 | 0.3786 | nan | 0.5950 | 0.6417 | 0.0597 | 0.8608 | 0.1306 | 0.0 | 0.4675 | 0.0256 | 0.5791 | 0.0 | 0.0230 | 0.7671 | 0.1637 | 0.5323 | 0.3301 | 0.3915 | nan | 0.2119 | 0.5075 | 0.3876 | 0.0 | 0.8901 | 0.7687 | 0.9532 | 0.0054 | 0.1181 | 0.3782 | 0.0 | | 0.0666 | 40.9 | 8180 | 0.6983 | 0.4148 | 0.4943 | 0.8804 | nan | 0.8615 | 0.9583 | 0.8135 | 0.8462 | 0.5189 | nan | 0.7181 | 0.8330 | 0.0987 | 0.9506 | 0.1547 | 0.0 | 0.6725 | 0.0495 | 0.7100 | 0.0 | 0.0287 | 0.9267 | 0.1776 | 0.6343 | 0.3780 | 0.5432 | nan | 0.2443 | 0.6137 | 0.5769 | 0.0 | 0.9584 | 0.8384 | 0.9828 | 0.0066 | 0.2333 | 0.4880 | 0.0 | nan | 0.7760 | 0.8835 | 0.6975 | 0.7577 | 0.3847 | nan | 0.5952 | 0.6365 | 0.0593 | 0.8659 | 0.1333 | 0.0 | 0.4577 | 0.0495 | 0.5792 | 0.0 | 0.0286 | 0.7722 | 0.1602 | 0.5420 | 0.3217 | 0.3705 | nan | 0.2078 | 0.5065 | 0.3912 | 0.0 | 0.8859 | 0.7595 | 0.9537 | 0.0053 | 0.1169 | 0.3758 | 0.0 | | 0.0927 | 41.0 | 8200 | 0.7110 | 0.4152 | 0.4938 | 0.8796 | nan | 0.8595 | 0.9572 | 0.8074 | 0.8472 | 0.5385 | nan | 0.7135 | 0.8222 | 0.0817 | 0.9568 | 0.1376 | 0.0 | 0.6614 | 0.0422 | 0.7135 | 0.0 | 0.0282 | 0.9262 | 0.1968 | 0.6167 | 0.3808 | 0.5397 | nan | 0.2615 | 0.6287 | 0.5838 | 0.0 | 0.9574 | 0.8236 | 0.9837 | 0.0121 | 0.2220 | 0.5007 | 0.0 | nan | 0.7749 | 0.8832 | 0.7024 | 0.7589 | 0.3841 | nan | 0.5957 | 0.6435 | 0.0512 | 0.8635 | 0.1196 | 0.0 | 0.4658 | 0.0422 | 0.5789 | 0.0 | 0.0281 | 0.7691 | 0.1761 | 0.5324 | 0.3221 | 0.3748 | nan | 0.2213 | 0.5070 | 0.3918 | 0.0 | 0.8855 | 0.7553 | 0.9531 | 0.0094 | 0.1149 | 0.3822 | 0.0 | | 0.0784 | 41.1 | 8220 | 0.7319 | 0.4160 | 0.4940 | 0.8790 | nan | 0.8561 | 0.9583 | 0.8089 | 0.8384 | 0.5278 | nan | 0.7092 | 0.8246 | 0.0814 | 0.9545 | 0.1391 | 0.0 | 0.6316 | 0.0590 | 0.7229 | 0.0 | 0.0238 | 0.9329 | 0.2112 | 0.6189 | 0.3859 | 0.5190 | nan | 0.2873 | 0.6168 | 0.5891 | 0.0 | 0.9543 | 0.8355 | 0.9833 | 0.0165 | 0.2376 | 0.4825 | 0.0 | nan | 0.7691 | 0.8808 | 0.6975 | 0.7617 | 0.3839 | nan | 0.5908 | 0.6494 | 0.0511 | 0.8644 | 0.1202 | 0.0 | 0.4527 | 0.0590 | 0.5800 | 0.0 | 0.0237 | 0.7681 | 0.1869 | 0.5344 | 0.3263 | 0.3720 | nan | 0.2333 | 0.5073 | 0.3929 | 0.0 | 0.8865 | 0.7570 | 0.9537 | 0.0123 | 0.1167 | 0.3791 | 0.0 | | 0.0667 | 41.2 | 8240 | 0.7222 | 0.4126 | 0.4903 | 0.8783 | nan | 0.8543 | 0.9550 | 0.8106 | 0.8437 | 0.5405 | nan | 0.7182 | 0.8463 | 0.0717 | 0.9550 | 0.1303 | 0.0 | 0.6477 | 0.0313 | 0.7111 | 0.0 | 0.0306 | 0.9294 | 0.1755 | 0.6283 | 0.3626 | 0.4902 | nan | 0.2610 | 0.6342 | 0.5772 | 0.0 | 0.9563 | 0.8257 | 0.9828 | 0.0121 | 0.2199 | 0.4890 | 0.0 | nan | 0.7670 | 0.8804 | 0.6984 | 0.7617 | 0.3820 | nan | 0.5916 | 0.6406 | 0.0455 | 0.8624 | 0.1137 | 0.0 | 0.4502 | 0.0313 | 0.5807 | 0.0 | 0.0305 | 0.7701 | 0.1591 | 0.5380 | 0.3146 | 0.3684 | nan | 0.2147 | 0.5078 | 0.3928 | 0.0 | 0.8845 | 0.7513 | 0.9536 | 0.0095 | 0.1195 | 0.3833 | 0.0 | | 0.0719 | 41.3 | 8260 | 0.7150 | 0.4129 | 0.4889 | 0.8786 | nan | 0.8482 | 0.9565 | 0.8046 | 0.8534 | 0.5374 | nan | 0.7099 | 0.8436 | 0.0664 | 0.9558 | 0.1160 | 0.0 | 0.6357 | 0.0288 | 0.7091 | 0.0 | 0.0459 | 0.9284 | 0.1771 | 0.6290 | 0.3729 | 0.4730 | nan | 0.2386 | 0.6250 | 0.5816 | 0.0 | 0.9566 | 0.8394 | 0.9835 | 0.0035 | 0.2339 | 0.4895 | 0.0 | nan | 0.7638 | 0.8790 | 0.7074 | 0.7540 | 0.3897 | nan | 0.5946 | 0.6470 | 0.0429 | 0.8617 | 0.1015 | 0.0 | 0.4443 | 0.0288 | 0.5846 | 0.0 | 0.0458 | 0.7694 | 0.1605 | 0.5365 | 0.3197 | 0.3608 | nan | 0.2000 | 0.5086 | 0.3954 | 0.0 | 0.8862 | 0.7583 | 0.9526 | 0.0030 | 0.1288 | 0.3859 | 0.0 | | 0.058 | 41.4 | 8280 | 0.7249 | 0.4149 | 0.4906 | 0.8788 | nan | 0.8595 | 0.9577 | 0.8047 | 0.8316 | 0.5300 | nan | 0.7112 | 0.8323 | 0.0734 | 0.9546 | 0.1188 | 0.0 | 0.6529 | 0.0564 | 0.7018 | 0.0 | 0.0388 | 0.9337 | 0.2297 | 0.6200 | 0.3705 | 0.4951 | nan | 0.2212 | 0.6194 | 0.5791 | 0.0 | 0.9535 | 0.8400 | 0.9823 | 0.0089 | 0.2406 | 0.4811 | 0.0 | nan | 0.7669 | 0.8796 | 0.7100 | 0.7539 | 0.3916 | nan | 0.5929 | 0.6481 | 0.0462 | 0.8639 | 0.1034 | 0.0 | 0.4556 | 0.0564 | 0.5855 | 0.0 | 0.0385 | 0.7658 | 0.1999 | 0.5280 | 0.3195 | 0.3666 | nan | 0.1890 | 0.5092 | 0.3935 | 0.0 | 0.8874 | 0.7598 | 0.9533 | 0.0071 | 0.1239 | 0.3806 | 0.0 | | 0.0713 | 41.5 | 8300 | 0.7246 | 0.4159 | 0.4938 | 0.8796 | nan | 0.8617 | 0.9582 | 0.8017 | 0.8346 | 0.5269 | nan | 0.7055 | 0.8484 | 0.0846 | 0.9547 | 0.1224 | 0.0 | 0.6546 | 0.0629 | 0.7093 | 0.0 | 0.0433 | 0.9283 | 0.2270 | 0.6212 | 0.3884 | 0.5291 | nan | 0.2168 | 0.6255 | 0.5767 | 0.0 | 0.9544 | 0.8536 | 0.9821 | 0.0084 | 0.2440 | 0.4769 | 0.0 | nan | 0.7708 | 0.8794 | 0.7099 | 0.7561 | 0.3945 | nan | 0.5920 | 0.6368 | 0.0546 | 0.8646 | 0.1063 | 0.0 | 0.4508 | 0.0629 | 0.5847 | 0.0 | 0.0426 | 0.7672 | 0.1970 | 0.5285 | 0.3282 | 0.3829 | nan | 0.1803 | 0.5096 | 0.3944 | 0.0 | 0.8888 | 0.7674 | 0.9534 | 0.0067 | 0.1231 | 0.3763 | 0.0 | | 0.0749 | 41.6 | 8320 | 0.7235 | 0.4159 | 0.4939 | 0.8791 | nan | 0.8596 | 0.9574 | 0.7982 | 0.8473 | 0.5339 | nan | 0.7065 | 0.8321 | 0.1159 | 0.9565 | 0.1260 | 0.0 | 0.6461 | 0.0717 | 0.7121 | 0.0 | 0.0399 | 0.9331 | 0.2266 | 0.6165 | 0.3872 | 0.5393 | nan | 0.2080 | 0.6241 | 0.5663 | 0.0 | 0.9519 | 0.8333 | 0.9848 | 0.0066 | 0.2445 | 0.4781 | 0.0 | nan | 0.7714 | 0.8796 | 0.7094 | 0.7562 | 0.3935 | nan | 0.5940 | 0.6356 | 0.0654 | 0.8632 | 0.1084 | 0.0 | 0.4528 | 0.0717 | 0.5806 | 0.0 | 0.0393 | 0.7653 | 0.1969 | 0.5249 | 0.3302 | 0.3871 | nan | 0.1723 | 0.5094 | 0.3931 | 0.0 | 0.8876 | 0.7604 | 0.9525 | 0.0054 | 0.1249 | 0.3762 | 0.0 | | 0.0797 | 41.7 | 8340 | 0.7274 | 0.4130 | 0.4909 | 0.8792 | nan | 0.8519 | 0.9562 | 0.8035 | 0.8531 | 0.5439 | nan | 0.7088 | 0.8430 | 0.0806 | 0.9565 | 0.1222 | 0.0 | 0.6614 | 0.0473 | 0.7077 | 0.0 | 0.0384 | 0.9308 | 0.1828 | 0.6351 | 0.3837 | 0.5151 | nan | 0.1893 | 0.6149 | 0.5878 | 0.0 | 0.9549 | 0.8449 | 0.9851 | 0.0052 | 0.2324 | 0.4736 | 0.0 | nan | 0.7700 | 0.8795 | 0.7001 | 0.7552 | 0.3956 | nan | 0.5961 | 0.6341 | 0.0483 | 0.8633 | 0.1057 | 0.0 | 0.4576 | 0.0472 | 0.5786 | 0.0 | 0.0380 | 0.7683 | 0.1633 | 0.5348 | 0.3249 | 0.3814 | nan | 0.1590 | 0.5099 | 0.3968 | 0.0 | 0.8878 | 0.7646 | 0.9524 | 0.0043 | 0.1245 | 0.3759 | 0.0 | | 0.0992 | 41.8 | 8360 | 0.7200 | 0.4129 | 0.4919 | 0.8783 | nan | 0.8496 | 0.9564 | 0.8070 | 0.8435 | 0.5330 | nan | 0.7220 | 0.8368 | 0.0732 | 0.9572 | 0.1298 | 0.0 | 0.6709 | 0.0622 | 0.7007 | 0.0 | 0.0269 | 0.9303 | 0.1829 | 0.6236 | 0.3705 | 0.5537 | nan | 0.1950 | 0.6268 | 0.5977 | 0.0 | 0.9553 | 0.8440 | 0.9844 | 0.0053 | 0.2268 | 0.4750 | 0.0 | nan | 0.7641 | 0.8773 | 0.6907 | 0.7587 | 0.3943 | nan | 0.5929 | 0.6437 | 0.0429 | 0.8629 | 0.1115 | 0.0 | 0.4596 | 0.0622 | 0.5825 | 0.0 | 0.0267 | 0.7682 | 0.1647 | 0.5321 | 0.3166 | 0.3840 | nan | 0.1642 | 0.5102 | 0.3949 | 0.0 | 0.8868 | 0.7622 | 0.9528 | 0.0044 | 0.1231 | 0.3788 | 0.0 | | 0.0582 | 41.9 | 8380 | 0.7296 | 0.4136 | 0.4924 | 0.8785 | nan | 0.8458 | 0.9592 | 0.8061 | 0.8447 | 0.5136 | nan | 0.7207 | 0.8342 | 0.0641 | 0.9576 | 0.1245 | 0.0 | 0.6799 | 0.0590 | 0.7184 | 0.0 | 0.0293 | 0.9313 | 0.1980 | 0.6216 | 0.3837 | 0.5488 | nan | 0.2023 | 0.6359 | 0.5885 | 0.0 | 0.9538 | 0.8449 | 0.9838 | 0.0088 | 0.2121 | 0.4875 | 0.0 | nan | 0.7637 | 0.8770 | 0.6863 | 0.7567 | 0.3963 | nan | 0.5920 | 0.6472 | 0.0403 | 0.8635 | 0.1081 | 0.0 | 0.4642 | 0.0590 | 0.5765 | 0.0 | 0.0291 | 0.7673 | 0.1775 | 0.5307 | 0.3274 | 0.3845 | nan | 0.1720 | 0.5113 | 0.3951 | 0.0 | 0.8879 | 0.7644 | 0.9531 | 0.0069 | 0.1164 | 0.3815 | 0.0 | | 0.0701 | 42.0 | 8400 | 0.7337 | 0.4166 | 0.4941 | 0.8794 | nan | 0.8473 | 0.9594 | 0.8017 | 0.8461 | 0.5233 | nan | 0.7117 | 0.8309 | 0.0828 | 0.9545 | 0.1145 | 0.0 | 0.6676 | 0.0605 | 0.7218 | 0.0 | 0.0342 | 0.9306 | 0.2053 | 0.6414 | 0.3719 | 0.5456 | nan | 0.2417 | 0.6270 | 0.5795 | 0.0 | 0.9551 | 0.8547 | 0.9828 | 0.0081 | 0.2259 | 0.4839 | 0.0 | nan | 0.7642 | 0.8767 | 0.7027 | 0.7549 | 0.3968 | nan | 0.5939 | 0.6486 | 0.0503 | 0.8654 | 0.1005 | 0.0 | 0.4616 | 0.0605 | 0.5804 | 0.0 | 0.0339 | 0.7705 | 0.1820 | 0.5421 | 0.3226 | 0.3825 | nan | 0.2008 | 0.5129 | 0.4000 | 0.0 | 0.8890 | 0.7714 | 0.9532 | 0.0065 | 0.1244 | 0.3819 | 0.0 | | 0.0728 | 42.1 | 8420 | 0.7088 | 0.4176 | 0.4970 | 0.8802 | nan | 0.8586 | 0.9580 | 0.7946 | 0.8437 | 0.5241 | nan | 0.7327 | 0.8391 | 0.0650 | 0.9594 | 0.1225 | 0.0 | 0.6719 | 0.0386 | 0.7056 | 0.0 | 0.0373 | 0.9302 | 0.2239 | 0.6361 | 0.3748 | 0.5825 | nan | 0.2644 | 0.6285 | 0.5914 | 0.0 | 0.9539 | 0.8399 | 0.9820 | 0.0074 | 0.2436 | 0.4938 | 0.0 | nan | 0.7709 | 0.8803 | 0.7146 | 0.7559 | 0.3943 | nan | 0.5952 | 0.6516 | 0.0407 | 0.8623 | 0.1078 | 0.0 | 0.4596 | 0.0386 | 0.5840 | 0.0 | 0.0372 | 0.7724 | 0.1968 | 0.5428 | 0.3211 | 0.3926 | nan | 0.2153 | 0.5124 | 0.3964 | 0.0 | 0.8873 | 0.7598 | 0.9536 | 0.0059 | 0.1290 | 0.3856 | 0.0 | | 0.0779 | 42.2 | 8440 | 0.7205 | 0.4161 | 0.4943 | 0.8797 | nan | 0.8508 | 0.9577 | 0.7977 | 0.8524 | 0.5320 | nan | 0.7101 | 0.8392 | 0.0628 | 0.9558 | 0.1263 | 0.0 | 0.6522 | 0.0314 | 0.7007 | 0.0 | 0.0404 | 0.9291 | 0.2000 | 0.6533 | 0.3771 | 0.5758 | nan | 0.2348 | 0.6217 | 0.5855 | 0.0 | 0.9555 | 0.8560 | 0.9833 | 0.0088 | 0.2664 | 0.4606 | 0.0 | nan | 0.7671 | 0.8793 | 0.7060 | 0.7545 | 0.3968 | nan | 0.5946 | 0.6539 | 0.0420 | 0.8647 | 0.1110 | 0.0 | 0.4508 | 0.0314 | 0.5844 | 0.0 | 0.0401 | 0.7719 | 0.1772 | 0.5459 | 0.3214 | 0.3899 | nan | 0.1958 | 0.5120 | 0.4020 | 0.0 | 0.8878 | 0.7681 | 0.9529 | 0.0068 | 0.1305 | 0.3751 | 0.0 | | 0.0871 | 42.3 | 8460 | 0.7242 | 0.4185 | 0.4984 | 0.8793 | nan | 0.8523 | 0.9558 | 0.8010 | 0.8471 | 0.5390 | nan | 0.7165 | 0.8376 | 0.0732 | 0.9575 | 0.1268 | 0.0 | 0.6554 | 0.0401 | 0.7135 | 0.0 | 0.0391 | 0.9302 | 0.2009 | 0.6376 | 0.3938 | 0.5685 | nan | 0.2904 | 0.6257 | 0.6011 | 0.0 | 0.9547 | 0.8443 | 0.9831 | 0.0072 | 0.2709 | 0.4865 | 0.0 | nan | 0.7665 | 0.8795 | 0.7045 | 0.7569 | 0.3943 | nan | 0.5916 | 0.6578 | 0.0482 | 0.8641 | 0.1112 | 0.0 | 0.4562 | 0.0401 | 0.5822 | 0.0 | 0.0388 | 0.7715 | 0.1783 | 0.5404 | 0.3289 | 0.3985 | nan | 0.2423 | 0.5129 | 0.3986 | 0.0 | 0.8878 | 0.7631 | 0.9532 | 0.0056 | 0.1355 | 0.3838 | 0.0 | | 0.0671 | 42.4 | 8480 | 0.7098 | 0.4188 | 0.4982 | 0.8806 | nan | 0.8619 | 0.9552 | 0.8057 | 0.8452 | 0.5423 | nan | 0.7258 | 0.8334 | 0.0872 | 0.9566 | 0.1294 | 0.0 | 0.6649 | 0.0448 | 0.7137 | 0.0 | 0.0332 | 0.9312 | 0.1886 | 0.6284 | 0.3881 | 0.5432 | nan | 0.3220 | 0.6299 | 0.5896 | 0.0 | 0.9543 | 0.8488 | 0.9821 | 0.0093 | 0.2349 | 0.4914 | 0.0 | nan | 0.7764 | 0.8833 | 0.6977 | 0.7602 | 0.3922 | nan | 0.5932 | 0.6581 | 0.0533 | 0.8642 | 0.1131 | 0.0 | 0.4540 | 0.0448 | 0.5776 | 0.0 | 0.0331 | 0.7703 | 0.1689 | 0.5359 | 0.3266 | 0.3886 | nan | 0.2650 | 0.5127 | 0.3999 | 0.0 | 0.8876 | 0.7635 | 0.9537 | 0.0070 | 0.1350 | 0.3869 | 0.0 | | 0.0736 | 42.5 | 8500 | 0.7222 | 0.4162 | 0.4933 | 0.8795 | nan | 0.8571 | 0.9566 | 0.8087 | 0.8463 | 0.5427 | nan | 0.7186 | 0.8216 | 0.0681 | 0.9545 | 0.1284 | 0.0 | 0.6721 | 0.0528 | 0.7124 | 0.0 | 0.0335 | 0.9342 | 0.1955 | 0.6150 | 0.3653 | 0.5562 | nan | 0.2725 | 0.6109 | 0.5731 | 0.0 | 0.9558 | 0.8340 | 0.9829 | 0.0079 | 0.2154 | 0.4945 | 0.0 | nan | 0.7733 | 0.8818 | 0.6834 | 0.7600 | 0.3922 | nan | 0.5932 | 0.6659 | 0.0427 | 0.8649 | 0.1118 | 0.0 | 0.4561 | 0.0528 | 0.5790 | 0.0 | 0.0332 | 0.7670 | 0.1743 | 0.5256 | 0.3162 | 0.3880 | nan | 0.2272 | 0.5090 | 0.3999 | 0.0 | 0.8859 | 0.7582 | 0.9532 | 0.0062 | 0.1276 | 0.3903 | 0.0 | | 0.0666 | 42.6 | 8520 | 0.7110 | 0.4198 | 0.4967 | 0.8812 | nan | 0.8616 | 0.9569 | 0.7996 | 0.8491 | 0.5364 | nan | 0.7060 | 0.8177 | 0.0932 | 0.9571 | 0.1263 | 0.0 | 0.6658 | 0.0576 | 0.7184 | 0.0 | 0.0473 | 0.9285 | 0.2135 | 0.6561 | 0.3734 | 0.5421 | nan | 0.2644 | 0.6309 | 0.5902 | 0.0 | 0.9542 | 0.8551 | 0.9831 | 0.0090 | 0.2125 | 0.4895 | 0.0 | nan | 0.7743 | 0.8819 | 0.6957 | 0.7576 | 0.3946 | nan | 0.5919 | 0.6597 | 0.0575 | 0.8641 | 0.1115 | 0.0 | 0.4605 | 0.0576 | 0.5757 | 0.0 | 0.0468 | 0.7745 | 0.1876 | 0.5488 | 0.3221 | 0.3951 | nan | 0.2200 | 0.5136 | 0.4034 | 0.0 | 0.8891 | 0.7712 | 0.9533 | 0.0069 | 0.1312 | 0.3877 | 0.0 | | 0.0695 | 42.7 | 8540 | 0.7218 | 0.4189 | 0.4977 | 0.8780 | nan | 0.8364 | 0.9574 | 0.7969 | 0.8431 | 0.5503 | nan | 0.7173 | 0.8123 | 0.1159 | 0.9577 | 0.1220 | 0.0 | 0.6804 | 0.0625 | 0.7212 | 0.0 | 0.0555 | 0.9310 | 0.1974 | 0.6558 | 0.3836 | 0.5621 | nan | 0.2385 | 0.6290 | 0.6051 | 0.0 | 0.9517 | 0.8515 | 0.9845 | 0.0038 | 0.2094 | 0.4955 | 0.0 | nan | 0.7579 | 0.8747 | 0.7015 | 0.7469 | 0.3890 | nan | 0.5928 | 0.6626 | 0.0668 | 0.8629 | 0.1075 | 0.0 | 0.4649 | 0.0625 | 0.5803 | 0.0 | 0.0551 | 0.7735 | 0.1761 | 0.5460 | 0.3289 | 0.4016 | nan | 0.1946 | 0.5145 | 0.4018 | 0.0 | 0.8895 | 0.7713 | 0.9526 | 0.0032 | 0.1358 | 0.3916 | 0.0 | | 0.0572 | 42.8 | 8560 | 0.7331 | 0.4175 | 0.4938 | 0.8775 | nan | 0.8462 | 0.9556 | 0.7954 | 0.8294 | 0.5532 | nan | 0.7014 | 0.8166 | 0.1012 | 0.9509 | 0.1225 | 0.0 | 0.6679 | 0.0907 | 0.7107 | 0.0 | 0.0320 | 0.9310 | 0.1957 | 0.6430 | 0.3670 | 0.5316 | nan | 0.2325 | 0.6179 | 0.5918 | 0.0 | 0.9551 | 0.8517 | 0.9839 | 0.0061 | 0.2207 | 0.5003 | 0.0 | nan | 0.7595 | 0.8746 | 0.7054 | 0.7463 | 0.3845 | nan | 0.5869 | 0.6623 | 0.0572 | 0.8663 | 0.1078 | 0.0 | 0.4633 | 0.0906 | 0.5868 | 0.0 | 0.0318 | 0.7716 | 0.1738 | 0.5426 | 0.3163 | 0.3871 | nan | 0.1916 | 0.5119 | 0.3987 | 0.0 | 0.8883 | 0.7697 | 0.9529 | 0.0050 | 0.1348 | 0.3928 | 0.0 | | 0.0852 | 42.9 | 8580 | 0.7294 | 0.4218 | 0.5035 | 0.8781 | nan | 0.8474 | 0.9539 | 0.7988 | 0.8363 | 0.5591 | nan | 0.7173 | 0.8223 | 0.1199 | 0.9574 | 0.1374 | 0.0 | 0.6863 | 0.0966 | 0.7122 | 0.0002 | 0.0356 | 0.9282 | 0.2153 | 0.6422 | 0.3889 | 0.5794 | nan | 0.3187 | 0.6249 | 0.5866 | 0.0 | 0.9541 | 0.8453 | 0.9840 | 0.0073 | 0.2534 | 0.5031 | 0.0 | nan | 0.7629 | 0.8772 | 0.7008 | 0.7526 | 0.3834 | nan | 0.5882 | 0.6610 | 0.0666 | 0.8639 | 0.1194 | 0.0 | 0.4654 | 0.0966 | 0.5804 | 0.0002 | 0.0351 | 0.7734 | 0.1897 | 0.5443 | 0.3258 | 0.4037 | nan | 0.2523 | 0.5123 | 0.3962 | 0.0 | 0.8888 | 0.7665 | 0.9529 | 0.0059 | 0.1388 | 0.3924 | 0.0 | | 0.088 | 43.0 | 8600 | 0.7267 | 0.4198 | 0.4991 | 0.8778 | nan | 0.8440 | 0.9545 | 0.8047 | 0.8429 | 0.5587 | nan | 0.7048 | 0.8152 | 0.1090 | 0.9560 | 0.1211 | 0.0 | 0.7106 | 0.0746 | 0.6981 | 0.0001 | 0.0267 | 0.9307 | 0.2232 | 0.6422 | 0.3654 | 0.5411 | nan | 0.3333 | 0.6245 | 0.5647 | 0.0 | 0.9534 | 0.8454 | 0.9850 | 0.0061 | 0.2331 | 0.5013 | 0.0 | nan | 0.7621 | 0.8776 | 0.6877 | 0.7496 | 0.3866 | nan | 0.5884 | 0.6568 | 0.0592 | 0.8628 | 0.1055 | 0.0 | 0.4838 | 0.0746 | 0.5828 | 0.0001 | 0.0265 | 0.7720 | 0.1964 | 0.5455 | 0.3158 | 0.3866 | nan | 0.2690 | 0.5124 | 0.3965 | 0.0 | 0.8881 | 0.7673 | 0.9523 | 0.0047 | 0.1319 | 0.3907 | 0.0 | | 0.065 | 43.1 | 8620 | 0.7317 | 0.4196 | 0.5005 | 0.8781 | nan | 0.8451 | 0.9537 | 0.8098 | 0.8427 | 0.5527 | nan | 0.7069 | 0.8223 | 0.1230 | 0.9560 | 0.1239 | 0.0 | 0.6888 | 0.0726 | 0.7129 | 0.0 | 0.0271 | 0.9294 | 0.2125 | 0.6537 | 0.3741 | 0.5544 | nan | 0.3373 | 0.6257 | 0.5650 | 0.0 | 0.9528 | 0.8529 | 0.9834 | 0.0072 | 0.2298 | 0.5000 | 0.0 | nan | 0.7622 | 0.8785 | 0.6837 | 0.7518 | 0.3855 | nan | 0.5875 | 0.6530 | 0.0684 | 0.8638 | 0.1080 | 0.0 | 0.4804 | 0.0725 | 0.5771 | 0.0 | 0.0269 | 0.7726 | 0.1868 | 0.5450 | 0.3229 | 0.3885 | nan | 0.2678 | 0.5116 | 0.3946 | 0.0 | 0.8892 | 0.7700 | 0.9534 | 0.0056 | 0.1319 | 0.3878 | 0.0 | | 0.0697 | 43.2 | 8640 | 0.7204 | 0.4192 | 0.5005 | 0.8785 | nan | 0.8482 | 0.9530 | 0.8100 | 0.8455 | 0.5573 | nan | 0.7103 | 0.8250 | 0.1150 | 0.9551 | 0.1219 | 0.0 | 0.6710 | 0.0804 | 0.7151 | 0.0 | 0.0266 | 0.9303 | 0.2002 | 0.6498 | 0.3774 | 0.5495 | nan | 0.3482 | 0.6393 | 0.5789 | 0.0 | 0.9557 | 0.8447 | 0.9829 | 0.0081 | 0.2347 | 0.4812 | 0.0 | nan | 0.7656 | 0.8803 | 0.6869 | 0.7549 | 0.3853 | nan | 0.5889 | 0.6460 | 0.0617 | 0.8646 | 0.1058 | 0.0 | 0.4654 | 0.0803 | 0.5804 | 0.0 | 0.0264 | 0.7723 | 0.1779 | 0.5491 | 0.3224 | 0.3888 | nan | 0.2797 | 0.5128 | 0.3962 | 0.0 | 0.8874 | 0.7630 | 0.9534 | 0.0062 | 0.1306 | 0.3827 | 0.0 | | 0.0618 | 43.3 | 8660 | 0.7165 | 0.4183 | 0.4983 | 0.8792 | nan | 0.8572 | 0.9548 | 0.8042 | 0.8479 | 0.5460 | nan | 0.7072 | 0.8212 | 0.1021 | 0.9566 | 0.1165 | 0.0 | 0.6709 | 0.0647 | 0.7131 | 0.0 | 0.0285 | 0.9315 | 0.2115 | 0.6370 | 0.3804 | 0.5576 | nan | 0.3268 | 0.6234 | 0.5779 | 0.0 | 0.9542 | 0.8376 | 0.9834 | 0.0093 | 0.2454 | 0.4792 | 0.0 | nan | 0.7710 | 0.8820 | 0.7006 | 0.7550 | 0.3833 | nan | 0.5902 | 0.6422 | 0.0543 | 0.8639 | 0.1019 | 0.0 | 0.4641 | 0.0647 | 0.5811 | 0.0 | 0.0281 | 0.7700 | 0.1853 | 0.5414 | 0.3240 | 0.3977 | nan | 0.2670 | 0.5098 | 0.3917 | 0.0 | 0.8875 | 0.7613 | 0.9529 | 0.0073 | 0.1270 | 0.3791 | 0.0 | | 0.06 | 43.4 | 8680 | 0.7109 | 0.4201 | 0.5011 | 0.8800 | nan | 0.8595 | 0.9558 | 0.7967 | 0.8494 | 0.5272 | nan | 0.7195 | 0.8216 | 0.1296 | 0.9587 | 0.1180 | 0.0 | 0.6750 | 0.0609 | 0.7230 | 0.0 | 0.0352 | 0.9277 | 0.2070 | 0.6379 | 0.3972 | 0.5523 | nan | 0.3292 | 0.6208 | 0.5823 | 0.0 | 0.9550 | 0.8361 | 0.9826 | 0.0137 | 0.2559 | 0.5083 | 0.0 | nan | 0.7730 | 0.8823 | 0.7029 | 0.7538 | 0.3909 | nan | 0.5893 | 0.6437 | 0.0670 | 0.8638 | 0.1036 | 0.0 | 0.4610 | 0.0609 | 0.5810 | 0.0 | 0.0346 | 0.7733 | 0.1834 | 0.5459 | 0.3286 | 0.4014 | nan | 0.2719 | 0.5098 | 0.3906 | 0.0 | 0.8872 | 0.7607 | 0.9534 | 0.0097 | 0.1288 | 0.3893 | 0.0 | | 0.0637 | 43.5 | 8700 | 0.7112 | 0.4197 | 0.5019 | 0.8798 | nan | 0.8598 | 0.9564 | 0.7924 | 0.8452 | 0.5278 | nan | 0.7139 | 0.8459 | 0.0673 | 0.9563 | 0.1162 | 0.0 | 0.7044 | 0.0499 | 0.7242 | 0.0046 | 0.0234 | 0.9320 | 0.2177 | 0.6367 | 0.4050 | 0.5829 | nan | 0.3565 | 0.6146 | 0.5795 | 0.0 | 0.9508 | 0.8419 | 0.9832 | 0.0151 | 0.2453 | 0.5112 | 0.0 | nan | 0.7701 | 0.8810 | 0.7087 | 0.7526 | 0.3938 | nan | 0.5881 | 0.6374 | 0.0410 | 0.8654 | 0.1019 | 0.0 | 0.4544 | 0.0498 | 0.5856 | 0.0044 | 0.0231 | 0.7726 | 0.1907 | 0.5461 | 0.3340 | 0.4087 | nan | 0.2842 | 0.5077 | 0.3904 | 0.0 | 0.8887 | 0.7634 | 0.9534 | 0.0104 | 0.1279 | 0.3934 | 0.0 | | 0.0686 | 43.6 | 8720 | 0.7090 | 0.4197 | 0.4989 | 0.8799 | nan | 0.8557 | 0.9581 | 0.7838 | 0.8390 | 0.5234 | nan | 0.7065 | 0.8386 | 0.0559 | 0.9563 | 0.1346 | 0.0 | 0.7263 | 0.0440 | 0.7063 | 0.0053 | 0.0168 | 0.9253 | 0.2019 | 0.6655 | 0.4033 | 0.5193 | nan | 0.3487 | 0.6270 | 0.5874 | 0.0 | 0.9547 | 0.8525 | 0.9830 | 0.0137 | 0.2328 | 0.4980 | 0.0 | nan | 0.7640 | 0.8780 | 0.7104 | 0.7531 | 0.4014 | nan | 0.5854 | 0.6436 | 0.0343 | 0.8650 | 0.1170 | 0.0 | 0.4675 | 0.0440 | 0.5847 | 0.0052 | 0.0166 | 0.7773 | 0.1781 | 0.5546 | 0.3319 | 0.3880 | nan | 0.2757 | 0.5094 | 0.3956 | 0.0 | 0.8885 | 0.7679 | 0.9534 | 0.0100 | 0.1391 | 0.3922 | 0.0 | | 0.0807 | 43.7 | 8740 | 0.7229 | 0.4180 | 0.4983 | 0.8792 | nan | 0.8466 | 0.9586 | 0.7929 | 0.8472 | 0.5371 | nan | 0.7049 | 0.8364 | 0.0621 | 0.9543 | 0.1221 | 0.0 | 0.7106 | 0.0559 | 0.7133 | 0.0062 | 0.0340 | 0.9336 | 0.1828 | 0.6558 | 0.3936 | 0.5337 | nan | 0.3272 | 0.6369 | 0.6002 | 0.0 | 0.9521 | 0.8448 | 0.9834 | 0.0054 | 0.2294 | 0.4832 | 0.0 | nan | 0.7613 | 0.8771 | 0.7048 | 0.7507 | 0.4042 | nan | 0.5878 | 0.6479 | 0.0383 | 0.8658 | 0.1062 | 0.0 | 0.4355 | 0.0559 | 0.5919 | 0.0061 | 0.0334 | 0.7728 | 0.1631 | 0.5500 | 0.3294 | 0.3916 | nan | 0.2608 | 0.5128 | 0.3933 | 0.0 | 0.8887 | 0.7658 | 0.9533 | 0.0044 | 0.1359 | 0.3871 | 0.0 | | 0.0557 | 43.8 | 8760 | 0.7293 | 0.4202 | 0.5003 | 0.8787 | nan | 0.8414 | 0.9593 | 0.7976 | 0.8454 | 0.5346 | nan | 0.7111 | 0.8324 | 0.0655 | 0.9556 | 0.1376 | 0.0 | 0.7028 | 0.0795 | 0.7066 | 0.0112 | 0.0297 | 0.9321 | 0.2095 | 0.6408 | 0.3838 | 0.5562 | nan | 0.3337 | 0.6187 | 0.5912 | 0.0 | 0.9529 | 0.8491 | 0.9826 | 0.0117 | 0.2366 | 0.4989 | 0.0 | nan | 0.7594 | 0.8760 | 0.7044 | 0.7460 | 0.4061 | nan | 0.5894 | 0.6491 | 0.0391 | 0.8650 | 0.1192 | 0.0 | 0.4528 | 0.0795 | 0.5932 | 0.0110 | 0.0292 | 0.7729 | 0.1842 | 0.5456 | 0.3261 | 0.3934 | nan | 0.2579 | 0.5118 | 0.3937 | 0.0 | 0.8887 | 0.7667 | 0.9542 | 0.0090 | 0.1334 | 0.3893 | 0.0 | | 0.0655 | 43.9 | 8780 | 0.7311 | 0.4198 | 0.5014 | 0.8783 | nan | 0.8393 | 0.9554 | 0.8100 | 0.8526 | 0.5444 | nan | 0.7158 | 0.8314 | 0.0777 | 0.9564 | 0.1426 | 0.0 | 0.7048 | 0.0661 | 0.7165 | 0.0070 | 0.0355 | 0.9295 | 0.2021 | 0.6563 | 0.3785 | 0.5421 | nan | 0.3485 | 0.6098 | 0.5809 | 0.0 | 0.9562 | 0.8432 | 0.9819 | 0.0113 | 0.2569 | 0.4911 | 0.0 | nan | 0.7591 | 0.8774 | 0.6880 | 0.7491 | 0.4010 | nan | 0.5890 | 0.6512 | 0.0459 | 0.8645 | 0.1234 | 0.0 | 0.4569 | 0.0661 | 0.5894 | 0.0069 | 0.0348 | 0.7745 | 0.1777 | 0.5502 | 0.3229 | 0.3875 | nan | 0.2705 | 0.5105 | 0.3950 | 0.0 | 0.8871 | 0.7637 | 0.9543 | 0.0087 | 0.1385 | 0.3887 | 0.0 | | 0.1089 | 44.0 | 8800 | 0.7312 | 0.4192 | 0.4999 | 0.8783 | nan | 0.8397 | 0.9578 | 0.8063 | 0.8582 | 0.5364 | nan | 0.6964 | 0.8238 | 0.0701 | 0.9568 | 0.1412 | 0.0 | 0.6988 | 0.0710 | 0.7186 | 0.0024 | 0.0395 | 0.9293 | 0.1825 | 0.6419 | 0.3726 | 0.5495 | nan | 0.3361 | 0.6296 | 0.5986 | 0.0 | 0.9542 | 0.8373 | 0.9846 | 0.0106 | 0.2464 | 0.5059 | 0.0 | nan | 0.7582 | 0.8774 | 0.6870 | 0.7482 | 0.4026 | nan | 0.5888 | 0.6556 | 0.0403 | 0.8647 | 0.1220 | 0.0 | 0.4713 | 0.0709 | 0.5898 | 0.0024 | 0.0388 | 0.7742 | 0.1633 | 0.5487 | 0.3173 | 0.3890 | nan | 0.2694 | 0.5097 | 0.3926 | 0.0 | 0.8871 | 0.7616 | 0.9533 | 0.0082 | 0.1322 | 0.3913 | 0.0 | | 0.0651 | 44.1 | 8820 | 0.7258 | 0.4167 | 0.4960 | 0.8781 | nan | 0.8389 | 0.9580 | 0.8069 | 0.8540 | 0.5310 | nan | 0.7010 | 0.8305 | 0.0672 | 0.9582 | 0.1303 | 0.0 | 0.6921 | 0.0439 | 0.7103 | 0.0 | 0.0331 | 0.9301 | 0.1987 | 0.6444 | 0.3661 | 0.5154 | nan | 0.3231 | 0.6235 | 0.5817 | 0.0 | 0.9546 | 0.8372 | 0.9849 | 0.0090 | 0.2459 | 0.5016 | 0.0 | nan | 0.7581 | 0.8772 | 0.6853 | 0.7509 | 0.4018 | nan | 0.5875 | 0.6505 | 0.0400 | 0.8634 | 0.1134 | 0.0 | 0.4602 | 0.0439 | 0.5862 | 0.0 | 0.0328 | 0.7733 | 0.1758 | 0.5474 | 0.3146 | 0.3825 | nan | 0.2584 | 0.5083 | 0.3952 | 0.0 | 0.8866 | 0.7597 | 0.9530 | 0.0068 | 0.1321 | 0.3900 | 0.0 | | 0.0692 | 44.2 | 8840 | 0.7330 | 0.4161 | 0.4947 | 0.8773 | nan | 0.8412 | 0.9553 | 0.8049 | 0.8450 | 0.5478 | nan | 0.7073 | 0.8187 | 0.0795 | 0.9572 | 0.1261 | 0.0 | 0.6916 | 0.0548 | 0.7093 | 0.0029 | 0.0275 | 0.9332 | 0.1765 | 0.6301 | 0.3681 | 0.5453 | nan | 0.2985 | 0.6152 | 0.5738 | 0.0 | 0.9541 | 0.8444 | 0.9853 | 0.0108 | 0.2370 | 0.4879 | 0.0 | nan | 0.7584 | 0.8765 | 0.6907 | 0.7500 | 0.3961 | nan | 0.5871 | 0.6576 | 0.0452 | 0.8634 | 0.1096 | 0.0 | 0.4611 | 0.0548 | 0.5893 | 0.0028 | 0.0274 | 0.7684 | 0.1583 | 0.5353 | 0.3184 | 0.3885 | nan | 0.2442 | 0.5093 | 0.3966 | 0.0 | 0.8871 | 0.7617 | 0.9526 | 0.0082 | 0.1301 | 0.3866 | 0.0 | | 0.0654 | 44.3 | 8860 | 0.7268 | 0.4164 | 0.4958 | 0.8779 | nan | 0.8438 | 0.9578 | 0.7944 | 0.8479 | 0.5362 | nan | 0.6987 | 0.8252 | 0.0859 | 0.9576 | 0.1255 | 0.0 | 0.6884 | 0.0533 | 0.7208 | 0.0038 | 0.0228 | 0.9300 | 0.1571 | 0.6303 | 0.4273 | 0.5702 | nan | 0.2575 | 0.6223 | 0.5944 | 0.0 | 0.9520 | 0.8468 | 0.9850 | 0.0115 | 0.2333 | 0.4859 | 0.0 | nan | 0.7596 | 0.8765 | 0.6991 | 0.7497 | 0.3995 | nan | 0.5865 | 0.6540 | 0.0503 | 0.8638 | 0.1088 | 0.0 | 0.4636 | 0.0533 | 0.5866 | 0.0037 | 0.0227 | 0.7688 | 0.1423 | 0.5312 | 0.3468 | 0.4065 | nan | 0.2179 | 0.5087 | 0.3934 | 0.0 | 0.8891 | 0.7646 | 0.9523 | 0.0087 | 0.1321 | 0.3839 | 0.0 | | 0.0886 | 44.4 | 8880 | 0.7317 | 0.4162 | 0.4946 | 0.8776 | nan | 0.8453 | 0.9574 | 0.7868 | 0.8436 | 0.5339 | nan | 0.7076 | 0.8323 | 0.0835 | 0.9565 | 0.1189 | 0.0 | 0.6836 | 0.0644 | 0.7184 | 0.0029 | 0.0224 | 0.9334 | 0.1645 | 0.6164 | 0.4076 | 0.5555 | nan | 0.2833 | 0.6216 | 0.5878 | 0.0 | 0.9531 | 0.8455 | 0.9836 | 0.0136 | 0.2147 | 0.4873 | 0.0 | nan | 0.7597 | 0.8757 | 0.7058 | 0.7446 | 0.4026 | nan | 0.5884 | 0.6438 | 0.0485 | 0.8646 | 0.1042 | 0.0 | 0.4628 | 0.0644 | 0.5891 | 0.0028 | 0.0223 | 0.7662 | 0.1479 | 0.5247 | 0.3425 | 0.3939 | nan | 0.2349 | 0.5086 | 0.3960 | 0.0 | 0.8889 | 0.7651 | 0.9531 | 0.0099 | 0.1246 | 0.3835 | 0.0 | | 0.0592 | 44.5 | 8900 | 0.7325 | 0.4189 | 0.4975 | 0.8784 | nan | 0.8458 | 0.9586 | 0.7892 | 0.8429 | 0.5300 | nan | 0.7085 | 0.8330 | 0.1017 | 0.9566 | 0.1281 | 0.0 | 0.6934 | 0.0763 | 0.7256 | 0.0006 | 0.0309 | 0.9290 | 0.2149 | 0.6268 | 0.3929 | 0.5390 | nan | 0.2751 | 0.6182 | 0.5806 | 0.0 | 0.9533 | 0.8541 | 0.9831 | 0.0123 | 0.2190 | 0.5009 | 0.0 | nan | 0.7606 | 0.8758 | 0.7049 | 0.7442 | 0.4042 | nan | 0.5897 | 0.6434 | 0.0576 | 0.8656 | 0.1122 | 0.0 | 0.4635 | 0.0763 | 0.5883 | 0.0005 | 0.0305 | 0.7700 | 0.1886 | 0.5313 | 0.3367 | 0.3915 | nan | 0.2296 | 0.5086 | 0.3965 | 0.0 | 0.8897 | 0.7701 | 0.9537 | 0.0092 | 0.1255 | 0.3859 | 0.0 | | 0.0756 | 44.6 | 8920 | 0.7285 | 0.4177 | 0.4947 | 0.8778 | nan | 0.8449 | 0.9569 | 0.7980 | 0.8439 | 0.5414 | nan | 0.7156 | 0.8249 | 0.0876 | 0.9550 | 0.1230 | 0.0 | 0.6976 | 0.0827 | 0.7212 | 0.0011 | 0.0269 | 0.9353 | 0.2019 | 0.6128 | 0.3814 | 0.5312 | nan | 0.2838 | 0.6041 | 0.5666 | 0.0 | 0.9534 | 0.8478 | 0.9831 | 0.0133 | 0.2039 | 0.4928 | 0.0 | nan | 0.7604 | 0.8769 | 0.6981 | 0.7464 | 0.4036 | nan | 0.5917 | 0.6492 | 0.0511 | 0.8661 | 0.1076 | 0.0 | 0.4700 | 0.0827 | 0.5898 | 0.0011 | 0.0267 | 0.7655 | 0.1780 | 0.5223 | 0.3287 | 0.3834 | nan | 0.2358 | 0.5075 | 0.3975 | 0.0 | 0.8880 | 0.7659 | 0.9536 | 0.0100 | 0.1227 | 0.3856 | 0.0 | | 0.0552 | 44.7 | 8940 | 0.7286 | 0.4178 | 0.4960 | 0.8785 | nan | 0.8500 | 0.9568 | 0.7997 | 0.8454 | 0.5380 | nan | 0.7104 | 0.8261 | 0.0826 | 0.9550 | 0.1311 | 0.0 | 0.6737 | 0.0857 | 0.7207 | 0.0013 | 0.0228 | 0.9298 | 0.1966 | 0.6192 | 0.3907 | 0.5334 | nan | 0.2912 | 0.6324 | 0.5819 | 0.0 | 0.9558 | 0.8433 | 0.9824 | 0.0117 | 0.2209 | 0.4822 | 0.0 | nan | 0.7627 | 0.8784 | 0.6998 | 0.7509 | 0.4015 | nan | 0.5918 | 0.6423 | 0.0479 | 0.8664 | 0.1136 | 0.0 | 0.4565 | 0.0857 | 0.5902 | 0.0013 | 0.0225 | 0.7685 | 0.1739 | 0.5288 | 0.3294 | 0.3864 | nan | 0.2423 | 0.5103 | 0.3982 | 0.0 | 0.8870 | 0.7651 | 0.9538 | 0.0089 | 0.1235 | 0.3812 | 0.0 | | 0.0666 | 44.8 | 8960 | 0.7386 | 0.4175 | 0.4952 | 0.8780 | nan | 0.8404 | 0.9579 | 0.8046 | 0.8503 | 0.5445 | nan | 0.7113 | 0.8195 | 0.1001 | 0.9529 | 0.1387 | 0.0 | 0.6556 | 0.0980 | 0.7253 | 0.0010 | 0.0275 | 0.9326 | 0.1886 | 0.6285 | 0.3736 | 0.5390 | nan | 0.2726 | 0.6155 | 0.5677 | 0.0 | 0.9554 | 0.8414 | 0.9833 | 0.0083 | 0.2342 | 0.4780 | 0.0 | nan | 0.7598 | 0.8779 | 0.6886 | 0.7473 | 0.4052 | nan | 0.5924 | 0.6417 | 0.0559 | 0.8671 | 0.1196 | 0.0 | 0.4493 | 0.0980 | 0.5946 | 0.0010 | 0.0273 | 0.7685 | 0.1679 | 0.5325 | 0.3207 | 0.3926 | nan | 0.2302 | 0.5093 | 0.3996 | 0.0 | 0.8867 | 0.7644 | 0.9536 | 0.0063 | 0.1233 | 0.3797 | 0.0 | | 0.062 | 44.9 | 8980 | 0.7354 | 0.4167 | 0.4943 | 0.8776 | nan | 0.8392 | 0.9559 | 0.8056 | 0.8510 | 0.5501 | nan | 0.7168 | 0.8244 | 0.0703 | 0.9562 | 0.1388 | 0.0 | 0.6559 | 0.0825 | 0.7042 | 0.0 | 0.0273 | 0.9347 | 0.2042 | 0.6299 | 0.3684 | 0.5330 | nan | 0.2867 | 0.6248 | 0.5721 | 0.0 | 0.9543 | 0.8418 | 0.9828 | 0.0084 | 0.2265 | 0.4720 | 0.0 | nan | 0.7588 | 0.8780 | 0.6875 | 0.7494 | 0.4034 | nan | 0.5932 | 0.6421 | 0.0418 | 0.8650 | 0.1199 | 0.0 | 0.4503 | 0.0825 | 0.5912 | 0.0 | 0.0270 | 0.7679 | 0.1794 | 0.5320 | 0.3182 | 0.3908 | nan | 0.2400 | 0.5106 | 0.3981 | 0.0 | 0.8870 | 0.7627 | 0.9540 | 0.0062 | 0.1210 | 0.3772 | 0.0 | | 0.0723 | 45.0 | 9000 | 0.7471 | 0.4173 | 0.4954 | 0.8778 | nan | 0.8333 | 0.9563 | 0.8064 | 0.8629 | 0.5388 | nan | 0.7114 | 0.8171 | 0.0703 | 0.9572 | 0.1278 | 0.0 | 0.6566 | 0.0568 | 0.7102 | 0.0 | 0.0346 | 0.9322 | 0.1989 | 0.6334 | 0.3889 | 0.5232 | nan | 0.3184 | 0.6227 | 0.5818 | 0.0 | 0.9517 | 0.8613 | 0.9836 | 0.0112 | 0.2188 | 0.4869 | 0.0 | nan | 0.7549 | 0.8780 | 0.6770 | 0.7456 | 0.4055 | nan | 0.5931 | 0.6419 | 0.0425 | 0.8642 | 0.1121 | 0.0 | 0.4577 | 0.0568 | 0.5839 | 0.0 | 0.0343 | 0.7700 | 0.1761 | 0.5356 | 0.3285 | 0.3927 | nan | 0.2656 | 0.5089 | 0.3986 | 0.0 | 0.8898 | 0.7708 | 0.9535 | 0.0083 | 0.1270 | 0.3819 | 0.0 | | 0.0648 | 45.1 | 9020 | 0.7314 | 0.4180 | 0.4975 | 0.8780 | nan | 0.8368 | 0.9564 | 0.8026 | 0.8635 | 0.5396 | nan | 0.7064 | 0.8212 | 0.0805 | 0.9556 | 0.1300 | 0.0 | 0.6996 | 0.0558 | 0.7156 | 0.0 | 0.0289 | 0.9301 | 0.2240 | 0.6456 | 0.3784 | 0.5095 | nan | 0.3363 | 0.6191 | 0.5911 | 0.0 | 0.9550 | 0.8447 | 0.9817 | 0.0094 | 0.2181 | 0.4845 | 0.0 | nan | 0.7572 | 0.8784 | 0.6826 | 0.7447 | 0.4042 | nan | 0.5951 | 0.6395 | 0.0479 | 0.8653 | 0.1139 | 0.0 | 0.4665 | 0.0558 | 0.5832 | 0.0 | 0.0287 | 0.7728 | 0.1957 | 0.5406 | 0.3217 | 0.3863 | nan | 0.2713 | 0.5095 | 0.3938 | 0.0 | 0.8876 | 0.7644 | 0.9542 | 0.0069 | 0.1283 | 0.3792 | 0.0 | | 0.0623 | 45.2 | 9040 | 0.7278 | 0.4188 | 0.4995 | 0.8778 | nan | 0.8387 | 0.9556 | 0.8078 | 0.8564 | 0.5497 | nan | 0.7147 | 0.8184 | 0.0845 | 0.9571 | 0.1374 | 0.0 | 0.7057 | 0.0565 | 0.7144 | 0.0 | 0.0271 | 0.9300 | 0.2166 | 0.6403 | 0.3961 | 0.5277 | nan | 0.3465 | 0.6230 | 0.5861 | 0.0 | 0.9538 | 0.8378 | 0.9831 | 0.0104 | 0.2284 | 0.4816 | 0.0 | nan | 0.7591 | 0.8794 | 0.6766 | 0.7506 | 0.3999 | nan | 0.5965 | 0.6427 | 0.0491 | 0.8646 | 0.1196 | 0.0 | 0.4744 | 0.0565 | 0.5865 | 0.0 | 0.0269 | 0.7725 | 0.1910 | 0.5386 | 0.3288 | 0.3946 | nan | 0.2760 | 0.5099 | 0.3924 | 0.0 | 0.8870 | 0.7582 | 0.9538 | 0.0074 | 0.1293 | 0.3795 | 0.0 | | 0.0933 | 45.3 | 9060 | 0.7315 | 0.4165 | 0.4958 | 0.8770 | nan | 0.8354 | 0.9562 | 0.8112 | 0.8543 | 0.5486 | nan | 0.7124 | 0.8203 | 0.0950 | 0.9562 | 0.1240 | 0.0 | 0.6678 | 0.0667 | 0.7217 | 0.0 | 0.0277 | 0.9344 | 0.1949 | 0.6166 | 0.3834 | 0.5225 | nan | 0.3350 | 0.6123 | 0.5716 | 0.0 | 0.9552 | 0.8340 | 0.9832 | 0.0097 | 0.2344 | 0.4819 | 0.0 | nan | 0.7565 | 0.8785 | 0.6719 | 0.7522 | 0.4025 | nan | 0.5940 | 0.6437 | 0.0542 | 0.8649 | 0.1081 | 0.0 | 0.4599 | 0.0667 | 0.5888 | 0.0 | 0.0274 | 0.7680 | 0.1737 | 0.5282 | 0.3233 | 0.3892 | nan | 0.2649 | 0.5078 | 0.3946 | 0.0 | 0.8857 | 0.7566 | 0.9535 | 0.0070 | 0.1277 | 0.3796 | 0.0 | | 0.0677 | 45.4 | 9080 | 0.7344 | 0.4174 | 0.4972 | 0.8772 | nan | 0.8379 | 0.9557 | 0.8075 | 0.8528 | 0.5457 | nan | 0.7049 | 0.8222 | 0.1007 | 0.9576 | 0.1268 | 0.0 | 0.6637 | 0.0536 | 0.7270 | 0.0 | 0.0301 | 0.9329 | 0.1934 | 0.6329 | 0.3872 | 0.5320 | nan | 0.3338 | 0.6185 | 0.5816 | 0.0 | 0.9532 | 0.8411 | 0.9840 | 0.0124 | 0.2396 | 0.4811 | 0.0 | nan | 0.7567 | 0.8781 | 0.6787 | 0.7525 | 0.3997 | nan | 0.5912 | 0.6507 | 0.0581 | 0.8639 | 0.1107 | 0.0 | 0.4616 | 0.0536 | 0.5893 | 0.0 | 0.0297 | 0.7699 | 0.1720 | 0.5350 | 0.3262 | 0.3926 | nan | 0.2656 | 0.5087 | 0.3940 | 0.0 | 0.8867 | 0.7585 | 0.9531 | 0.0086 | 0.1308 | 0.3809 | 0.0 | | 0.0562 | 45.5 | 9100 | 0.7293 | 0.4182 | 0.4980 | 0.8775 | nan | 0.8373 | 0.9568 | 0.8015 | 0.8531 | 0.5431 | nan | 0.7127 | 0.8264 | 0.0872 | 0.9570 | 0.1405 | 0.0 | 0.6721 | 0.0682 | 0.7243 | 0.0 | 0.0273 | 0.9280 | 0.2170 | 0.6294 | 0.3873 | 0.5305 | nan | 0.3120 | 0.6229 | 0.5872 | 0.0 | 0.9560 | 0.8360 | 0.9832 | 0.0102 | 0.2377 | 0.4911 | 0.0 | nan | 0.7569 | 0.8778 | 0.6855 | 0.7522 | 0.4006 | nan | 0.5921 | 0.6538 | 0.0508 | 0.8642 | 0.1226 | 0.0 | 0.4610 | 0.0682 | 0.5914 | 0.0 | 0.0270 | 0.7720 | 0.1898 | 0.5368 | 0.3270 | 0.3875 | nan | 0.2549 | 0.5103 | 0.3902 | 0.0 | 0.8855 | 0.7555 | 0.9537 | 0.0074 | 0.1266 | 0.3816 | 0.0 | | 0.0722 | 45.6 | 9120 | 0.7235 | 0.4166 | 0.4948 | 0.8775 | nan | 0.8413 | 0.9552 | 0.7981 | 0.8550 | 0.5459 | nan | 0.7196 | 0.8248 | 0.0916 | 0.9560 | 0.1451 | 0.0 | 0.6633 | 0.0771 | 0.7236 | 0.0 | 0.0392 | 0.9341 | 0.1956 | 0.6222 | 0.3749 | 0.5074 | nan | 0.2789 | 0.6205 | 0.5841 | 0.0 | 0.9543 | 0.8399 | 0.9834 | 0.0040 | 0.2227 | 0.4759 | 0.0 | nan | 0.7596 | 0.8782 | 0.6895 | 0.7517 | 0.3923 | nan | 0.5944 | 0.6488 | 0.0515 | 0.8644 | 0.1259 | 0.0 | 0.4444 | 0.0771 | 0.5943 | 0.0 | 0.0388 | 0.7686 | 0.1723 | 0.5313 | 0.3210 | 0.3871 | nan | 0.2354 | 0.5105 | 0.3925 | 0.0 | 0.8865 | 0.7586 | 0.9536 | 0.0031 | 0.1195 | 0.3787 | 0.0 | | 0.0855 | 45.7 | 9140 | 0.7305 | 0.4143 | 0.4917 | 0.8777 | nan | 0.8419 | 0.9569 | 0.8012 | 0.8498 | 0.5367 | nan | 0.7175 | 0.8233 | 0.0903 | 0.9561 | 0.1382 | 0.0 | 0.6537 | 0.0677 | 0.7246 | 0.0 | 0.0283 | 0.9335 | 0.1604 | 0.6322 | 0.3685 | 0.5063 | nan | 0.2680 | 0.6185 | 0.5825 | 0.0 | 0.9568 | 0.8389 | 0.9840 | 0.0062 | 0.2316 | 0.4611 | 0.0 | nan | 0.7593 | 0.8777 | 0.6863 | 0.7509 | 0.3992 | nan | 0.5939 | 0.6446 | 0.0520 | 0.8650 | 0.1202 | 0.0 | 0.4408 | 0.0677 | 0.5886 | 0.0 | 0.0282 | 0.7690 | 0.1451 | 0.5347 | 0.3159 | 0.3871 | nan | 0.2261 | 0.5104 | 0.3969 | 0.0 | 0.8861 | 0.7610 | 0.9529 | 0.0048 | 0.1205 | 0.3735 | 0.0 | | 0.1003 | 45.8 | 9160 | 0.7352 | 0.4134 | 0.4917 | 0.8775 | nan | 0.8407 | 0.9570 | 0.8053 | 0.8416 | 0.5420 | nan | 0.7239 | 0.8304 | 0.0839 | 0.9577 | 0.1438 | 0.0 | 0.6563 | 0.0487 | 0.7149 | 0.0 | 0.0286 | 0.9311 | 0.1542 | 0.6253 | 0.3811 | 0.5337 | nan | 0.2558 | 0.6163 | 0.5718 | 0.0 | 0.9561 | 0.8423 | 0.9833 | 0.0079 | 0.2251 | 0.4753 | 0.0 | nan | 0.7589 | 0.8770 | 0.6796 | 0.7526 | 0.4003 | nan | 0.5916 | 0.6395 | 0.0499 | 0.8648 | 0.1252 | 0.0 | 0.4420 | 0.0487 | 0.5860 | 0.0 | 0.0284 | 0.7701 | 0.1400 | 0.5330 | 0.3208 | 0.3949 | nan | 0.2163 | 0.5095 | 0.3961 | 0.0 | 0.8868 | 0.7634 | 0.9535 | 0.0060 | 0.1182 | 0.3753 | 0.0 | | 0.0856 | 45.9 | 9180 | 0.7275 | 0.4160 | 0.4940 | 0.8780 | nan | 0.8410 | 0.9569 | 0.8037 | 0.8525 | 0.5318 | nan | 0.7125 | 0.8325 | 0.0641 | 0.9577 | 0.1498 | 0.0 | 0.6451 | 0.0527 | 0.7146 | 0.0 | 0.0274 | 0.9315 | 0.1914 | 0.6320 | 0.3802 | 0.5320 | nan | 0.2852 | 0.6298 | 0.5780 | 0.0 | 0.9526 | 0.8491 | 0.9825 | 0.0073 | 0.2231 | 0.4918 | 0.0 | nan | 0.7590 | 0.8777 | 0.6857 | 0.7505 | 0.3972 | nan | 0.5930 | 0.6430 | 0.0397 | 0.8646 | 0.1305 | 0.0 | 0.4510 | 0.0527 | 0.5873 | 0.0 | 0.0272 | 0.7714 | 0.1695 | 0.5386 | 0.3233 | 0.3932 | nan | 0.2411 | 0.5101 | 0.3956 | 0.0 | 0.8883 | 0.7657 | 0.9541 | 0.0055 | 0.1176 | 0.3792 | 0.0 | | 0.0818 | 46.0 | 9200 | 0.7458 | 0.4147 | 0.4941 | 0.8772 | nan | 0.8414 | 0.9570 | 0.7964 | 0.8409 | 0.5371 | nan | 0.7188 | 0.8320 | 0.0784 | 0.9571 | 0.1535 | 0.0 | 0.6523 | 0.0480 | 0.7226 | 0.0 | 0.0183 | 0.9319 | 0.1858 | 0.6348 | 0.3898 | 0.5330 | nan | 0.2789 | 0.6231 | 0.5943 | 0.0 | 0.9558 | 0.8330 | 0.9843 | 0.0110 | 0.2373 | 0.4636 | 0.0 | nan | 0.7578 | 0.8759 | 0.6976 | 0.7506 | 0.3975 | nan | 0.5916 | 0.6398 | 0.0482 | 0.8652 | 0.1325 | 0.0 | 0.4373 | 0.0480 | 0.5842 | 0.0 | 0.0181 | 0.7708 | 0.1639 | 0.5377 | 0.3260 | 0.3928 | nan | 0.2324 | 0.5116 | 0.3963 | 0.0 | 0.8858 | 0.7563 | 0.9532 | 0.0081 | 0.1199 | 0.3716 | 0.0 | | 0.0727 | 46.1 | 9220 | 0.7385 | 0.4162 | 0.4947 | 0.8773 | nan | 0.8434 | 0.9572 | 0.7878 | 0.8435 | 0.5369 | nan | 0.7161 | 0.8285 | 0.0850 | 0.9556 | 0.1520 | 0.0 | 0.6477 | 0.0513 | 0.7257 | 0.0 | 0.0171 | 0.9305 | 0.2151 | 0.6339 | 0.3868 | 0.5109 | nan | 0.2958 | 0.6302 | 0.5912 | 0.0 | 0.9548 | 0.8314 | 0.9839 | 0.0083 | 0.2354 | 0.4753 | 0.0 | nan | 0.7582 | 0.8760 | 0.7016 | 0.7485 | 0.3972 | nan | 0.5919 | 0.6417 | 0.0512 | 0.8663 | 0.1306 | 0.0 | 0.4412 | 0.0513 | 0.5872 | 0.0 | 0.0169 | 0.7718 | 0.1877 | 0.5386 | 0.3260 | 0.3906 | nan | 0.2423 | 0.5121 | 0.3961 | 0.0 | 0.8860 | 0.7552 | 0.9534 | 0.0062 | 0.1189 | 0.3750 | 0.0 | | 0.0575 | 46.2 | 9240 | 0.7399 | 0.4176 | 0.4962 | 0.8774 | nan | 0.8411 | 0.9561 | 0.7987 | 0.8454 | 0.5384 | nan | 0.7186 | 0.8241 | 0.0766 | 0.9531 | 0.1534 | 0.0 | 0.6517 | 0.0703 | 0.7375 | 0.0 | 0.0285 | 0.9308 | 0.2062 | 0.6480 | 0.3815 | 0.5007 | nan | 0.3178 | 0.6272 | 0.5761 | 0.0 | 0.9552 | 0.8348 | 0.9836 | 0.0087 | 0.2379 | 0.4757 | 0.0 | nan | 0.7579 | 0.8761 | 0.6950 | 0.7505 | 0.3951 | nan | 0.5925 | 0.6410 | 0.0465 | 0.8681 | 0.1319 | 0.0 | 0.4437 | 0.0703 | 0.5958 | 0.0 | 0.0279 | 0.7733 | 0.1823 | 0.5453 | 0.3248 | 0.3814 | nan | 0.2603 | 0.5126 | 0.3966 | 0.0 | 0.8860 | 0.7557 | 0.9535 | 0.0066 | 0.1182 | 0.3746 | 0.0 | | 0.0972 | 46.3 | 9260 | 0.7251 | 0.4175 | 0.4958 | 0.8779 | nan | 0.8462 | 0.9550 | 0.7966 | 0.8451 | 0.5448 | nan | 0.7146 | 0.8252 | 0.0723 | 0.9575 | 0.1456 | 0.0 | 0.6676 | 0.0647 | 0.7245 | 0.0 | 0.0273 | 0.9313 | 0.1948 | 0.6500 | 0.3861 | 0.5225 | nan | 0.3129 | 0.6088 | 0.5722 | 0.0 | 0.9545 | 0.8397 | 0.9830 | 0.0075 | 0.2369 | 0.4783 | 0.0 | nan | 0.7610 | 0.8766 | 0.7007 | 0.7536 | 0.3934 | nan | 0.5942 | 0.6424 | 0.0442 | 0.8653 | 0.1253 | 0.0 | 0.4524 | 0.0647 | 0.5900 | 0.0 | 0.0269 | 0.7730 | 0.1726 | 0.5446 | 0.3287 | 0.3898 | nan | 0.2568 | 0.5100 | 0.3965 | 0.0 | 0.8871 | 0.7584 | 0.9537 | 0.0058 | 0.1184 | 0.3745 | 0.0 | | 0.0528 | 46.4 | 9280 | 0.7302 | 0.4166 | 0.4933 | 0.8784 | nan | 0.8492 | 0.9565 | 0.7981 | 0.8473 | 0.5424 | nan | 0.6950 | 0.8226 | 0.0683 | 0.9571 | 0.1401 | 0.0 | 0.6675 | 0.0655 | 0.7211 | 0.0 | 0.0256 | 0.9300 | 0.1869 | 0.6525 | 0.3817 | 0.5119 | nan | 0.2976 | 0.6154 | 0.5809 | 0.0 | 0.9544 | 0.8396 | 0.9842 | 0.0088 | 0.2052 | 0.4807 | 0.0 | nan | 0.7626 | 0.8770 | 0.6993 | 0.7526 | 0.3962 | nan | 0.5926 | 0.6401 | 0.0423 | 0.8654 | 0.1211 | 0.0 | 0.4546 | 0.0655 | 0.5869 | 0.0 | 0.0253 | 0.7730 | 0.1662 | 0.5443 | 0.3279 | 0.3840 | nan | 0.2467 | 0.5094 | 0.3965 | 0.0 | 0.8873 | 0.7608 | 0.9531 | 0.0066 | 0.1164 | 0.3777 | 0.0 | | 0.0629 | 46.5 | 9300 | 0.7256 | 0.4175 | 0.4942 | 0.8785 | nan | 0.8492 | 0.9565 | 0.7925 | 0.8456 | 0.5273 | nan | 0.7210 | 0.8175 | 0.0515 | 0.9588 | 0.1461 | 0.0 | 0.6703 | 0.0769 | 0.7161 | 0.0 | 0.0221 | 0.9329 | 0.1864 | 0.6420 | 0.3898 | 0.5235 | nan | 0.2992 | 0.6153 | 0.5840 | 0.0 | 0.9537 | 0.8408 | 0.9846 | 0.0105 | 0.2214 | 0.4793 | 0.0 | nan | 0.7628 | 0.8771 | 0.7040 | 0.7518 | 0.3966 | nan | 0.5926 | 0.6505 | 0.0332 | 0.8643 | 0.1260 | 0.0 | 0.4539 | 0.0769 | 0.5916 | 0.0 | 0.0219 | 0.7715 | 0.1656 | 0.5414 | 0.3303 | 0.3870 | nan | 0.2515 | 0.5090 | 0.3982 | 0.0 | 0.8873 | 0.7597 | 0.9529 | 0.0079 | 0.1177 | 0.3779 | 0.0 | | 0.0586 | 46.6 | 9320 | 0.7344 | 0.4168 | 0.4927 | 0.8779 | nan | 0.8464 | 0.9545 | 0.8018 | 0.8483 | 0.5394 | nan | 0.7172 | 0.8116 | 0.0486 | 0.9574 | 0.1387 | 0.0 | 0.6651 | 0.0812 | 0.7010 | 0.0 | 0.0252 | 0.9338 | 0.1746 | 0.6301 | 0.3853 | 0.5186 | nan | 0.3050 | 0.6162 | 0.5729 | 0.0 | 0.9552 | 0.8382 | 0.9841 | 0.0090 | 0.2169 | 0.4890 | 0.0 | nan | 0.7616 | 0.8774 | 0.6929 | 0.7521 | 0.3965 | nan | 0.5919 | 0.6575 | 0.0306 | 0.8647 | 0.1198 | 0.0 | 0.4507 | 0.0812 | 0.5933 | 0.0 | 0.0249 | 0.7696 | 0.1567 | 0.5370 | 0.3266 | 0.3832 | nan | 0.2577 | 0.5076 | 0.3988 | 0.0 | 0.8866 | 0.7574 | 0.9530 | 0.0069 | 0.1191 | 0.3816 | 0.0 | | 0.0568 | 46.7 | 9340 | 0.7314 | 0.4166 | 0.4932 | 0.8782 | nan | 0.8463 | 0.9563 | 0.8034 | 0.8480 | 0.5337 | nan | 0.7172 | 0.8189 | 0.0663 | 0.9556 | 0.1397 | 0.0 | 0.6488 | 0.0903 | 0.7265 | 0.0 | 0.0278 | 0.9337 | 0.1693 | 0.6203 | 0.3956 | 0.5112 | nan | 0.2894 | 0.6188 | 0.5791 | 0.0 | 0.9555 | 0.8414 | 0.9835 | 0.0048 | 0.2222 | 0.4803 | 0.0 | nan | 0.7618 | 0.8772 | 0.6908 | 0.7524 | 0.3994 | nan | 0.5930 | 0.6552 | 0.0414 | 0.8665 | 0.1207 | 0.0 | 0.4390 | 0.0903 | 0.5985 | 0.0 | 0.0274 | 0.7683 | 0.1518 | 0.5324 | 0.3318 | 0.3791 | nan | 0.2433 | 0.5085 | 0.3998 | 0.0 | 0.8870 | 0.7603 | 0.9532 | 0.0039 | 0.1184 | 0.3791 | 0.0 | | 0.0611 | 46.8 | 9360 | 0.7297 | 0.4186 | 0.4960 | 0.8783 | nan | 0.8479 | 0.9551 | 0.8031 | 0.8471 | 0.5461 | nan | 0.7064 | 0.8101 | 0.0863 | 0.9576 | 0.1442 | 0.0 | 0.6391 | 0.0892 | 0.7285 | 0.0 | 0.0249 | 0.9313 | 0.1826 | 0.6372 | 0.3967 | 0.5232 | nan | 0.3067 | 0.6147 | 0.5874 | 0.0 | 0.9550 | 0.8416 | 0.9838 | 0.0089 | 0.2377 | 0.4788 | 0.0 | nan | 0.7624 | 0.8777 | 0.6896 | 0.7528 | 0.3999 | nan | 0.5928 | 0.6580 | 0.0516 | 0.8651 | 0.1244 | 0.0 | 0.4500 | 0.0892 | 0.5927 | 0.0 | 0.0246 | 0.7715 | 0.1615 | 0.5397 | 0.3325 | 0.3890 | nan | 0.2582 | 0.5085 | 0.3975 | 0.0 | 0.8872 | 0.7598 | 0.9531 | 0.0068 | 0.1216 | 0.3773 | 0.0 | | 0.1038 | 46.9 | 9380 | 0.7274 | 0.4177 | 0.4942 | 0.8783 | nan | 0.8465 | 0.9568 | 0.7992 | 0.8453 | 0.5452 | nan | 0.7094 | 0.8165 | 0.0852 | 0.9587 | 0.1454 | 0.0 | 0.6590 | 0.0907 | 0.7254 | 0.0 | 0.0270 | 0.9278 | 0.1996 | 0.6471 | 0.3748 | 0.4610 | nan | 0.2958 | 0.6211 | 0.5866 | 0.0 | 0.9559 | 0.8365 | 0.9833 | 0.0078 | 0.2260 | 0.4811 | 0.0 | nan | 0.7621 | 0.8770 | 0.6919 | 0.7533 | 0.4018 | nan | 0.5925 | 0.6552 | 0.0508 | 0.8641 | 0.1258 | 0.0 | 0.4486 | 0.0907 | 0.5881 | 0.0 | 0.0267 | 0.7738 | 0.1758 | 0.5457 | 0.3221 | 0.3668 | nan | 0.2496 | 0.5085 | 0.3966 | 0.0 | 0.8863 | 0.7590 | 0.9533 | 0.0059 | 0.1182 | 0.3766 | 0.0 | | 0.0721 | 47.0 | 9400 | 0.7309 | 0.4177 | 0.4954 | 0.8785 | nan | 0.8496 | 0.9579 | 0.7981 | 0.8473 | 0.5252 | nan | 0.7008 | 0.8264 | 0.0810 | 0.9573 | 0.1389 | 0.0 | 0.6520 | 0.0924 | 0.7349 | 0.0 | 0.0209 | 0.9318 | 0.1881 | 0.6388 | 0.3893 | 0.5309 | nan | 0.2970 | 0.6226 | 0.5797 | 0.0 | 0.9539 | 0.8383 | 0.9824 | 0.0103 | 0.2231 | 0.4843 | 0.0 | nan | 0.7617 | 0.8773 | 0.6923 | 0.7515 | 0.4019 | nan | 0.5902 | 0.6515 | 0.0488 | 0.8656 | 0.1199 | 0.0 | 0.4441 | 0.0924 | 0.5927 | 0.0 | 0.0206 | 0.7722 | 0.1676 | 0.5419 | 0.3287 | 0.3841 | nan | 0.2496 | 0.5102 | 0.3969 | 0.0 | 0.8872 | 0.7594 | 0.9538 | 0.0076 | 0.1178 | 0.3788 | 0.0 | | 0.0765 | 47.1 | 9420 | 0.7436 | 0.4173 | 0.4947 | 0.8780 | nan | 0.8395 | 0.9568 | 0.7953 | 0.8553 | 0.5420 | nan | 0.7069 | 0.8268 | 0.0823 | 0.9524 | 0.1397 | 0.0 | 0.6590 | 0.0776 | 0.7365 | 0.0 | 0.0291 | 0.9325 | 0.1797 | 0.6467 | 0.3971 | 0.5116 | nan | 0.2776 | 0.6242 | 0.5679 | 0.0 | 0.9525 | 0.8448 | 0.9835 | 0.0085 | 0.2178 | 0.4879 | 0.0 | nan | 0.7576 | 0.8766 | 0.6852 | 0.7475 | 0.4050 | nan | 0.5935 | 0.6497 | 0.0490 | 0.8680 | 0.1209 | 0.0 | 0.4457 | 0.0776 | 0.5973 | 0.0 | 0.0285 | 0.7724 | 0.1608 | 0.5445 | 0.3343 | 0.3842 | nan | 0.2379 | 0.5114 | 0.3974 | 0.0 | 0.8881 | 0.7620 | 0.9534 | 0.0065 | 0.1186 | 0.3802 | 0.0 | | 0.0778 | 47.2 | 9440 | 0.7376 | 0.4153 | 0.4919 | 0.8779 | nan | 0.8420 | 0.9571 | 0.7947 | 0.8531 | 0.5307 | nan | 0.7137 | 0.8216 | 0.0704 | 0.9525 | 0.1361 | 0.0 | 0.6425 | 0.0739 | 0.7302 | 0.0 | 0.0221 | 0.9352 | 0.1703 | 0.6248 | 0.3884 | 0.5102 | nan | 0.2715 | 0.6262 | 0.5748 | 0.0 | 0.9541 | 0.8482 | 0.9821 | 0.0113 | 0.2260 | 0.4762 | 0.0 | nan | 0.7585 | 0.8773 | 0.6851 | 0.7481 | 0.4032 | nan | 0.5925 | 0.6512 | 0.0433 | 0.8679 | 0.1177 | 0.0 | 0.4427 | 0.0739 | 0.5972 | 0.0 | 0.0217 | 0.7690 | 0.1527 | 0.5360 | 0.3284 | 0.3733 | nan | 0.2334 | 0.5107 | 0.3990 | 0.0 | 0.8876 | 0.7631 | 0.9538 | 0.0084 | 0.1165 | 0.3760 | 0.0 | | 0.0621 | 47.3 | 9460 | 0.7279 | 0.4137 | 0.4915 | 0.8774 | nan | 0.8375 | 0.9566 | 0.8020 | 0.8541 | 0.5428 | nan | 0.7096 | 0.8224 | 0.0794 | 0.9576 | 0.1345 | 0.0 | 0.6602 | 0.0616 | 0.7244 | 0.0 | 0.0222 | 0.9324 | 0.1667 | 0.6315 | 0.3798 | 0.5007 | nan | 0.2676 | 0.6308 | 0.5884 | 0.0 | 0.9567 | 0.8354 | 0.9810 | 0.0079 | 0.2086 | 0.4761 | 0.0 | nan | 0.7575 | 0.8770 | 0.6783 | 0.7480 | 0.4013 | nan | 0.5942 | 0.6488 | 0.0470 | 0.8652 | 0.1168 | 0.0 | 0.4413 | 0.0616 | 0.5908 | 0.0 | 0.0220 | 0.7701 | 0.1501 | 0.5382 | 0.3234 | 0.3744 | nan | 0.2276 | 0.5094 | 0.3971 | 0.0 | 0.8862 | 0.7601 | 0.9539 | 0.0060 | 0.1150 | 0.3776 | 0.0 | | 0.0558 | 47.4 | 9480 | 0.7360 | 0.4143 | 0.4952 | 0.8770 | nan | 0.8385 | 0.9552 | 0.8044 | 0.8521 | 0.5415 | nan | 0.7143 | 0.8293 | 0.0805 | 0.9582 | 0.1368 | 0.0 | 0.6804 | 0.0656 | 0.7297 | 0.0 | 0.0222 | 0.9316 | 0.1784 | 0.6315 | 0.3912 | 0.5249 | nan | 0.2854 | 0.6190 | 0.5898 | 0.0 | 0.9560 | 0.8343 | 0.9832 | 0.0095 | 0.2359 | 0.4682 | 0.0 | nan | 0.7570 | 0.8768 | 0.6773 | 0.7489 | 0.3980 | nan | 0.5937 | 0.6465 | 0.0482 | 0.8650 | 0.1187 | 0.0 | 0.4391 | 0.0656 | 0.5893 | 0.0 | 0.0220 | 0.7709 | 0.1589 | 0.5394 | 0.3281 | 0.3818 | nan | 0.2384 | 0.5090 | 0.3943 | 0.0 | 0.8860 | 0.7567 | 0.9533 | 0.0072 | 0.1149 | 0.3720 | 0.0 | | 0.0726 | 47.5 | 9500 | 0.7314 | 0.4160 | 0.4960 | 0.8772 | nan | 0.8422 | 0.9544 | 0.7934 | 0.8513 | 0.5438 | nan | 0.7173 | 0.8243 | 0.0799 | 0.9579 | 0.1533 | 0.0 | 0.7087 | 0.0749 | 0.7255 | 0.0 | 0.0160 | 0.9311 | 0.2048 | 0.6372 | 0.3851 | 0.5228 | nan | 0.2714 | 0.6266 | 0.5853 | 0.0 | 0.9553 | 0.8319 | 0.9832 | 0.0115 | 0.2038 | 0.4784 | 0.0 | nan | 0.7578 | 0.8768 | 0.6892 | 0.7488 | 0.3959 | nan | 0.5926 | 0.6495 | 0.0482 | 0.8648 | 0.1313 | 0.0 | 0.4538 | 0.0749 | 0.5859 | 0.0 | 0.0158 | 0.7718 | 0.1794 | 0.5413 | 0.3257 | 0.3844 | nan | 0.2306 | 0.5098 | 0.3921 | 0.0 | 0.8859 | 0.7547 | 0.9535 | 0.0087 | 0.1108 | 0.3771 | 0.0 | | 0.0779 | 47.6 | 9520 | 0.7363 | 0.4140 | 0.4934 | 0.8770 | nan | 0.8399 | 0.9569 | 0.7949 | 0.8506 | 0.5383 | nan | 0.7144 | 0.8275 | 0.0841 | 0.9571 | 0.1461 | 0.0 | 0.6956 | 0.0652 | 0.7286 | 0.0 | 0.0193 | 0.9326 | 0.1878 | 0.6376 | 0.3843 | 0.5204 | nan | 0.2447 | 0.6152 | 0.5780 | 0.0 | 0.9556 | 0.8259 | 0.9832 | 0.0079 | 0.2296 | 0.4659 | 0.0 | nan | 0.7575 | 0.8757 | 0.6880 | 0.7494 | 0.3984 | nan | 0.5922 | 0.6447 | 0.0499 | 0.8652 | 0.1262 | 0.0 | 0.4449 | 0.0652 | 0.5913 | 0.0 | 0.0191 | 0.7713 | 0.1667 | 0.5406 | 0.3256 | 0.3877 | nan | 0.2083 | 0.5086 | 0.3915 | 0.0 | 0.8857 | 0.7537 | 0.9534 | 0.0062 | 0.1125 | 0.3698 | 0.0 | | 0.0684 | 47.7 | 9540 | 0.7313 | 0.4147 | 0.4948 | 0.8773 | nan | 0.8394 | 0.9563 | 0.7983 | 0.8544 | 0.5386 | nan | 0.7088 | 0.8248 | 0.0835 | 0.9565 | 0.1451 | 0.0 | 0.6814 | 0.0755 | 0.7316 | 0.0 | 0.0193 | 0.9340 | 0.1910 | 0.6379 | 0.3899 | 0.5246 | nan | 0.2466 | 0.6246 | 0.5960 | 0.0 | 0.9549 | 0.8363 | 0.9820 | 0.0070 | 0.2391 | 0.4575 | 0.0 | nan | 0.7574 | 0.8760 | 0.6850 | 0.7473 | 0.3987 | nan | 0.5922 | 0.6459 | 0.0498 | 0.8657 | 0.1249 | 0.0 | 0.4395 | 0.0755 | 0.5940 | 0.0 | 0.0191 | 0.7708 | 0.1686 | 0.5411 | 0.3285 | 0.3841 | nan | 0.2108 | 0.5105 | 0.3936 | 0.0 | 0.8873 | 0.7602 | 0.9538 | 0.0055 | 0.1153 | 0.3687 | 0.0 | | 0.0506 | 47.8 | 9560 | 0.7419 | 0.4159 | 0.4964 | 0.8773 | nan | 0.8391 | 0.9555 | 0.7997 | 0.8506 | 0.5418 | nan | 0.7197 | 0.8267 | 0.0754 | 0.9557 | 0.1434 | 0.0 | 0.6838 | 0.0789 | 0.7314 | 0.0 | 0.0211 | 0.9315 | 0.2033 | 0.6406 | 0.3975 | 0.5284 | nan | 0.2621 | 0.6212 | 0.5835 | 0.0 | 0.9555 | 0.8379 | 0.9829 | 0.0084 | 0.2466 | 0.4632 | 0.0 | nan | 0.7574 | 0.8760 | 0.6866 | 0.7498 | 0.3988 | nan | 0.5916 | 0.6472 | 0.0461 | 0.8665 | 0.1237 | 0.0 | 0.4367 | 0.0789 | 0.5946 | 0.0 | 0.0208 | 0.7718 | 0.1779 | 0.5414 | 0.3328 | 0.3886 | nan | 0.2215 | 0.5104 | 0.3945 | 0.0 | 0.8871 | 0.7591 | 0.9536 | 0.0066 | 0.1176 | 0.3700 | 0.0 | | 0.088 | 47.9 | 9580 | 0.7372 | 0.4154 | 0.4948 | 0.8770 | nan | 0.8372 | 0.9557 | 0.7968 | 0.8555 | 0.5386 | nan | 0.7190 | 0.8235 | 0.0752 | 0.9572 | 0.1330 | 0.0 | 0.6780 | 0.0705 | 0.7250 | 0.0 | 0.0221 | 0.9338 | 0.2007 | 0.6348 | 0.3928 | 0.5277 | nan | 0.2733 | 0.6200 | 0.5778 | 0.0 | 0.9557 | 0.8324 | 0.9817 | 0.0100 | 0.2410 | 0.4647 | 0.0 | nan | 0.7565 | 0.8758 | 0.6889 | 0.7478 | 0.3999 | nan | 0.5922 | 0.6489 | 0.0459 | 0.8654 | 0.1161 | 0.0 | 0.4375 | 0.0705 | 0.5903 | 0.0 | 0.0219 | 0.7702 | 0.1762 | 0.5390 | 0.3303 | 0.3902 | nan | 0.2294 | 0.5097 | 0.3954 | 0.0 | 0.8866 | 0.7577 | 0.9539 | 0.0076 | 0.1170 | 0.3704 | 0.0 | | 0.0708 | 48.0 | 9600 | 0.7499 | 0.4138 | 0.4932 | 0.8768 | nan | 0.8380 | 0.9567 | 0.8049 | 0.8498 | 0.5414 | nan | 0.7080 | 0.8187 | 0.0876 | 0.9575 | 0.1376 | 0.0 | 0.6679 | 0.0770 | 0.7211 | 0.0 | 0.0208 | 0.9348 | 0.1811 | 0.6242 | 0.3758 | 0.5263 | nan | 0.2632 | 0.6166 | 0.5870 | 0.0 | 0.9565 | 0.8334 | 0.9838 | 0.0097 | 0.2509 | 0.4512 | 0.0 | nan | 0.7566 | 0.8760 | 0.6799 | 0.7510 | 0.4006 | nan | 0.5910 | 0.6498 | 0.0504 | 0.8651 | 0.1189 | 0.0 | 0.4321 | 0.0770 | 0.5895 | 0.0 | 0.0206 | 0.7681 | 0.1609 | 0.5345 | 0.3214 | 0.3879 | nan | 0.2187 | 0.5091 | 0.3942 | 0.0 | 0.8860 | 0.7582 | 0.9531 | 0.0074 | 0.1174 | 0.3658 | 0.0 | | 0.0619 | 48.1 | 9620 | 0.7347 | 0.4134 | 0.4906 | 0.8774 | nan | 0.8400 | 0.9564 | 0.7991 | 0.8501 | 0.5427 | nan | 0.7101 | 0.8264 | 0.0870 | 0.9591 | 0.1333 | 0.0 | 0.6645 | 0.0631 | 0.7117 | 0.0 | 0.0173 | 0.9342 | 0.1844 | 0.6364 | 0.3685 | 0.4912 | nan | 0.2683 | 0.6247 | 0.5788 | 0.0 | 0.9545 | 0.8391 | 0.9843 | 0.0087 | 0.2019 | 0.4636 | 0.0 | nan | 0.7576 | 0.8762 | 0.6888 | 0.7512 | 0.4006 | nan | 0.5913 | 0.6446 | 0.0503 | 0.8633 | 0.1157 | 0.0 | 0.4366 | 0.0631 | 0.5840 | 0.0 | 0.0172 | 0.7691 | 0.1639 | 0.5377 | 0.3190 | 0.3767 | nan | 0.2234 | 0.5091 | 0.3951 | 0.0 | 0.8869 | 0.7603 | 0.9529 | 0.0067 | 0.1134 | 0.3731 | 0.0 | | 0.0679 | 48.2 | 9640 | 0.7443 | 0.4146 | 0.4932 | 0.8774 | nan | 0.8398 | 0.9563 | 0.8032 | 0.8523 | 0.5367 | nan | 0.7114 | 0.8243 | 0.0852 | 0.9570 | 0.1333 | 0.0 | 0.6632 | 0.0695 | 0.7156 | 0.0 | 0.0205 | 0.9321 | 0.1847 | 0.6359 | 0.3747 | 0.5341 | nan | 0.2829 | 0.6160 | 0.5802 | 0.0 | 0.9557 | 0.8381 | 0.9843 | 0.0100 | 0.2119 | 0.4738 | 0.0 | nan | 0.7574 | 0.8765 | 0.6845 | 0.7502 | 0.4017 | nan | 0.5913 | 0.6466 | 0.0493 | 0.8652 | 0.1155 | 0.0 | 0.4376 | 0.0695 | 0.5849 | 0.0 | 0.0204 | 0.7699 | 0.1649 | 0.5374 | 0.3215 | 0.3882 | nan | 0.2340 | 0.5089 | 0.3947 | 0.0 | 0.8867 | 0.7605 | 0.9530 | 0.0075 | 0.1149 | 0.3748 | 0.0 | | 0.0938 | 48.3 | 9660 | 0.7429 | 0.4140 | 0.4931 | 0.8774 | nan | 0.8406 | 0.9563 | 0.8019 | 0.8513 | 0.5320 | nan | 0.7138 | 0.8390 | 0.0770 | 0.9569 | 0.1339 | 0.0 | 0.6531 | 0.0543 | 0.7198 | 0.0 | 0.0187 | 0.9321 | 0.1847 | 0.6366 | 0.3761 | 0.5390 | nan | 0.2840 | 0.6273 | 0.5741 | 0.0 | 0.9549 | 0.8357 | 0.9836 | 0.0111 | 0.2154 | 0.4776 | 0.0 | nan | 0.7576 | 0.8767 | 0.6863 | 0.7500 | 0.4008 | nan | 0.5908 | 0.6389 | 0.0470 | 0.8657 | 0.1162 | 0.0 | 0.4395 | 0.0543 | 0.5858 | 0.0 | 0.0185 | 0.7704 | 0.1651 | 0.5387 | 0.3217 | 0.3894 | nan | 0.2368 | 0.5099 | 0.3934 | 0.0 | 0.8866 | 0.7584 | 0.9532 | 0.0083 | 0.1141 | 0.3750 | 0.0 | | 0.0591 | 48.4 | 9680 | 0.7427 | 0.4138 | 0.4931 | 0.8771 | nan | 0.8375 | 0.9563 | 0.8043 | 0.8503 | 0.5410 | nan | 0.7166 | 0.8399 | 0.0708 | 0.9560 | 0.1297 | 0.0 | 0.6557 | 0.0510 | 0.7179 | 0.0 | 0.0270 | 0.9321 | 0.1910 | 0.6275 | 0.3767 | 0.5428 | nan | 0.2835 | 0.6128 | 0.5744 | 0.0 | 0.9556 | 0.8359 | 0.9835 | 0.0084 | 0.2141 | 0.4868 | 0.0 | nan | 0.7571 | 0.8764 | 0.6832 | 0.7505 | 0.4016 | nan | 0.5924 | 0.6416 | 0.0450 | 0.8665 | 0.1128 | 0.0 | 0.4379 | 0.0510 | 0.5873 | 0.0 | 0.0268 | 0.7699 | 0.1697 | 0.5351 | 0.3212 | 0.3897 | nan | 0.2340 | 0.5080 | 0.3918 | 0.0 | 0.8863 | 0.7582 | 0.9532 | 0.0065 | 0.1118 | 0.3760 | 0.0 | | 0.0523 | 48.5 | 9700 | 0.7421 | 0.4144 | 0.4949 | 0.8775 | nan | 0.8384 | 0.9572 | 0.8026 | 0.8506 | 0.5378 | nan | 0.7151 | 0.8431 | 0.0752 | 0.9573 | 0.1406 | 0.0 | 0.6618 | 0.0523 | 0.7143 | 0.0 | 0.0241 | 0.9290 | 0.1937 | 0.6402 | 0.3804 | 0.5604 | nan | 0.2732 | 0.6196 | 0.5871 | 0.0 | 0.9550 | 0.8367 | 0.9846 | 0.0083 | 0.2182 | 0.4810 | 0.0 | nan | 0.7572 | 0.8761 | 0.6848 | 0.7501 | 0.4023 | nan | 0.5922 | 0.6393 | 0.0462 | 0.8657 | 0.1215 | 0.0 | 0.4336 | 0.0523 | 0.5862 | 0.0 | 0.0239 | 0.7723 | 0.1720 | 0.5407 | 0.3235 | 0.3985 | nan | 0.2255 | 0.5089 | 0.3921 | 0.0 | 0.8870 | 0.7597 | 0.9527 | 0.0065 | 0.1145 | 0.3759 | 0.0 | | 0.0637 | 48.6 | 9720 | 0.7331 | 0.4134 | 0.4923 | 0.8774 | nan | 0.8411 | 0.9564 | 0.8016 | 0.8486 | 0.5389 | nan | 0.7131 | 0.8436 | 0.0734 | 0.9559 | 0.1355 | 0.0 | 0.6678 | 0.0479 | 0.7143 | 0.0 | 0.0212 | 0.9321 | 0.1742 | 0.6388 | 0.3770 | 0.5414 | nan | 0.2821 | 0.6092 | 0.5614 | 0.0 | 0.9559 | 0.8377 | 0.9840 | 0.0100 | 0.2209 | 0.4709 | 0.0 | nan | 0.7578 | 0.8764 | 0.6842 | 0.7507 | 0.4014 | nan | 0.5907 | 0.6393 | 0.0455 | 0.8661 | 0.1174 | 0.0 | 0.4289 | 0.0479 | 0.5867 | 0.0 | 0.0211 | 0.7708 | 0.1560 | 0.5389 | 0.3215 | 0.3930 | nan | 0.2359 | 0.5076 | 0.3937 | 0.0 | 0.8865 | 0.7603 | 0.9530 | 0.0076 | 0.1152 | 0.3744 | 0.0 | | 0.0684 | 48.7 | 9740 | 0.7413 | 0.4147 | 0.4926 | 0.8777 | nan | 0.8393 | 0.9575 | 0.7960 | 0.8495 | 0.5399 | nan | 0.7155 | 0.8305 | 0.0608 | 0.9568 | 0.1343 | 0.0 | 0.6793 | 0.0566 | 0.7062 | 0.0 | 0.0236 | 0.9330 | 0.1737 | 0.6419 | 0.3874 | 0.5362 | nan | 0.2728 | 0.6194 | 0.5792 | 0.0 | 0.9542 | 0.8375 | 0.9835 | 0.0097 | 0.2096 | 0.4797 | 0.0 | nan | 0.7577 | 0.8758 | 0.6888 | 0.7504 | 0.4032 | nan | 0.5920 | 0.6493 | 0.0382 | 0.8655 | 0.1165 | 0.0 | 0.4411 | 0.0566 | 0.5861 | 0.0 | 0.0234 | 0.7713 | 0.1561 | 0.5405 | 0.3279 | 0.3945 | nan | 0.2280 | 0.5106 | 0.3945 | 0.0 | 0.8875 | 0.7615 | 0.9533 | 0.0074 | 0.1156 | 0.3780 | 0.0 | | 0.0569 | 48.8 | 9760 | 0.7418 | 0.4147 | 0.4920 | 0.8776 | nan | 0.8406 | 0.9572 | 0.7979 | 0.8524 | 0.5363 | nan | 0.7072 | 0.8246 | 0.0664 | 0.9582 | 0.1298 | 0.0 | 0.6691 | 0.0599 | 0.7100 | 0.0 | 0.0233 | 0.9336 | 0.1790 | 0.6340 | 0.3755 | 0.5214 | nan | 0.2883 | 0.6211 | 0.5849 | 0.0 | 0.9546 | 0.8381 | 0.9827 | 0.0090 | 0.2097 | 0.4801 | 0.0 | nan | 0.7578 | 0.8763 | 0.6864 | 0.7495 | 0.4025 | nan | 0.5912 | 0.6513 | 0.0409 | 0.8647 | 0.1129 | 0.0 | 0.4435 | 0.0599 | 0.5852 | 0.0 | 0.0232 | 0.7699 | 0.1597 | 0.5369 | 0.3219 | 0.3894 | nan | 0.2395 | 0.5099 | 0.3934 | 0.0 | 0.8873 | 0.7619 | 0.9536 | 0.0069 | 0.1146 | 0.3783 | 0.0 | | 0.0643 | 48.9 | 9780 | 0.7366 | 0.4135 | 0.4904 | 0.8775 | nan | 0.8419 | 0.9566 | 0.7893 | 0.8522 | 0.5418 | nan | 0.7185 | 0.8334 | 0.0726 | 0.9567 | 0.1313 | 0.0 | 0.6664 | 0.0590 | 0.7157 | 0.0 | 0.0185 | 0.9318 | 0.1687 | 0.6233 | 0.3777 | 0.5091 | nan | 0.2657 | 0.6215 | 0.5780 | 0.0 | 0.9570 | 0.8320 | 0.9825 | 0.0083 | 0.1973 | 0.4853 | 0.0 | nan | 0.7581 | 0.8762 | 0.6937 | 0.7500 | 0.4024 | nan | 0.5925 | 0.6464 | 0.0436 | 0.8658 | 0.1139 | 0.0 | 0.4394 | 0.0590 | 0.5873 | 0.0 | 0.0184 | 0.7697 | 0.1521 | 0.5328 | 0.3214 | 0.3823 | nan | 0.2255 | 0.5103 | 0.3930 | 0.0 | 0.8863 | 0.7600 | 0.9536 | 0.0065 | 0.1109 | 0.3795 | 0.0 | | 0.0618 | 49.0 | 9800 | 0.7324 | 0.4133 | 0.4908 | 0.8776 | nan | 0.8385 | 0.9577 | 0.8009 | 0.8520 | 0.5329 | nan | 0.7132 | 0.8277 | 0.0697 | 0.9596 | 0.1262 | 0.0 | 0.6673 | 0.0580 | 0.7046 | 0.0 | 0.0220 | 0.9325 | 0.1735 | 0.6305 | 0.3849 | 0.5274 | nan | 0.2633 | 0.6253 | 0.5768 | 0.0 | 0.9547 | 0.8406 | 0.9843 | 0.0085 | 0.1999 | 0.4747 | 0.0 | nan | 0.7577 | 0.8761 | 0.6846 | 0.7503 | 0.4020 | nan | 0.5919 | 0.6485 | 0.0424 | 0.8634 | 0.1100 | 0.0 | 0.4368 | 0.0580 | 0.5832 | 0.0 | 0.0219 | 0.7696 | 0.1555 | 0.5351 | 0.3255 | 0.3876 | nan | 0.2220 | 0.5101 | 0.3954 | 0.0 | 0.8875 | 0.7621 | 0.9528 | 0.0066 | 0.1131 | 0.3773 | 0.0 | | 0.0722 | 49.1 | 9820 | 0.7347 | 0.4152 | 0.4928 | 0.8777 | nan | 0.8403 | 0.9565 | 0.7873 | 0.8557 | 0.5384 | nan | 0.7119 | 0.8282 | 0.0710 | 0.9564 | 0.1294 | 0.0 | 0.6734 | 0.0744 | 0.7173 | 0.0 | 0.0147 | 0.9324 | 0.1844 | 0.6363 | 0.3878 | 0.5176 | nan | 0.2736 | 0.6280 | 0.5896 | 0.0 | 0.9558 | 0.8401 | 0.9836 | 0.0111 | 0.2020 | 0.4727 | 0.0 | nan | 0.7581 | 0.8762 | 0.6974 | 0.7481 | 0.4016 | nan | 0.5941 | 0.6497 | 0.0431 | 0.8656 | 0.1120 | 0.0 | 0.4399 | 0.0744 | 0.5860 | 0.0 | 0.0146 | 0.7702 | 0.1637 | 0.5370 | 0.3277 | 0.3850 | nan | 0.2310 | 0.5107 | 0.3964 | 0.0 | 0.8870 | 0.7609 | 0.9530 | 0.0084 | 0.1158 | 0.3774 | 0.0 | | 0.0848 | 49.2 | 9840 | 0.7302 | 0.4154 | 0.4924 | 0.8779 | nan | 0.8416 | 0.9570 | 0.7898 | 0.8545 | 0.5362 | nan | 0.7096 | 0.8272 | 0.0683 | 0.9567 | 0.1272 | 0.0 | 0.6605 | 0.0652 | 0.7112 | 0.0 | 0.0269 | 0.9322 | 0.1873 | 0.6383 | 0.3877 | 0.5232 | nan | 0.2786 | 0.6276 | 0.5780 | 0.0 | 0.9544 | 0.8408 | 0.9845 | 0.0039 | 0.2122 | 0.4777 | 0.0 | nan | 0.7582 | 0.8762 | 0.6952 | 0.7485 | 0.4011 | nan | 0.5941 | 0.6516 | 0.0409 | 0.8653 | 0.1102 | 0.0 | 0.4407 | 0.0652 | 0.5879 | 0.0 | 0.0267 | 0.7704 | 0.1661 | 0.5384 | 0.3281 | 0.3896 | nan | 0.2317 | 0.5104 | 0.3957 | 0.0 | 0.8876 | 0.7614 | 0.9526 | 0.0032 | 0.1171 | 0.3778 | 0.0 | | 0.0643 | 49.3 | 9860 | 0.7355 | 0.4159 | 0.4939 | 0.8777 | nan | 0.8423 | 0.9557 | 0.7939 | 0.8495 | 0.5429 | nan | 0.7175 | 0.8320 | 0.0748 | 0.9574 | 0.1283 | 0.0 | 0.6710 | 0.0728 | 0.7084 | 0.0 | 0.0211 | 0.9312 | 0.1863 | 0.6321 | 0.3859 | 0.5260 | nan | 0.2964 | 0.6212 | 0.5756 | 0.0 | 0.9555 | 0.8396 | 0.9838 | 0.0094 | 0.2111 | 0.4841 | 0.0 | nan | 0.7584 | 0.8765 | 0.6940 | 0.7514 | 0.4006 | nan | 0.5930 | 0.6498 | 0.0438 | 0.8652 | 0.1112 | 0.0 | 0.4418 | 0.0728 | 0.5869 | 0.0 | 0.0209 | 0.7707 | 0.1661 | 0.5366 | 0.3264 | 0.3899 | nan | 0.2456 | 0.5099 | 0.3942 | 0.0 | 0.8869 | 0.7603 | 0.9532 | 0.0072 | 0.1160 | 0.3788 | 0.0 | | 0.0671 | 49.4 | 9880 | 0.7447 | 0.4160 | 0.4946 | 0.8777 | nan | 0.8400 | 0.9564 | 0.8003 | 0.8508 | 0.5393 | nan | 0.7193 | 0.8202 | 0.0841 | 0.9561 | 0.1303 | 0.0 | 0.6808 | 0.0803 | 0.7160 | 0.0 | 0.0180 | 0.9329 | 0.1805 | 0.6364 | 0.3888 | 0.5383 | nan | 0.2899 | 0.6208 | 0.5842 | 0.0 | 0.9553 | 0.8362 | 0.9844 | 0.0111 | 0.1967 | 0.4782 | 0.0 | nan | 0.7581 | 0.8765 | 0.6879 | 0.7514 | 0.4014 | nan | 0.5924 | 0.6533 | 0.0488 | 0.8656 | 0.1124 | 0.0 | 0.4460 | 0.0803 | 0.5854 | 0.0 | 0.0179 | 0.7700 | 0.1612 | 0.5372 | 0.3270 | 0.3889 | nan | 0.2425 | 0.5099 | 0.3944 | 0.0 | 0.8867 | 0.7602 | 0.9529 | 0.0084 | 0.1167 | 0.3799 | 0.0 | | 0.0706 | 49.5 | 9900 | 0.7401 | 0.4150 | 0.4933 | 0.8775 | nan | 0.8401 | 0.9563 | 0.7972 | 0.8519 | 0.5382 | nan | 0.7231 | 0.8251 | 0.0681 | 0.9572 | 0.1265 | 0.0 | 0.6689 | 0.0728 | 0.7077 | 0.0 | 0.0200 | 0.9336 | 0.1805 | 0.6236 | 0.3934 | 0.5249 | nan | 0.2937 | 0.6237 | 0.5834 | 0.0 | 0.9554 | 0.8395 | 0.9831 | 0.0091 | 0.2201 | 0.4694 | 0.0 | nan | 0.7584 | 0.8766 | 0.6901 | 0.7510 | 0.4012 | nan | 0.5928 | 0.6518 | 0.0408 | 0.8652 | 0.1098 | 0.0 | 0.4336 | 0.0728 | 0.5861 | 0.0 | 0.0199 | 0.7688 | 0.1606 | 0.5332 | 0.3284 | 0.3879 | nan | 0.2446 | 0.5098 | 0.3961 | 0.0 | 0.8869 | 0.7609 | 0.9535 | 0.0070 | 0.1169 | 0.3748 | 0.0 | | 0.1163 | 49.6 | 9920 | 0.7314 | 0.4154 | 0.4931 | 0.8776 | nan | 0.8406 | 0.9555 | 0.7982 | 0.8534 | 0.5451 | nan | 0.7131 | 0.8257 | 0.0790 | 0.9569 | 0.1210 | 0.0 | 0.6700 | 0.0738 | 0.7103 | 0.0 | 0.0224 | 0.9322 | 0.1839 | 0.6400 | 0.3738 | 0.5067 | nan | 0.3024 | 0.6231 | 0.5802 | 0.0 | 0.9561 | 0.8347 | 0.9831 | 0.0082 | 0.2098 | 0.4802 | 0.0 | nan | 0.7584 | 0.8768 | 0.6893 | 0.7502 | 0.4007 | nan | 0.5928 | 0.6520 | 0.0460 | 0.8651 | 0.1058 | 0.0 | 0.4382 | 0.0738 | 0.5842 | 0.0 | 0.0223 | 0.7704 | 0.1638 | 0.5387 | 0.3210 | 0.3845 | nan | 0.2505 | 0.5099 | 0.3950 | 0.0 | 0.8865 | 0.7604 | 0.9534 | 0.0063 | 0.1171 | 0.3788 | 0.0 | | 0.0655 | 49.7 | 9940 | 0.7432 | 0.4168 | 0.4966 | 0.8777 | nan | 0.8411 | 0.9569 | 0.7993 | 0.8503 | 0.5401 | nan | 0.7121 | 0.8312 | 0.0845 | 0.9536 | 0.1329 | 0.0 | 0.6689 | 0.0911 | 0.7217 | 0.0 | 0.0183 | 0.9332 | 0.1866 | 0.6354 | 0.3938 | 0.5485 | nan | 0.2954 | 0.6268 | 0.5871 | 0.0 | 0.9534 | 0.8387 | 0.9836 | 0.0106 | 0.2143 | 0.4811 | 0.0 | nan | 0.7581 | 0.8764 | 0.6886 | 0.7514 | 0.4024 | nan | 0.5921 | 0.6494 | 0.0486 | 0.8673 | 0.1144 | 0.0 | 0.4376 | 0.0911 | 0.5913 | 0.0 | 0.0181 | 0.7701 | 0.1660 | 0.5373 | 0.3309 | 0.3936 | nan | 0.2439 | 0.5109 | 0.3925 | 0.0 | 0.8879 | 0.7621 | 0.9533 | 0.0080 | 0.1168 | 0.3782 | 0.0 | | 0.0654 | 49.8 | 9960 | 0.7503 | 0.4162 | 0.4953 | 0.8776 | nan | 0.8410 | 0.9559 | 0.8005 | 0.8522 | 0.5450 | nan | 0.7082 | 0.8301 | 0.0806 | 0.9556 | 0.1303 | 0.0 | 0.6670 | 0.0746 | 0.7186 | 0.0 | 0.0215 | 0.9323 | 0.1870 | 0.6362 | 0.3906 | 0.5446 | nan | 0.3029 | 0.6134 | 0.5720 | 0.0 | 0.9558 | 0.8356 | 0.9824 | 0.0106 | 0.2257 | 0.4792 | 0.0 | nan | 0.7582 | 0.8768 | 0.6872 | 0.7508 | 0.4013 | nan | 0.5918 | 0.6502 | 0.0475 | 0.8662 | 0.1127 | 0.0 | 0.4361 | 0.0745 | 0.5863 | 0.0 | 0.0213 | 0.7704 | 0.1661 | 0.5377 | 0.3288 | 0.3947 | nan | 0.2522 | 0.5096 | 0.3940 | 0.0 | 0.8870 | 0.7608 | 0.9537 | 0.0080 | 0.1186 | 0.3769 | 0.0 | | 0.0668 | 49.9 | 9980 | 0.7339 | 0.4160 | 0.4957 | 0.8778 | nan | 0.8397 | 0.9574 | 0.8012 | 0.8533 | 0.5329 | nan | 0.7149 | 0.8324 | 0.0808 | 0.9581 | 0.1315 | 0.0 | 0.6686 | 0.0743 | 0.7097 | 0.0 | 0.0236 | 0.9318 | 0.1824 | 0.6321 | 0.3937 | 0.5660 | nan | 0.2925 | 0.6250 | 0.5827 | 0.0 | 0.9539 | 0.8401 | 0.9840 | 0.0107 | 0.2145 | 0.4754 | 0.0 | nan | 0.7578 | 0.8765 | 0.6865 | 0.7502 | 0.4021 | nan | 0.5921 | 0.6487 | 0.0468 | 0.8647 | 0.1139 | 0.0 | 0.4381 | 0.0743 | 0.5855 | 0.0 | 0.0234 | 0.7708 | 0.1630 | 0.5373 | 0.3288 | 0.3982 | nan | 0.2423 | 0.5108 | 0.3945 | 0.0 | 0.8876 | 0.7623 | 0.9531 | 0.0081 | 0.1176 | 0.3773 | 0.0 | | 0.0611 | 50.0 | 10000 | 0.7413 | 0.4159 | 0.4945 | 0.8774 | nan | 0.8388 | 0.9573 | 0.8007 | 0.8552 | 0.5279 | nan | 0.7175 | 0.8243 | 0.0845 | 0.9557 | 0.1300 | 0.0 | 0.6660 | 0.0883 | 0.7186 | 0.0 | 0.0195 | 0.9361 | 0.1788 | 0.6178 | 0.3916 | 0.5516 | nan | 0.2882 | 0.6120 | 0.5812 | 0.0 | 0.9537 | 0.8377 | 0.9844 | 0.0132 | 0.2116 | 0.4815 | 0.0 | nan | 0.7575 | 0.8765 | 0.6860 | 0.7491 | 0.4030 | nan | 0.5916 | 0.6521 | 0.0488 | 0.8662 | 0.1122 | 0.0 | 0.4442 | 0.0883 | 0.5874 | 0.0 | 0.0193 | 0.7672 | 0.1600 | 0.5304 | 0.3282 | 0.3920 | nan | 0.2410 | 0.5084 | 0.3941 | 0.0 | 0.8872 | 0.7611 | 0.9529 | 0.0098 | 0.1155 | 0.3780 | 0.0 | ### Framework versions - Transformers 4.32.1 - Pytorch 2.0.1 - Datasets 2.12.0 - Tokenizers 0.13.2
Audi24/fire_classifier
Audi24
2023-09-22T00:06:37Z
65
1
transformers
[ "transformers", "tf", "vit", "image-classification", "generated_from_keras_callback", "base_model:google/vit-base-patch16-224-in21k", "base_model:finetune:google/vit-base-patch16-224-in21k", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
image-classification
2023-09-20T01:08:00Z
--- license: apache-2.0 base_model: google/vit-base-patch16-224-in21k tags: - generated_from_keras_callback model-index: - name: Audi24/fire_classifier results: [] --- <!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # Audi24/fire_classifier This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.1936 - Validation Loss: 0.1743 - Train Accuracy: 0.9889 - Epoch: 4 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 1755, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 1.0088 | 0.8898 | 0.8667 | 0 | | 0.7325 | 0.6165 | 0.9333 | 1 | | 0.4620 | 0.3794 | 0.9444 | 2 | | 0.3100 | 0.2546 | 0.9667 | 3 | | 0.1936 | 0.1743 | 0.9889 | 4 | ### Framework versions - Transformers 4.33.2 - TensorFlow 2.13.0 - Datasets 2.14.5 - Tokenizers 0.13.3
Saul98lm/Prueba4
Saul98lm
2023-09-21T23:39:32Z
106
0
transformers
[ "transformers", "pytorch", "tensorboard", "roberta", "text-classification", "generated_from_trainer", "dataset:glue", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
text-classification
2023-09-21T22:31:37Z
--- license: apache-2.0 tags: - generated_from_trainer datasets: - glue metrics: - accuracy - f1 model-index: - name: Prueba4 results: - task: name: Text Classification type: text-classification dataset: name: glue type: glue config: mrpc split: validation args: mrpc metrics: - name: Accuracy type: accuracy value: 0.8406862745098039 - name: F1 type: f1 value: 0.8845470692717585 --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Prueba4 This model is a fine-tuned version of [distilroberta-base](https://huggingface.co/distilroberta-base) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 1.1542 - Accuracy: 0.8407 - F1: 0.8845 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.0931 | 1.09 | 500 | 1.1624 | 0.8260 | 0.8807 | | 0.0917 | 2.18 | 1000 | 1.1542 | 0.8407 | 0.8845 | ### Framework versions - Transformers 4.28.1 - Pytorch 2.0.1+cu118 - Datasets 2.14.5 - Tokenizers 0.13.3
daochf/Lora-MetaLlama2-7b-hf-PuceDs05-v01
daochf
2023-09-21T23:31:28Z
0
0
peft
[ "peft", "region:us" ]
null
2023-09-21T23:30:42Z
--- library_name: peft --- ## Training procedure The following `bitsandbytes` quantization config was used during training: - quant_method: bitsandbytes - load_in_8bit: False - load_in_4bit: True - llm_int8_threshold: 6.0 - llm_int8_skip_modules: None - llm_int8_enable_fp32_cpu_offload: False - llm_int8_has_fp16_weight: False - bnb_4bit_quant_type: nf4 - bnb_4bit_use_double_quant: True - bnb_4bit_compute_dtype: float16 ### Framework versions - PEFT 0.5.0
aadel4/kid-whisper-small-en-myst_cslu
aadel4
2023-09-21T23:17:44Z
113
0
transformers
[ "transformers", "pytorch", "whisper", "automatic-speech-recognition", "generated_from_trainer", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-20T21:14:18Z
--- license: apache-2.0 tags: - generated_from_trainer metrics: - wer model-index: - name: openai/whisper-small-en results: - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: myst-test type: asr config: en split: test metrics: - type: wer value: 9.21 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: cslu_scripted type: asr config: en split: test metrics: - type: wer value: 2.59 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: cslu_spontaneous type: asr config: en split: test metrics: - type: wer value: 27.16 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: librispeech type: asr config: en split: testclean metrics: - type: wer value: 4.74 name: WER --- # openai/whisper-small-en This model is a fine-tuned version of [openai/whisper-small-en](https://huggingface.co/openai/whisper-small-en) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.24359586834907532 - Wer: 8.182209008725348 ## Training and evaluation data Training data: Myst Train (125 hours) + CSLU Scripted train (35 hours) Evaluation data: Myst Dev (20.9 hours) + CSLU Scripted Dev(4.8) ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 64 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 10000 - converged_after: 1000
aadel4/kid-whisper-small-en-myst
aadel4
2023-09-21T23:17:01Z
115
0
transformers
[ "transformers", "pytorch", "whisper", "automatic-speech-recognition", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-20T20:50:44Z
--- license: apache-2.0 metrics: - wer model-index: - name: openai/whisper-small-en results: - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: myst-test type: asr config: en split: test metrics: - type: wer value: 9.11 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: cslu_scripted type: asr config: en split: test metrics: - type: wer value: 33.85 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: cslu_spontaneous type: asr config: en split: test metrics: - type: wer value: 28.47 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: librispeech type: asr config: en split: testclean metrics: - type: wer value: 4.18 name: WER --- # openai/whisper-small-en This model is a fine-tuned version of [openai/whisper-small-en](https://huggingface.co/openai/whisper-small-en) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.26971688866615295 - Wer: 8.508066331024994 ## Training and evaluation data - Training data: Myst Train (125 hours) - Evaluation data: Myst Dev (20.9 hours) ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 64 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 10000 - converged_after: 1000
aadel4/kid-whisper-small-myst
aadel4
2023-09-21T23:15:34Z
114
0
transformers
[ "transformers", "pytorch", "whisper", "automatic-speech-recognition", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
automatic-speech-recognition
2023-09-21T22:48:46Z
--- license: apache-2.0 metrics: - wer model-index: - name: openai/whisper-small results: - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: myst-test type: asr config: en split: test metrics: - type: wer value: 11.80 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: cslu_scripted type: asr config: en split: test metrics: - type: wer value: 55.51 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: cslu_spontaneous type: asr config: en split: test metrics: - type: wer value: 28.53 name: WER - task: type: automatic-speech-recognition name: Automatic Speech Recognition dataset: name: librispeech type: asr config: en split: testclean metrics: - type: wer value: 6.23 name: WER --- # openai/whisper-small This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.26971688866615295 - Wer: 8.508066331024994 ## Training and evaluation data - Training data: Myst Train (125 hours) - Evaluation data: Myst Dev (20.9 hours) ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 64 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 10000 - converged_after: 2500